Tuesday, November 18, 2014

Disabling Cell Broadcast/Service Messages in Latest Android Phones

I love my newly bought Moto G (2nd Gen), Android Phone. But every now and then, an annoying pop up floats on the screen, requesting to subscribe promotional offers. Pressing the 'OK' button will automatically subscribe to the offer, and we don't want that to happen, as it will cost money.
tata-docomo-vas-dive-in

Googling, revealed that, these are 'Cell Broadcast' messages, broadcasted by your network operator. They also known by some other terms like 'Service messages'.

In earlier Android versions, you can disable it directly from "Message Options". These options are called by different names like, 'Cell Broadcast', 'Push Messages' or 'Service Messages'.

But with Moto G (And may be in Android 4.4 and above), we've not seen this option in the message settings and nothing similar in call/wireless and network menus.

Calling network operator gives the same answer like, "Please do it yourself, The option is in your phone. Please disable 'Service Messages' in your phone'. We've explored almost all menu options, but nothing sounds similar.

Finally after hours of struggling, we've found out the option.

Option-A


"The menu option is there, but called by a different name! It is "Emergency Alerts"

c7bf52550239daa8f353a4cac721f50a32b5ebff

It names differently (like threats to life, display child abduction emergency etc), but disabling these options, did the trick.

There are also other ways to disable 'Cell broadcast messages' and are given below.

Option-B [Use another device, temporarily, to deactivate 'Cell Broadcast']

As 'Cell Broadcast' is not related to an individual device, but tagged to your SIM in general, we can opt the below workaround.

a) Find an old phone which has an option to enable/disable Broadcast Channels and Messages (in my case its a Samsung D500).

b) Insert the first SIM in that old phone and go to the settings menu for Broadcast messages.

c) At first the Channels and Messages usually appear as disabled. I changed the setting for both Channels and Messages to Enabled in the old phone.

d) Switch-off the old phone and Switch-on

e) Go to the settings menu for Broadcast messages. Change the settings for both Channels and Messages to Disabled.

f) Switch-off the old phone and Switch-on

g) Switch-off the old phone and remove SIM card

h) Do the same steps (a-g) for the second SIM

i) Insert the SIM cards back in the new phone and switch on

Original source
here.

Option-C [Subscribe to Do Not Disturb (DND) option of your network operator]


Subscribe or request Do Not Distrubt option from your network provider. Most of them does, buy simply sending an SMS to a toll free number, Or you can register for it in their website.

Do Not Disturb or (DND) function on most PBX or PABX systems prevents calls from ringing on an extension for which DND is activated. Wiki link here.

Note: For Tata Docomo, "simply call or SMS at 1909 (tollfree)" - SMS STOP to 1909.". Original source
here.


Conclusion


Hope this might be helpful to someone facing a similar issue.

Friday, November 7, 2014

'Dynamic', to ease 'Reflection' - Microsoft.NET

One of the nice feature that, we're quite obsessive with Microsoft.NET languages (C#, VB.net) is

'Reflection'

You might be already knowing, the capabilities of 'Reflection' in .NET. Most often, it is heavily used along with 'Custom Attributes'.

But we've found a very good use of it.

Reflection to achieve 'Late-Bindings' with our code

(Especially in scenarios, where you are integrating your application with external COM Servers, like Out Of Process Servers [EXE] or In Process Servers [DLL], which are written in a other native languages like Visual C++ [ATL or MFC])

This is not only limited to integration with 'COM Servers', but we can also leverage, 'Late-Bindings' with .NET types as well (which resides especially in other assemblies).

In other terms,

through 'Reflection' you can remove version dependency

to a referenced assembly or COM Object, in your project.

But implementing 'Version Independence' through Reflection API's is quite complex and less tidy. New 'dynamic' keyword in C# will specifically help you in that front, where you don't have to dig available methods/properties, through 'Reflection API's. It will be done under the hood.

Before coming into details, we should be aware of the below well known scenarios.;

Early Binding
and
Late Binding


Early Binding


To understand the scenario, lets take a case study, that was a real exercise for our team. We'd to create an application (.NET Windows), that should extend 'Office Outlook' notifications. i.e By default, Outlook does not support 'Desktop Notifications' for new mails, arriving at 'Secondary Mailbox's. So we're trying to add that functionality, which was long awaited by our support team, as they are having dedicated 'Secondary Mailboxes' for their support activity related mails. As of now, they need to periodically check their mailboxes for new mails. Quite tedious right? So if we can show the typical 'Outlook Mail Notifications' , for new mails arriving at 'Secondary Mailboxes', that will be a huge time saver for our support team.

So obviously, we need to create an application (.NET Windows), integrated with 'Office Outlook'. As you would know, Office Outlook is a 'COM Server' (Out Of Proc -EXE). At the time of development, we all had Office Outlook 2007 installation in our machines. So we just took the easy way out. We created the .NET project, and added the 'Office Outlook 2007' COM Reference from the COM tab. The implementation was quite easy and we completed the project in weeks.

Deployment commenced, and the support team was quite happy with the features. They don't have to periodically scan the mailbox anymore! A few more months passed, until the most inevitable happened!

IT Infrastructure Team, had rolled out a new version of Office (Office 2012). So Office Outlook 2007, has been upgraded to Office Outlook 2012. For our surprise, our application crashed.

Why? This is a

'Early Binding' scenario, where our application requires the dependencies (in this case Outlook) to be available with exact version, that had referenced during the development of the application.


So what could be the solution? Simple, Just recompile the project, by changing reference to Office Outlook 2012, instead of Office Outlook 2007. But what will happen, if a new version of Office (Office365) will be deployed again. Every time, we've to recompile and deploy our application.

Obviously this is not a real solution. The actual solution is 'Late Binding'

Late Binding


If we can access 'Office Outlook' at run time, without caring the version, that could be the most viable solution. We don't have to recompile the application every time, when a new upgrade of Office happens.

This 'Late Binding' is achieved through a technique called 'Reflection' in .NET.

The application we've just talked about is detailed in
this blog. It has been completely implemented through '.NET Reflection' and it will work with any version of Office Outlook (2007, 2012, 365 etc). If you're curious on the implementation, you can also download the source from the above link.

OK that's about the whole theory. Now if you explore '.NET Reflection' API's, it is quite difficult to implement (We will soon get into that with a real example). That means we are compromising 'Ease Of Implementation' for achieving 'Version Independence'. Can we've best of both the worlds?

i.e 'Ease of Implementation' and 'Version Independence' . Yes you can, with 'dynamic' feature of C#


We can understand the concepts with a simple example and code walk through.



Code Sample (Early Binding [Assembly Reference] ---> Late Binding [Reflection] ---> Late Binding [Dynamic] )



For this exercise, we will try to do the below with 'Office Word' (A COM Server written in Visual C++ !). A simple Office Word Automation from a .NET Program!

From our .NET application, we will perform the below,

a. Create the Word Application Object
b. Create a Blank Word Document Object
c. Make the 'Word' application visible
d. Close the Word application


Early Binding Implementation:


image

As we've already discussed, we've a static dependency to 'Interop.Word' (in our case it is Office Word 2007). But the implementation is quite straightforward and easy.

Late Binding Implementation (Reflection):


image

As you can see, the implementation is quite lengthy and tedious with 'Reflection'. But see the best part of it, You've not even referenced, 'Office Word Interop' in the 'References' section (See right hand side of the figure) and that is why it is working with every versions of 'Office Word'.

Late Binding Implementation (Dynamic):



image

As you can see with 'dynamic' keyword, you have best of both the worlds. You don't have the dependency to the 'Office Word Interop' (See right hand side of the fig) and at the same time, You've a easy implementation like 'Early Binding' scenario.

Conclusion:


'Early Binding' (Static Dependencies) may be a problem while referring COM/External assemblies in your .NET program.

'Late Binding' can resolve this dependency issue, with 'Reflection'.
But 'Reflection' makes your code lengthier and complex.


The 'dynamic' keyword brings you best of both the world of 'Early Binding' and 'Version Independence'

Comments, Suggestions! Most Welcome!

Dynamic JSON parsers in Microsoft.NET

We are aware of typical, static Json parsers, those can parse a json string to a predefined POCO C# object. This is practical for scenarios, where you have a pre-defined json schema, before you design your C# classes. Also your json object structure (json schema) should not change, if it does your code breaks, while parsing dynamic json string to your static C# object.

But mostly,

JSON is dynamic in nature

That's why most, No-SQL databases uses it or at least it's derivatives (like BSON by MongoDB). We've a situation here, where we've to parse such dynamic Json objects, that can change over time or changes based on other elements in the json.

For eg:

If you represent a 'system event' as Json, you've a different Json for a 'KeyEvent', when compared to a 'MouseEvent'.
See the example below.

"Key Event" example:
{      
eventType: "Key",
Args:
{
KeyCode: "35",
Special: "Shift"
}
}


"Mouse Event" example:
{
eventType: "Mouse",
Args:
{
Button: "Left",
Point:
{
X: "30",
Y: "45"
}
}
}



This is just a simple sample, that shows how dynamic Json, can change over time. Static Json parsers wont do well with such scenarios.

Our objective was to parse, such dynamic JSON, and put the values as key/value pairs to a RDBMS database.

Our research on Dynamic JSON parsers, provided two options



1. System.Json Namespace (From Microsoft)

2. Newtonsoft.Json Namespace (From Newtonsoft)


Both can be installed through Nuget in Visualstudio.

Install Newtonsoft JSON through package manager console
PM> Install-Package Newtonsoft.Json

Install System.Json through package manager console
PM> Install-Package System.Json -Version 4.0.20126.16343

System.Json Namespace



Microsoft way of doing it. But it has a major limitation. If you've multiple json objects, in the same string, it will not support parsing it, unless you split them by your own means.

The below shows a simple usage of System.Json, on parsing a json string.

string jsonStr = "{Name:'Test',Marks:['20','81']}";
JsonValue parsedJsonObject = JsonObject.Parse(jsonStr);
switch (parsedJsonObject.JsonType)
{
case JsonType.String:
case JsonType.Number:
case JsonType.Boolean:
//JSon properties, get the value by converting it to string
string value = Convert.ToString(parsedJsonObject);
break;
case JsonType.Array:
JsonArray jArray = parsedJsonObject as JsonArray;
for (int index = 0; index < jArray.Count; ++index)
{
JsonValue jArrayItem = jArray[index];
//Now recursively parse, each array item. i.e jArrayItem
}
break;
case JsonType.Object:
JsonObject jObject = parsedJsonObject as JsonObject;
foreach (string key in jObject.Keys)
{
JsonValue jSubObject = jObject[key];
//Now recursively parse, each usb item. i.e jSubObject
}
break;
}


Newtonsoft.Json Namespace



Excellent Json parser. It has a good performance over other parsers, when you've a bulk of Json objects to parse.

The below shows a simple usage of Newtonsoft.Json, on parsing a json string.

string jsonStr = "{Name:'Test',Marks:['20','81']}";
JToken parsedJsonObject = JToken.Parse(jsonStr);
switch (parsedJsonObject.Type)
{
case JTokenType.Raw:
case JTokenType.Boolean:
case JTokenType.Bytes:
case JTokenType.Date:
case JTokenType.Float:
case JTokenType.Guid:
case JTokenType.String:
case JTokenType.TimeSpan:
case JTokenType.Integer:
case JTokenType.Uri:
//JSon properties, get the value by converting it to string
string value = Convert.ToString(parsedJsonObject);
break;
case JTokenType.Array:
JArray jArray = parsedJsonObject as JArray;
for (int index = 0; index < jArray.Count; ++index)
{
JToken jArrayItem = jArray[index];
//Now recursively parse, each array item. i.e jArrayItem
}
break;
case JTokenType.Object:
JObject jObject = parsedJsonObject as JObject;
foreach (JProperty key in jObject.Properties())
{
JToken jSubObject = jObject[key.Name];
//Now recursively parse, each usb item. i.e jSubObject
}
break;
}


Also Newtonsoft, has one more advantage over System.Json, as


it can parse multiple json, included in the same string
For eg. It can parse a Json string something like this.
string jsonStr = "{hello:'there'} {goodbye:'you',Thanks:'you'}";

The below code snippet, shows how to parse a string that contains multiple 'Json' objects. It does read, the two different Json object, in two while loop iterations.

string jsonStr = "{hello:'there'} {goodbye:'you',Thanks:'you'}";
using (var stream = new MemoryStream(System.Text.Encoding.UTF8.GetBytes(jsonStr)))
{
var serializer = new JsonSerializer();
using (var reader = new StreamReader(stream))
using (var jsonReader = new JsonTextReader(reader))
{
//Setting, to read multiple json, if it exists
jsonReader.SupportMultipleContent = true;
while (jsonReader.Read())
{
var obj= serializer.Deserialize(jsonReader);
}
}
}


Conclusion




So in essence, 'System.Json' is an obvious option, for those who always wants packages provided by Microsoft. For client environments, where security is a major concern, this will be the most viable option.

But if you need more advanced options, like 'parsing a json string that contains multiple json objects', you can rely on 'Newtonsoft.Json' namespace.

If you've aware about, such products, mention them in comments.


High Performing Linux File System With EXT4 (Journal Off)

For virtualization purpose, we required a very high performing Linux file system (Host machine, Lubuntu 14.04 LTS Version). Earlier we'd tried with 'EXT2' file system, but not impressive enough.

Later we came to know that, 'EXT4' is the best file system for high performance and stability. As you would probably know, 'EXT4' is a journaling files system by default, which cost a bit on the performance front.

So if you' opt the below combination, you will get the ultimate file system performance, that will benefits your host machine, as well as guests (IO Performance)


1. Use EXT4 File System

2. Switch Off 'Journaling'

3. Enable "Write Back" mode

So, say you've your partition already formatted with 'EXT4' (It is with Journaling-On by default). Now from command line do the below operations to switch off 'Journal'.

Say for eg: Our EXT4 partition is '/dev/sda5'

# Enable writeback mode. Provides the best ext4 performance.
tune2fs -o journal_data_writeback /dev/sda5
# Delete journal option
tune2fs -O ^has_journal /dev/sda5

Now recheck the files system.

# Required fsck
e2fsck -f /dev/sda5

We're done! We can further optimize the performance with the below additional options with EXT4 (fstab opions: data=writeback, noatime, nodiratime)


/dev/sda5 /opt ext4 defaults,data=writeback,noatime,nodiratime 0 0

Ok. the above is applicable, if we already have a 'EXT4' files system.


What if we've an EXT2 file system, that we need to make it as a 'EXT4' with the above options? Just convert it to EXT4 first (using the below command) and perform the above.
# /dev/sda5 disk (Change EXT2 to ext4)
mkfs.ext4 /dev/sda5

Hope, this might be useful for someone having similar experiments!

Single Sign-On With Office365 (Windows Azure) from Corporate Domains

Recently we'd given an Office365 introduction, as part of our customer migrating their Office2007 environments to Office365 subscription based model.

The most surprising part of the training, was

The demonstration of corporate users, directly login to Office365 (Hosted in Windows Azure Cloud) without entering any credentials from their Windows7 laptops.


Wow! That's quite amazing.

But how it could be? Typically an organization have it's own domain (Active Directory/Domain Forest) and domain controllers. Lets say it is an On-Premise active directory (eg. was.int.mycompany1.org). Usually it is a well defined boundary, within which users access and authorization has been defined and restricted.

Normally two separate domains, will have no security context in common, So that they are logically separated. One user in first domain has nothing to do with or have access to resources in other domain. (say between for eg: was.int.mycompany1.org and was.int.mycompany2.org). But if situation demands, you can set a common security context with the two independent private domains using 'Domain Trusts'. In that case we say

"One domain Trusts another domain, and security context have been set between the using Trusted Domain Objects"

Using "Trusts", user in one domain can be authenticated and authorized in another domain. But that will be happening for 'Private' domains/organizations.

But "Windows Azure" is a public cloud that hosts Office365. So it is not a private domain and have to support multiple independent organizations any way. So how this really works? How it really separate one organization context from another, while validating users?

The answer in the fact that,

"Windows Azure Active Directory is not a normal AD, but it is Multi-Tenant Active Directory"


So it support multiple organizations and keeps them separate from each other. But how 'Windows Azure Active Directory' knows about the corporate users identities, that's privately stored with in the Organization's On-Premise active directory, that is private to the organization?

The user objects get synchronized/copied between the corporate active directory and the Windows Azure active directory.

This is done using a software component called 'DirSync' and called as,

"Active Directory Federation Services (ADFS)"


The below figure will give you a better idea on this.

11


As you can see, the bottom oval represents your organization's domain (Active directory), and its being synched to 'Windows Azure Active Directory' using 'DirSync' component.

That means, once you logged into to your 'On Premises Domain Controller', you need not login again, to access your 'Office365' applications from the 'Windows Azure Cloud'. As now 'Windows Azure AD' have your details, it allows you to login instantly and is 'Zero Sign-On'.

This is called,

'Federated Identity' in Office365"

and is the most seamless identity scenarios. Organization's IT support team, don't have track two separate, user identities, but can have the same user identity that is used at the corporate domain. It give the easy and more fine grained control, as adding/removing a user from Office365 cloud, simply means add/remove the user object from the corporate active directory.

Apart from this, 'two more identity scenarios' are possible with Office365 subscription, not as seamless as 'Federated Identity'. They are listed below;


12


Cloud Identity:
You have a separate, User identity for each user, other than the one used to login to your corporate active directory. Simply you've left with two separate user identities, one is for login into the Office365 cloud and once for login to your corporate domain.

Directory & Password Synchronization:
Similar to 'Federated Identity', but synchronized in one way only (i.e From corporate AD to Windows Azure AD). Also the passwords are stored as hash values in 'Windows Azure AD'. User can use the same corporate credentials for their Office365 cloud as well. (But they have to re-enter it, each time they access Office365)