Saturday 31 January 2015

Information entropy as an anthropomorphic concept

After this post I wrote a paper on this subject, find it in arxiv.org

One of the most interesting opinions that I have heard about entropy has been formed by E.T. Jaynes and E.P. Wigner (link). According to them, entropy is an anthropomorphic concept in the sense that in a physical system there can be found many thermodynamic systems. The physical system can be examined from many points of view examining different variables each time and calculating entropy differently. In this post and in the paper in arxiv.org that followed it (link), I discuss how I think that we may apply this concept in information entropy, how Shannon’s definition of entropy can fit in Jayne’s and Bank’s opinion. I also present a case in which I have used this combination of ideas in the past in an Android application.
Information entropy has been defined by Claude Shannon (link). It quantifies the amount of information that is transmitted by a channel or more generally the information content that there is in a message M. Message M consists of a set of n symbols, let P(i) be the probability of appearance of any symbol i in M then for all n symbols stands that entropy H = -ΣP(i)logP(i). Probability P(i) equals to the frequency of appearance of i in M.
Let’s generalize this idea. Let S be a set of objects that we may describe their properties using v variables for each one. Let each variable take values from a range [a, b] of discrete values. Now we have a lot of information about S and we can use entropy in order to analyze it. Exactly as Jaynes describes the examination of physical systems we may choose any subset v’ of v in order to examine the information content of S. The choice of v’ depends on how we care to examine S and this provides the anthropomorphic characteristic to our analysis. Using information entropy we can examine the distribution of the properties described by v’ in the objects of S.
For simplicity reasons let us examine S considering two properties for its objects described by variables X and Y. For variable X and for probability P(x) in the appearance of any value x entropy is H(X)=-ΣP(x)logP(x). Same for Y, it stands that H(Y)=-ΣP(y)logP(y). In order to combine these two entropies we have to consider if they are dependent.
As it is known for independent X and Y stands that H(X,Y) = H(X) + H(Y).
For joint probability P(x, y) stands H(X,Y) = -ΣΣ P(x, y)log P(x, y).
For conditional entropy H(X | Y) which stands for the probability P(y, x) when X depends on Y entropy is H(X | Y) = -ΣP(y, x)log(P(y)/P(y, x)).
These considerations about entropy variables are different from the classical definitions of Shannon. Shannon defines conditional probability P(x, y) as the probability of appearance of symbol x which depends on the appearance of symbol y. While the above probabilities consider properties x and y of one discrete object.
Unconsciously I had used in the past all the above for the development of an android application; you may find it here or here. The goal of the application was to rate the strength of passwords for handheld devices. A strong password M must be complex but as we focus in handheld devices the frequency of appearance of each character in M is not the only property we may consider. The user of each device has to change the keyboard appearance of the device in order to write upper case characters, lower case, symbols or numbers. This makes the input of M more complex than in desktop computers and we may consider this property when rating the complexity of M.
The rating of password strength still regards characters of a simple string M but let us look into them as discrete objects of more than one property. Variable X is defined on the set of characters available to the user and P(x) is the frequency of appearance of each character x in M. Variable Y is the property of each symbol of being an upper case character, lower case, symbol or a number. Then P(y) is the probability of y being upper case character, lower case, symbol or number. The two variables are independent; X does not depend on Y. As a result H(X,Y)=H(X) + H(Y). Calculating entropy using the last equation provides a more accurate analysis of the information content of M.
For a more detailed presentation see my paper currently published in arxiv.org.

Friday 30 January 2015

The illegal and misleading practices of mobogenie.com

Lately I found out that that there is site called mobogenie.com in which two of my Android apps are published. I have not submitted them to this site and I don’t really want them published there. But this didn’t stop them from presenting the apps in the site and putting a download button.
When you press the download button you get an .apk file that it is supposed to be the app that you are seeing in the site it has the same name. In reality it is the “mobogenie” app that tricks you to install it in your device. Then, perhaps you will have the chance do get the app that you actually want. In their disclaimer they admit that they have no control over the app submissions, they don’t really know if the app that you submit is really yours. A few days ago there was a link in the main page with directions on how to claim your own app from mobogenie if you find it there without having submitted it yourself.
Of course I send an e-mail, they respond that they will look at it, two weeks later no answer.
It turns out that they have built a whole market in which is not clear if they have the right to distribute the apps that they distribute. This practice is illegal, at least in my case it is clear in the EULA of my apps that they may be distributed only be authorized distributors; mobogenie is not one of them.
It is also misleading. The app that they let first to download is not the one you asked for.
Most of the app stores and markets have an app that they ask you to install before start downloading apps. But it is quite clear when you download the app of the appstore and when you download the app that you want. Before submitting an app to legitimate appstores like Amazon and Google Play you go through an extensive check of your ID, they even cross-check the information you provide with IRS. In mobogenie they could do something like this, but they don’t seem to care as their location is in Asia and you will probably won’t bother to form an international law complaint.
Besides, my apps that I saw in mobogenie are distributed from Amazon and Play for free, the mobogenies probably think that I should not mind having in their site my free apps. They could ask first.