The case of FaceApp, the application that uses artificial intelligence to age a face, has put the spotlight on a common trend to which more users are paying attention.
When FaceApp is installed, it warns that all user data will be used and even transferred to third parties, so by agreeing to its terms and conditions, users lose control of their privacy.
Unfortunately, few users pay attention to the process of information acquisition utilized by app makers and social media giants, until it is too late. Instead of reading the terms and conditions, users simply click accept without thinking about the consequences.
It is not a surprise that most mobile programs may not even need explicit consent from cellphone users to get as much information as they want without people noticing that their privacy no longer exists.
Thousands of applications circumvent the limitations imposed by their own policies, or app platforms such as Google Play Store and Apple’s App Store, to spy on people even if they are not authorized.
Think about the following questions for a moment:
Why does the mobile flashlight need to know a user’s location? Why does a photo retouching application need to access the microphone? Why should a voice recorded function need to access a person list of contacts?
In principle, these apps do not require explicit permission to operate. When apps get a hold of a contact list, a recording function or a microphone, they usually are in search of an extremely valuable asset: user data.
Users can give or deny different permissions to applications to access their location, contacts or files stored on the phone. But an investigation by a team of experts in cybersecurity revealed that up to 12,923 apps have found a way to continue collecting private information despite the fact that users explicitly deny them the permissions.
This study highlights the difficulty that users have to safeguard their privacy.
Researchers from the International Institute of Computational Sciences in Berkeley, together with the University of Calgary and AppCensus analyzed a total of 88,000 applications from the Play Store and have observed how thousands of applications access information such as the location or terminal data that the user had previously denied access to.
The experts have not yet made public the full list of apps that perform these practices. But according to the investigation, such a list includes apps from Disneyland in Hong Kong, the navigator of Samsung and the Chinese search engine Baidu.
The number of potential users affected by these findings is in the “hundreds of millions.”
It is a very serious infraction because the Android operating system requires that the apps ask for consented to access data through permission warnings.
Consent works in a very similar way in both physical and non-physical worlds, especially when it comes to personal data.
It’s like in the case of a rape in which the victim expressly says no, but the abuser still commits the crime.
Narseo Vallina-Rodríguez, co-author of the study, points out that “it is not clear if there will be patches or updates for the billions of Android users that today use versions of the operating system with these vulnerabilities”.
Google has not specified if it intends to withdraw apps that violate its terms from the market or if it will take any action in relation to the applications that, according to the study, access the users’ data without the relevant permission.
Google has assured that the problem will be solved with Android Q, the next version of its operating system. The company intends to launch six beta versions throughout the year before releasing the final version during the third quarter of the year.
Of course, the question everyone asks is, how do the applications access the user’s private information without the necessary permissions?
The apps circumvent the control mechanisms of the operating system through side channels and covert channels.
Vallina makes the following comparison: “To enter a house [the user’s data] you can do it through the door with the key that the owner has given you [the permit], but you can also do it without the consent of the owner taking advantage of a vulnerability of the door [a side channel] or with the help of someone who is already inside [covert channel] “.
You can open a door with a key, but you can also find a way to do it without having that key. “
The same happens when trying to access the geolocation of a terminal. You may not have access to the GPS, but find a way to access the user’s positioning information anyways.
One way to do this is through the metadata that are integrated into the photographs taken by the owner of the smartphone.
By default, each photograph taken by an Android user contains metadata such as the position and time they have been taken in. Several apps access the historical position of the user asking for permission to read the memory card, because that is where they stor the photographs, without having to ask for access to the GPS.
This is the case of Shutterfly, a photo editing application. Researchers have verified that it collected GPS coordinates information from the users’ images despite having been denied permission to access their location.
It is also possible to access the geolocation through the Wi-Fi access point with the MAC address of the router, an identifier assigned by the manufacturer that can be correlated with existing databases to find out the user’s position “with a fairly accurate resolution”.
So that the application can access this information, there is a permission that the user must activate on his smartphone called “wifi connection information”.
But there are apps that get this data without permission being activated. To do this, they extract the MAC address of the router that the terminal obtains through the Address Resolution Protocol, which is used to connect and discover the devices that are in a local network.
That is, applications can access a file that exposes the MAC information of the Wi-Fi access point: “If you read that file that the operating system exposes without any type of permission, you can know the geolocation in a totally unknown way for the user”.
Many of these data leaks or abuses of user privacy are made by libraries, which are services or mini-programs of third parties included in the application code.
These libraries are executed with the same privileges as the app in which they are located. On many occasions, the user is not aware that they exist.
“Many of these services have a business model that is based on obtaining and processing personal data,” says the researcher.
For example, applications like the Hong Kong Disneyland Park use the map service of the Chinese company Baidu.
In this way, they can access without having to have any permission to information such as the IMEI and other identifiers that the Chinese search libraries store on the SD card.
Samsung’s health and navigation applications, which are installed on more than 500 million devices, have also used this type of libraries for their operation.
“The library itself exploits those vulnerabilities to access that data for its own purposes. It is not clear if then the developer of the app accesses that data through the library,” he explains.
Vallina says that in the next research they will analyze the ecosystem of third-party libraries and for what purposes the data is obtained.
They will also study the monetization models that exist in Android and the transparency of the applications in terms of what they do and what they say they do in privacy policies.
To avoid this type of practice, co-author Joel Reardon makes a point out the importance of carrying out research of this type with the aim of “finding these errors and preventing them”.
If application developers can circumvent permissions, does it make sense to ask users for permission?
The researcher emphasizes that applications cannot circumvent all control mechanisms and that little by little it will be more difficult.
“The permit system has many failures, but still serves and pursues an important purpose,” he says.
These practices carried out without the consent of the users fail, among other regulations, the General Regulation of Data Protection (RGPD) and the Organic Law of Data Protection.
The developers of these applications could face, according to the RGPD, economic sanctions of up to 20 million euros or 4% of the company’s annual turnover.
Their illegal access to private information could even constitute a crime against privacy that could lead to prison sentences.
Most of the responsibility lies with the developers. But both the Google Play and Apple Store and the platforms that give applications access to their users’ data – like Facebook in the Cambridge Analytica case – have a responsibility in monitoring: That is, the duty of monitor that the applications that they accept in their store or to those that give access to the data of their users in their platform are safe.
Although each one is responsible for their actions, there is a need for some authority to review the security of ICT applications and services before launching them on the market.
In other sectors, there is some type of certification that guarantees that a product or service is safe. It does not occur to anyone, for example, to authorize the circulation of cars that fail brakes. And let’s not even think about medicines, food or toys. However, it is normal in the ICT sector that applications and services are launched to the market with security holes, which then, on the fly, are patched.