The study found that many mobile apps might have hidden, programmed behaviors that the average user would be totally unaware of. Usually apps work on the premise of interacting with users via the data they input. This input can vary from word data, swipes or button presses.
In this particular study 150,000 apps were examined. Of those 100,000 were chosen based on their popularity in the Google Play store, another 20K were chosen for their popularity in an alternative market, along with 30K pre-installed apps that appear on Android systems.
8.5% of those apps, 12, 706 apps, contained some kind of programming labeled by the research team as “backdoor secrets.” These are hidden commands in the app which trigger background behaviors unknown to the user.
Other apps had programmed master passwords that would allow anyone with the master password to potentially private data. Other apps had secret keys that could trigger hidden options like bypassing a pay-to-play screen.
Another 4.028 apps (about 2.7%) were found to block content when it contained specified key words that were meant to be censored, or if it was cyber bullying or discrimination.
Researchers have recently discovered that mobile apps that work with Bluetooth devices have an internal design problem that makes them vulnerable to hackers. The problem is inherent to the way Bluetooth devices communicate with the apps that are used to control them, according to researchers.
Think about any common Bluetooth device like wearable health trackers and smart devices like a thermostat, speaker and home assistant. What allows app and device to communicate is a broadcasted UUID or universally unique identifier. This connects the app to the smart device allowing the two to communicate.
This identifier is also housed in the mobile app code, while this is essential for communication it also makes the identifier vulnerable to hackers via the app itself.
Despite this, researchers say this doesn’t mean you should throw away your smart devices.
Realizing this, researchers created their own hacking device to test the extent of this vulnerability. Using an area about a mile wide around Ohio State’s campus they sent their hacking program on a search. Of 5,800 some devices 94.6 percent (5,500 of the devices) were vulnerable to fingerprint attacks and 7.4 percent (431 devices) were vulnerable to unauthorized access and eavesdropping style attacks.
Those vulnerable to the latter kind of attack had issues when device and app initially pair that puts them at risk for hacking. According to researchers, app developers need to tighten defenses during this initial process to fix the problem.