Some app makers delayed updating to a more secure version of Google’s Android platform just so they could continue to harvest and sell users’ personal data, finds new research from the Robert H. Smith School of Business. The working paper published Oct. 2 reveals the tactic backfired for the devious developers, who suffered hits to their reputations and revenue. 

Siva Viswanathan, Dean's Professor of Information Systems and Digital Innovation, and Maryland Smith Ph.D. candidate Raveesh Mayya looked at what happened when Google rolled out Android 6.0 in 2015 with tighter privacy rules. Before the version dubbed “Marshmallow,” app developers could use blanket permissions to gain broad access to sensitive personal information like location data, contact lists and photos.

“They sell the data, and you never know who it’s being sold to and what it’s being used for,” Viswanathan said. “The key thing is, until very recently, with mobile apps you didn’t even have a choice. The only way to download and use any app was to accept all the permissions it sought.”

And users weren’t often reading that entire list of disclosures, Mayya said. “They were used to just scrolling to the bottom and clicking ‘accept.’” That gave app makers access to information consumers might not have been comfortable sharing if they’d paid attention. 

But in recent years, as consumers have become much more sensitive to privacy issues in the wake of scandals like the Cambridge Analytica-Facebook firestorm—when a firm hired by the Trump campaign gained access to information on millions of users—platforms have tightened access to user data. Google’s upgraded privacy options in Android 6.0 require app makers to ask for individual permissions to sensitive data, allowing users to pick and choose the information apps can peruse.

Google gave Android app developers a three-year window to upgrade, offering an interesting opportunity for the researchers to see when app makers decided to upgrade, track how they changed privacy settings and find out what happened afterward.

To do so, Mayya and Viswanathan installed over 13,600 Android apps while developing their own app to track all the others; they then monitored which apps changed their permissions each month and the data to which they sought access before and after upgrading.

The first thing the researchers noticed was that not all the apps were quick to upgrade, prompting them to figure out why. 

“We didn’t expect apps to have a strategic intent to delay upgrading,” Viswanathan said. “We thought it would be more about operational issues or things like that. But when we see that apps of a certain type—the ones asking a lot of nonessential permissions to sell user data or display personalized advertisements—are the ones delaying, then you begin to see that there’s a strategic intent.”

Then the researchers looked at what happened to the apps after they finally upgraded to the tighter privacy rules, and found that delaying the upgrade didn’t pay off in the long run. With the eventual upgrade, those apps couldn’t keep asking for unnecessary data to sell. And the fact that they took so long to upgrade didn’t sit well with consumers. 

“Users are less likely to give you those permissions, but more importantly, they give you lower ratings,” Mayya said. “They become more sensitive.”

The apps that delayed upgrades ended up receiving more negative reviews from consumers, which pushed them further down the list in Android Play Store, leading to fewer downloads of their apps.   

“The apps that delayed upgrading paid a price,” Mayya said. “Their bad behavior didn’t pay off in the long run.”