It’s time to audit your code, as it appears that some no/low code functions used in iOS or Android apps may not be as secure as you thought. That’s the big takeaway from a report that explains disguised Russian software is being used in apps by the US military, the CDC, the British Labor Party and other entities.
When Washington becomes Siberia
The problem is that the code developed by a company called Pushwoosh has been distributed in thousands of apps from thousands of devices. These include the Centers for Disease Control and Prevention (CDC), which claims it was led to believe that Pushwoosh was based in Washington when the developer is actually based in Siberia, Reuters explains. One more visit Pushwoosh Twitter feed shows the company claiming to be based in Washington, DC.
The company provides code and data processing support that can be used in apps to profile what smartphone app users do online and send personalized alerts. CleverTap, Braze, One Signal and Firebase offer similar services. Now, to be fair, Reuters has no evidence that the data collected by the company has been misused. But the fact that the firm is based in Russia is problematic, as information is subject to local data laws, which could pose a security risk.
It can’t be, of course, but it’s unlikely that any developers involved in handling data that could be seen as sensitive would take that risk.
What is the background?
While there are many reasons to be suspicious of Russia at this point, I’m sure each nation has its own third-party component developers who may put user safety first. The challenge is to find out which ones do and which ones don’t.
The reason code like this from Pushwoosh is used in applications is simple: it’s about money and development time. Mobile application development can be expensive, so to reduce development costs, some apps will use off-the-shelf code from third parties for certain tasks. Doing so keeps costs down, and since we’re moving pretty quickly toward no-code/low-code development environments, we’re going to see more of these types of modeling bricks for app development.
That’s fine, as modular code can bring huge benefits to apps, developers, and businesses, but it highlights an issue any business using third-party code needs to examine.
Who owns your code?
How secure is the code? What data is collected using the code, where does this information go, and what power does the end user (or company whose name is on the app) have to protect, delete or manage this data?
There are other challenges: Is it updated regularly when using such code? Does the code itself remain secure? What depth of rigor is used when testing the software? Does the code contain any undisclosed script tracking code? What encryption is used and where is data stored?
The problem is that if the answer to any of these questions is “don’t know” or “none”, then the data is at risk. This emphasizes the need for robust security assessments around the use of any modular component code.
Data compliance teams need to test these things thoroughly – “minimum tests” are not enough.
I would also argue that an approach where all data collected is anonymised makes a lot of sense. That way, should information leak, the chance of misuse is minimized. (The danger for personalized technologies that lack robust information protection in the middle of the exchange is that this data, once collected, becomes a security risk.)
Surely the implications of Cambridge Analytica illustrate why obfuscation is a necessity in a connected age?
Apple seems to understand this risk. Pushwoosh is used in around 8,000 iOS and Android apps. It is important to note that the developer says that the data it collects is not stored in Russia, but this cannot protect it from being exfiltrated, experts cited by Reuters explain.
In a way, it doesn’t matter, since safety is based on anticipating risk, rather than waiting for the danger to happen. Given the large number of businesses that go out of business after being hacked, it’s better to be safe than sorry when it comes to security policy.
That’s why any company whose development team relies on off-the-shelf code should ensure that the third-party code is compliant with the company’s security policy. Because it’s your code, with your company name on it, and any misuse of that data due to inadequate compliance testing will be your problem.
Follow me further Twitter, or join me in the AppleHolics bar & grill and Apple Discussions groups on MeWe. Also now on Mastodon.