A Q&A with Jack Walsh of ICSA Labs
With the proliferation of mobile devices, businesses from all sectors are now offering apps for consumer and employee use. However, data insecurity, the potential for lost personal information and a lack of developer experience pose a major liability for companies providing mobile apps. I talked to Jack Walsh, mobility programs manager of ICSA Labs, about the major security and privacy issues connected to mobile apps.
So many companies are offering mobile apps these days. What are some of the key issues that risk managers should be aware of?
One of the main problems is that apps are typically developed by a third party. In the olden days, when everyone simply used a computer, applications came from large, well-known and respected companies such as Microsoft and IBM, which followed lengthy processes and procedures to better ensure safety, security and functionality. With hundreds of thousands of apps available now we have many smaller players developing mobile apps, and processes and procedures can be significantly different from one developer to the next. Even if you choose someone with experience, this is a relatively young field and it’s challenging to ensure that things are done properly. The second problem is that no one is able to test all of these apps—part of the trouble is that there are currently few good tools out there. The third concern is that the apps need to be compliant with whatever regulatory guidance is associated with the app developing company’s given industry, and that problem is going to continue to be a concern for many companies.
What kind of security risks are companies facing now?
An app can actually be malicious, doing something it was either intentionally or inadvertently made to do. For instance, there are disreputable ad networks out there that have served up malware. This is becoming more prevalent. Another area to watch are libraries for different apps—a developer who doesn’t want to do everything from scratch might link in an existing library outside of their control that could get you into trouble as well. Beyond that, apps might have vulnerable code or contain dead or recycled code; they might be overpermissioned or underpermissioned.
Is there a ‘case study’ you can offer as a lesson?
There have been issues with regards to libraries—there are people out there who intentionally create a harmless looking library that contains hidden malware. It’s not an everyday occurrence, but we’re seeing it more. In Russia and China we’ve also seen many apps that send SMS messages surreptitiously to premium numbers, billed to the user without the user’s knowledge. This has happened multiple times recently. There are a number of studies out there of the top 100 free and paid Apple iOS and Google Android apps that describe how those apps – many of which we all use every day – are not safeguarding user’s sensitive information.
What can a company do to mitigate its mobile app risk exposure?
Well, first and foremost, they should ask questions of their developer and find out how they’re building the app, what sort of processes and procedures they follow. Many times a developer will give what seems like an acceptable answer—“I use a secure platform for app development”—but the risk manager should dig deeper. Press them to find out what’s being done beyond that in the way of testing. Who else, if anyone, looks at the code or tests the resulting app for the developer? It’s always easier for someone else to see mistakes than the person who wrote it. Most importantly, does the app adequately protect sensitive information when it stores and transmits it? Does it use encryption and other storage and data transfer safeguards? Is it possible to defeat the authentication mechanism? Can other apps get access to the data through improper use of APIs in one or more library that are outside the direct control of the app developer? There are best practices out there to avoid these occurrences. Note, too, that testing should not be just a one-time thing. Apps need to be tested throughout their life cycle, any time the code is upgraded or the operating system gets an update the app should be looked at again. There’s no such thing as too much testing.
In summary…
Mr. Walsh, who tests mobile apps for security and safety, raises an important red flag here. All types of businesses and organizations, both tech and non-tech, are deploying newer technologies that can drastically increase their cyber liability risk exposure. Just the other day I saw that even my local hoagie shop was offering its own mobile app—God only knows where that user data is going!
Privacy matters. Moreover, if your mobile app doesn’t have a privacy policy that is transparent about the kinds of data being collected, stored and shared with third parties it can land your company in major legal trouble. “Wrongful data collection,” or running afoul of your own privacy policy, is one of the fastest growing areas of litigation in the cyber risk area. See, for example, the state of California suing Delta airlines for their mobile app, pursuing fines of $2500 per download. Not having a privacy policy in this day and age is simply reckless.
Note: eRisk Hub® users, please see our free mobile app privacy policy template by Ron Raether, Esq. of Faruki Ireland & Cox P.L.L. on the Tools section.