Skip to main content
Application Security Image

This post is from the Arxan blog and has not been updated since the original publish date.

Last Updated Apr 03, 2019 — Application Security expert

Vulnerability Epidemic in Financial Mobile Apps - Episode 1 [Video]

Application Security

 

Summary of research and findings


Well, we're here today to talk about the state of application security. And our guest, Alissa Knight, is here with some interesting research to talk about.

I'm Alissa Knight. I am a security analyst with Aite Group, and I am a 19-year veteran as a recovering hacker.

Great. Aaron?

I'm Aaron Lint. I'm Arxan's VP of research and chief scientist.

And I'm Rusty Carter. I'm the vice president of product management for Arxan. So Alissa, can you tell us a little bit about your research and how you went about it?

Sure. So I had started this research project into application security, and specifically around application shielding And how systemic within the financial services industry companies were not shielding their apps. And so I had worked with my research director to identify the three apps in each financial services industry sector that we were going to do, which included wealth management, retail banking, basically all of the individual verticals within the horizontal sector.

And the results were very shocking. I knew that they would be bad, but I didn't know that they would be that bad. Out of all of the apps, only one company had implemented application obfuscation to obfuscate the source code and shielding. And that was not a US company. So it is very systemic across the financial services industry.

And so the methodology that I used was after downloading the application to the Android device--I used an Android tablet running Android 7. And after using a tool to extract it off of the device, had loaded it onto my local system for what's called static code analysis, where I then loaded it into--using different tools like Apktool to actually extract the APK file and then load it into a decompiler to decompile it.

And these are all the applications that are just right off of the Google Play Store?

Correct. So I didn't go to any third party app stores. Everything was the official app distribution from the financial--or the FI, I'll call them--within the Google Play Store.

So these are the things that we've been reading about that are supposed to be secure and scanned and protected.

Right. You would think especially within the financial services industry that these would be highly protected, that you wouldn't even be able to decompile it, that it would just be garbage. But it was shocking.

The results that I found when performing this analysis was private keys being stored inside the app, some of them stored in subdirectories with no encryption precomputed, no passwords. I could just load them into--on the Mac, it's called Keychain. So I could just load them into Keychain without being prompted for a password. And these are-- we're not talking about-- some of them were car insurance apps, but others were wealth management companies where security should be a top priority.

Some of the other findings included finding out that what looked to be debug logging, they were logging absolutely everything, including user input to the log files, to the logs of the app where anything--like I said, the only way I could describe it was it looked like debug logging, debug output, where absolutely everything was going to log files.

If you took your skills, and you applied this to the research, and you used some common tools that were available, were there any really acute issues that you found that someone without your level of skill would be able to discover that would put these businesses at risk?

I would say that the most critical findings were the P2P payment money transfer apps. So each one was categorized. And I tried to anonymize the data as much as possible, and especially in the report that will be published. I tried to use percentages. I can tell you that the highest number of critical findings where you could potentially have account takeover as a result of the findings, or that the user's either account credentials, or even be able to intercept and replay the traffic, were in the P2P money transfer apps.

The second worst category of findings were definitely in the retail banking apps.

And then everything else followed suit from there. We did include healthcare providers. So three major health insurance companies were selected, and those findings were about just as devastating as the money transfer apps.

With the health care apps, they were different findings, but they were certainly--it's almost as if the developers who wrote the code didn't realize that you could actually access the Android operating system or Android file system to get what they are actually storing to text clear text files on the Android device. It's like a security through obscurity approach, almost. And these are-- this is health care information. This is PHI that in any other circumstance, on a server or a file server or a shared folder, would have been encrypted data at rest. It was just being written to log files or text files on the Android OS.

So I averaged roughly about 8 and 1/2 minutes per app. I got what I would call to a point of staging, where I staged the apps and was ready to actually analyze and look at the code, all those took roughly about 8, 8 and 1/2 minutes. And then I spent the rest of the week, and a period of five days, just looking at source code and looking at log files and what's called dynamic code analysis. But my research was limited to the static network interdiction where I set up what's called Burp Suite to actually interdict the traffic that was going out, and was just as alarmed with the findings on the network side on layer 3.

But aren't all the apps using HTTPS or SSL to connect?

You would think, especially the financial services apps. In my findings, there were quite a few URLs and traffic being passed over HTTP. Some of the other findings included QA and dev URLs in the code where I actually went to the-- connected to the APIs, and they were live and would respond to TCP requests. And a simple flip of a switch in the code, or what have you, would basically redirect that traffic to a QA or a dev server.

Did you see similar findings with certificate pinning?

Yes. The PKI side of the findings was very surprising as well. Like I said, something as simple as protecting the private keys on the file system, stuff like that, or just being hard-coded in the app, was very surprising as well across multiple sectors. It wasn't just one sector. They seem to be systemic across all of the individual verticals.

 

More from the Blog

View more
Apr 29, 2021

Why better security means better products

Application Security
Over the past 15 years, businesses have learned a lot about the value ...
Read More
Jun 05, 2020

In Plain Sight II: On the Trail of Magecart

Application Security
On the surface, the breaches that impacted British Airways, Ticketmast ...
Read More
Jun 02, 2020

Here Comes CCPA

Application Security
  Ready Or Not, Here It Comes! As of publication, there are 147 ...
Read More
May 27, 2020

Application Security: Testing is NOT Enough

Application Security
In the software development world, developers are faced with a break ...
Read More
Contact Us