*This blog entry was originally published on May 24, 2015 on the original Polito Blog by Ian Duffy. It was re-posted on October 3, 2017 due to migrating to a new blog platform.
Recently we were contacted by a client who was developing an internal Android application for their customers to use. The application was provided to us by the client as a .apk file with no source code or documentation, and we were asked to test it out and determine if there were any security vulnerabilities present.
Our typical workflow involves loading the app onto a device or in an emulator and exercising its functionality to see what it does. We typically monitor network communications as well in order to determine what server(s) the application talks to and what protocols it uses to do so. In this particular case, the application used SSL/TLS to talk to the server.
We used Burpsuite to attempt to MITM the web traffic between the app and the server, and the app gave us an SSL certificate error, indicating that the app was not happy with Burpsuite's SSL certificate. The app refused to execute, which somewhat complicated our analysis.
Thinking that if we could trick Android into trusting the certificate (which may in turn prompt the app to trust the certificate), we tried installing the Burpsuite CA certificate into the Android device's trusted certificate storage. Normally, applications will check the device's trusted CAs and see the Burpsuite certificate and proceed with the SSL connection. In this case, this trick was not effective - it appeared that the application was using certificate pinning so that only the server's specific SSL certificate would be accepted. All other certificates would result in the app refusing to launch. So now what?
We installed the Android SSL TrustKiller app from iSECPartners onto our rooted device in order to see whether it would successfully bypass the SSL certificate pinning code within the application. This app hooks various SSL methods traditionally used by applications to establish SSL / TLS connections between client and server. For rudimentary android SSL functionality, it has proven effective at bypassing certificate pinning and allowing the SSL connection to be subverted. In this case, it was not effective - the SSL error persisted and the app refused to launch. In addition we suspected that the app had some checks to see whether the device was rooted and would refuse to launch if it detected a rooted device. It was time to try a different approach.
I extracted the contents of the .apk file and used dex2jar to convert the classes.dex file into a .jar file. Using JD-GUI, I decompiled the source code of the resulting .jar file so that I could see how they implemented their SSL certificate pinning within the application.
What became apparent was that the developers of this application had implemented some very specific checks in addition to the certificate pinning. They had implemented a custom TrustManager to check specific attributes of the SSL certificate issued by the server against the pinned certificate, which included verification of the hostname in the SSL certificate. Since they were doing these specific checks, the iSECPartners SSL Trust Killer would not be effective.
Okay, so we've identified the code that does the checking -- now what?
Using apktool, we can decode the dex (dalvik executable code) into smali, a sort of assembly language representation of dalvik bytecode. The nice thing about smali files is that they can be edited and recompiled into an android application (.apk) using apktool.
The first step is to disassemble the application:
apktool d <apk file name>
Once this has finished, there will be a folder created with the same name as the original apk file. Inside this folder will be some subdirectories:
<apk file name folder>
Underneath the smali folder, you will find all of the .smali files for each of the classes within the original .apk file. From our analysis in JD-GUI, we knew the classpaths for the files that contained the SSL checking. We then edited the corresponding smali files for the classes and removed the SSL certificate checking methods. In this case, the classes were throwing SSLExceptions if the certificates did not match. Therefore I just commented out the code that threw the exception and told the app to return normally (without having thrown an exception).
Once the smali code has been edited, you can recompile the smali code back into an .apk file.
apktool b <apk file name folder>
Apktool will scan the smali code and rebuild it into a full fledged .apk file that you can run on your device.
As a side note, I found throughout this engagement that Smali is a little bit strange to look at, but once you stare at it for a while you can figure out what it's doing, and the longer you stare at it the more sense it makes. For those that don't like staring at code, there is a good reference guide here. Also, the decompile-edit-recompile process can be a bit time consuming since any little error will cause either the compilation to fail or the app to crash at runtime, or mysterious runtime anomalies which suck! Either way, I'd recommend scripting / automating this entire process for future engagements to minimize the time required.
One other thing is required before you can run the .apk on your device - you need to digitally sign the .apk file. To do so, follow the instructions here to generate a signing certificate and use jarsigner (included with the Java JDK) to sign the application. Once that is done, install the application and test it.
After a few iterations of fixing / tweaking things within the .smali files and recompiling, we were successfully able to get the application to talk to the server via Burpsuite, and could proceed with our penetration test.
The lesson here for mobile application developers is that anyone can modify your code and make it do bad things. Developers should implement integrity checks on both the application and server side to verify the authenticity of their application prior to allowing the app to execute trusted functions. Web applications, especially those that interact with mobile applications, should flag any sessions where anomalous data is received and terminate those sessions immediately. Developers often assume that since they've implemented security checks on the client side (certificate pinning, rooted device checking, etc.) that they can trust any traffic that comes in from the device. This has proven time and again to be a bad idea and my experience here only reinforces that user data from mobile applications cannot be trusted.
Polito, Inc. offers a wide range of security consulting services including penetration testing, vulnerability assessments, incident response, digital forensics, and more. If your business or your clients have any cyber security needs, contact our experts and experience what Masterful Cyber Security is all about.