Finding and fixing software vulnerabilities has become a major struggle for most software-development companies. While generally without alternative, such fixing efforts are a major cost factor, which is why companies have a vital interest in focusing their secure software development activities such that they obtain an optimal return on this investment.
Thus, investigating which factors have the largest impact on the actual fix time is an important research area. To shed some light on this area, we analyzed the times for fixing security vulnerabilities at SAP. The results of our study have been published in the Journal on Data Science and Engineering (DSEJ) [1].
Apache Cordova is a widely used framework for writing mobile apps that follows the “hybrid app” paradigm. A hybrid app is a mobile app that is partly implemented in platform-neutral HTML5/JavaScript and partly in platform specific languages (e.g., Java or Objective C).
As part of our work on developing static analysis techniques for
Cordova apps [1], we analyzed Cordova
apps from Google play: we took the Top 1000 apps (as ranked by Google
in spring 2015) from Google Play and checked if these apps contain a
config.xml
file that belongs to the Cordova framework. Using this
criterion, we could identify 50 Cordova apps. Thus, according to our
analysis, only 5% of the Top 1000 apps are using Cordova.
This actually differs significantly. Many apps do, in fact, use Cordova as intended: The app is written in JavaScript, the Java part is unmodified and simply loads the entry-point HTML file which is set in the Cordova configuration file. Some apps, however, significantly change the Java part. The most extreme apps do not ship any HTML or JavaScript code in the APK and simply specify one hard coded URL in Java to be loaded, which is often just the mobile version of their website, hosted in a remote location.
Some apps chose a middle ground: They may first load Activities like regular Android apps, and may embed HTML and JavaScript code only into some parts of the app, where Cordova Plugins may be used to communicate back and forth. Such irregular Cordova apps are the exception and are significantly harder to statically analyze, as they change the way Cordova is integrated into the app.
Many plugins take callback functions and pass them through to their
exec
call. Especially for plugins which do not simply yield
a result which can be passed to the success callback, e.g, when the
plugin is just supposed to execute a command, there are often no fail
callbacks being provided, either. Some of these actions could indeed
fail, which would not get propagated through to the app code itself,
though, because no fail callback has been passed.
Plugins generally have the character of libraries, where the
JavaScript part rarely does more than encapsulate the exec
calls. There are also no other mechanism used to conduct
cross-language calls. The official Cordova plugins adhere to these
guidelines. Our work is intended for this kind of plugins.
Anyone can write Cordova plugins, and not all developers adhere to
these guidelines. One found plugin, apparently written just for this
specific app, does not contain any JavaScript code; instead, the
exec
calls are done right in the app code itself. Other
plugins represent the other extreme and implement quite a bit of the
plugin logic on the JavaScript side, which could have been as well
written in Java. Again some other plugins do not even use
exec
to communicate with their Java side, but use methods
which are also used internally in the Cordova framework. The reason
for these unnecessary uses of workarounds remains unclear.
One plugin found in those Cordova apps is special in a different way: Combining Java and JavaScript was apparently not enough, as the APK contained some native libraries accessed via JNI to do some basic arithmetic calculations. As JSON strings get passed from the JavaScript part via Java to the C part, the attack surface gets even larger.
I am looking forward to my first OWASP meeting in Sheffield (it’s actually the second meeting of the Sheffield OWASP Chapter). I will give a talk on my experiences in introducing and implementing a security testing strategy within a large (more than 25000 developers) and international software development team. There will be even more interesting talks (as well as free beer in pizza).
For example,
Looking forward to a great OWASP meeting in Sheffield (and I am sure it will not be the last one)!
Looking forward to a great week in Trento attending the SECENTIS PhD Winter School on Security and Trust of Next Generation Enterprise Information Systems. It will be a week full of interesting lectures on building security and privacy-aware enterprise systems.
The topic and speakers are:
I will give a talk entitled Static Analysis - The Workhorse Of A End-To-End Security Testing Strategy in which I provide a broad overview of static program analysis and also report on the experiences in using static analysis at SAP.
Everybody developing software should, in fact, accept the challenge to develop secure software. This is not an easy challenge: it requires an end-to-end security development life-cycle (SDLC) that nicely integrates with your software development processes.
Security testing is an important part of any security development life-cycle (SDLC) and, thus, should be a part of any software development life-cycle. Still, security testing is often understood by an activity done by security testers in the time between “end of development” and “offering the product to customers”. Fixing bugs that late in the development process is not only expensive, it also conflicts with agile development in general and the DevOps model in particular.
SAP’s Security Testing Strategy enables developers to find security vulnerabilities early by applying a variety of different security testing methods and tools. When you want to integrate security testing into your (agile) software development, the most people emphasize how important a security awareness program for both developers and mangers is. While security awareness is important, our experience is that developer awareness is even more important! Listen to your developers and help them. Recall, building secure systems is much more difficult than finding a successful attack.
Do not expect your developers to become security experts (or penetration testers) – expect them to become security aware and help them with development friendly tools that spot security vulnerabilities early during development and that are nicely integrated into the tools and workflows used by the developers. And, finally, make the process of fixing issues as easy and painless as possible. The effort for fixing an issue should not be the main reason for not fixing something. If you want to learn more about SAP’s Security Testing Strategy, you can watch my presentation at the OWASP AppSec 2014 on youtube (slides are also available).