<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
    <title>logicalhacking.com: Posts tagged tips&tricks</title>
    <link href="https://logicalhacking.com/blog/tags/tips%26tricks/index.xml" rel="self" />
    <link href="https://logicalhacking.com" />
    <id>https://logicalhacking.com/blog/tags/tips%26tricks/index.xml</id>
    <author>
        <name>Achim D. Brucker</name>
        <email>adbrucker@0x5f.org</email>
    </author>
    <updated>2020-01-03T00:00:00Z</updated>
    <entry>
    <title>Unsanitize Safelinks</title>
    <link href="https://logicalhacking.com//blog/2020/01/03/safelinks/" />
    <id>https://logicalhacking.com//blog/2020/01/03/safelinks/</id>
    <published>2020-01-03T00:00:00Z</published>
    <updated>2020-01-03T00:00:00Z</updated>
    <summary type="html"><![CDATA[<article>
  <header>
    <div class="meta">
      Posted on 
      <time datetime="2020-01-03" pubdate data-updated="true"> 3 January 2020</time>
       by  <a href="https://www.brucker.ch/">Achim D. Brucker</a>, 
      licensed under <a href="https://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND 4.0</a>.
      <div class="tags"><a href="/blog/tags/security/index.html">security</a> | <a href="/blog/tags/safelinks/index.html">safelinks</a> | <a href="/blog/tags/phishing/index.html">phishing</a> | <a href="/blog/tags/tips%26tricks/index.html">tips&amp;tricks</a></div>
      <meta name="fediverse:creator" content="@adbrucker@fediscience.org">
    </div>
    <h1 class="entry-title">
      <a href="/blog/2020/01/03/safelinks/">Unsanitize Safelinks</a>
    </h1>
  </header>
  <p>Both the home/personal online offerings of Microsoft Outlook (e.g., Outlook.com,
Office 365 Home, or Office 365 Personal) and the professional Office 365
offerings (e.g., as part of Office 365 Advanced Threat Detection) might rewrite
links in received emails with the goal of protecting users against certain
threats (e.g., phishing).</p>
<!-- MORE -->
<p>For various reasons, one might to rewrite these “safelinks” back into their
original form.</p>
<p>The script
<a href="https://git.logicalhacking.com/adbrucker/unsanitize-safelinks">unsantize-safelinks</a>
does exactly this. This can, for example, be used for displaying mails nicely in
<a href="https://www.mutt.org">mutt</a> or other text-based mail programs. In your
“.muttrc” you need to add/edit the following configuration:</p>
<pre class="muttrc"><code>set display_filter=&quot;unsanitize-safelinks&quot;</code></pre>
<p>If you want to also rewrite the links when using tools such as urlscan, use:</p>
<pre class="muttrc"><code>macro index,pager \cb &quot;&lt;pipe-message&gt; unsanitize-safelinks| urlscan&lt;Enter&gt;&quot;</code></pre>
<p>And the following trick rewrites the links prior to editing a message (e.g., when replying):</p>
<pre class="muttrc"><code>set editor =&quot;unsanitize-safelinks -i %s &amp;&amp; $EDITOR %s&quot;</code></pre>
<p>Finally, if links should be rewritten when viewing the HTML-part, you need to
edit your <code>.mailcap</code> entry for type <code>text/html</code>:</p>
<pre class="mailcap"><code>text/html; unsanitize-safelinks -i --html %s &amp;&amp; /usr/bin/sensible-browser %s; description=HTML Text; nametemplate=%s.html</code></pre>
<h2 class="sectionAnchor" id="availability">Availability <a href="#availability">§</a></h2>
<p>The project is licensed under a 2-clause BSD license and available at:
<a href="https://git.logicalhacking.com/adbrucker/unsanitize-safelinks" class="uri">https://git.logicalhacking.com/adbrucker/unsanitize-safelinks</a>.</p>
</article>
]]></summary>
</entry>
<entry>
    <title>A LaTeX Style For Self-Archiving Copies of Papers</title>
    <link href="https://logicalhacking.com//blog/2018/02/21/self-archiving-papers-with-latex/" />
    <id>https://logicalhacking.com//blog/2018/02/21/self-archiving-papers-with-latex/</id>
    <published>2018-02-21T00:00:00Z</published>
    <updated>2018-02-21T00:00:00Z</updated>
    <summary type="html"><![CDATA[<article>
  <header>
    <div class="meta">
      Posted on 
      <time datetime="2018-02-21" pubdate data-updated="true">21 February 2018</time>
       by  <a href="https://www.brucker.ch/">Achim D. Brucker</a>, 
      licensed under <a href="https://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND 4.0</a>.
      <div class="tags"><a href="/blog/tags/tips%26tricks/index.html">tips&amp;tricks</a> | <a href="/blog/tags/latex/index.html">latex</a> | <a href="/blog/tags/publishing/index.html">publishing</a></div>
      <meta name="fediverse:creator" content="@adbrucker@fediscience.org">
    </div>
    <h1 class="entry-title">
      <a href="/blog/2018/02/21/self-archiving-papers-with-latex/">A LaTeX Style For Self-Archiving Copies of Papers</a>
    </h1>
  </header>
  <p>Luckily, an increasing number of publishers allows authors of (academic) papers
to publish a pre-print of their accepted papers on their personal website or
their institutional website. This eases access to those papers significantly, as
the “official” version on the publishers’ website is often behind a paywall.
Most publishers require that the pre-prints published by the author contain a
statement referring to the official version.</p>
<p>Thus, the only remaining question is: how to produce a pre-print containing this
reference with as little effort as possible. If you are using LaTeX for writing
your papers, <em>authorarchive</em> package might be the solution.</p>
<!-- MORE -->
<p>Adding the self-archiving note to a paper formatted with Springer’s LNCS style
is as easy as adding</p>
<div class="sourceCode" id="cb1"><pre class="sourceCode tex"><code class="sourceCode latex"><span id="cb1-1"><a href="#cb1-1" aria-hidden="true" tabindex="-1"></a><span class="bu">\usepackage</span>[LNCS,</span>
<span id="cb1-2"><a href="#cb1-2" aria-hidden="true" tabindex="-1"></a>   key=brucker-authorarchive-2016,</span>
<span id="cb1-3"><a href="#cb1-3" aria-hidden="true" tabindex="-1"></a>   year=2016,</span>
<span id="cb1-4"><a href="#cb1-4" aria-hidden="true" tabindex="-1"></a>   publication={Anonymous et al. (eds). Proceedings of the International</span>
<span id="cb1-5"><a href="#cb1-5" aria-hidden="true" tabindex="-1"></a>       Conference on LaTeX-Hacks, LNCS~42. Some Publisher, 2016.}</span>
<span id="cb1-6"><a href="#cb1-6" aria-hidden="true" tabindex="-1"></a>   startpage={42},</span>
<span id="cb1-7"><a href="#cb1-7" aria-hidden="true" tabindex="-1"></a>   doi={00/00_00},</span>
<span id="cb1-8"><a href="#cb1-8" aria-hidden="true" tabindex="-1"></a>   doiText={0/00<span class="fu">\_</span>00},</span>
<span id="cb1-9"><a href="#cb1-9" aria-hidden="true" tabindex="-1"></a>   nocopyright</span>
<span id="cb1-10"><a href="#cb1-10" aria-hidden="true" tabindex="-1"></a> ]{<span class="ex">authorarchive</span>}</span></code></pre></div>
<p>to the preamble of your paper. The package also supports advanced features such
as adding bibliographic entries (e.g., for BibTex) into the final PDF.</p>
<p>The LaTeX package “authorarchive” is a LaTeX style for producing author
self-archiving copies of (academic) papers. It is available on
<a href="https://ctan.org/pkg/authorarchive">CTAN</a> and development versions are
available in the <a href="https://git.logicalhacking.com/adbrucker/authorarchive">authorarchive git
repository</a>. The package
is dual-licensed under a 2-clause BSD-style license and/or the LPPL version 1 or
any later version.</p>
<h3 class="sectionAnchor" id="links">Links <a href="#links">§</a></h3>
<ul>
<li><a href="https://ctan.org/pkg/authorarchive">CTAN</a> and development versions</li>
<li><a href="https://git.logicalhacking.com/adbrucker/authorarchive">git repository</a>.</li>
</ul>
</article>
]]></summary>
</entry>
<entry>
    <title>Static Analysis of Cordova Apps</title>
    <link href="https://logicalhacking.com//blog/2018/01/26/static-analysis-of-cordova-apps/" />
    <id>https://logicalhacking.com//blog/2018/01/26/static-analysis-of-cordova-apps/</id>
    <published>2018-01-26T00:00:00Z</published>
    <updated>2018-01-26T00:00:00Z</updated>
    <summary type="html"><![CDATA[<article>
  <header>
    <div class="meta">
      Posted on 
      <time datetime="2018-01-26" pubdate data-updated="true">26 January 2018</time>
       by  <a href="https://www.brucker.ch/">Achim D. Brucker</a>, 
      licensed under <a href="https://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND 4.0</a>.
      <div class="tags"><a href="/blog/tags/cordova/index.html">cordova</a> | <a href="/blog/tags/android/index.html">android</a> | <a href="/blog/tags/mobile%20apps/index.html">mobile apps</a> | <a href="/blog/tags/sast/index.html">sast</a> | <a href="/blog/tags/security/index.html">security</a> | <a href="/blog/tags/appsec/index.html">appsec</a> | <a href="/blog/tags/tips%26tricks/index.html">tips&amp;tricks</a></div>
      <meta name="fediverse:creator" content="@adbrucker@fediscience.org">
    </div>
    <h1 class="entry-title">
      <a href="/blog/2018/01/26/static-analysis-of-cordova-apps/">Static Analysis of Cordova Apps</a>
    </h1>
  </header>
  <p><a href="https://cordova.apache.org">Apache Cordova</a> is a widely used framework for
writing mobile apps that follows the “hybrid app” paradigm. A hybrid app is a
mobile app that is partly implemented in platform-neutral HTML5/JavaScript and
partly in platform specific languages (e.g., Java or Objective C).</p>
<p>Static (data flow) analysis of hybrid apps that supports the analysis of both the
platform independent and the platform specific parts in a unified way
(e.g., for finding injection attacks) is an unsolved problem.</p>
<!-- MORE -->
<p>The main problem with statically analyzing Cordova apps is that many
vulnerabilities in Cordova applications exploit data flows that cross the
boundary between HTML/JavaScript and native code. Thus, a static tool should be
able to analyze these cross-language data flows.</p>
<p>There are, in principle, three ways for implementing a static analysis
statically of cross-language data-flows of Cordova apps:</p>
<ol type="1">
<li><strong>A (deep) analysis of the Cordova:</strong> In this approach, the full Cordova
framework source code is, all plugin source code, together with the
implemented application, is analyzed.
<ul>
<li><em>Advantages:</em>
<ul>
<li>Very precise computation of all data flows possible.</li>
<li>Only very limited amount of manual modeling of sinks and sources required.</li>
</ul></li>
<li><em>Disadvantages:</em>
<ul>
<li>Computationally very expensive. The analysis might take hours
even for very small extensions.</li>
</ul></li>
</ul></li>
<li><strong>Modeling the core API of Cordova:</strong> In this approach, the cross-language
interfaces of the core Cordova framework are modeled, avoiding the need of
analyzing the framework statically. Only the application itself and all used
plugins are analyzed.
<ul>
<li><em>Advantages:</em>
<ul>
<li>Allows for analyzing the application in the context of custom or
modified plugins.</li>
<li>Usually very fast (a few minutes, even for complex applications)<br />
</li>
</ul></li>
<li><em>Disadvantages:</em>
<ul>
<li>If the framework changes, a specialist needs to update the model.</li>
</ul></li>
</ul></li>
<li><strong>Modeling the Cordova plugins:</strong> In this approach, the Cordova
framework and all plugins are modeled, i.e., their sources and
sinks are configured in the static analysis tool. Only the
application code itself is statically analyzed.
<ul>
<li><em>Advantages:</em>
<ul>
<li>Very fast.</li>
</ul></li>
<li><em>Disadvantages:</em>
<ul>
<li>No detection of vulnerabilities caused by modified or custom
plugins.</li>
</ul></li>
</ul></li>
</ol>
<p>We consider the second approach a good compromise between thoroughly<br />
analyzing all possible cross-language data flows and performance (respectively,
repetitively scanning the same code). We implemented this approach in a
<a href="https://git.logicalhacking.com/DASCA/DASCA">prototype</a> and its evaluation shows
that it reliably detects cross-language data flows in Cordova application. For
more details, have a look at our <a href="https://distrinet.cs.kuleuven.be/events/essos/2016/">ESSoS
2016</a> paper
<span class="citation" data-cites="brucker.ea:cordova-security:2016">[1]</span>.</p>
<h3 class="sectionAnchor unnumbered" id="references">References <a href="#references">§</a></h3>
<div id="refs" class="references csl-bib-body" role="doc-bibliography">
<div id="ref-brucker.ea:cordova-security:2016" class="csl-entry" role="doc-biblioentry">
<div class="csl-left-margin">[1] </div><div class="csl-right-inline">A. D. Brucker and M. Herzberg, <span>“On the static analysis of hybrid mobile apps: A report on the state of apache cordova nation,”</span> in <em>International symposium on engineering secure software and systems (ESSoS)</em>, J. Caballero and E. Bodden, Eds. Heidelberg: Springer-Verlag, 2016, pp. 72–88. doi: <a href="https://doi.org/10.1007/978-3-319-30806-7_5">10.1007/978-3-319-30806-7_5</a>. Author copy: <a href="http://logicalhacking.com/publications/brucker.ea-cordova-security-2016/" class="uri">http://logicalhacking.com/publications/brucker.ea-cordova-security-2016/</a></div>
</div>
</div>
</article>
]]></summary>
</entry>
<entry>
    <title>Secure Your Software Supply Chain</title>
    <link href="https://logicalhacking.com//blog/2016/07/20/secure-supply-chain/" />
    <id>https://logicalhacking.com//blog/2016/07/20/secure-supply-chain/</id>
    <published>2016-07-20T00:00:00Z</published>
    <updated>2016-07-20T00:00:00Z</updated>
    <summary type="html"><![CDATA[<article>
  <header>
    <div class="meta">
      Posted on 
      <time datetime="2016-07-20" pubdate data-updated="true">20 July 2016</time>
       by  <a href="https://www.brucker.ch/">Achim D. Brucker</a>, 
      licensed under <a href="https://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND 4.0</a>.
      <div class="tags"><a href="/blog/tags/sdlc/index.html">sdlc</a> | <a href="/blog/tags/security/index.html">security</a> | <a href="/blog/tags/appsec/index.html">appsec</a> | <a href="/blog/tags/securitytesting/index.html">securitytesting</a> | <a href="/blog/tags/tips%26tricks/index.html">tips&amp;tricks</a> | <a href="/blog/tags/floss/index.html">floss</a> | <a href="/blog/tags/opensource/index.html">opensource</a> | <a href="/blog/tags/webinar/index.html">webinar</a></div>
      <meta name="fediverse:creator" content="@adbrucker@fediscience.org">
    </div>
    <h1 class="entry-title">
      <a href="/blog/2016/07/20/secure-supply-chain/">Secure Your Software Supply Chain</a>
    </h1>
  </header>
  <p>The question if FLOSS (Free/Libre and Open-Source Software) is more or less
secure than proprietary software is often not the right question to ask. The
much more important question is: How to integrate FLOSS components securely into
a Secure Software Development Process? Moreover, if you think about it, the
potential challenges in the secure integration of FLOSS components are also
challenges integrating other types of third-party components. As a software
vendor you are finally responsible for the security of the overall product,
regardless which technologies and components where used in building it (you can
either read more, or watch the <a href="https://youtu.be/zUDaP0m-gFU">video of our AppSecEU
presentation</a>).</p>
<!-- MORE -->
<p>Ideally, third party components should, security-wise, be treated as your own
code and, thus, they impact the all aspects of the Secure Software Development
Lifecycle.</p>
<figure>
<img src="/blog/images/sdlc-third-party.png" id="id" class="zoom" style="width:100.0%" alt="How Third-Party Components Affect the SDLC" />
<figcaption aria-hidden="true">How Third-Party Components Affect the SDLC</figcaption>
</figure>
<p>Before we continue, let;s quickly review the three most important types of third
party components:</p>
<ul>
<li><strong>Commercial libraries, outsourcing, and bespoke software:</strong> This category
comprises all third party components that are developed or purchased/licensed
under a commercial licensing agreement that contains also a
support/maintenance agreement. This software is usually only available to
developers after the procurement department bought them (thus, it is easy to
track their use within a software company) and any support/maintenance
obligations can (at least in principle) be pushed to the supplier.</li>
<li><strong>Freeware:</strong> This is the <em>gratis</em> software that you usually get as binary (or
with a license that forbids modification). A good example of this category are
device drivers that you get as a necessity to operate the hardware that you
actually bought or any non-FLOSS software that you can download for free. As
this software are gratis, they are usually not covered by a maintenance
contract (no warranty), i.e., you neither have guaranties to get fixes nor
future updates.</li>
<li><strong>Free/Libre Open Source Software:</strong> These are software components that are
subject to an approved <a href="https://opensource.org/licenses">FLOSS license</a>, which
guarantees the access to the source code <em>and</em> also the right to modify the
sources and distribute modified versions of it. This allows you to maintain
the components yourself or to pay the vendor or a third-party provider to
maintain a version for you – in case you need a different maintenance
strategy as offered by the upstream authors. While this is not a strict
requirement, they are usually also gratis available, e.g., as download from
the Internet.</li>
</ul>
<p>Freeware is ubiquitous, i.e., easy available to developers without triggering
formal processes. Thus, it is the most problematic one as its use is usually
hard to track, and it is usually hard to get fixes or updates in a timely manner
(or any maintenance guarantees). FLOSS is also easily available – but it does
not have the maintenance problem, as you could fix it yourself and there are
also a plenty of companies offering support for FLOSS components. Thus, when
you are tracking the use of FLOSS (as well as the use of proprietary third-party
components) in your organization, proprietary and FLOSS components differ mainly
in one aspect: FLOSS, by definition, provides you with the possibility to fix
issues yourself (or ask an arbitrary third part to do it for you).</p>
<p>Let’s face the truth: any third party component (as any self-developed code) can
contain vulnerabilities that need to be fixed during the lifecycle of the
consuming applications. Thus, instead of asking which type of components is more
secure (answer: neither, there is bad and good software in both camps), it is
more important to control/plan the risk and effort associated with consuming
third party components.</p>
<p>Thus, FLOSS just provides you one additional opportunity; fixing the issues
yourself. Moreover, when doing research in software security, FLOSS has the
additional advantage that data about software versions, vulnerabilities and
fixes is available that can be used for validating research ideas. For example,
<strong>we are researching methods</strong></p>
<ul>
<li>for assessing if a specific version of a software component is actually
vulnerable. This helps to avoid unnecessary updates, as public vulnerability
databases such as <a href="https://nvd.nist.gov/">NVD</a> usually over-approximate the
range of vulnerable versions,</li>
<li>for estimating the maintenance effort (and risk) of third party component that
are consumed by a larger product. The goal is to provide already in the design
phase of an application an estimate how much effort on the long-run is caused
by the selected third-party components.</li>
</ul>
<p>We published already preliminary results <span class="citation" data-cites="dashevskyi.ea:foss-costs:2016">[1]</span> and we
are expecting much more to come in the (near) future.</p>
<p>Of course, one would also like to precisely predict the risk (or the likelihood
that vulnerabilities are detected in a specific third-party component during the
maintenance period of the consuming applications). Sadly, our research shows
that this is not (easily) possible and, again, is wrong question to ask.</p>
<p>Let’s get back to some pragmatic recommendations if you are using third-party
components in general and FLOSS components in particular as part of your
software development. As we cannot predict future vulnerabilities easily, we
focus on strategies for controlling the risk and effort – which should be,
anyway, the main focus of a good project manager.</p>
<h4 class="sectionAnchor" id="strategies-for-controlling-security-risks">Strategies For Controlling Security Risks <a href="#strategies-for-controlling-security-risks">§</a></h4>
<p>To control (minimize) the risk of third party components we recommend
integrating the management of third-party components in your Secure Software
Development Lifecycle right from the start and to obtain them from trustworthy
sources (and, if you are in the lucky situation to be able to select a component
from various components providing the necessary functionalities, we have some
tips as well):</p>
<ul>
<li><em>Integration in your Secure Software Development Life Cycle:</em>
<ul>
<li><strong>Maintain a detailed software inventory:</strong> To be able to fix vulnerable
third-party components as soon as possible, a complete <em>software
inventory</em> is a must. Recall the morning when
<a href="https://en.wikipedia.org/wiki/Heartbleed">Heartbleed</a> was published –
did you know which of the applications that you are offering to customers
did use the vulnerable OpenSSL version? If you can determine the affected
applications of your offering, you can immediately focus on fixing them,
no time wasted in first finding out what to fix. This minimizes your risk
as well as the risk of your customers.</li>
<li><strong>Actively monitor vulnerability databases:</strong> Not all third-party vendors in
general and FLOSS projects in particular are actively notifying their
customers individually about vulnerabilities or fixes. Thus, it is your
obligation to actively monitor the public vulnerability databases daily
for new vulnerabilities. This does not only include general databases such
as <a href="https://nvd.nist.gov/">NVD</a>, you also need to monitor project specific
vulnerability pages (not all projects issue CVE’s!). For your key
third-party components, we also strongly recommend subscribing to their
security mailing list (or other news channels).</li>
<li><strong>Assess project specific risk of third-party components:</strong> The potential
risk of a vulnerability in a third party component depends on how <em>you</em>
are using/consuming it. Thus, your <em>threat modeling</em> approach needs to
cover the use of third-party components to assess the project specific
risks and to develop project specific mitigation strategies.</li>
</ul></li>
<li><em>Obtaining components (or sources):</em>
<ul>
<li><strong>Download from trustworthy sources:</strong> To avoid working with
malicious components you should always ensure that components are
obtained from a trustworthy source (i.e., the original upstream
vendor or a trustworthy and reliable distributor/third-party
provider). While it seems obvious, this boils down to
downloading sources only via http<strong>s</strong> (or other authentic
channels) and checking the signatures or checksums. Finally,
don’t forget to check the scripts (maven scripts, installations
scripts, makefiles, etc.) that download
dependencies during build or deployment – a surprisingly large
number of them downloads dependencies via non-authenticated
channels (neither do they check the validity of the download).</li>
</ul></li>
<li><em>Project Selection:</em>
<ul>
<li><p><strong>Prefer projects with private bug trackers:</strong> Being able to
report security issues to a FLOSS project in privately allows you
to discuss potential fixes with the community without putting
your customers or all other customers of the FLOSS component at
risk (e.g., by inadvertently publishing a 0-day).</p></li>
<li><p><strong>Prefer projects with a mature (healthy) Secure Development Lifecycle:</strong> As
nobody is immune from security vulnerabilities, it is important to select
project that take security seriously. A good indicator is the maturity
level of the Secure Software Development Lifecycle, e.g., by answering
such as</p>
<ul>
<li>does the project document security fixes/patches (reduce the
risk of “secret” security fixes)</li>
<li>does the project document security guidelines</li>
<li>does the project use security testing tools</li>
</ul></li>
</ul>
The <a href="https://www.coreinfrastructure.org/programs">Badge Program</a> of the <a href="https://www.coreinfrastructure.org/programs">Core
Infrastructure Initiative</a> is
currently developing guidelines and a certification program to allow project
to make users aware that they have a mature Secure Software Development
Lifecycle.</li>
</ul>
<h4 class="sectionAnchor" id="strategies-for-controlling-effort">Strategies For Controlling Effort <a href="#strategies-for-controlling-effort">§</a></h4>
<p>To control (minimize) the effort of third party components, again, the
Secure Software Development Lifecycle is the most important part to
look at – followed by the project selection.</p>
<ul>
<li><em>Secure Software Development Life Cycle</em>
<ul>
<li><strong>Update early and often:</strong> Based on our analysis of API changes and
published CVE’s for many Java-based FLOSS projects, we
recommend updating early and often. Overall, this ensures the latest
fixes as well as keep the upgrade effort low, as only a few APIs (if at
all) have changed.</li>
<li><strong>Avoid own forks (collaborate with FLOSS community):</strong> Maintaining own
forks for fixing security issues increases your effort and does not
contribute back to the community. For both you and the community a
collaboration is the much better model – and as you selected projects
that support a private security bug tracker, you can work with the
community without putting your customers at risk.</li>
</ul></li>
<li><em>Project selection</em>
<ul>
<li><strong>Large user base:</strong> A large user base has at least two effects: first more
security issues will be detected and reported. While, on the first sight,
this seems counter-productive as also more patches will be released that
you need to integrate, it results in a more secure product on the long
term (and you want to avoid sever undetected vulnerabilities at all
costs). Second, a larger user base results in more people that know the
component and can provide support.</li>
<li><strong>Active development community:</strong> This is an easy one: a active development
community will more likely result in timely and well documented security
fixes and, moreover, more people that can help you to fix issues that you
encounter.</li>
<li><strong>Technologies you are familiar with:</strong> This also seems to be obvious but
still, this is a recommendation that often is ignored. By choosing
technologies you are familiar with, the effort in taking over maintenance
yourself – if necessary – is much lower. More importantly, familiarity
with the component and its infrastructure also allows you to assess the
severity of vulnerabilities in your actual usage scenarios. Lastly, if you
choose components using technologies you are not familiar with, built up
the necessary knowledge (give your developers time and resource to get
familiar with the new technology).</li>
<li><strong>Compatible maintenance strategy/lifecycle:</strong> Your software has a certain
maintenance lifecycle and so have the third-party components. If you are
building software for larger enterprises, it is not unlikely that you are
providing support for ten years and more – if your third-party vendors
only provide support for one year, you are in an unlucky situation.</li>
<li><strong>Smaller (in terms of code size) and less complex might be better:</strong> Less
code means often fewer vulnerabilities and a less complex implementations
are easier to maintain. Thus, if you only need a UUID generator that can
be implemented in 20 lines of code, you might prefer to implement it
yourself or choose a dedicated project providing this functionality
instead of using a complex and generic “utilities” framework that has a
several hundred thousands lines of code.</li>
</ul></li>
</ul>
<h3 class="sectionAnchor" id="supplementary-material">Supplementary Material <a href="#supplementary-material">§</a></h3>
<ul>
<li><a href="https://youtu.be/zUDaP0m-gFU">Recording of our talk at the OWASP AppSecEU 2016</a></li>
<li><a href="https://www.brucker.ch/bibliography/abstract/talk-brucker.ea-owasp-third-party-security-2016">Slides of our OWASP AppSecEU talk</a></li>
</ul>
<h3 class="sectionAnchor unnumbered" id="references">References <a href="#references">§</a></h3>
<div id="refs" class="references csl-bib-body" role="doc-bibliography">
<div id="ref-dashevskyi.ea:foss-costs:2016" class="csl-entry" role="doc-biblioentry">
<div class="csl-left-margin">[1] </div><div class="csl-right-inline">S. Dashevskyi, A. D. Brucker, and F. Massacci, <span>“On the security cost of using a free and open source component in a proprietary product,”</span> in <em>International symposium on engineering secure software and systems (ESSoS)</em>, J. Caballero and E. Bodden, Eds. Heidelberg: Springer-Verlag, 2016, pp. 190–206. doi: <a href="https://doi.org/10.1007/978-3-319-30806-7_12">10.1007/978-3-319-30806-7_12</a>. Author copy: <a href="http://logicalhacking.com/publications/dashevskyi.ea-foss-costs-2016/" class="uri">http://logicalhacking.com/publications/dashevskyi.ea-foss-costs-2016/</a></div>
</div>
</div>
</article>
]]></summary>
</entry>
<entry>
    <title>Cordova Security Considerations</title>
    <link href="https://logicalhacking.com//blog/2016/05/12/cordova-security/" />
    <id>https://logicalhacking.com//blog/2016/05/12/cordova-security/</id>
    <published>2016-05-12T00:00:00Z</published>
    <updated>2016-05-12T00:00:00Z</updated>
    <summary type="html"><![CDATA[<article>
  <header>
    <div class="meta">
      Posted on 
      <time datetime="2016-05-12" pubdate data-updated="true">12 May 2016</time>
       by  <a href="https://www.brucker.ch/">Achim D. Brucker</a>, 
      licensed under <a href="https://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND 4.0</a>.
      <div class="tags"><a href="/blog/tags/mobile/index.html">mobile</a> | <a href="/blog/tags/appsec/index.html">appsec</a> | <a href="/blog/tags/cordova/index.html">cordova</a> | <a href="/blog/tags/security/index.html">security</a> | <a href="/blog/tags/tips%26tricks/index.html">tips&amp;tricks</a></div>
      <meta name="fediverse:creator" content="@adbrucker@fediscience.org">
    </div>
    <h1 class="entry-title">
      <a href="/blog/2016/05/12/cordova-security/">Cordova Security Considerations</a>
    </h1>
  </header>
  <p>More and more (mobile) apps are written in Apache Cordova (or its
proprietary variants such as PhoneGap or SAP Kapsel). Apache Cordova
is a framework that allows to easily write (mobile) apps for many
different platforms using a hybrid development approach, i.e.,
combining web development technologies (HTML5 and JavaScript) with
native development techniques such as Java or Objective C.</p>
<p>Combining web and native technologies creates new security challenges
as, e. g., an XSS attacker becomes more powerful. For example, a XSS
vulnerability might allow an attacker to access the calendar of a
device or delete the address book.</p>
<!-- MORE -->
<h3 class="sectionAnchor" id="overview">Overview <a href="#overview">§</a></h3>
<p>On the one hand, Cordova apps are HTML5 applications, i.e., they share
all typical features (e.g., JavaScript code that is downloaded at
runtime) and security risks (e.g., XSS) of web applications. On the
other hand, Cordova apps share the features (e.g., full device access)
and security risk (e.g., SQL injections, privacy leaks) of native apps.</p>
<figure>
<img src="/blog/images/android-cordova-architecture.png" id="id" class="zoom" style="width:100.0%" alt="The Cordova Architecture" />
<figcaption aria-hidden="true">The Cordova Architecture</figcaption>
</figure>
<p>To limit the typical web application threats, WebViews (which execute
the HMTML5/JavaScript part of a Cordova app) are re-using the
well-known security mechanism from web browsers such as the
same-origin policy. Moreover, WebViews are separated from the regular
web browsers on Android, e.g., WebViews have their own cache and
cookie store. Still, there are subtle differences that make
implementing secure Cordova apps even for experienced web application
developers a
challenge.</p>
<p>A plugin is a mechanism for drilling holes into the sandbox
of a WebView, making the traditional web attacker much more powerful
as, e.g., an XSS attack might grant access to arbitrary device
features. The root cause for such vulnerabilities can be located in
Cordova itself (e.g.,
<a href="https://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2013-4710">CVE-2013-4710</a>
or
<a href="https://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2014-1882">CVE-2014-1882</a>
or in programming and configuration mistakes by the app developer.</p>
<h3 class="sectionAnchor" id="recommendations">Recommendations <a href="#recommendations">§</a></h3>
<p>Do not forget that Cordova apps are <em>web applications</em>, thus, you need
to</p>
<ul>
<li>do secure JavaScript programming</li>
<li>use the content security policy and same origin policy</li>
<li>always use http<strong>s</strong> and validate SSL certificates, dynamically
loaded code from third parties can be dangerous
(even if ``iframed’’)</li>
<li>…</li>
</ul>
<p>And keep in mind, that the WebView sandbox is not as protective as it is in
modern desktop browsers.</p>
<p>Cordova apps are <em>native</em> (Java, Objective C, Swift, .net, …) apps and, thus,
you need to apply the best practices of native development, such as:</p>
<ul>
<li>do secure Java/Objective-C/… programming</li>
<li>do not trust validations done in the JavaScript part of the
plugin (do input validation in the native part)</li>
<li>…</li>
</ul>
<p>Cordova apps are <em>mobile</em> apps, and you need to use the security features of the
mobile platform correctly, e.g.,</p>
<ul>
<li>do use as few permissions as possible</li>
<li>…</li>
</ul>
<p>Finally, Cordova apps are <em>Cordova apps</em>:</p>
<ul>
<li>use plugin white-listing (and only ship plugins that you really need)</li>
<li>if you need only very limited features of a plugin (e.g., read only
access to a calendar) it is more secure to remove the unnecessary
features in the native part</li>
<li>read the <a href="https://cordova.apache.org/docs/en/5.4.0/guide/appdev/security/index.html">Cordova security guide</a></li>
<li>use the latest version of Cordova</li>
<li>monitor <em>regularly</em> the
<a href="https://web.nvd.nist.gov/view/vuln/search-results?query=Cordova&amp;search_type=all&amp;cves=on">NVD</a>
for new vulnerabilities</li>
<li>due to
<a href="https://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2013-4710">CVE-2013-4710</a>
do not use Cordova on Android below version 4.1</li>
</ul>
<p>Finally, did you know that</p>
<div class="sourceCode" id="cb1"><pre class="sourceCode xml"><code class="sourceCode xml"><span id="cb1-1"><a href="#cb1-1" aria-hidden="true" tabindex="-1"></a>&lt;<span class="kw">application</span><span class="ot"> android:debuggable=</span><span class="st">&quot;true&quot;</span> /&gt;</span></code></pre></div>
<p>on Android <em>disables</em> the certificate checks in WebViews.</p>
</article>
]]></summary>
</entry>

</feed>
