Posted on by Achim D. Brucker, licensed under CC BY-ND 4.0.

Formal Firewall Conformance Testing

If you ever tried to enforce a network policy in a large data center, i.e., needed to configure the different firewalls and routers, you will agree that this is a tedious and error-prone task. This is even more true, if you need to maintain and change those policies over a long period of time. Understanding, the actual policy enforced in a non-trivial network setup (e.g., a data center with multiple fall-back connections) is even harder.

One way of ensuring that important security (access control) properties of a network are true and are not changed during reconfiguration is testing. We developed a specification-based (model-based) testing approach for network policies that allows to represent network policies in a high-level language, to optimize the policies, and to generate test cases that can directly be executed in a real-world network.

Our approach supports complex policies that require network translation (NAT) or port forwarding as well as stateful and stateless protocols (thus, we capture the concept of widely used network filters such as iptables). For the following example, let’s limit ourselves to simple packet filters and a very simple network policy that is illustrated by the following table:

source destination protocol port action
internet dmz udp 25 allow
internet dmz tcp 80 allow
dmz intranet tcp 25 allow
intranet dmz tcp 993 allow
intranet internet udp 80 allow
any any any any deny

In our example, we have three sub-nets (internet, intranet, and dmz) and the intranet (i.e., the internal network) should be protected from the outside network (i.e., the internet). This representation is rather similar to the usual textbook representation of network policies. In our formalism (or DSL), we can express this policy as follows: \[ \newcommand{\cmd}[1]{\operatorname{\color{blue}{#1}}} \newcommand{\fw}[1]{\operatorname{#1}} \begin{align} \cmd{definition}~\mathrm{TestPolicy} \cmd{where}\\ TestPolicy &= \fw{allow\_port} \fw{udp} 25 \fw{internet} \fw{dmz} \\ &\oplus \fw{allow\_port} \fw{tcp} 80 \fw{internet} \fw{dmz} \\ &\oplus \fw{allow\_port} \fw{tcp} 25 \fw{dmz} \fw{intranet} \\ &\oplus \fw{allow\_port} \fw{tcp} 993 \fw{intranet} \fw{dmz} \\ &\oplus \fw{allow\_port} \fw{udp} 80 \fw{intranet} \fw{internet} \\ &\oplus D_U \end{align} \] where \(D_U\) is the policy that denies all traffic.

Using this policy definition, we can state our test specification as follows: \[ \begin{align} \cmd{test\_spec} \fw{test:}~~ & P~x \Longrightarrow FUT~x = \mathrm{TestPolicy}~x \end{align} \] In general, this specifies that the firewall implementation under test (\(FUT\)) should behave for all packages satisfying the predicate \(P\) should behave as the specification. Thus, the \(P\) allows us to specify additional test constraints such as to not to generated test cases that test traffic within one sub-net (which, most likely, cannot be monitored easily).

Using our automated test case generation approach that is implemented on top of HOL-TestGen we obtain test cases that both check that packages that should be accepted (i.e., pass through the network) are actually accepted and that all others are denied. A (simplified) test case for a denied package looks as follows: \[ FUT(1,((8,13,12,10),6,\fw{tcp}),((172,168,2,1),80,\fw{tcp}),data)= \lfloor(\fw{deny}()\rfloor \]

This approach is described in all details in our journal paper [1] in detail in our journal paper and the implementation is available as part of the HOL-TestGen distribution. Our approach has inspired the colleagues at Microsoft Research to a similar approach using Z3 that is now used for testing the configuration of the Azure data centers.

Brucker, A.D., Brügger, L., & Wolff, B. (2015). Formal firewall conformance testing: an application of test and proof techniques Software Testing, Verification and Reliability, 25 (1), 34-71 DOI: 10.1002/stvr.1544

References

[1]
A. D. Brucker, L. Brügger, and B. Wolff, “Formal firewall conformance testing: An application of test and proof techniques,” Software Testing, Verification & Reliability (STVR), vol. 25, no. 1, pp. 34–71, 2015, doi: 10.1002/stvr.1544.

Welcome to the blog of the Software Assurance & Security Research Team at the University of Exeter. We blog regularly news, tips & tricks, as well as fun facts about software assurance, reliability, security, testing, verification, hacking, and logic.

You can also follow us on Twitter: @logicalhacking.

Categories

Archive

Tags

academia ai android apidesign appsec bitcoin blockchain bpmn browser browserextensions browsersecurity bug certification chrome composition cordova dast devops devsecops dom dsbd efsm epsrc event extensions fixeffort floss formaldocument formalmethods funding hol-ocl hol-testgen humanfactor hybridapps iast industry internetofthings iot isabelle/hol isabelledof isadof latex logic maintance malicous mbst mobile mobile apps modelinference modeling monads monitoring msc ocl ontology opensource owasp patches pet phd phdlife phishing policy protocols publishing reliability research safelinks safety sap sast sdlc secdevops secureprogramming security securityengineering securitytesting semantics servicecomposition skills smartcontract smartthings softwareeinginering softwaresecurity softwaresupplychain solidity staff&positions statemachine studentproject tcb test&proof testing tips&tricks tools transport tuos uk uoe upgrade usability verification vulnerabilities vulnerableapplication webinar websecurity

Search


blog whole site