Posted on by Achim D. Brucker, licensed under CC BY-ND 4.0.

Formal Firewall Conformance Testing

If you ever tried to enforce a network policy in a large data centre, i.e., needed to configure the different firewalls and routers, you will agree that this is a tedious and error-prone task. This is even more true, if you need to maintain and change those policies over a long period of time. Understanding, the actual policy enforced in a non-trivial network setup (e.g., a data centre with multiple fall-back connections) is even harder.

One way of ensuring that important security (access control) properties of a network are true and are not changed during reconfiguration is testing. We developed a specification-based (model-based) testing approach for network policies that allows to represent network policies in a high-level language, to optimise the policies, and to generate test cases that can directly be executed in a real-world network.

Our approach supports complex policies that require network translation (nat) or port forwarding as well as statefull and stateless protocols (thus, we capture the concept of widely used network filters such as iptables). For the the following example, let’s limit ourselves to simple packet filters and a very simple network policy that is illustrated by the following table:

source destination protocol port action
internet dmz udp 25 allow
internet dmz tcp 80 allow
dmz intranet tcp 25 allow
intranet dmz tcp 993 allow
intranet internet udp 80 allow
any any any any deny

In our example, we have three sub-nets (internet, intranet, and dmz) and the intranet (i.e., the internal network) should be protected from the outside network (i.e., the internet). This representation is rather similar to the usual textbook representation of network policies. In our formalism (or DSL), we can express this policy as follows: \[ \newcommand{\cmd}[1]{\operatorname{\color{blue}{#1}}} \newcommand{\fw}[1]{\operatorname{#1}} \begin{align} \cmd{definition}~\mathrm{TestPolicy} \cmd{where}\\ TestPolicy &= \fw{allow\_port} \fw{udp} 25 \fw{internet} \fw{dmz} \\ &\oplus \fw{allow\_port} \fw{tcp} 80 \fw{internet} \fw{dmz} \\ &\oplus \fw{allow\_port} \fw{tcp} 25 \fw{dmz} \fw{intranet} \\ &\oplus \fw{allow\_port} \fw{tcp} 993 \fw{intranet} \fw{dmz} \\ &\oplus \fw{allow\_port} \fw{udp} 80 \fw{intranet} \fw{internet} \\ &\oplus D_U \end{align} \] where \(D_U\) is the policy that denies all traffic.

Using this policy definition, we can state our test specification as follows: \[ \begin{align} \cmd{test\_spec} \fw{test:}~~ & P~x \Longrightarrow FUT~x = \mathrm{TestPolicy}~x \end{align} \] In general, this specifies that the firewall implementation under test (\(FUT\)) should behave for all packages satisfying the predicate \(P\) should behave as the specification. Thus, the \(P\) allows us to specify additional test constraints such as to not to generated test cases that test traffic within one sub-net (which, most likely, cannot be monitored easily).

Using our automated test case generation approach that is implemented on top of HOL-TestGen we obtain test cases that both check that packages that should be accepted (i.e., pass through the network) are actually accepted and that all others are denied. A (simplified) test case for a denied package looks as follows: \[ FUT(1,((8,13,12,10),6,\fw{tcp}),((172,168,2,1),80,\fw{tcp}),data)= \lfloor(\fw{deny}()\rfloor \]

This approach is described in all details in our journal paper [1] in detail in our journal paper and the implementation is available as part of the HOL-TestGen distribution. Our approach has inspired the colleagues at Microsoft Research to a similar approach using Z3 that is now used for testing the configuration of the Azure data centres.

Brucker, A.D., Brügger, L., & Wolff, B. (2015). Formal firewall conformance testing: an application of test and proof techniques Software Testing, Verification and Reliability, 25 (1), 34-71 DOI: 10.1002/stvr.1544


1. Brucker, A. D., Brügger, L., and Wolff, B. “Formal Firewall Conformance Testing: An Application of Test and Proof TechniquesSoftware Testing, Verification & Reliability (STVR) 25, no. 1 (2015): 34–71. doi:10.1002/stvr.1544, URL:

Welcome to the blog of the Software Assurance & Security Research Team at The University of Sheffield. We blog regularly news, tips & tricks, as well as fun facts about software assurance, reliability, security, testing, verification, hacking, and logic.

You can also follow us on Twitter: @logicalhacking.




academia appsec cordova dast devops devsecops event fixeffort floss hol-ocl hol-testgen iast industry isabelle/hol logic mbst mobile modelling monads ocl opensource owasp research sap sast sdlc secdevops security securityengineering securitytesting staff&positions test&proof testing tips&tricks tools tuos uk verification webinar


blog whole site