Abstract
The firewall is usually the first line of defence in ensuring network security. However, the management of manually configured firewall rules has proven to be complex, error-prone and costly for large networks. Even with error-free rules, presence of defects in the firewall implementation or device may make the network insecure. Evaluation of effectiveness of policy and correctness of implementation requires a thorough analysis of network traffic data. We present a set of algorithms that simplify this analysis. By analysing only the firewall log files using aggregation and heuristics, we regenerate the effective firewall rules, i.e., what the firewall is really doing. By comparing these with the original rules, we can easily find if there is any anomaly in the original rules, and if there is any defect in the implementation. Our experiments show that the effective firewall rules can be regenerated to a high degree of accuracy from a small amount of data.
Original language | English |
---|---|
Pages (from-to) | 3-22 |
Number of pages | 20 |
Journal | International Journal of Internet Protocol Technology |
Volume | 5 |
Issue number | 1-2 |
DOIs | |
Publication status | Published - Apr 2010 |
Keywords
- Firewall implementation validation
- Firewall log analysis
- Firewall rule analysis
- Firewall rule anomaly detection
- Internet protocol
- Network security
- Network traffic mining
ASJC Scopus subject areas
- Computer Networks and Communications