This setup is one giant collection of regression tests like every project should have. And mingled in with them was some basic security testing from a users perspective as well. However, what we did not have was a method of automating the ability to test hack attempts against our sites. One of the reasons why is the shear number of possible hacks and combination hacks that are possible and should be tested for.
Enter the OWASP ZAP tool. A bunch of security experts have formed a non profit to educate people about security, and one of that companies outputs has been a free (and well maintained) tool for attacking your own sites and producing reports about any security issues it finds. The best part is that this tool has a pretty well developed REST API, so you can run it in an automated fashion.
The ZAP tool has a decent API UI to help you learn it, but other than that is lacking documentation for it, which is the reason for this post.
If you open the ZAP tool GUI, click on Tools / Options / API you can find your API key and a few API related settings you can mess with. By default, the tool has a web UI at http://localhost:8080/. Clicking on the Local API link inside of that will get you into the REST API help and demo area where you can run each call.
In the GUI you can enter the base URL for your site, and click Attack. That is an easy method of running a one-off attack against your site. But if you want to automate this process, there are several REST calls involved.
The first thing you need to do is start the application via command line. On windows it installs to:
C:\Program Files\OWASP\Zed Attack Proxy\zap.bat
Then call these REST endpoints:
- NewSession (clear all unsaved scan histories, just for a fresh start; you could load saved sessions instead if desired)
http://localhost:8080/JSON/core/action/newSession/?zapapiformat=JSON&apikey=[yourapikey]
- SetMode ( sets the scan mode to attack level )
http://localhost:8080/JSON/core/action/setMode/?zapapiformat=JSON&apikey=[yourapikey]&mode=attack
- Spider.Scan ( spider the url to find all the links to attack, this has to be performed before an Ascan attack is performed, the attack itself is only done on URLs already in the ZAP session. The Spider is the most efficient method of crawling one or more pages and loading all the found URLs into memory. The attack method only attacks distinct URLs, so it does not matter if the spider duplicates URLs while it is crawling pages. )
http://localhost:8080/JSON/spider/action/scan/?zapapiformat=JSON&apikey=[yourapikey]&url=[an escaped url to crawl]&recurse=true
- AScan.Scan ( fire off an attack for all links the spider found under this base URL )
http://localhost:8080/JSON/ascan/action/scan/?zapapiformat=JSON&apikey=[yourapikey]&url=[an escaped url to lookup]&recurse=true
- JsonReport ( get the total report for all scans and attacks in json format )
http://localhost:8080/OTHER/core/other/jsonreport/?apikey=[yourapikey]
There are many more options, and other REST endpoints you can call to customize from there and to perform other and more detailed automated attacks. However, those are the base required to create an automated scan and attack. Hopefully that provides enough of a jump start so others can get the concept down and process further on their own.