Monday, April 10, 2023

Add Manually Crawled Links to Burp Enterprise Scans

I've used Burp Suite Pro for many years, and I have a particular way I like to use scanning as part of my workflow. I generally do my manual testing first and then click on individual HTTP requests and send them to a predefined scanning task when I'm done. This ensures I see and get hands on with each page and API request, and I can be intentional about both manually testing them and making sure they are scanned properly afterward. There's a bit more to it, but I'm going to skip past those details as they aren't the point of this article.

More recently, I've been helping an organization set up Burp Enterprise to scan their applications. Both Burp Enterprise and Burp Pro have the same scanning engines, but Burp Enterprise does not let you get as granular with what you want scan. I can't, for example, crawl the whole application manually and then feed those URLs to a Burp Enterprise scan. 

In this case, that's exactly what I wanted to do however. There were several single-page applications, and Burp Enterprise could use a helping hand. So, I first manually crawled the application using Burp Suite Pro, using as many of the features as I could successfully. Then, I highlighted all the requests in my proxy history, right clicked, and chose "Copy URLs".

Next, I needed to give these URLs to Burp Enterprise somehow. My first attempt was to put them in the "Include URLs" box found when creating a site (shown below).