Logging BurpSuite with ELK Stack

During a penetration testing assessment accountability is very important for customer satisfaction as well as the ability to recreate a crash or deletion if the situation arises.  There are many ways to accomplish this such as using an Nginx reverse proxy or a Squid caching proxy and then logging all of the details.  However, this can be cumbersome to set up on the fly on an engagement that might have a quick deadline.  As well, if you are off-site or are working on an internal web application that does not allow certain ports to egress through the firewall.  

"ELK" is an acronym for three open source projects: Elasticsearch, Logstash, and Kibana.  Elasticsearch is a searching and analytics engine.  Logstash is a server-side data processing pipeline that can ingest data from multiple sources, transform the data, and then send it to a "stash"  like Elasticsearch mentioned above.  Finally, Kibana is the visual addition where you can make pretty graphs and charts.  

ELK Stack is highly popular because it helps fill a market in the log analytic space.  There are other more polished products, but like most enterprise software the price is astronomical.  ELK maintains the open source model and free license and use.  Thus, ELK has been downloaded across multiple mediums more in a single month than all of Splunk's total customer count.  (Not saying Splunk isn't nice, but having a quick and easy to setup free version of Splunk would be better. )

One way to accomplish this is with the use of ELK Stack running locally on the remote/attacking machine.  However, this presents another issue where we need to be able to stand up a fresh machine or to reload quickly due to that quick deadline.  This is where the magic of Docker comes into play.  

Docker is a platform designed to make it easier to create, deploy, and run applications by using what is known as containers.  Containers allow a developer to package up an application with all of the parts it needs, such as libraries and other dependencies, and ship it out all as one package.  By shipping the entire application as one package a developer or system administrator can be assured that the application will run on any other machine regardless of any customized settings the remote machine might have that could differ from the machine that created the application.  The container model will allow us to download and run entire multi-application environments with single commands.  This is perfect for systems that need to be setup in a hurry.  

This is a quick intro on how to install Docker and Docker-Compose.  This was setup on a fresh Kali 2018.2 rolling release.  If you need more resources on installing or configuring Docker on your system there are a plethora of guides and tutorials on the web.  

#!/bin/bash

# Remove Old Versions of Docker and Install Latest Docker
curl -fsSL https://download.docker.com/linux/debian/gpg | sudo apt-key add -
echo 'deb https://download.docker.com/linux/debian stretch stable' > /etc/apt/sources.list.d/docker.list
apt-get update
apt-get remove docker docker-engine docker.io -y
apt-get install docker-ce -y
systemctl start docker
systemctl enable docker
docker --version

# Install Docker-Compose
curl -L https://github.com/docker/compose/releases/download/1.21.2/docker-compose-$(uname -s)-$(uname -m) -o /usr/local/bin/docker-compose
chmod +x /usr/local/bin/docker-compose
docker-compose --version

Now that Docker is finished installing the entire ELK Stack setup can be achieved with two commands.

git clone https://github.com/deviantony/docker-elk.git && cd docker-elk
docker-compose up -d

The first command is just pulling down a Github repository that contains all of the ELK Stack package and then moving into the downloaded folder.  The second command is the Docker-Compose command.  Docker Compose is a tool for defining and running multi-container Docker applications.  With Compose, you use a Compose file to configure your application's services.  Then, using a single command, you create and start all the services from your configuration.  Which is what was issued as the second command.  The -d flag at the end will tell the Docker engine to run all of the services in the background.  If you are trying to debug or investigate why parts of your ELK Stack is not working leaving off the -d would be beneficial.  To make sure that your ELK stack is up and running now you can simply type docker ps which will give you a list of all the running processes.  You should see something similar to this:

Now it is time to tie BurpSuite into ELK Stack by adding an extender to the Community Edition or Professional Version of BurpSuite.   Select the tab at the top of the BurpSuite application that says Extender, and then choose BApp Store.  It should be noted there are many other great extensions for BurpSuite, but unless you have a Professional license many of them will be restricted.   From the BApp Store choose the Logger++ extension and then click the install button.  Once installed the Logger++ extension will include a new tab at the top of your BurpSuite application.  

Now we must configure the Logger++ extension to forward log data to the Elasticsearch listening address.  Inside the Logger++ extension there is a Options tab, which will take you to the configuration page.  About half way down the configuration page you will see the subsection Elasticsearch shown by the screenshot below.   Take note of the Cluster Name: docker-cluster this is defined by the Compose file setup above.  If you change the Cluster Name to anything else it will not connect correctly.  The Index field you can change to a customer name or to BurpSuite if you like.  Once this information is set up appropriately you can click the enabled button and logging to the ELK Stack will begin.  

In order to actually see data being transferred to the Elasticsearch we must first generate data.  I am going to skip over the actual steps taken from BurpSuite and hope that you can already at least generate some requests in some way.  For this demonstration I have chosen to spider Reddit's homepage for a few requests.  If you click the Logger++ extender button at the top and then choose the View Logs tab you will be able to see the requests being sent in near real-time.  

Once you have generated some logs we can finally spectate and configure our Kibana portion of the ELK Stack.  You can navigate to Kibana by going to http://localhost:5601 You will be greeted with the following screen shown in the first screenshot below.  At this point the data has already arrived in Elasticsearch and Kibana is waiting for us to categorize it for visualization.  

By choosing the Management tab you will see that Kibana needs to create an index pattern before displaying the data that has already been sent over.  You should notice the logger file that is displayed below.  Inside the Index pattern input field you will need to type the Index Name you defined earlier in your BurpSuite configuration.  Make sure to leave the -* at the end of the index pattern.  In case you ever decide to add a date to the Index Name or a customer code the Kibana will classify anything using the wildcard at the end of logger.  The final step will ask you if you would like to define a Time Filter Field.  My recommendation is to use the requesttime provided by the BurpSuite Logger++.  Other options are the responsetime or no Time Filter Field.  Once you click the final Create Index Pattern button the data will be available for viewing under the Discover tab on the left hand side of the page.  

As you can see from the screenshot below this is what the data will look like that is sent from the Logger++ extension.  Uh oh.  Not seeing any data in your records?  Make sure to set the time frame appropriately.  Sometimes Kibana does not record the request time in correct format.  A quick work-around is to simply set the time from for the past 5 years.   The option to change the time frame is in the upper right hand corner of the Kibana dashboard.  

You have now set up BurpSuite Logger++ to forward logs into a full ELK Stack.  From here you can utilize the Kibana visualization to create charts and graphs.