Skip to main content

Performance Testing – Vegeta Attack!

Performance testing is crucial field of modern software development. Taking into account that in today’s world majority of app’s communication is web-based, it turns out to be even more important. However, it still enjoys less interest than automated functional testing, and publications on load testing subject usually focus on mature and complex tools like JMeter or Gatling. In this post I’d like to introduce command line usage of a super simple and lightweight tool for performance testing of HTTP services, which is Vegeta.

Who is Vegeta 

Besides Dragon Ball’s character, Vegeta is a simple load testing tool written in GO, that can be used both from command line and as an external library in your project. In order to use Vegeta, download executables from here. It is recommended to set environment variable for ease of use. 

For Mac users, you can download vegeta directly from Homebrew repositories:

$ brew update && brew install vegeta

Vegeta’s Arsenal 

Usage is pretty straight foreward, you just need to keep with scheme:

vegeta [global flags] [command flags]

Global flags let’s you customize number of CPUs to use, or profile cpu and heap settings. You can also check version of your vegeta executable (-version).
Next you choose a command, which is one of three: attack, report and dump, and then set command flags, which are options for each command. Attack command is to deal with your attack target. You can set request body here, attach string with server certificate, set headers, define req/s rate and much more. Report command let’s you create report of your tests, which can be in simple stdout string or for example some neat html plots. Dump is a very handy command, where you can log information of all your request to stdout or save it to file. You can view full option list with:

$ vegeta

(yes, that simple!)

Prepare to Battle 

Enough theory, let’s make some kamehame-ha! We will perform some load testing on simple GET/POST service on our local machine (and yes, load testing your localhost isn’t the brightest idea, but it’s for educational purpose only). We will use standalone wiremock stubs from one of my previous posts, that creates REST application mock. Basically, we want to test two methods:

GET /user/1

POST /addNewUser 
    “login” : “lukaszroslonek”, 
    “www” : “” 

First Clash 

Let’s say we want to perform 3 seconds attack with default req/s number, using 2 CPUs. We also want to analyse test results in form of simple text output in console. In order to do so, we’ll execute following command in console:

$ echo “GET http://localhost:8080/user/1” | vegeta -cpus 2 attack -duration=3s | vegeta report

In the first part of above command, we define HTTP method and endpoint under test. In second part, we set global flag of CPUs number and declare the attack to last 3 seconds. Last part is where we tell Vegeta to prepare test report, which is printed to stdout by default. If everything went successful, we should see output:

$ echo “GET http://localhost:8080/user/1” | vegeta -cpus 2 attack -duration=3s | vegeta report
Requests [total, rate] 150, 50.34
Duration [total, attack, wait] 2.981607925s, 2.979999934s, 1.607991ms
Latencies [mean, 50, 95, 99, max] 1.291686ms, 1.4065ms, 1.842859ms, 1.994408ms, 2.132038ms
Bytes In [total, mean] 10350, 69.00
Bytes Out [total, mean] 0, 0.00
Success [ratio] 100.00%
Status Codes [code:count] 200:150
Error Set:

We can familiarize ourselfs about some important metrics from report output like success ratio, latency or request rate.

Second Clash 

For the second method, we want to perform 1 second attack with 100 req/s and default number of CPUs. HTTP method and enpoint should be read from text file, and we want to produce nice, html file with plot report. To do so, first of all we need to prepare two files.


POST http://localhost:8080/addNewUser


    “login” : “lukaszroslonek”, 
    “www” : “” 

Now we can execute following command:

$ vegeta attack -targets=target.txt -body=body.json -rate=100 -duration=1s | vegeta report -reporter=plot -output=report.html

In this test, we provide targets list in target.txt file, and request body in body.json. We also set report type to plot and redirect output to report.html file, which creates for us nice looking, interactive plot:

Final Flash 

Results from single test are stored in results.bin file, which is created after attack execution. We can use it for further analysis of test output. Let’s say we want to create a histogram with requests duration times. To achive that, we need to redirect output of cat results.bin to vegeta report:

$ cat results.bin | vegeta report -reporter=”hist[0,100ms,200ms,300ms]” -output=hist.txt

…and we’ve just created text file hist.txt with following content:

$ cat hist.txt
Bucket # % Histogram
[0, 100ms] 4 100.00% ############################################################
[100ms, 200ms] 0 0.00%
[200ms, 300ms] 0 0.00%
[300ms, +Inf] 0 0.00%

Continue reading 

If you want to continue reading and expand your knowledge in area of REST and microservices, I recommend you these books:

  • Building Microservices – one of the most important books for me, everything you want to know about microservices is here
  • Java For Testers: Learn Java fundamentals fast – test automation does not require complex programming knowledge. Learn fundamentals of Java for test automation. From tester to testers!
  • RESTful Web APIs – another great book about REST architecture. Lots of practical knowledge about designing and consuming RESTful APIs


Vegeta is a very simple tool that let’s you perform simple load tests of your HTTP service. Although it does not offer as many possibilities like JMeter, LoadUI or Gatling, it’s

Popular posts from this blog

REST-Assured framework overview

In modern software development, REST services becomes most popular choice for implementing distributed and scalable web application. They are light and easy to maintain, which results in faster and more effective implementation and system integration.
I recommend you also my other posts about REST-Assured and building microservice’s test automation frameworks: REST-Assured: going deeperBuilding microservices testing framework
With the increase popularity of RESTful services, there is a need for fast and lightweight tool for REST webservices testing automation. One of the most popular choice is Rest-Assured framework from Jayway. It introduces simplicity of testing web services from dynamic languages like groovy or ruby to java. In this post we will get our hands dirty and write automatic test in Rest-Assured framework.
In order to create complete implementation of automated tests in Rest-Assured framework, we need to write our code against some example API. We’ll use standalone Wiremock m…

Testing Asynchronous APIs: Awaitility tutorial

Despite the growing popularity of test automation, most of it is still likely to be done on the frontend side of application. While GUI is a single layer that puts all the pieces together, focusing your automation efforts on the backend side requires dealing with distributed calls, concurrency, handling their diversity and integration.
Backend test automation is especially popular in the microservices architecture, with testing REST API’s. I’ve noticed that dealing with asynchronous events is particularly considered as challenging. In this article I want to cover basic usage of Awaitility – simple java library for testing asynchronous events. All the code examples are written in groovy and our REST client is Rest-Assured.
Synchronous vs Asynchronous In simple words, synchronous communication is when the API calls are dependent and their order matters, while asynchronous communication is when the API calls are independent. Quoting Apigee definition:
Synchronous  If an API call is synchrono…

Notes after TestingCup 2018

On May 28-29th I attended TestingCup conference in Łódź. Having quite unique perspective: this was my second year in row as a Speaker at this conference I want to share some thoughts on the event. Dust has settled, lets go! 

Championship Originally TestingCup is a software testing championship. Wait, what? Yes, the formula is unique: teams and individuals from all around Poland are competing in finding the most bugs and defects in specially prepared application - Mr. Buggy. I don’t have specific data, but since this year’s conference was all english I guess competitors were not only from Poland. As a spectator, I must say that the whole competition looked very professional. There were team shirts and names, podium and trophies (gold cup and cash). 
Some cons? Testing championship is held at the first day of the conference. So this is already a conference, but if you’re not taking part in the championship… there’s not much to do, since all the talks are in the next day. Organizers are aw…