Toad World Blog

Integrate database load testing in your build pipelines

Apr 29, 2020 3:30:00 AM by Mathew Phan

A follow up to my previous video post, if you're interested in integrating database performance testing as part of a build pipeline in Jenkins or other CI tools, check out this tutorial where I walk you through scripting out a database load test using the Benchmark Factory's REST API.

BMFRESTAPI2

In this video we discuss:

  • A summary of the previous video
  • Line by line explanation of scripting out Benchmark Factory jobs
  • Invoking the script in Jenkins

Thanks for watching!

 

Free 30-day trial

If you liked what you learned today about load testing and think you could benefit from Benchmark Factory, why not try it out? In a world full of constant change and unpredictability, Benchmark Factory allows DBAs and developers to stay a step ahead.  Try Benchmark Factory free for 30 days.

Ready to purchase Benchmark Factory now? Buy now.

Talk to a Benchmark Factory sales professional.

 

More blogs

Blog: Automate database load testing with Benchmark Factory®

Blog: How to create and replay a SQL Server Workload Replay: Capture SQL Server Workload with Benchmark Factory.

Video: Introduction to Benchmark Factory video. 

Blogs: how to create and use Benchmark Factory Rest API feature.

Benchmark Factory Rest API - Part 1

Benchmark Factory Rest API - Part 2

 

Learn more

Download this tech brief: Benchmark Factory for Databases.

Visit our Benchmark Factory product page to learn how you can increase your database development process speed by up to 40%

 

Questions?

If you have any questions about this load testing, visit Toad Benchmark Factory Forums to ask a question. If you have input you want to share on how to make Benchmark Factory better, submit it to our Idea Pond.

 

Share on social media

If you think your colleagues would benefit from this blog, share it now on social media with the buttons located at the top of this blog post. Thanks!

 

Transcript from webcast:

Hello again my name is Mathew Phan Software Consultant at Quest software specializing in our Toad database management solutions. Welcome to the second video of a two-part video series about automated database load testing using Benchmark Factory’s REST API function. In the first video we covered how Benchmark Factory provides a UI to do database load testing, scale testing, and performance testing and that Benchmark Factory can be accessed through a REST API. 

Powershell as a scripting language

We went through some examples of issuing those REST commands through Postman. In this video we will use Powershell as a scripting language to issue those REST API commands. Then we'll take our Powershell script and put that into Jenkins, a continuous integration system. That way you can embed database testing as part of your application release cycles.

 

You can use whatever scripting language you want as long as it supports REST such as python Groovy or VB. The process should be similar even though the syntax won't be exactly the same. Now let's walk through how the script works. First, you'll need Powershell 3.0 or higher to support REST calls. So here's a quick check. If the version is less than 3 then we're going to throw a quick error here. 

 

Set variables

Next we’re going to set some variables. The state of a job in Benchmark Factory is 1 2 3 4 or 5 which isn’t very descriptive. So this just puts some text to these values. State of 1 means job complete, state of 2 means job is running, and so on. You can view the definition of states in Benchmark Factory’s REST API documentation. A Benchmark Factory job may have more than one test, and those tests have states as well. So here's again just assigning names to the states but you can pretty much leave these lines as they are. Line 33 however that's where you want to start to customize. This sets a variable called BMF Server which will be the location of where Benchmark Factory is installed. So put the hostname of the machine or the IP address of the machine where Benchmark Factory is installed. In my example I happened to have it installed on the same computer where I'm running the script from so I’m going to use the name localhost.

 

The next part is just a catch-all or any error message that we haven't really planned for, like if the network goes down or something else wildly unforeseen. Then we're going to clear out any errors before we start. Powershell lets me organize tasks into functions or methods so we're going to leverage that for readability. We’re going to start with the main method. The purpose of the script is going to be to create a job, run that job, check the results, then delete the job. If you have a Benchmark Factory UI console, it's far easier to create the job in the UI then it is to create the job through Powershell. But as an example to create the job through the script so you can see what it looks like. 

 

We're going to create some variables that way if we need to update the script later with new values we can just update these values in one part of the script instead of having to go back and hunt and change all these values all over the script. So in this main method will first create a variable called job name. We’ll call that job my test SQL scalability then we'll create a variable called user load and we'll set that user load to 10 we're going to make a job with 10 users. Then we'll set the connection we want to use. This connection was already created in Benchmark Factory’s console so I'll just put the name here Oracle_ORCLPDB. Then we'll set the value for dollar sign SQL. This will be my single statement that I'm going to be testing for scalability. You can put whatever SQL query or functional call or whatever you want to test here. And for this test I want to make sure the service level agreement is met. My database should give me the results for the query for each user in an average of 3 seconds for each user. I'm going to set response time average to 3. We're going to set these variables then we’re going to call some functions. The functions are going to be add scalability job, run job, validate SLA. We’re going to define these functions in just a moment. After we run these functions we’re going to delete the job. This was just a couple lines of code so I didn't feel like it was necessary to separate this into its own function so I just put it here in the main method. 

 

Define a variable called URI

And so while we're here let's just take a look at this first RESTful call to Benchmark Factory and break it down. First we’ll write to the output deleting job the job name as part of that string so it should say deleting job mytest SQL scalability in my console. Now here's the actual command to delete in Powershell. To make it a little cleaner we're going to define a variable called URI. We're going to concatenate the HTTP request together and save that as URI. So this should read HTTP://localhost:30100/API/jobs/mytest SQL scalability. And again we're just doing this to make it look cleaner and saving that as URI. Now it's time to invoke the REST command so we're going to say invoke-restmethod this is a Powershell command. We’re going to say a method it’s going to be a type delete and we're passing in URI defined as dollar sign URI. 

 

Which again it's just the string here. And again this is done so if wherever we want to change the hostname or job name we just update the variable rather than having to find all these instances of these URIs and having to update them individually. Alright so again set some variables add the scalability job, run the job, validate the SLA then delete the job. 

 

So let’s take a look at the first function add scalability job and that's defined right here. So this will add the Benchmark Factory job. Again, you don't have to do it this way. It would be easier to create the job from the Benchmark Factory console UI, but if you wanted to create a job from a script here's an example. So this function will first it’ll just print to console creating job my test for SQL scalability. This example is going to use XML and has an XML string attached to it. I'm going to create XML string the following tagline. The name tag is going to have the same job name my SQL scalability. It's going to use the connection defined in the connection name variable again and then the test name from an earlier variable, and the SQL query or function as defined in that variable. Then the user load variable which was 10 users. So create a job with this connection and these tests names and these parameters and this user load. Set that equal to SQL job XML and using variables is super handy. So again we're going to create a URI variable it's going to be http the bmf server which in my case is localhost port number API / jobs. But this time I want to invoke the REST command and post I’m going to define content type to be text / XML but URI is going to be the new URI variable and the body is going to be the SQL job XML which is found here. I'm also going to be using a variable for response. That way you can save the value of the response in case you want to print out codes or use it for debugging later you can say print the dollar sign response to console.

 

Run job

The next function is run job let's take a look at that one. We're going to print a console starting job my test SQL scalability. we're going to create a URI again. This time it's going to localhost port number / API / jobs / myTEST SQL scalability and then we're going to invoke the rest command again we're going to invoke the REST method this time method is going to be a put content type text / XML the URI as we defined above and this time we're going to change the dash body and put the tag status 2 close tag status. What that means is change the status for this job to 2 and if you recall the status to is run job so go ahead and run this job. This next part is a little bonus. We’re going to loop for 10 minutes checking on the job status. First we'll wait 5 seconds before we do a check. Then we're going to go to this URI job name again which is going to be my test for SQL scalability / status because we want to get the job status of that job. So wait 5 seconds and get the job status of that job if the job status is not equal to job running, say job completed. So if the job is not running it must be completed. Go ahead and break out of this loop and we're done but if the job is still running go ahead and wait another 10 seconds and check again. Go back to the top here and check for a job status and keep doing that to either the job is completed or we reached 10 minutes and the job is timed out and go ahead and break that message. 

 

Run index

The next part here is a little helper function called get run index. Since a job has multiple tests and if you run the job multiple times you're going to have multiple test runs so what this will do is count how many test runs you’re going to have. It also helps you find what the last test run number was and save that as a value. That's going to make more sense when we talk about the next function. So we’re going to check all the test runs. If the test run has the same test name that we're looking for go ahead and count that so count how many times you seen this test show up and so if this test has ran 20 times return the value of 20.

 

Validate SLA

So our last function is validate SLA. Again we’re just going to print that were going to validate. We’re going to get the number of times this test has ran. Let's say it ran again 20 times let's get the values for the 20th test run We're going to go and check test run number 20 and see is that test still running has that test completed. If the test is not equal to complete it then maybe the test is still running or did not complete successfully so what it's a test 20 did not successfully complete. Then we're going to go and take a look at test run number 20 and load up the user load report. So invoke the get test run number 20. Give me the user load results. Then for each of the users in that user load we’re going to check the response time. So if we remember does user there was 10 so go to each of those 10 users and check how long did it take for that user to finish their load. So we’re going to check each of the 10 users and see how fast they ran so user 1 might have ran in 2 seconds. User 2 might have ran in 3 seconds. User 4 ran in 1 seconds and so on. We want to get the average of the 10 users and then compare that to my SLA which was 3 seconds. So if the average response time of the 10 users is less than 3 seconds we’ll pass. If the average response time for the 10 users was more than 3 seconds we failed this user load test. So here we just go ahead and execute the main. 

 

Alright let's go ahead and run this. I’ll click on the play button here to run the script and while this is running we’ll also kickoff Benchmark Factory. And you see the job was created and the job is running. My Powershell output is saying creating the job, starting the job, you see the job is running, my test SQL scalability. it's running test SQL number 1 here. It’s kicking off virtual users 10 user load. 

 

Job is completed. Verified the SLA value. Got the test results for a test run number 60. It passed the SLA with 10 user load and we went ahead and deleted the job from Benchmark Factory so if I went back to Benchmark Factory the job has been deleted. So we created the job, we ran the job, the job completed. We validated the SLA agreement for test run number 60 and we passed.

 

Jenkins

Then went ahead and deleted the job. Alright now that i validated my Powershell script let's go ahead and put into Jenkins. In Jenkins I'm going to manage Jenkins and go to the plug-in manager and search for Powershell plug-in. I have the plug-in already installed and can find it in my installed list down here. Since I have it installed I can invoke Powershell scripts as part of a Jenkins build pipeline. Let's go back to my Jenkins homepage. Here I have a pipeline here called DB load test. Take a look at it and let’s go to configure this DBload test. My project name DB load test. I could tap into a source control management system. So anytime code has been checked in automatically run this pipeline. Let's go down to the build step you see that I've selected Powershell and I just basically pasted in the same Powershell script. Because I have the Powershell plug-in I can just say add step here and select Windows Powershell and again I just pasted in my Powershell code as I had it before. 

 

I'm going to save this and I can say build now or I could have this run on a schedule or on some other trigger but I'll run build now now. So build is running number five here and you can see it has invoked my Benchmark Factory and it’s running my scale test. So this puts a database test as part of an application release. Every time the code is updated or on a schedule, run this test against my database. A scale test, a performance test, reliability test, and so on. And I can go back to build number 5 and look at the console output and in Jenkins I could see create a test started that test, test is completed and test number 61 here passed for a user load of 10.

So to recap Benchmark Factory can run load tests, performance tests, scale tests to ensure database reliability. You can run these tests from Benchmark Factory’s console but leveraging its new REST API you can embed database testing as part of a build cycle and release management cycle. You can integrate it into your existing CI CD strategies and systems. And in this video we covered an example using Powershell to call Benchmark Factory to run a scalability test from Jenkins.

I hope this was helpful and gave you some ideas for the future. Thank you for watching. 

 

 

 

 

Tags: DevOps Jenkins Benchmark Factory Tutorial Benchmark Factory Videos Automation CI

Mathew Phan

Written by Mathew Phan

Mathew Phan began his career as a technical support engineer for Toad for Oracle soon after graduating from the University of California, Irvine, where he obtained a bachelor's degree in computer and information science. Mat has devoted his entire 14-year career with the Toad family of database solutions and held various positions helping to shape the direction and growth of Toad. Early in his tenure, his focus was on Oracle databases from a developer point of view. Now as a systems consultant, he continues to work with customers to build effective solutions that meet their needs.