Go vs Rust Web API Performance Testing — Rust Baseline Part #1

Rust vs Go Web API Performance Testing — Rust Baseline #1

It is a common belief that RUST APIs are the most performant APIs because of its language semantics and manual memory management. It’s often compared to C++ in terms of performance. An aspect that makes me curious is also how does RUST compare with Golang. So, I decided to establish a baseline for a quite simple API written in RUST. This article highlights the results for the performance.

I would recommend reading an article I wrote a few days back comparing Golang and RUST. That article is the motivation for this one. Golang, being a highly productive systems language, can be compared with rust for performance reasons. But what is the real difference between the performance between Golang and rust. Can we quantify this? That’s the reason I wanted to see this for myself.

Setup

The setup is divided into two phases and tests.

The first setup I created was a new REST API in Rust. It’s a simple API that takes a JSON input, deserializes it, validates and then returns the response.

The second setup would include computing a MD5 hash with every request. And see how that impacts the performance. I believe that this being a slightly CPU intensive operation will have some impact on API performance. I may be wrong!

I Dockerize this build and then run a Performance test by starting up a new VM on GCP. This would ensure that I can standardize some settings and repeat it again for the same API in Golang.

Dockerfile

The docker file for my rust application is a simple one

FROM rust:1.63.0 AS build

WORKDIR /
src/openab
COPY . .
RUN cd management-server && cargo install --path .
RUN ls -al /usr/local/cargo/bin

FROM debian:stable-slim
COPY --from=build /usr/local/cargo/bin/management-server /bin
CMD ["/bin/management-server"]

The machine where we will fire all requests from will be a separate Docker instance with the same machine configuration. We’ll use k6 to test our load. Below is the script and the command to run the test.

Script

import http from 'k6/http';
export default function () {
const url = 'http://x.x.x.x:3000/experiments';
const payload = JSON.stringify(
{
"name":"new_home_page",
"variants":[
{
"name":"blue_button",
"allocation_percent":50.0
},
{
"name":"red_button",
"allocation_percent":50.0
}
]
}
);
const params = {
headers: {
'Content-Type': 'application/json',
},
};
http.post(url, payload, params);
}

Test Command

k6 run --vus 3000 --iterations 1000000 script.js

The test uses a 2 Core 4 GB Ram e2-medium machine. We’ll run the test for 1 Million Requests with 3000 virtual users concurrently hitting our server. This should be a significant load to test our performance.

Results Scenario 1 — Without Hash Generation

Below are the results for our first test.

https://medium.com/media/8464d2cf19a1d6808c5c9294c704870b/href

Looking at the results, it felt pretty good. With just a 2 Core 4 GB RAM, we were able to achieve a throughput of 9.5K with 3000 virtual users. CPU peaked at 83% for the machine.

Result Scenario #2

For this scenario, I updated our codebase to generate an MD5 hash, for the experiment name and unix timestamp concatenated. I replace the experiment name with the hash and return that as a response.

let digest = md5::compute(format!("{}-{}",e.name , since_the_epoch.as_secs()));
e.name = format!("{:x}", digest) ;

I run the same test with the exact same setup.

https://medium.com/media/ef46280222e059666b8b613dd7de0d0a/href

The results are quite comparable. Even with the added md5 hash, the results had a drop of 30 Reqests / second. There is a very tiny impact to latency (received response times) but that’s in response to the hash.

Conclusion

It was interesting to see the results, and I’m quite happy with the throughput. 9.5k/Sec on 3000 virtual users — with JSON serialization and deserialization is a particularly good baseline for an e2-medium machine. One of the things I noticed was that CPU usage was consistent and there weren’t any spikes. There’s not a lot of variance on our request latencies or CPU performance for 3000 virtual users simultaneously hitting our service. The results are very predictable.

That’s all for Rust Baseline comparison. In the next articles, I’ll write the exact same program in Golang, and try to get some baseline numbers for this API. It would be interesting to see what happens. I’m excited about this one!

Please stay tuned and follow me for more!


Go vs Rust Web API Performance Testing — Rust Baseline Part #1 was originally published in Level Up Coding on Medium, where people are continuing the conversation by highlighting and responding to this story.


This content originally appeared on Level Up Coding - Medium and was authored by Shanmukh Sista

Rust vs Go Web API Performance Testing — Rust Baseline #1

It is a common belief that RUST APIs are the most performant APIs because of its language semantics and manual memory management. It’s often compared to C++ in terms of performance. An aspect that makes me curious is also how does RUST compare with Golang. So, I decided to establish a baseline for a quite simple API written in RUST. This article highlights the results for the performance.

I would recommend reading an article I wrote a few days back comparing Golang and RUST. That article is the motivation for this one. Golang, being a highly productive systems language, can be compared with rust for performance reasons. But what is the real difference between the performance between Golang and rust. Can we quantify this? That’s the reason I wanted to see this for myself.

Setup

The setup is divided into two phases and tests.

The first setup I created was a new REST API in Rust. It’s a simple API that takes a JSON input, deserializes it, validates and then returns the response.

The second setup would include computing a MD5 hash with every request. And see how that impacts the performance. I believe that this being a slightly CPU intensive operation will have some impact on API performance. I may be wrong!

I Dockerize this build and then run a Performance test by starting up a new VM on GCP. This would ensure that I can standardize some settings and repeat it again for the same API in Golang.

Dockerfile

The docker file for my rust application is a simple one

FROM rust:1.63.0 AS build

WORKDIR /
src/openab
COPY . .
RUN cd management-server && cargo install --path .
RUN ls -al /usr/local/cargo/bin

FROM debian:stable-slim
COPY --from=build /usr/local/cargo/bin/management-server /bin
CMD ["/bin/management-server"]

The machine where we will fire all requests from will be a separate Docker instance with the same machine configuration. We’ll use k6 to test our load. Below is the script and the command to run the test.

Script

import http from 'k6/http';
export default function () {
const url = 'http://x.x.x.x:3000/experiments';
const payload = JSON.stringify(
{
"name":"new_home_page",
"variants":[
{
"name":"blue_button",
"allocation_percent":50.0
},
{
"name":"red_button",
"allocation_percent":50.0
}
]
}
);
const params = {
headers: {
'Content-Type': 'application/json',
},
};
http.post(url, payload, params);
}

Test Command

k6 run --vus 3000 --iterations 1000000 script.js

The test uses a 2 Core 4 GB Ram e2-medium machine. We’ll run the test for 1 Million Requests with 3000 virtual users concurrently hitting our server. This should be a significant load to test our performance.

Results Scenario 1 — Without Hash Generation

Below are the results for our first test.

Looking at the results, it felt pretty good. With just a 2 Core 4 GB RAM, we were able to achieve a throughput of 9.5K with 3000 virtual users. CPU peaked at 83% for the machine.

Result Scenario #2

For this scenario, I updated our codebase to generate an MD5 hash, for the experiment name and unix timestamp concatenated. I replace the experiment name with the hash and return that as a response.

let digest = md5::compute(format!("{}-{}",e.name , since_the_epoch.as_secs()));
e.name = format!("{:x}", digest) ;

I run the same test with the exact same setup.

The results are quite comparable. Even with the added md5 hash, the results had a drop of 30 Reqests / second. There is a very tiny impact to latency (received response times) but that’s in response to the hash.

Conclusion

It was interesting to see the results, and I’m quite happy with the throughput. 9.5k/Sec on 3000 virtual users — with JSON serialization and deserialization is a particularly good baseline for an e2-medium machine. One of the things I noticed was that CPU usage was consistent and there weren’t any spikes. There’s not a lot of variance on our request latencies or CPU performance for 3000 virtual users simultaneously hitting our service. The results are very predictable.

That’s all for Rust Baseline comparison. In the next articles, I'll write the exact same program in Golang, and try to get some baseline numbers for this API. It would be interesting to see what happens. I’m excited about this one!

Please stay tuned and follow me for more!

Go vs Rust Web API Performance Testing — Rust Baseline Part #1 was originally published in Level Up Coding on Medium, where people are continuing the conversation by highlighting and responding to this story.


This content originally appeared on Level Up Coding - Medium and was authored by Shanmukh Sista


Print Share Comment Cite Upload Translate Updates
APA

Shanmukh Sista | Sciencx (2022-10-03T18:01:04+00:00) Go vs Rust Web API Performance Testing — Rust Baseline Part #1. Retrieved from https://www.scien.cx/2022/10/03/go-vs-rust-web-api-performance-testing-rust-baseline-part-1/

MLA
" » Go vs Rust Web API Performance Testing — Rust Baseline Part #1." Shanmukh Sista | Sciencx - Monday October 3, 2022, https://www.scien.cx/2022/10/03/go-vs-rust-web-api-performance-testing-rust-baseline-part-1/
HARVARD
Shanmukh Sista | Sciencx Monday October 3, 2022 » Go vs Rust Web API Performance Testing — Rust Baseline Part #1., viewed ,<https://www.scien.cx/2022/10/03/go-vs-rust-web-api-performance-testing-rust-baseline-part-1/>
VANCOUVER
Shanmukh Sista | Sciencx - » Go vs Rust Web API Performance Testing — Rust Baseline Part #1. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2022/10/03/go-vs-rust-web-api-performance-testing-rust-baseline-part-1/
CHICAGO
" » Go vs Rust Web API Performance Testing — Rust Baseline Part #1." Shanmukh Sista | Sciencx - Accessed . https://www.scien.cx/2022/10/03/go-vs-rust-web-api-performance-testing-rust-baseline-part-1/
IEEE
" » Go vs Rust Web API Performance Testing — Rust Baseline Part #1." Shanmukh Sista | Sciencx [Online]. Available: https://www.scien.cx/2022/10/03/go-vs-rust-web-api-performance-testing-rust-baseline-part-1/. [Accessed: ]
rf:citation
» Go vs Rust Web API Performance Testing — Rust Baseline Part #1 | Shanmukh Sista | Sciencx | https://www.scien.cx/2022/10/03/go-vs-rust-web-api-performance-testing-rust-baseline-part-1/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.