Difference between these 2 scenarios in load testing

六月ゝ 毕业季﹏ 提交于 2019-12-25 01:28:36

问题


I'm creating a load test, and I have done this so far: threads-100 ramp up period- 100 loop - 2

My understanding is that 1 thread will be fired per second, and after the first batch of 100 threads is done, another round of 100 threads will be fired in the same way, with the difference between their firing period as 1 sec(same as the first loop). Is this correct? Also, what would be the case if I had this:

threads-200 ramp up period- 200 loop-1

Does it mean 1 thread per second and a total of 200 threads? Is this equivalent to the 1st case? Please help, I'm getting very wierd results while testing, hence this question


回答1:


Nope. 2nd scenario is not the same as the first.

Remember these (assuming 'delay thread creation until needed' is selected)

Thread Creation Rate = ( Ramp up Period ) / (No of Threads).
Thread is executed independently.

First Scenario:

Thread Creation Rate = 1 thread / sec. Every second, a thread is created. So after 100 sec, you will have 100 threads/users.

Once the first thread is created, it sends the first the request. once it completes, it does not wait unless you have have explicitly set a timer. Since the loop count is 2, it sends another request. Here, each user sends 2 requests to the server. But the second request is sent only after the first request is complete. but it does not matter other threads have the sent the requests/got their responses or not.

Second Scenario:

Thread Creation Rate = 1 thread / sec. So after 200 sec, you will have 200 threads/users. Here each sends only one request to the server.

What is the difference?

Lets assume, the server usually takes 300 seconds to process a request.

First Scenario:

After 100 seconds, 100 users have sent requests to the server. As each request is processed in 300 seconds, after 100 seconds, 100 users wait for the server to respond. They do not send any other request until any of the users got a response. Even after 200 seconds, the serer has only 100 concurrent users.

Second Scenario:

But, here, The server has 200 concurrent users after 200 seconds. we have more load on the server compared to the first scenario. Response time of the server might be more compared to the First scenario as the load is more.



来源:https://stackoverflow.com/questions/23741713/difference-between-these-2-scenarios-in-load-testing

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!