问题
I'm trying to do user acceptance testing on an application which becomes unresponsive on a particular URL parameter included in the GET request.
Steps
I have
curl
and run the GET req (crafted) copied curl syntax for Unix and copied to ubuntu server along with some changes.'https://abc.ai/getMultiDashboard/demouser' -H 'Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _gid=GA1.2.1366208807.1601560229; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0Ellc3NUb2tlbiUyMiUzQSUyMjA2MTk3NjM3NTgwOGE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3A8ZGd7Mol31n_Y8OCLq39dHoo3_mIlRhZ.pFQWz5gG9McKsQLzOikcTBmmb2Wcrxo%2B9u9iPpqoyxw; pageUrl=/#/dashboard/18; _gat_gtag_UA_97985973_5=1' "https://abc.ai/getTagTrends/E1_CPU_PERCENTAGE/2020-9-12%2013:4:0/202**'23548'**0-09-15|%2013:04:00" "https://abc.ai/getTagTrends/E1_CPU_PERCENTAGE/2020-9-12%2013:4:0/202**'`23548`'**0-09-15|%2013:04:00"
The
**
asterisks are not part of the actual values; I use them to demarcate myinjected
valueUsing a small bash script I have generated 1000s of (unique) payload combinations for Curl.
#/bin/bash for ((i=0; i<1000; ++i)); do echo " 'https://abc.ai/getMultiDashboard/demouser' -H 'Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7 f-b3ef-6f9f12b13d66; 54651cc_an=4; _gid=GA1.2.1366208807.1601560229; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMjJkZW1vdXNlciU yMiUyQyUyMm4lMjIlM0ElMjJkZW1vdXNlciUyMiUyQyUyMmZyaWVuZHMlMjIlM0ElMjIlMjIlMkMlMjJhdXRoJTIyJTNBJTIyZWQ0YjVhNDFkMzJlY2U4MzQ3Mzk0ZjlkZT U5YThjMWQlMjIlMkMlMjJyZWZlcmVyJTIyJTNBJTIyaXJpZGl1bS1wcmVwcm9kLmVtcGlyaWMuYWklMjIlMkMlMjJhY2Nlc3NUb2tlbiUyMiUzQSUyMjA2MTk3NjM3NTgwO GE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disable lastseen=false; 54651cc_usertype=loginuser; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3A8ZGd7Mol31n_ Y8OCLq39dHoo3_mIlRhZ.pFQWz5gG9McKsQLzOikcTBmmb2Wcrxo%2B9u9iPpqoyxw; pageUrl=/#/dashboard/18; _gat_gtag_UA_97985973_5=1' \"https://abc.ai/getTagTrends/E1_CPU_PERCENTAGE/2020-9-12%2013:4:0/202'$((1 + RANDOM % 10000000))'0-09-15|%2013:04:00\"" > URL.txt done
Final command for testing (one-liner) fails as
cat URL.txt | xargs -I{} -- curl -O {}
Output:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- 0:00:01 --:--:-- 0
Expected output when I run the curl manually copying the contents from URL file I get
[{"dashboard_id": 18, "user_id": "demouser", "dashboard_name": "My_dashboard_1", "description": "Test description One", "creation_date": "2020-09-21 10:13:00", "dashboard_config": null, "id": 5}]
<html>
<head><title>504 Gateway Time-out</title></head>
<body>
<center><h1>504 Gateway Time-out</h1></center>
<hr><center>nginx/1.18.0</center>
In order to troubleshoot, i used set -x
on shell cmd-line I can't see why or how the request is crafted and handled by the curl processes. The curl output shows output (above) which has all 0 values in all fields, this tell me its just a bad malformed request, which isn't the actual case since i manually tested running the URL payload given in URL.txt multiple times it works.
EMPTY LINE
CODE
NEW-LINE
CODE
NEWLINE
...
I want to generate as many parallel requests as possible, without waiting for the first one to finish.
Debug
running it with -v
using one-liner (showing only importants lines)
> GET /getMultiDashboard/demouser -H Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMgwOGE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _gid=GA1.2.1722546791.1601890062; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3AKsRWcfNnOkbDHh1e65C3NwiDSZMx4LYg.zxLIymu488Ii5Z2%2Brz0qiwS17BzK2P7A0OoTSCHlMQM; pageUrl=/ HTTP/1.1
> Host: abc.ai
> User-Agent: curl/7.58.0
> Accept: */*
>
{ [5 bytes data]
< HTTP/1.1 400 BAD_REQUEST
< Content-Length: 0
< Connection: Close
When I run it with curl
alone not using xargs
I get the correct output no 400
bad request
> Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMjJkZW1vdXNlciUyMiUyQyUyMm4lMjIlM0ElMjJkZW1vdXNJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _gid=GA1.2.1722546791.1601890062; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3AKsRWcfNnOkbDHh1e65C3NwiDSZMx4LYg.zxLIymu488Ii5Z2%2Brz0qiwS17BzK2P7A0OoTSCHlMQM; pageUrl=/#/dashboard; _gat_gtag_UA_97985973_5=1
>
< HTTP/1.1 200 OK
< Content-Type: text/html; charset=utf-8
< Date: Mon, 05 Oct 2020 09:48:51 GMT
< ETag: W/"3b4-gP1vMAXMzUZy+pt7cwyOmQslPT8"
< Server: nginx/1.18.0
< Strict-Transport-Security: max-age=15552000; includeSubDomains
< Vary: Accept-Encoding
< X-Content-Type-Options: nosniff
< X-DNS-Prefetch-Control: off
< X-Download-Options: noopen
< X-Frame-Options: SAMEORIGIN
< X-XSS-Protection: 1; mode=block
< Content-Length: 948
< Connection: keep-alive
<
* Connection #0 to host abc.ai left intact
[{"dashboard_id": 18, "user_id": "demouser", "dashboard_name": "My_dashboard_1", "description": "Test description One", "creation_date": "2020-09-21 10:13:00", "2020-08-12 09:08:00", "dashboard_config": {}, "sort_id": 4, "id": 2}, {"dashboard_id": 5}]* Found bundle for host abc.ai: 0x55836cf75a50 [can pipeline]
* Re-using existing connection! (#0) with host abc.ai
* Connected to abc.ai (52.86.136.249) port 443 (#0)
> GET /getTagTr/E1_CP/2020-9-12%2013:4:0/202'6368'0-09-15|%2013:04:00 HTTP/1.1
> Host: abc.ai
> User-Agent: curl/7.58.0
> Accept: */*
> Cookie: _ga=GA1.2.561275388.1601468723; _hjid=ecd3d778-b7f5-4f7f-b3ef-6f9f12b13d66; 54651cc_an=4; _hjTLDTest=1; 54651cc_data=JTdCJTIyaWQlMjIlM0ElMjJkZW1vdXNlciUyMiUyQyUyMmjM3NTgwOGE2N2RmZjlhMmJlOWJmODE5NDQzJTIyJTdE; 54651cc_loggedin=1; 54651cc_sound=true; 54651cc_read=true; 54651cc_popup=true; 54651cc_disablelastseen=false; 54651cc_usertype=loginuser; _gid=GA1.2.1722546791.1601890062; _hjIncludedInPageviewSample=1; _hjAbsoluteSessionInProgress=0; abc=s%3AKsRWcfNnOkbDHh1e65C3NwiDSZMx4LYg.zxLIymu488Ii5Z2%2Brz0qiwS17BzK2P7A0OoTSCHlMQM; pageUrl=/#/dashboard; _gat_gtag_UA_97985973_5=1
回答1:
Having multiple curl
arguments and options in the same file adds a complication which probably isn't worth working around. Basically,
echo "http://example.com -H 'X-Hello: Hello'" | xargs curl -O
passes the entire argument to echo
as a single string to curl
, which interprets it as the URL to fetch.
My suggestion would be to put the URL and any other arguments on the command line, and only store the -H
option's argument in the file.
for ((i=0; i<1000; ++i)); do
curl -O http://example.com -H "$(sed "s/%|/%$((1 + RANDOM))|/" xm.cookiefile)"
done
and run 400 (or whatever) of these jobs in parallel, perhaps just as regular background processes, or maybe with xargs
if you think it adds value. (Maybe also look at GNU parallel
which simplifies some aspects of this.)
I took out the big modulo because it's not doing anything; $RANDOM
produces integers in the range 0-32767 so if you need a much bigger number, maybe paste together multiple $RANDOM
numbers, or maybe use a different random source.
来源:https://stackoverflow.com/questions/64195729/stress-testing-uri-using-xargs-curls-bash-script-failing-with-status-empty