Multiple read from a txt file in bash (parallel processing )
问题 Here is a simple bash script for HTTP status code while read url do urlstatus=$(curl -o /dev/null --silent --head --write-out '%{http_code}' "${url}" --max-time 5 ) echo "$url $urlstatus" >> urlstatus.txt done < $1 I am reading URL from text file but it processes only one at a time, taking too much time, GNU parallel and xargs also process one line at time (tested) How to process simultaneous URL for processing to improve timing? In other words threading of URL file rather than bash commands