I\'m trying to get embed tweet from Twitter. So, I\'m using cURL to get the json back. I wrote a little test but the test takes around 5 seconds as well as when I run locall
The final solution for speed up is this
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4 );
Regards
With respect to environment, I've observed in PHP that cURL typically runs very fast in most environments except in places where there is low CPU and there is slower network performance. For example, on localhost on my MAMP installation, curl is fast, on a larger amazon instance, curl is fast. But on a small crappy hosting, i've seen it have performance issues where it is noticeably slower to connect. Though, i'm not sure exactly why that is slower. Also, it sure wasn't 5 seconds slower.
to help determine if its PHP or your environment, you should try interacting with curl via the command line. At least that you'll be able to rule out PHP code being the problem if its still 5 seconds.
The best speed up I've ever had was reusing the same curl handle.
Replace $ch = curl_init( $json_url );
with curl_setopt($ch, CURLOPT_URL, $url);
. Then outside the functions have one $ch = curl_init();
. You'll need to make $ch
global in the functions to access it.
Reusing the curl handle keeps the connection to the server open. This only works if the server is the same between requests, as yours are.
Try
CURLOPT_TCP_FASTOPEN => 1
... to activate TCP-Fast-Open.
It was added to cURL 7.49.0, added to PHP 7.0.7.
Try to set
curl_setopt($ch, CURLOPT_ENCODING, '')
it's enable gzip compression
To speed up cURL I recommend create special class for API (e.g. ApiClient
) and use one shared cURL handler, only change URL for every request. Also cut off requests for name resolving and use gzipped response.
I needed to process about 1 million entities every day from one API-server that limits us to use only one concurrent connection. I created that class. I hope it will help others in optimising their curl requests.
class ApiClient
{
const CURL_TIMEOUT = 3600;
const CONNECT_TIMEOUT = 30;
const HOST = 'api.example.com';
const API_TOKEN = 'token';
/** @var resource CURL handler. Reused every time for optimization purposes */
private $ch;
/** @var string URL for API. Calculated at creating object for optimization purposes */
private $url;
public function __construct()
{
$this->url = 'https://' . self::HOST . '/v1/entity/view?token=' . self::API_TOKEN . '&id=';
// Micro-optimization: every concat operation takes several milliseconds
// But for millions sequential requests it can save a few seconds
$host = [implode(':', [ // $host stores information for domain names resolving (like /etc/hosts file)
self::HOST, // Host that will be stored in our "DNS-cache"
443, // Default port for HTTPS, can be 80 for HTTP
gethostbyname(self::HOST), // IPv4-address where to point our domain name (Host)
])];
$this->ch = curl_init();
curl_setopt($this->ch, CURLOPT_ENCODING, ''); // This will use server's gzip (compress data)
// Depends on server. On some servers can not work
curl_setopt($this->ch, CURLOPT_RESOLVE, $host); // This will cut all requests for domain name resolving
curl_setopt($this->ch, CURLOPT_TIMEOUT, self::CURL_TIMEOUT); // To not wait extra time if we know
// that api-call cannot be longer than CURL_TIMEOUT
curl_setopt($this->ch, CURLOPT_CONNECTTIMEOUT, self::CONNECT_TIMEOUT); // Close connection if server doesn't response after CONNECT_TIMEOUT
curl_setopt($this->ch, CURLOPT_RETURNTRANSFER, true); // To return output in `curl_exec`
}
/** @throws \Exception */
public function requestEntity($id)
{
$url = $this->url . $id;
curl_setopt($this->ch, CURLOPT_URL, $url);
$data = curl_exec($this->ch);
if (curl_error($this->ch)) {
throw new \Exception('cURL error (' . curl_errno($this->ch) . '): ' . curl_error($this->ch));
}
return $data;
}
public function __destruct()
{
curl_close($this->ch);
}
}
Also if you don't have limitations to have only one connection with server you can use curl_multi_*
functions.