- PHP >= 5.3 (compatible up to version 7.0 && hhvm)
- stable. Using many projects
- fast request. Minimal overhead
- run a query in a single line
- parallel request (Multi request). Default enable parallel request
- use async request
- balancing requests
- no callable
The recommended way to install multi curl is through composer.
$ composer require khr/php-mcurl-client:3.*
{
"require": {
"khr/php-mcurl-client": "~3.0"
}
}
use MCurl\Client;
$client = new Client();
echo $client->get('http://example.com');
$result = $client->get('http://example.com');
echo (!$result->hasError()
? 'Ok: ' . $result
: 'Error: ' .$result->error . ' ('.$result->errorCode.')')
, PHP_EOL;
echo $client->get('http://example.com', [CURLOPT_REFERER => 'http://example.net/']);
echo $client->post('http://example.com', ['post-key' => 'post-value'], [CURLOPT_REFERER => 'http://example.net/']);
// @var $results Result[]
$results = $client->get(['http://example.com', 'http://example.net']);
foreach($results as $result) {
echo $result;
}
$urls = ['http://example.com', 'http://example.net', 'http://example.org'];
foreach($urls as $url) {
$client->add([CURLOPT_URL => $url]);
}
// wait all request
// @var $results Result[]
$results = $client->all();
$urls = ['http://example.com', 'http://example.net', 'http://example.org'];
foreach($urls as $url) {
$client->add([CURLOPT_URL => $url]);
}
while($result = $client->next()) {
echo $result;
}
while($result = $client->next()) {
$urls = fun_get_urls_for_parse_result($result);
foreach($urls as $url) {
$client->add([CURLOPT_URL => $url]);
}
echo $result;
}
while($client->run() || $client->has()) {
while($client->has()) {
// no blocking
$result = $client->next();
echo $result;
}
// more async code
//end more async code
}
$result = $client->add([CURLOPT_URL => $url], ['id' => 7])->next();
echo $result->params['id']; // echo 7
// @var $result Result
$result->body; // string: body result
$result->json; // object; @see json_encode
$result->getJson(true); // array; @see json_encode
$result->headers['content-type']; // use $client->enableHeaders();
$result->info; // @see curl_getinfo();
$result->info['total_time']; // 0.001
$result->hasError(); // not empty curl_error or http code >=400
$result->hasError('network'); // only not empty curl_error
$result->hasError('http'); // only http code >=400
$result->getError(); // return message error, if ->hasError();
$result->httpCode; // return 200
This curl options add in all request
$client->setOptions([CURLOPT_REFERER => 'http://example.net/']);
Add headers in result
$client->enableHeaders();
The maximum number of queries executed in parallel
$client->setMaxRequest(20); // set 20 parallel request
To balance the requests in the time interval using the method $client->setSleep. It will help you to avoid stress (on the sending server) for receiving dynamic content by adjusting the conversion rate in the interval. Example:
$client->setSleep (20, 1);
1 second will run no more than 20 queries.
For static content is recommended restrictions on download speeds, that would not score channel. Example:
//channel 10 Mb.
$client->setMaxRequest (123);
$client->setOptions([CURLOPT_MAX_RECV_SPEED_LARGE => (10 * 1024 ^ 3) / 123]);
$client->get('http://exmaple.com/image.jpg', [CURLOPT_FILE => fopen('/tmp/image.jpg', 'w')]);
To reduce memory usage, you can write the query result in a temporary file.
$client->setStreamResult(Client::STREAM_FILE); // All Result write in tmp file.
/**
* @see tests/ and source
*/