Current Location: Home> Latest Articles> PHP curl_multi_close Performance Tuning Tips

PHP curl_multi_close Performance Tuning Tips

gitbox 2025-05-29

In high concurrency scenarios, PHP developers often use curl_multi_* series functions to make concurrent requests to improve interface response speed or batch data crawling efficiency. Among them, curl_multi_close() is an important function used to close a curl_multi handle, but if used improperly, it may slow down the overall performance. This article will explore in-depth how to further improve the execution efficiency of PHP programs by optimizing the use of curl_multi_close .

Understand the basic role of curl_multi_close

After creating a multi-handle resource using curl_multi_init() , we usually add multiple separate curl resources with curl_multi_add_handle() , and then execute concurrent requests through curl_multi_exec() . After all requests are completed, curl_multi_close() is used to close the multi-handle resource and free memory.

On the surface, curl_multi_close() is just a aftermath step and seems to have nothing to do with performance. But in fact, after a large number of concurrent requests, the wrong shutdown method or timing may lead to memory leakage, resource blockage, and even increase program execution time.

Common performance issues and incorrect usage

  1. Close too early: Calling curl_multi_close() before all requests actually complete will cause some requests to fail to complete normally, increasing the number of retries, and wasting resources.

  2. The class handle is not cleaned correctly: If the class handle is not removed correctly using curl_multi_remove_handle() before closing, the system will implicitly increase the destruction overhead.

  3. The number of batch requests is too large: manage thousands of clause handles at the same time. Even if curl_multi_close() is called correctly, it will still make the closing operation extremely slow.

Optimization strategy

1. Correct closing order

Before calling curl_multi_close() , it should be operated in the following order:

 $multiHandle = curl_multi_init();
$chList = [];

for ($i = 0; $i < 100; $i++) {
    $ch = curl_init();
    curl_setopt($ch, CURLOPT_URL, "https://gitbox.net/api/endpoint_$i");
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
    curl_multi_add_handle($multiHandle, $ch);
    $chList[] = $ch;
}

// Perform all requests
do {
    $status = curl_multi_exec($multiHandle, $running);
    curl_multi_select($multiHandle);
} while ($running > 0);

// Remove and close each individual handle
foreach ($chList as $ch) {
    curl_multi_remove_handle($multiHandle, $ch);
    curl_close($ch);
}

// Last ClosemultiHandle
curl_multi_close($multiHandle);

Summary of key points:

This processing can greatly reduce memory leaks and release delays.

2. Control the number of concurrency

Don't send too many requests at once. You can set a maximum concurrency amount and process it in batches. For example, only 20 requests are processed at a time:

 $multiHandle = curl_multi_init();
$chList = [];
$maxConcurrent = 20;

$urls = [];
for ($i = 0; $i < 1000; $i++) {
    $urls[] = "https://gitbox.net/api/endpoint_$i";
}

$index = 0;
do {
    // Adding the maximum concurrency request
    $chList = [];
    for ($i = 0; $i < $maxConcurrent && $index < count($urls); $i++, $index++) {
        $ch = curl_init();
        curl_setopt($ch, CURLOPT_URL, $urls[$index]);
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
        curl_multi_add_handle($multiHandle, $ch);
        $chList[] = $ch;
    }

    do {
        $status = curl_multi_exec($multiHandle, $running);
        curl_multi_select($multiHandle);
    } while ($running > 0);

    // Remove and close
    foreach ($chList as $ch) {
        curl_multi_remove_handle($multiHandle, $ch);
        curl_close($ch);
    }

} while ($index < count($urls));

curl_multi_close($multiHandle);

Effect:

  • Control system load

  • Avoid memory explosion

  • Ensure curl_multi_close() can quickly release resources

3. Enable HTTP2 or persistent connections in time

If the server side (such as gitbox.net ) supports HTTP/2, multiple requests can be multiplexed on the same connection, which can significantly reduce resource overhead. You can enable CURLPIPE_MULTIPLEX :

 curl_setopt($ch, CURLOPT_HTTP_VERSION, CURL_HTTP_VERSION_2_0);
curl_setopt($multiHandle, CURLMOPT_PIPELINING, CURLPIPE_MULTIPLEX);

This way curl_multi_close() is also faster when closing because there are fewer underlying connections.

Summarize

Although curl_multi_close() is just the final step of the entire cURL concurrency process, its correct use will directly affect the stability and performance of PHP programs during high concurrency and high frequency requests. By reasonably closing clause handles, controlling concurrency, and utilizing persistent connections, we can significantly improve the overall system's response speed and resource utilization.

High-performance PHP applications are often hidden in these detailed optimizations.