Current Location: Home> Latest Articles> How to Efficiently Handle and Optimize Performance When mysql_fetch_array Returns Large Result Sets?

How to Efficiently Handle and Optimize Performance When mysql_fetch_array Returns Large Result Sets?

gitbox 2025-08-28

When working with MySQL databases in PHP, mysql_fetch_array is a commonly used function for fetching rows from a query result set. However, when the result set is extremely large, directly using mysql_fetch_array may cause performance bottlenecks, high memory usage, and slow page responses. This article, with practical code examples, explains how to efficiently handle and optimize performance when dealing with large result sets.


1. Avoid loading all data at once

mysql_fetch_array retrieves data row by row from the result set, which helps avoid reading the entire dataset into memory at once. However, if the query itself returns too many rows, it can still consume significant resources.

Optimization ideas:

  • Use pagination to limit the amount of data returned per query

  • Use cursor-based queries to process data step by step


2. Use pagination to process in batches

Pagination is a common method for handling large result sets. By applying LIMIT and OFFSET, you can control the number of rows returned per query, avoiding memory overload.

<?php
$mysqli = new mysqli("gitbox.net", "user", "password", "database");
<p>$limit = 100; // number of rows per page<br>
$page = isset($_GET['page']) ? intval($_GET['page']) : 1;<br>
$offset = ($page - 1) * $limit;</p>
<p>$sql = "SELECT * FROM large_table LIMIT $limit OFFSET $offset";<br>
$result = $mysqli->query($sql);</p>
<p>while ($row = mysql_fetch_array($result)) {<br>
// process each row<br>
print_r($row);<br>
}<br>
?><br>

This approach breaks data into smaller chunks, processes them in batches, and reduces server load.


3. Use more efficient database interfaces — mysqli or PDO

The mysql_* functions have been deprecated since PHP 5.5. It is recommended to use mysqli or PDO, which provide better performance and support prepared statements.

For example, using mysqli_fetch_array:

<?php
$mysqli = new mysqli("gitbox.net", "user", "password", "database");
<p>$sql = "SELECT * FROM large_table";<br>
$result = $mysqli->query($sql);</p>
<p>while ($row = $result->fetch_array(MYSQLI_ASSOC)) {<br>
// process each row<br>
print_r($row);<br>
}<br>
?><br>


4. Reduce query size by selecting only required fields

Avoid using SELECT *. Query only the necessary fields to reduce data transfer and memory usage.

$sql = "SELECT id, name, email FROM large_table LIMIT 100";

5. Optimize queries with indexes

Ensure proper indexes are created for fields used in query conditions to avoid full table scans and improve query speed.

CREATE INDEX idx_name ON large_table(name);

6. Use server-side cursors (stored procedures or deferred queries)

If supported by the database, you can use cursors or stored procedures to process rows on the database side, reducing memory usage in PHP.


7. Adjust PHP memory and execution time appropriately

For scripts that genuinely need to handle large volumes of data, you can temporarily increase memory limits and execution time. However, this should be considered a workaround, not a fundamental solution.

ini_set(&#039;memory_limit&#039;, &#039;512M&#039;);
set_time_limit(300);

Conclusion

  • Avoid loading all data at once; prioritize pagination or batch processing

  • Use mysqli or PDO instead of deprecated mysql_* functions

  • Select only required fields, avoid SELECT *

  • Ensure queries are supported by proper indexes for better efficiency

  • Use server-side cursors or stored procedures to offload processing from PHP

  • Adjust PHP configuration as a supplementary optimization measure

By following these practices, you can significantly improve the performance and efficiency of handling large result sets with mysql_fetch_array.