When working with MySQL databases in PHP, mysql_fetch_array is a commonly used function for fetching rows from a query result set. However, when the result set is extremely large, directly using mysql_fetch_array may cause performance bottlenecks, high memory usage, and slow page responses. This article, with practical code examples, explains how to efficiently handle and optimize performance when dealing with large result sets.
mysql_fetch_array retrieves data row by row from the result set, which helps avoid reading the entire dataset into memory at once. However, if the query itself returns too many rows, it can still consume significant resources.
Optimization ideas:
Use pagination to limit the amount of data returned per query
Use cursor-based queries to process data step by step
Pagination is a common method for handling large result sets. By applying LIMIT and OFFSET, you can control the number of rows returned per query, avoiding memory overload.
<?php
$mysqli = new mysqli("gitbox.net", "user", "password", "database");
<p>$limit = 100; // number of rows per page<br>
$page = isset($_GET['page']) ? intval($_GET['page']) : 1;<br>
$offset = ($page - 1) * $limit;</p>
<p>$sql = "SELECT * FROM large_table LIMIT $limit OFFSET $offset";<br>
$result = $mysqli->query($sql);</p>
<p>while ($row = mysql_fetch_array($result)) {<br>
// process each row<br>
print_r($row);<br>
}<br>
?><br>
This approach breaks data into smaller chunks, processes them in batches, and reduces server load.
The mysql_* functions have been deprecated since PHP 5.5. It is recommended to use mysqli or PDO, which provide better performance and support prepared statements.
For example, using mysqli_fetch_array:
<?php
$mysqli = new mysqli("gitbox.net", "user", "password", "database");
<p>$sql = "SELECT * FROM large_table";<br>
$result = $mysqli->query($sql);</p>
<p>while ($row = $result->fetch_array(MYSQLI_ASSOC)) {<br>
// process each row<br>
print_r($row);<br>
}<br>
?><br>
Avoid using SELECT *. Query only the necessary fields to reduce data transfer and memory usage.
$sql = "SELECT id, name, email FROM large_table LIMIT 100";
Ensure proper indexes are created for fields used in query conditions to avoid full table scans and improve query speed.
CREATE INDEX idx_name ON large_table(name);
If supported by the database, you can use cursors or stored procedures to process rows on the database side, reducing memory usage in PHP.
For scripts that genuinely need to handle large volumes of data, you can temporarily increase memory limits and execution time. However, this should be considered a workaround, not a fundamental solution.
ini_set('memory_limit', '512M');
set_time_limit(300);
Avoid loading all data at once; prioritize pagination or batch processing
Select only required fields, avoid SELECT *
Ensure queries are supported by proper indexes for better efficiency
Use server-side cursors or stored procedures to offload processing from PHP
Adjust PHP configuration as a supplementary optimization measure
By following these practices, you can significantly improve the performance and efficiency of handling large result sets with mysql_fetch_array.