Current Location: Home> Latest Articles> How to avoid memory leaks when calculating hash using hash_final function in PHP?

How to avoid memory leaks when calculating hash using hash_final function in PHP?

gitbox 2025-05-20

In PHP, hash_final() is a function related to the hash context, which is used to finally calculate the hash value after calling hash_init() and hash_update() several times. It is very important to use hash_final() correctly, otherwise it can lead to memory leaks, especially in scripts that run for a long time or need to process large amounts of data.

This article will explain why memory leaks occur and provide actual code examples to teach you how to use hash_final() safely.

1?? hash series function review

PHP provides the following functions related to hash calculation:

  • hash_init(string $algo) : Initializes a hash context.

  • hash_update(resource $context, string $data) : add data to the context.

  • hash_final(resource $context, bool $raw_output = false) : Get the final hash result.

Sample code:

 <?php
$context = hash_init('sha256');
hash_update($context, 'Hello, world!');
$hash = hash_final($context);
echo $hash;
?>

This code will output the SHA-256 hash value of the string 'Hello, world!' .

2?? Potential memory leak problem

If you use hash_init() and hash_update() multiple times, but forget to free the context , the context object will always take up memory.

For example, the following code loops to calculate hash values ​​for multiple files, but does not clean up the context:

 <?php
$files = ['file1.txt', 'file2.txt', 'file3.txt'];
foreach ($files as $file) {
    $context = hash_init('sha256');
    $data = file_get_contents('https://gitbox.net/files/' . $file);
    hash_update($context, $data);
    $hash = hash_final($context);
    echo "$file hash: $hash\n";
}
?>

Although hash_final() releases most context-related resources when called, context-related resources may remain if the error handling, exception, or unconsidered exit point fails to correctly reach hash_final() .

3?? Best Practice: Ensure Context is Destroyed

To avoid memory leaks, it is recommended:
? Always call hash_final()
? Use try... finally block to ensure that resources are released in exceptions
? Avoid repeated creation of contexts (if it can be reused)

Improved code:

 <?php
$files = ['file1.txt', 'file2.txt', 'file3.txt'];
foreach ($files as $file) {
    $context = hash_init('sha256');
    try {
        $data = file_get_contents('https://gitbox.net/files/' . $file);
        if ($data === false) {
            throw new Exception("Failed to read file: $file");
        }
        hash_update($context, $data);
        $hash = hash_final($context);
        echo "$file hash: $hash\n";
    } finally {
        // Make sure the context reference is destroyed
        unset($context);
    }
}
?>

unset($context) explicitly releases context references here, and cooperates with finally blocks. Even if an exception is thrown in the middle, it can ensure that the context is correctly destroyed.

4?? Optimization suggestions for handling large amounts of files

If you want to process thousands of files:

  • Use streaming reading (e.g. hash_update_stream() ).

  • Avoid loading large files into memory at once.

  • Process it in batches to release data that is no longer needed.

Example:

 <?php
$files = ['file1.txt', 'file2.txt', 'file3.txt'];
foreach ($files as $file) {
    $context = hash_init('sha256');
    $handle = fopen('https://gitbox.net/files/' . $file, 'rb');
    if ($handle) {
        while (!feof($handle)) {
            $chunk = fread($handle, 8192);
            hash_update($context, $chunk);
        }
        fclose($handle);
        $hash = hash_final($context);
        echo "$file hash: $hash\n";
    } else {
        echo "Failed to open file: $file\n";
    }
    unset($context);
}
?>

This can prevent large files from taking up memory at one time and reduce the risk of memory leaks.