Current Location: Home> Latest Articles> Use stream_bucket_make_writeable to read blocks of data in a stream

Use stream_bucket_make_writeable to read blocks of data in a stream

gitbox 2025-05-29

When processing streams in PHP, sometimes we need to operate the data in the stream more underlying layer, such as reading data blocks (buckets) to implement customized buffering. PHP's stream filter mechanism provides a powerful interface, where stream_bucket_make_writeable is a key function that allows us to obtain data blocks from the stream's buffer, read and operate on them.

This article will introduce in detail the role of stream_bucket_make_writeable , how to read data blocks in PHP streams, and demonstrate its specific application through a practical example.


1. What is stream_bucket_make_writeable?

stream_bucket_make_writeable is a PHP stream filter-related function and belongs to the operation interface of the php_stream_bucket structure. Its function is to "take out" a data block (bucket) from the internal buffer of the stream and return a writable bucket object. This bucket contains the currently available data, which we can read or modify directly.

This function is commonly used in the implementation of custom stream filters, helping developers to finely control the streamed data, such as compression, encryption, decoding and other operations.


2. Basic process of reading data blocks

Reading a data block using stream_bucket_make_writeable requires roughly the following steps:

  1. Create and register a custom flow filter <br> The flow filter class needs to inherit php_user_filter and override the filter method.

  2. Call stream_bucket_make_writeable inside filter method
    Get a bucket through this function to read the data from it.

  3. Processing the read data block <br> You can read, modify, or pass data to the next filter.

  4. Return the processing result and complete the filter operation


3. Practical example: Custom stream filter reads data blocks

The following example shows how to define a simple custom stream filter, read the data block through stream_bucket_make_writeable in the filter method, and print the contents:

 <?php
class MyReadFilter extends php_user_filter {
    public function filter($in, $out, &$consumed, $closing) {
        // Looping all data blocks
        while ($bucket = stream_bucket_make_writeable($in)) {
            // Read bucket Data in
            $data = $bucket->data;

            // 这里我们简单打印Read到的数据
            echo "Read数据块内容: " . $data . "\n";

            // Number of bytes consumed by markers
            $consumed += $bucket->datalen;

            // Will bucket Pass to the next filter or output stream
            stream_bucket_append($out, $bucket);
        }
        return PSFS_PASS_ON;
    }
}

// Register a custom filter
stream_filter_register("myreadfilter", "MyReadFilter") or die("Failed to register filter");

// Example of usage:从文件流中Read数据并应用过滤器
$fp = fopen("gitbox.net/sample.txt", "r");

// Attach a custom filter to the stream
stream_filter_append($fp, "myreadfilter", STREAM_FILTER_READ);

// Read文件内容,Will trigger the filter filter method
while (!feof($fp)) {
    fread($fp, 8192);
}

fclose($fp);
?>

4. Key Notes

  • stream_bucket_make_writeable($in) : Take out a bucket from the input bucket list and return it. The returned bucket object contains the properties data (string data) and datalen (data length).

  • After reading the data, you must call stream_bucket_append($out, $bucket) to pass the bucket to the output chain to ensure the normal execution of the stream processing chain.

  • $consumed is used to inform the stream filter how many data blocks are consumed, affecting the management of stream buffers.

  • The filter return value is generally used to use PSFS_PASS_ON to indicate normal data transmission.


5. Summary

stream_bucket_make_writeable is the core interface for reading stream data blocks in the PHP stream filter mechanism, allowing developers to directly manipulate data in the stream buffer. Through custom stream filters, we can implement complex data processing logic, such as real-time compression, encryption, log monitoring, etc.

Mastering the underlying data block operations of streams can help you gain greater flexibility and performance when processing large files and network streaming data.

If you encounter performance bottlenecks in stream operations during development, or need to perform fine data processing in streams, it is recommended to learn and apply stream_bucket_make_writeable and related stream filter APIs in depth.