development mysql php postgresql

PostgreSQL unbuffered queries and PHP (cursors)

From using MySQL, I’ve used the ‘unbuffered queries‘ feature a number of times. It’s where you don’t fetch the entire resultset into memory at once – which is necessary if you’re retrieving more data than you have memory available. If’s often also generally gets results/data back to you sooner.

Pseudo PHP code could be a bit like :

$pdo = new PDO("mysql:host=whatever;dbname=whatever", $user, $pass.);
$pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);
$stmt = $pdo->prepare("SELECT * FROM big_table");
while($row = $stmt->fetch(PDO::FETCH_ASSOC) ) {
    echo $row['field'];

Today, I found myself needing to do the same but with PostgreSQL, and it’s been some time since I used it in anger… so had to do some research.

Unfortunately, PostgreSQL doesn’t have the exact same behaviour available (so I can’t just change the PDO DSN/connection info), and I ended up discovering CURSOR queries – example PHP code below.

$curSql = "DECLARE cursor1 CURSOR FOR SELECT * FROM big_table";
$con = new PDO("pgsql:host=whatever dbname=whatever", "user", "pass");
$con->beginTransaction(); // cursors require a transaction.
$stmt = $con->prepare($curSql);

$innerStatement = $con->prepare("FETCH 1 FROM cursor1");

while($innerStatement->execute() && $row = $innerStatement->fetch(PDO::FETCH_ASSOC)) {
    echo $row['field'];

(The innerStatement->execute() needs calling on each loop iteration to move the cursor, and we also need to fetch the data from it …).

3 replies on “PostgreSQL unbuffered queries and PHP (cursors)”

Here is a bit of code you might find interesting if you are using PHP 5.5 or higher:

global $pdo = new PDO('pgsql:'); // Just for reference, I usually use a class to wrapper all this

function fetchCursor($sql, $idCol = false) {
  nextCursorId() is an undefined function, but
  the objective of it is to create a unique Id for each cursor.
  try {
    $cursorID = nextCursorId();
    $pdo->exec("DECLARE {$cursorID} NO SCROLL FOR ({$sql})");
    $stm = $pdo->prepare('FETCH NEXT FROM {$cursorID}');

    if ($stm) {
      while ($row = $stm->fetch(PDO::FETCH_ASSOC) {
        if (is_string($idCol) && array_key_exists($idCol, $row)) {
          yield $row[$idCol] => $row;
        } else {
          yield $row;
  } catch (Exception $ex) {
    // Anything you want
  } finally {
    Do some clean up after the loop is done.
    This is in a "finally" block because if you break the parent loop, it still gets called.
    $pdo->exec("CLOSE {$cursorID}");

And what you can do next with that is just use the function in a loop. While loops, foreach loops, and with the foreach loop, if you want to, you can specify the column you want in the id variable of the loop. when using foreach (fetchCursor as $id => $row).

Thanks. This was extremely useful!
I hardly found any information online regarding this topic…

[…] I am producing large query results that can exceed a sizable PHP memory limit (128 Mb). I’ve been trying to address this and came across unbuffered queries. Unfortunately, these only seem to be supported by MYSQL while I’m using PostgreSQL. There does seem to be a similar process for PostgreSQL involving cursors, but I’m unsure how to implement them within the Drupal framework using db_query(). Demo implementation with plain PHP can be found here. […]

Leave a Reply

Your email address will not be published. Required fields are marked *