0
votes

I have an AWS Lambda function written in nodejs that is doing a set of recursive postgres database calls that result in the following error every time on the 81st call:

remaining connection slots are reserved for non-replication superuser connections

I'm assuming I am leaking something at the postgres levels but I believe I am adhering to the recommended calls for performing a single pooled query as defined at https://node-postgres.com/features/pooling. I've simplified my code as shown below so that I'm only executing the same query every time and the result is still the same. The function testHarness is what initiates the logic within my lamba function. The intent here is execute a query against postgres, once it is completed to fire off the query again, repeating 500 times for this example. It always fails when the 81st call occurs. The DB_CONNECT environment variable contains the connection information including a "MAX" value of 3.

function testHarness(cb){
    _test(0, cb);
}

function _test(pos, cb){
    console.log(pos);
    _testData(function (err, data){
        if (err) return cb(err);
        if (pos < 500){
            _test(pos + 1, cb);
        }
        else{
            return cb(null, 'done');
        }
    });
}
function _testData(cb){
    const { Pool } = require('pg')
    const pool = new Pool(JSON.parse(process.env.DB_CONNECT));
    const sql = 'SELECT id, url, pub_date, description, title, duration FROM episodes WHERE feed_id = $1 ORDER BY pub_date DESC LIMIT 10';

    pool.query(sql, ['28c65c8d-f96a-4499-a854-187eed7050bd'], (err, result) => {
        if (err) throw err;
        return cb(err, result);
    })
}
1
You should create the pool instance globally, not on each call to _testData.Bergi

1 Answers

3
votes

So the problem is leaking Pool objects that you create in _testData function. After using a Pool you have to shut it down you and find the documentation here under "Shutdown" title, as it says:

  pool.end()

But, the way you are using Pool does not make sense. It is better to put it in _testHarness function to be able to reuse the connection and save the connection overhead time to let your code run faster:

function testHarness(cb){
    const { Pool } = require('pg')
    const pool = new Pool(JSON.parse(process.env.DB_CONNECT));

    _test(pool, 0, function(err, data){
      pool.end();
      cb(err, data);
    });
}

function _test(pool, pos, cb){
    console.log(pos);
    _testData(pool, function (err, data){
        if (err) return cb(err);
        if (pos < 500){
            _test(pos + 1, cb);
        }
        else{
            return cb(null, 'done');
        }
    });
}
function _testData(pool, cb){
    const sql = 'SELECT id, url, pub_date, description, title, duration FROM episodes WHERE feed_id = $1 ORDER BY pub_date DESC LIMIT 10';

    pool.query(sql, ['28c65c8d-f96a-4499-a854-187eed7050bd'], (err, result) => {
        if (err) throw err;
        return cb(err, result);
    })
}

I am not AWS user, but I guess it should be like any other postgres database service, you might need to change it a bit to fit AWS service.

Also, don't you have the ability to use async/await pattern? It is lot easier to comprehend.