0
votes

I am working on tuning the AWS RDS (Postgres) performance.

With the current setting (default),

db.t2.micro

max_connections: {DBInstanceClassMemory/31457280} // => about 34 connections

I ran performance test using Gatling, with 100 concurrent requests (one request one connection). Report showed that 78 successful requests, 22 failed requests.

I have few questions:

1/ How can RDS support up to 78 connections while the max_connections is 34?

2/ Is there any dynamic way to adjust the max_connections when the number of request exceeds the max_connections?

Note: I know how to change the value of max_connections by hardcode.

1
I don't understand your second question: if every time the connection limit is reached you raise the limit, then why not set the higher limit to begin with? Do you mean you want to be able to raise or lower it dynamically as part of some other automated process?IMSoP
1. Check gatling documentation. 2. Futile move. Since you use t2.micro, once you run out of CPU credit, the CPU will be throttles to 25% of the load.mootmoot
@IMSop, yes, I want a dynamic way, be able to raise or lower max_connection. Preset a limit is not what I am looking for, the request is vary from 10 -> 1000 or more, it doesn't work for my case.Phong Vu
But what are you varying it based on? The reason I ask is that "dynamic" could mean "whenever this condition occurs, automatically do this"; it could mean "at this time of day, do this"; or it could mean "from my script/application written in language/framework X, do this based on complex business logic". The variant "if the limit is reached, make it higher, and do nothing else" makes no sense to me, because if the upper limit is going to eventually reach 1000 "dynamically", why not set it to 1000 straight away?IMSoP
@IMSoP, as you said, I should set it with the max number that I can predict. For example: a school has 1000 student, and they wanna request to see the calendar, so I can set max_connection 1000, it's a possible way. But is it a waste when one day, if just 10 students request? In term of cost, how can I optimise it.Phong Vu

1 Answers

0
votes

1) At any point in time, if the number of concurrent connections >max_connections, RDS will respond with "too many connections" error for new requests, until it is able to release already existing connections. The server rejects any excessive connection attempts. So when you run performance testing with Gatling, do check in the database(by logging into Postgres sql and checking with a query), how many connections are getting established to process the requests.

2) As far as dynamically changing the max_connections parameter is concerned, max_connection is a static parameter in Postgres, meaning you require a DB instance restart before that parameter takes effect. In MySQL though, it is a dynamic parameter. I recommend you not to consider this approach of dynamically modifying this parameter value in production environments. But if you really want to dynamically change it for testing purpose, you can try the below (untested) method:

  1. Create a cloudwatch alarm MaxDbConnection with condition dbconnections > 25(considering t2.micro as an example).

  2. Create a SNS topic exclusively for the above alarm and add it under the Actions section of MaxDbConnection alarm.

  3. Create a lambda function and select the SNS topic created above as its trigger.

  4. In the lambda function, you can write a logic to modify the max_connection parameter using the modify-db-parameter-group method using any sdk of your choice. Also, verify the changes are applied using DescribeDBParameters method.