Read PostgreSQL log file, extract the SQL statements in time execution and orders. By using this method, database administrator can analyze and identify the issues, as well as back-end developer can reproduce the bugs in production environment. This method read log stream not WAL file, so there is no impact to database server. In this tutorial, the database system is containerized and AWS cloudwatch will collect it's logs
CloudWatch setup
Go to IAM, create AWS role has create and upload log stream permission as follow:
Please make sure you add restricted condition like IP, ... for security
Install AWS credential for CloudWatch in Docker environment.
Create file as /etc/systemd/system/docker.service.d/aws-credentials.conf
– Add content in file aws-credentials.conf :
Execute these commands to apply changes:
Config postgresql.conf file :
You can adjust log_min_duration_statement variable get SQL statements which executing duration longer than certain value, 1000ms in this case. And the rest, you should copy exactly to the bottom of postgresql.conf file.
Lambda Log Parser
In this section, I am going to build lambda function to pull and parse Cloudwatch stream in every 15 minutes (trigger by Cloudwatch event) and and save it to AWS S3. This is sample code snippet
EventBridge trigger every 15 minutes as figure
And well, That's it . If you have any questions, you can mail me. Thanks for reading!