bcgov/TheOrgBook

Implement/Activate automatic Vacuum Analyse on the TOB DB.

Closed this issue · 3 comments

Parts of TheOrgBook used cached data that can become stale and incorrect over time. Running Vacuum Analyse solves this issue.

Having the database run Vacuum Analyse automatically helps keep the caches refreshed when there is a lot of activity, such as during credential loading.

Figure out how to add the settings into the regular database migrations.

This was done on the BC Registries Agent's event-db, but that database is built using discrete scripts.

Environments to update:

  • devex-von-tools - build configuration

  • devex-von-bc-tob-dev

  • devex-von-bc-tob-test

  • devex-von-bc-tob-prod

  • devex-bcgov-dac-dev

  • ontvon-von-tools - build configuration

  • ontvon-von-dev

  • ontvon-von-test

  • ontvon-von-prod

autovacuum_vacuum and autovacuum_analyze are globally adjusted for the server in the postgresql.conf. In our case it's /var/lib/pgsql/data/userdata/postgresql.conf which is on persistent storage.

The current settings can be seen by running select name, setting from pg_settings where name like '%autovacuum%';

The defaults don't run autovacuum_vacuum and autovacuum_analyze often enough for our purposes when loading data.

A more aggressive scale factor would run the processes more often;

  • autovacuum_analyze_scale_factor = 0.002;
  • autovacuum_vacuum_scale_factor = 0.001;

Simply editing the postgresql.conf file is possible however that does not protect the settings against deletion in the case the pvc is dropped and recreated or the data directory is dropped completely.

Best practice for the image is to extend the configuration using the s2i extensions explained here; Extending Image

Added support for customizing the Postgres configuration; #565

The new configuration settings are performing vacuum analyse about every 2000 records which is keeping the summary page closer inline to the actual record counts when large volumes of credentials are being loaded.