Sample Header Ad - 728x90

Getting the max ID of integer columns in databases in Postgres 9.6/12

0 votes
0 answers
45 views
I am looking to get the max ID of each integer column in our databases servers. The function I currently use:
CREATE OR REPLACE FUNCTION intpkmax() RETURNS
   TABLE(schema_name name, table_name name, column_name name, max_value integer)
   LANGUAGE plpgsql STABLE AS
$$BEGIN
   /* loop through tables with a simgle integer column as primary key */
   FOR schema_name, table_name, column_name IN
      SELECT sch.nspname, tab.relname, col.attname
         FROM pg_class tab
            JOIN pg_constraint con ON con.conrelid = tab.oid
            JOIN pg_attribute col ON col.attrelid = tab.oid
            JOIN pg_namespace sch ON sch.oid = tab.relnamespace
         WHERE con.contype = 'p'
            AND array_length(con.conkey, 1) = 1
            AND col.atttypid = 'integer'::regtype
            AND NOT col.attisdropped
   LOOP
      /* get the maximum value of the primary key column */
      EXECUTE 'SELECT max(' || quote_ident(column_name) ||
              ') FROM ' || quote_ident(schema_name) ||
              '.' || quote_ident(table_name) || ''
         INTO max_value;
      /* return the next result */
      RETURN NEXT;
   END LOOP;
END;$$;
And then insert the results of the above function into a table
SELECT * into public.maxIDValue FROM public.intpkmax();
I need to generate a report once a month or maybe a shorter timeframe, I can schedule this via Cron but need the data. I am currently running this on a test environment with no traffic but wanted to ask: - Would this function cause locking on any tables, as the environments it would run on are heavily transactional. - Is there anyway of keeping the function on the postgres public schema but then calling that function so it runs against each database, if so is it possible to have the table the data is inserted in to have the database name (datname), I tried joining the pg_database but had no joy. The function itself running against the largest database 600+GB does take some time is there an alternative that is quicker for me to run? Any help is much appreciated.
Asked by rdbmsNoob (459 rep)
Nov 29, 2021, 05:22 PM