I am attempting to extract tables from multiple schemas in the same database. The tables are similar enough that they are easy to identify and work with using meta data queries like this one:
SELECT DISTINCT TABLE_NAME
FROM information_schema.columns
WHERE TABLE_NAME LIKE 'form_entries%'
With the wildcard standing in for the only distinct part of the table. Each of these tables has a similar format, but a differing number of columns.
I can write a Python script to loop the schemas, dump each of the tables I need, then put them all in a single schema. That's the only approach I have at the moment.
I am somewhat familiar with ETL tools like Kettle, but I don't have the experience to know if there is an ETL workflow, or just a better way of accomplishing this.
**Purpose**
Each of the tables involved contains form submission data. Each table is a different form, so the column names are variable. The ultimate goal is data analysis of the form data.
There are a lot of steps to get to the analysis we want, but we are stuck at the first hurdle of how widely spread out the data is.
Asked by David Hamilton
(101 rep)
Jan 11, 2022, 02:32 PM
Last activity: May 22, 2025, 07:08 PM
Last activity: May 22, 2025, 07:08 PM