Column ___ of relation ___ does not exist in csv migration -- is Liquibase ignoring the schema I configured?
0
votes
0
answers
149
views
edit: I fought this too long and went with SQL insert statements instead, since I was generating the CSV from a JS script it was easy to translate over. I think there's a bug in this
loadData
command. Asked the same question in Liqubibase forums and got no responses. I recommend using Flyway instead of Liquibase, seems to be more active.
I'm trying to insert csv data. My config looks like this:
{
"description": "DS-5001: insert boundary data",
"tagDatabase": {
"tag": "v2.1.0"
},
"loadData": {
"dbms": "postgresql",
"columns": [
{
"column": {
"name": "id",
"type": "STRING"
}
},
{
"column": {
"name": "name",
"type": "STRING"
}
},
{
"column": {
"name": "state_id",
"type": "STRING"
}
},
{
"column": {
"name": "state_abbr",
"type": "STRING"
}
},
{
"column": {
"name": "state_name",
"type": "STRING"
}
},
{
"column": {
"name": "country_code",
"type": "STRING"
}
},
{
"column": {
"name": "country_name",
"type": "STRING"
}
},
{
"column": {
"name": "geom",
"type": "STRING"
}
}
],
"commentLineStartsWith": "//",
"encoding": "UTF-8",
"file": "../csv/0000-geom-data.csv",
"quoteChar": "\"",
"schemaName": "boundaries_justincollum",
"relativeToChangelogFile": true,
"separator": ",",
"tableName": "boundary_area",
"usePreparedStatements": true
}
},
The table looks like:
CREATE TABLE boundaries_justincollum.boundary_area (
id varchar NOT NULL,
"name" varchar NOT NULL,
state_id varchar NOT NULL,
state_abbr varchar NOT NULL,
state_name varchar NOT NULL,
country_code varchar NOT NULL,
country_name varchar NOT NULL,
geom varchar NOT NULL,
CONSTRAINT pk_county2 PRIMARY KEY (id)
);
But when I run the migration I get an error:
"23-09-21 17:03:15] SEVERE [liquibase.integration] Migration failed for changeset changelog/changesets/000210_add_geoms.json::000210::jcollum:",
"son: liquibase.exception.DatabaseException: org.postgresql.util.PSQLException: ERROR: current transaction is aborted, commands ignored until end of transaction block",
"aused by: ERROR: column \"\"id\"\" of relation \"boundary_area\" does not exist",
"ition: 48",
(The logs are a little messed up because it's running in an AWS ambda that's running locally.)
In fact it can't find any of the columns in the boundary_area
table (many similar entries in the logs, column ___ of relation ___ does not exist). But when I select them it's fine:
SELECT id FROM boundaries_justincollum.boundary_area ma LIMIT 10; -- returns 10 rows!
command line:
JAVA_OPTS=\"-Xms2048m -Xmx4096m\" liquibase update --url=jdbc:postgresql://[redacted]?currentSchema=boundaries_justincollum,public --default-schema-name=boundaries_justincollum --username=*** --password=*** --changelog-file=\"changelog/index.json\" --log-level=INFO"
It's acting like it's looking in the public schema -- the table does not exist in the public schema. I built a very similar migration using sql files (we have about 20 existing migrations like that) and it worked fine. But as soon as I tried to use loadData
I started encountering issues.
Am I missing something or is this a bug in Liquibase's loadData
? I have another migration that is inserting large amounts of data but it's using INSERT INTO ...
with a a generated SQL file. That works fine. But I'd rather keep the data in CSV, that's easier to edit.
Asked by jcollum
(229 rep)
Sep 22, 2023, 03:35 PM
Last activity: Sep 26, 2023, 03:21 PM
Last activity: Sep 26, 2023, 03:21 PM