Most efficient method to import bulk JSON data from different sources in postgresql?
0
votes
1
answer
3332
views
I need to import data from thousands of URLs, here is an example of the data:
>[{"date":"20201006T120000Z","uri":"secret","val":"1765.756"},{"date":"20201006T120500Z","uri":"secret","val":"2015.09258"},{"date":"20201006T121000Z","uri":"secret","val":"2283.0885"}]
Since COPY doesn't support JSON format, i've been using this to import the data from some of the URLs:
CREATE TEMP TABLE stage(x jsonb);
COPY stage FROM PROGRAM 'curl https:// .....';
insert into test_table select f.* from stage,
jsonb_populate_recordset(null::test_table, x) f;
But it is inefficient since it creates a table for every import and it imports a single url at a time.
I would like to know if it is possible (through a tool, script or command) to read a file with all the URLs and copy their data into the database.
Asked by Lautaro Aguilera
(1 rep)
Apr 6, 2021, 03:56 AM
Last activity: Jun 19, 2022, 04:10 AM
Last activity: Jun 19, 2022, 04:10 AM