Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
0
votes
1
answers
1305
views
Connection rejected based on ACL filtering, but the ACL is disabled?
I'm trying to connect to an Oracle Cloud database through DataGrip (with JetBrains's instructions) but I'm getting an error: ``` DBMS: Oracle (no ver.) Case sensitivity: plain=mixed, delimited=exact [66000][12506] ORA-12506: TNS:listener rejected connection based on service ACL filtering. ``` Howeve...
I'm trying to connect to an Oracle Cloud database through DataGrip (with JetBrains's instructions) but I'm getting an error:
What could possibly be causing this error and how can I fix it.
DBMS: Oracle (no ver.)
Case sensitivity: plain=mixed, delimited=exact
ORA-12506: TNS:listener rejected connection based on service ACL filtering.
However, clearly the ACL is disabled:

andyinnie
(1 rep)
Nov 21, 2023, 01:57 AM
• Last activity: Aug 3, 2025, 04:06 PM
2
votes
1
answers
710
views
Datagrip issues for Oracle 19c database
I've used Datagrip for a while with both SQL Server and MySql, but the experience with Oracle has been very different so far. I'm having issues with a lot of basic features I took for granted with the other databases I've worked with. There is little information about these issues in the Datagrip do...
I've used Datagrip for a while with both SQL Server and MySql, but the experience with Oracle has been very different so far. I'm having issues with a lot of basic features I took for granted with the other databases I've worked with. There is little information about these issues in the Datagrip documentation and it's hard to believe that this tool is so lacking for Oracle.
Typical issues:
- Struggling to synchronize with incoming updates to schema and similar changes(might be performance related)
- Types, columns, tables, etc. aren't being resolved
- There are user schema's with connections(e.g triggers) to other tables not owned by itself which I can't see anywhere in Datagrip.
Any input on resources or solutions are much appreciated!
Øystein Seel
(141 rep)
Jun 29, 2021, 07:43 AM
• Last activity: Apr 14, 2025, 01:10 AM
1
votes
2
answers
387
views
DataGrip showing prints during a transaction
I have a long running update query, for this reason I used the old well known trick of dividing the query in smaller chunks of 1000 to be print the status of the query and calm the anxiety. Unfortunately DataGrip seems to write to output all the print at the end of the query, witch makes it useless....
I have a long running update query, for this reason I used the old well known trick of dividing the query in smaller chunks of 1000 to be print the status of the query and calm the anxiety.
Unfortunately DataGrip seems to write to output all the print at the end of the query, witch makes it useless.
Is it possible to print the updates the same way as Visual Studio?


s.demuro
(303 rep)
Feb 7, 2023, 07:05 PM
• Last activity: Mar 30, 2023, 12:43 PM
0
votes
0
answers
152
views
Describe table from Athena data source in DataGrip fails
Whenever I attempt to Describe an AWS Athena table in DataGrip it fails. The error states that is is a permissions error but I am fairly certain it is not. In fact, I can describe tables using SQLWorkbench/j and the Athena console without issue. It is only a problem in DataGrip. The query I am runni...
Whenever I attempt to Describe an AWS Athena table in DataGrip it fails. The error states that is is a permissions error but I am fairly certain it is not. In fact, I can describe tables using SQLWorkbench/j and the Athena console without issue. It is only a problem in DataGrip.
The query I am running is simply DESCRIBE schemaName.TableName;
Is this normal or an issue or something I am doing incorrectly in DG.

Mattboy91
(1 rep)
Mar 1, 2023, 11:20 PM
• Last activity: Mar 2, 2023, 01:37 PM
0
votes
1
answers
6874
views
Datagrip EXPLAIN ANALYZE Gui?
In Datagrip (using 2018.2.4), you can right-click on a query and choose 'Explain' which gives nicely formatted output. Is there anyway to get this nicely formatted output using instead 'Explain Analyze'?
In Datagrip (using 2018.2.4), you can right-click on a query and choose 'Explain' which gives nicely formatted output. Is there anyway to get this nicely formatted output using instead 'Explain Analyze'?
Bob Woodley
(103 rep)
Oct 24, 2018, 02:42 PM
• Last activity: Sep 12, 2022, 05:36 PM
1
votes
1
answers
2893
views
How to view query history in datagrip when the icon is not available?
[On the datagrip website:][1] > On the toolbar of every console there is a [button][2]. Press it to > see the history of all the queries which were run against this data > source. Speed search also works! > > Also, don't forget about the Local History of each file. But I do not see this icon. So how...
On the datagrip website:
> On the toolbar of every console there is a
. Press it to
> see the history of all the queries which were run against this data
> source. Speed search also works!
>
> Also, don't forget about the Local History of each file.
But I do not see this icon. So how can I make this visible?


Daan
(167 rep)
Apr 1, 2022, 08:03 AM
• Last activity: Jul 31, 2022, 11:24 PM
0
votes
1
answers
683
views
Cannot connect to a pluggable database via IDE
I have an Oracle XE database on computer1. When I try to connect to it via sqlplus from computer2, it works fine, for both cdb and pdb connections; e.g.: rlwrap sqlplus timon@//192.168.0.115:9121/xepdb1 SQL*Plus: Release 21.0.0.0.0 - Production on Tue Jun 21 09:40:00 2022 Version 21.6.0.0.0 Copyrigh...
I have an Oracle XE database on computer1.
When I try to connect to it via sqlplus from computer2, it works fine, for both cdb and pdb connections;
e.g.:
rlwrap sqlplus timon@//192.168.0.115:9121/xepdb1
SQL*Plus: Release 21.0.0.0.0 - Production on Tue Jun 21 09:40:00 2022
Version 21.6.0.0.0
Copyright (c) 1982, 2022, Oracle. All rights reserved.
Enter password:
Last Successful login time: Tue Jun 21 2022 09:39:11 +03:00
Connected to:
Oracle Database 18c Express Edition Release 18.0.0.0.0 - Production
Version 18.4.0.0.0
SQL>
When I try to connect to it via DataGrip IDE:
- I can connect to container database with username "sys as sysdba"
- I cannot connect to pluggable database xepdb1. Settings are following:
Error is following:
How can I fix it?


Timofey Pasichnik
(3 rep)
Jun 21, 2022, 07:05 AM
• Last activity: Jun 21, 2022, 08:57 AM
0
votes
1
answers
3200
views
Communications link failure SSH Tunnel on Datagrip
I need to connect RDS through SSH for now(Local -> EC2 -> RDS). So I tried to connect MySQL via SSH Tunnel option on Datagrip, however, It cannot connect to server. ``` [08S01] Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not rece...
I need to connect RDS through SSH for now(Local -> EC2 -> RDS).
So I tried to connect MySQL via SSH Tunnel option on Datagrip, however, It cannot connect to server.
This is my workbench screenshot. It works.
This is my Datagrip screenshot.
I wrote host as RDS endpoint but it failed.
[08S01] Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. No appropriate protocol (protocol is disabled or cipher suites are inappropriate).
I tried MySQL Workbench with Standard TCP/IP over SSH
option. It works well.
Is that different between TCP/IP over SSH
and SSH Tunnel
?
Or Did I something wrong?




Minkyu Kim
(127 rep)
Mar 4, 2022, 12:31 AM
• Last activity: Mar 10, 2022, 10:52 AM
0
votes
0
answers
202
views
Datagrip connecting to databases on launch
I'm using Jetbrains Datagrip to connect to Mysql servers. My problem is that Datagrip connects to the various databases I created as soon as I launch it. That's not a problem with development databases, but I really don't want that for production databases. Is the a way to indicate to Datagrip I wan...
I'm using Jetbrains Datagrip to connect to Mysql servers. My problem is that Datagrip connects to the various databases I created as soon as I launch it. That's not a problem with development databases, but I really don't want that for production databases. Is the a way to indicate to Datagrip I want it to connect to production databases only when I say so?
Alexis Dufrenoy
(101 rep)
Oct 18, 2021, 08:52 AM
0
votes
2
answers
282
views
MySQL configuration. Query doesnt get returned result (DataGrip, DBeaver, PHP). Something times out
I am running a simple INSERT INTO ... SELECT ... ON DUPLICATE KEY UPDATE ... query using either DataGrip or DBeaver or PHP and it always doesnt register the result. The SELECT inside takes about 10 minutes and the query always completes (the table is updated) but the program / PHP process doesnt get...
I am running a simple INSERT INTO ... SELECT ... ON DUPLICATE KEY UPDATE ... query using either DataGrip or DBeaver or PHP and it always doesnt register the result.
The SELECT inside takes about 10 minutes and the query always completes (the table is updated) but the program / PHP process doesnt get the result and just waits.
Using SHOW FULL PROCESSLIST I can see the query being executed and after 10 minutes it transforms into "command: sleep" and just hangs there.
My MySQL version is 5.7.33-0ubuntu0.16.04.1 hosted on AWS EC2 and some relevant variables from SHOW VARIABLES:
Variable_name Value
connect_timeout 3600
delayed_insert_timeout 300
have_statement_timeout YES
innodb_flush_log_at_timeout 1
innodb_lock_wait_timeout 100
innodb_rollback_on_timeout OFF
interactive_timeout 1800
lock_wait_timeout 31536000
net_read_timeout 3600
net_write_timeout 3600
rpl_stop_slave_timeout 31536000
slave_net_timeout 60
wait_timeout 28800
Any ideas what to change? Any missing configuration to be updated? Thank you
honzaik
(101 rep)
Sep 2, 2021, 04:43 PM
• Last activity: Sep 8, 2021, 10:11 AM
0
votes
1
answers
660
views
Why DataGrip automatically creates Foreign Key relation when 'table_id' column is added
Not long ago I started using DataGrip from Jetbrains, I am using postgres when I add column 'table_id' foreign key relation is automaitcally is created between tables, though I didn't add any `FOREIGN KEY` statements. Do you know why relation is created automatically, when I just added tablename_id?...
Not long ago I started using DataGrip from Jetbrains, I am using postgres
when I add column 'table_id' foreign key relation is automaitcally is created between tables, though I didn't add any
FOREIGN KEY
statements. Do you know why relation is created automatically, when I just added tablename_id?


kenzo
(3 rep)
Jul 18, 2021, 06:33 PM
• Last activity: Jul 19, 2021, 07:07 AM
2
votes
1
answers
5152
views
How to connect Datagrip with AWS Athena using an assumed role?
I am able to make Athena queries through the AWS CLI. This requires me to assume a role first (I use `awsume`). The role requires MFA. Now I want to configure Datagrip for this as well. However there are two options for authenticating: * User/pass - I can put in my AWS key ID and secret, but I think...
I am able to make Athena queries through the AWS CLI. This requires me to assume a role first (I use
awsume
). The role requires MFA.
Now I want to configure Datagrip for this as well. However there are two options for authenticating:
* User/pass - I can put in my AWS key ID and secret, but I think it would fail without the MFA token as well (there is no field for it in Datagrip 2021.1).
* AWS profile - I figured out what to put into "Profile" - it's the source_profile
in my role's profile in my ~/.aws/config
. However, I can't figure out what to put under User
.
If I put in:
* The name of the role's profile
* The name of the role
* The ARN of the role
* My AWS username
They all fail with the error:
>The specified database user/password combination is rejected: [HY000] [Simba][AthenaJDBC](100071) An error has been thrown from the AWS Athena client. User: arn:aws:iam::[NUMBER]:user/[MYUSERNAME] is not authorized to perform: athena:ListDataCatalogs [Execution ID not available]
>
>com.simba.athena.amazonaws.services.athena.model.AmazonAthenaException: User: arn:aws:iam::[NUMBER]:user/[MYUSERNAME] is not authorized to perform: athena:ListDataCatalogs (Service: AmazonAthena; Status Code: 400; Error Code: AccessDeniedException; Request ID: [UUID]; Proxy: null
I get this same error, with my username, even if I put in the role as a User
in the Datagrip dialog box.
Haterind
(151 rep)
Jun 23, 2021, 09:13 PM
• Last activity: Jun 29, 2021, 09:58 PM
0
votes
1
answers
1343
views
How can I connect to an Oracle Autonomous Data Warehouse with a 3rd party IDE (DataGrip)?
I'm trying to connect to a Oracle Autonomous Data Warehouse database with Jetbrains DataGrip. Oracle provides me with a wallet file (a zip), with contains tnsnames.ora, a keystore, ojdbc.properties, and some other files. I'm having a lot of trouble using this information to connect to the database u...
I'm trying to connect to a Oracle Autonomous Data Warehouse database with Jetbrains DataGrip. Oracle provides me with a wallet file (a zip), with contains tnsnames.ora, a keystore, ojdbc.properties, and some other files.
I'm having a lot of trouble using this information to connect to the database using DataGrip. I found a thread on the DataGrip support forums, but I'm not having any luck with that either.
Jetbrains support thread: https://intellij-support.jetbrains.com/hc/en-us/community/posts/360001792539-Connect-with-Oracle-Cloud
Relevant Oracle documentation: https://docs.oracle.com/en/cloud/paas/autonomous-data-warehouse-cloud/adwud/connect-using-client-application.html
**What I did:**
1. Created the 'TNS_ADMIN' environment variable and set it to:
C:\\Users\\xxx\\Documents\\[folder with wallet files]
2. Added the Oracle JDBC driver files (ojdbc8.jar, osdt_cert.jar, oraclepki.jar, osdt_core.jar) to the standard Oracle driver in DataGrip
3. edited the 'sqlnet.ora' file to include the path to the wallet files
4. Added the following to the Data Source VM Options:
-Doracle.net.tns_admin=C:\\Users\\xxx\\Documents\\[folder with wallet files]
-Djavax.net.ssl.trustStore=truststore.jks
-Djavax.net.ssl.trustStorePassword=[password]
-Djavax.net.ssl.keyStore=keystore.jks
-Djavax.net.ssl.keyStorePassword=[password]
-Doracle.net.ssl_server_dn_match=true
-Doracle.net.ssl_version=1.2
5. Set connection type to URL only
6. Tried different connection strings in the URL field:
:oracle:thin:@//adb.eu-frankfurt-1.oraclecloud.com:1522/xxxxxx_adw1_high.adwc.oraclecloud.com?TNS_ADMIN=C:\\Users\\xxx\\Documents\\[folder with wallet files]
:oracle:thin:@xxxxxx_adw1_high.adwc.oraclecloud.com?TNS_ADMIN=C:\\Users\\xxx\\Documents\\[folder with wallet files]
:oracle:thin:@//adb.eu-frankfurt-1.oraclecloud.com:1522/mnr6yzqr22jgywm_adw1_high.adwc.oraclecloud.com
**Result:**
Connection to ADW1 failed.
IO Error: Got minus one from a read call, connect lapse 32 ms.,
Authentication lapse 0 ms.
I have also tried using the 'Service name' and 'TNS' connection types and filled in the info from tnsnames.ora. No dice, same error.
Also tried explicitely setting the 'tcp.validnode_checking' parameter to null.
(The connection works fine with sqldeveloper)
What's the proper way to do this?
honeybees
(103 rep)
Jan 25, 2020, 07:45 PM
• Last activity: Apr 27, 2021, 10:12 AM
0
votes
1
answers
745
views
Incorrect string value: '\\xD0\\xA2\\xD0\\x9E\\xD0\\x9E...' for column Username in MySQL DataGrip specific
I know that such error can be caused by incorrect character set applied to some table, but in my case it's a bit different: I have some procedure with select, create temporary table and cursor... The actual problem is that if I do `DROP/CREATE` procedure from [JetBrain's DataGrip][1] and `CALL` it -...
I know that such error can be caused by incorrect character set applied to some table, but in my case it's a bit different:
I have some procedure with select, create temporary table and cursor...
The actual problem is that if I do
DROP/CREATE
procedure from JetBrain's DataGrip and CALL
it - error appears. But in case I do DROP/CREATE
of SAME procedure without any modification from other software, for example I've used DBForge - CALL
works just as expected! What can cause such behavior? I've used DataGrip for years now and this is the first time after which I start to worry about recreating procedures on production database.
Additional information: the error is caused by FETCH
of Username
column in cursor loop body. In case of commenting that part - everything works. So there is a problem FETCHING Username to my local variable DECLARE Username VARCHAR(300);
. Sizes of variables are same.
0x49D1
(101 rep)
Oct 15, 2020, 11:19 AM
• Last activity: Oct 15, 2020, 12:31 PM
2
votes
2
answers
1892
views
DataGrip Issue Load Data Local Infile
I am having trouble using the `LOAD DATA LOCAL INFILE` statement in Datagrip. Suppose I have `source_data.csv` such as: ``` rating,statement -2,"I'm hungry!" 5,"I'm satisfied." 1,"Ok, but could be better." ``` After running the following code block in DataGrip: ``` # loading_sample_table.sql USE sam...
I am having trouble using the
LOAD DATA LOCAL INFILE
statement in Datagrip. Suppose I have source_data.csv
such as:
rating,statement
-2,"I'm hungry!"
5,"I'm satisfied."
1,"Ok, but could be better."
After running the following code block in DataGrip:
# loading_sample_table.sql
USE sample_db;
DROP TABLE IF EXISTS sample_table;
CREATE TABLE sample_table (rating int, statement varchar(50), id serial);
LOAD DATA
LOCAL INFILE 'absolute/path/to/my/source_data.csv'
INTO TABLE sample_table
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '\''
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(rating, statement);
I receive the following error:
The used command is not allowed with this MySQL version
Running SHOW VARIABLES LIKE "%version%";
in either MySQL shell or DataGrip produces
+--------------------------+-------------------------------+
| Variable_name | Value |
+--------------------------+-------------------------------+
| immediate_server_version | 999999 |
| innodb_version | 8.0.18 |
| original_server_version | 999999 |
| protocol_version | 10 |
| slave_type_conversions | |
| tls_version | TLSv1,TLSv1.1,TLSv1.2,TLSv1.3 |
| version | 8.0.18 |
| version_comment | MySQL Community Server - GPL |
| version_compile_machine | x86_64 |
| version_compile_os | macos10.14 |
| version_compile_zlib | 1.2.11 |
+--------------------------+-------------------------------+
Then, if I try the same same command above but just drop LOCAL
from the code block, I receive the following error:
[HY000] Can't get stat of 'absolute/path/to/my/source_data.csv' (OS errno 13 - Permission denied)
Also, I checked SHOW GLOBAL VARIABLES LIKE 'local_infile';
and can see:
+---------------+-------+
| Variable_name | Value |
+---------------+-------+
| local_infile | ON |
+---------------+-------+
However, if from my terminal I run:
mysql --local-infile=1 -uroot -p < 'absolute/path/to/my/loading_sample_table.sql'
After entering my root password, **the file runs successfully**. Ideally, I would like to do everything in DataGrip. Unsure on how to debug this issue.
Vivek Jha
(155 rep)
May 9, 2020, 12:14 AM
• Last activity: May 9, 2020, 11:08 PM
0
votes
1
answers
725
views
Login failed for User 18456 State 5. User id works on other machines
I have a really weird login issue I've been pursuing. I experience this only on my Macbook Pro (2018) and only when logging into this one specific SQL Server 2016 database directly through a database client. Login attempts through any database client to this specific database are met with: > Login f...
I have a really weird login issue I've been pursuing. I experience this only on my Macbook Pro (2018) and only when logging into this one specific SQL Server 2016 database directly through a database client.
Login attempts through any database client to this specific database are met with:
> Login failed for user ''. (Msg 18456, Level 14).
Failure state is 5, "invalid user id".
These credentials work fine when used as credentials for a local copy of the java application that this DB backs.
These credentials work fine when used to login from another machine.
User ID/password does not work **only** when used to log in through a database client, and **only** on my machine. Tested with DataGrip and SQLPro for SQL Server. Both applications can log into other SQL Server databases without issues.
I generated a new user account to no avail. The issue persists. I can connect to other database servers through 1433, and my application can connect to **this** database when running locally with the same credentials
I am connected through a VPN, but I don't think that has anything to do with it.
I would suspect that it's some kind of encoding issue with the request, but the problem persists across multiple applications.
Anybody have any ideas?
Andrew Ross
(1 rep)
Oct 3, 2019, 06:16 PM
• Last activity: Feb 26, 2020, 12:56 PM
0
votes
2
answers
589
views
PostgreSQL Select statement produces errors in datagrip but not pgAdmin4
I can run the following script without errors in pgadmin4 SELECT * from tbl I have tried the following in datagrip SELECT * from tbl and SELECT * from public.tbl Neither of the above scripts worked. The following error was returned: [2018-05-12 21:16:44] [42P01] ERROR: relation "public.tbl" does not...
I can run the following script without errors in pgadmin4
SELECT * from tbl
I have tried the following in datagrip
SELECT * from tbl
and
SELECT * from public.tbl
Neither of the above scripts worked. The following error was returned:
[2018-05-12 21:16:44] [42P01] ERROR: relation "public.tbl" does not exist
[2018-05-12 21:16:44] Position: 15
Alex
(111 rep)
May 13, 2018, 01:23 AM
• Last activity: Feb 25, 2020, 04:52 PM
0
votes
1
answers
254
views
postgres function definition not saving top whitespace
For reference I am running postgres 9.6 . I'd like to write functions inside of postgres where the top of the function saves. create or replace function apqp.apqp_list_gentex_tool_report_getlist(_omni_search text DEFAULT ''::text , _cust_id integer DEFAULT 0 ) returns TABLE(reason text , program tex...
For reference I am running postgres 9.6 . I'd like to write functions inside of postgres where the top of the function saves.
create or replace function
apqp.apqp_list_gentex_tool_report_getlist(_omni_search text DEFAULT ''::text
, _cust_id integer DEFAULT 0
)
returns TABLE(reason text
, program text
, mold_number text
, rev text
, stone_part_number text
, part_description text
, cust_request_num text
, status text
)
language sql
as $$ -- and so on
When I build this, if I relookup the function definition in datagrip the top part loses my white space I put in there and it looks like this.
create or replace function apqp.apqp_list_gentex_tool_report_getlist(_omni_search text DEFAULT ''::text, _cust_id integer DEFAULT 0) returns TABLE(reason text, program text, mold_number text, rev text, stone_part_number text, part_description text, cust_request_num text, status text, kickoff date, mold_delivered date, first_shot date, ppap_submission_date date, interim_approval_expiration_date date, tool_shop text, manager text, apqp_status text, apqp_list_id integer, comments text)
language sql
as $$
Is there a setting in postgres or someway to save to the database when I
CREATE
a function that it will save my white space at the top. I'd like to be able to repull the definition while maintaining my white space.
Daniel L. VanDenBosch
(408 rep)
Aug 7, 2019, 06:12 PM
• Last activity: Aug 7, 2019, 11:49 PM
2
votes
2
answers
1713
views
How to set "PRAGMA foreign_keys = ON" each time I'm connecting to sqlite db via Datagrip
In my pet project I'm working with small sqlite db and I've just came to realization that, in order to `ON DELETE CASCADE` actually do the thing, I need to set `PRAGMA foreign_keys = ON` each time I'm connecting to db. However I'm using Datagrip and since it's quite new product to me I don't what's...
In my pet project I'm working with small sqlite db and I've just came to realization that, in order to
ON DELETE CASCADE
actually do the thing, I need to set PRAGMA foreign_keys = ON
each time I'm connecting to db.
However I'm using Datagrip and since it's quite new product to me I don't what's the best way to set this option every time.
shabunc
(767 rep)
Jun 30, 2019, 06:58 PM
• Last activity: Jul 1, 2019, 09:09 AM
1
votes
1
answers
2282
views
Moving a large table from Localhost to remote server
Postgres 11, Windows 10, pgAdmin 4 I have a large post-gis enabled table (29million rows, 20GB + a 5GB spatial index) stored on my local machine (localhost). I want to move this table to an AWS hosted server instance (also Postgres). The table has 2 columns, and id column and a geometry column. **At...
Postgres 11, Windows 10, pgAdmin 4
I have a large post-gis enabled table (29million rows, 20GB + a 5GB spatial index) stored on my local machine (localhost).
I want to move this table to an AWS hosted server instance (also Postgres). The table has 2 columns, and id column and a geometry column.
**Attempt 1**: Using dblink, the transfer doesn't work:
CREATE TABLE destschema.dest_table AS
SELECT fid, wkb_geometry FROM dblink('host=localhost
user=postgres
password=password
dbname=sourcedb',
'select fid, wkb_geometry
from sourceschema.sourcetable') as linktable(
fid VARCHAR,
wkb_geometry geometry)
Error is: 'FATAL: password authentication failed for user "postgres"'
Even though the localhost connection credentials are correct. Why does it throw this error? I have previously had success using dblink to transfer tables from one remote cloud server to another remote cloud server. It just seems to not like moving a table from localhost to a remote server.
**Attempt 2**: using pg_dump to create a .backup file of the table (works ok), then upload it to the destination server using psql:
psql -U your_user_name your_db_name < your_dump_file
This method took 12 hours to insert 5million rows, so I cancelled the process. I don't want to have to wait 72 hours to move this table...
**Attempt 3**: Using PgAdmin's interface to backup the table to a .backup file and then restoring that .backup file to the destination database. However, after waiting 6 hours it was still uploading... I stop the process.
**Attempt 4**: Eventually I got around this issue by purchasing DataGrip (by Jetbrains) and exporting the 29mil rows to 3 CSV files, then importing the data back into the destination db. All the while using the DataGrip GUI. The exporting phase took around 8 mins in total, the uploading took 2.5 hours in total. Then I had to manually add a spatial index to the new table.
What is the quickest, simplest way of moving a large table from localhost to remote? Surely I've missed a trick or two...
Theo F
(151 rep)
Jun 5, 2019, 03:26 PM
• Last activity: Jun 6, 2019, 01:51 PM
Showing page 1 of 20 total questions