Sample Header Ad - 728x90

Database Administrators

Q&A for database professionals who wish to improve their database skills

Latest Questions

1 votes
1 answers
756 views
MongoDB + Compass exporting strings and numbers "columns" with value of 1 as "true"(boolean) and 2 or null as "false" to CSV
When I export my MongoDB collection to a CSV, the String and Number "column" data types values are converted to "true" for 1 and "false" for 2 and null - how do I retain the values as numbers, strings and integers on export to CSV from Compass?
When I export my MongoDB collection to a CSV, the String and Number "column" data types values are converted to "true" for 1 and "false" for 2 and null - how do I retain the values as numbers, strings and integers on export to CSV from Compass?
CoolElectricity (21 rep)
Dec 9, 2020, 05:19 PM • Last activity: Jul 26, 2025, 03:04 AM
0 votes
1 answers
208 views
SQL Server backup file 15GB is empty when downclick trying to export single table from this sep4 a weekly backup
SQL Server backup file 15GB no Backupset selected to be restored? is empty when downclick trying to export single table from see attached why is that is it permissions? [seeTheBackupScreenShot][1] [1]: https://i.sstatic.net/3OxEG.png
SQL Server backup file 15GB no Backupset selected to be restored? is empty when downclick trying to export single table from see attached why is that is it permissions? seeTheBackupScreenShot
Padraig O'Hara
Sep 14, 2021, 04:11 AM • Last activity: Jun 24, 2025, 10:03 AM
2 votes
1 answers
232 views
export data from postgresql
I have some text like xml -code in a field of table. I want to export the data from my field into file without any changes. I've tried to use a lot of variants but all of them changes the text. The most successful variants were: COPY (SELECT alll FROM super.a_a ) TO 'D:\alll.txt' CSV ; But there app...
I have some text like xml -code in a field of table. I want to export the data from my field into file without any changes. I've tried to use a lot of variants but all of them changes the text. The most successful variants were: COPY (SELECT alll FROM super.a_a ) TO 'D:\alll.txt' CSV ; But there appeared extra quotes "". COPY (SELECT alll FROM super.a_a ) TO 'D:\alll.txt CSV QUOTE ' ' ; But there appeared extra spaces. In my field I have such date like : name="firsname" and when I use any of varients I get something like that : name=""firstname"" or name=\n\r"firstname\n\r""
bable (21 rep)
Jun 29, 2016, 12:30 PM • Last activity: May 31, 2025, 01:04 AM
0 votes
1 answers
259 views
Export Wizard never starts Pre-executing
I'm using the Import/Export Wizard in SSMS 2012 to store some data into a flat file, as I have to move these data from one SQL server to another. I've tried with direct connecting to the server through the export wizard but it fails after som time when the export begins due to some connection-timeou...
I'm using the Import/Export Wizard in SSMS 2012 to store some data into a flat file, as I have to move these data from one SQL server to another. I've tried with direct connecting to the server through the export wizard but it fails after som time when the export begins due to some connection-timeout, therefor I stick to the flat-file method. Even if the direct-connection method is really nice. I've been using the flat-file method with an row-count of ~92+ million rows, but now I'm reaching 1.1 billion (calculated) and the Executing and Pre-Executing phase in the Export wizard doesn't seem to cooperate at all. I've started to export 3 hours ago, and it's still looking as you see on the picture below. Any Idea? enter image description here I have seen similar waiting-time for the Pre-execute and executing phase to get done, where it started after 10-20 minutes, and the rows started to count in the pre-execute phase, but 3 hours, I've never tried that before.
dmh (13 rep)
Nov 11, 2015, 11:33 AM • Last activity: May 25, 2025, 01:07 PM
0 votes
1 answers
263 views
Use data from Mysql to ElasticSearch with Logstash
I'm using logstash for use my mysql database in ElasticSearch My conf is the next. ```` input { jdbc { jdbc_connection_string => "jdbc:mysql://[ip]:3306/nextline_dev" jdbc_user => "[user]" jdbc_password => "[pass]" #schedule => "* * * * *" #jdbc_validate_connection => true jdbc_driver_library => "/p...
I'm using logstash for use my mysql database in ElasticSearch My conf is the next.
`
input {
    jdbc {
        jdbc_connection_string => "jdbc:mysql://[ip]:3306/nextline_dev"
        jdbc_user => "[user]"
        jdbc_password => "[pass]"
        #schedule => "* * * * *"
        #jdbc_validate_connection => true
        jdbc_driver_library => "/path/mysql-connector-java-6.0.5.jar"
        jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
        statement => "SELECT * FROM Account"
    }
}
output {
    elasticsearch {
        index => "account"
        document_id => "%{id}"
        hosts => ["127.0.0.1:9200"]
    }
}
` But I have some questions, I want to schedule more than one query, but the index will be always account. Can I make a dynamic index for the output to elasticsearch? And how can I use more than one statement? (Export more than one table)
BlueSeph (121 rep)
Mar 15, 2019, 08:29 PM • Last activity: May 20, 2025, 03:02 PM
2 votes
1 answers
2811 views
Powershell Restore Azure SQL Database to local with Restore-Database
I have written a powershell script that uses `Start-AzureSqlDatabaseExport` & `Get-AzureStorageBlob` to export my Azure DB to a `bacpac` and download it locally, the next step is to restore the `bacpac` to my local SQL DB using `Restore-Database`, but below fails with "the media family on device is...
I have written a powershell script that uses Start-AzureSqlDatabaseExport & Get-AzureStorageBlob to export my Azure DB to a bacpac and download it locally, the next step is to restore the bacpac to my local SQL DB using Restore-Database, but below fails with "the media family on device is incorrectly formatted". Restore-Database -dbName "MyDB" -backupFile "C:\MyAzureBackup.bacpac" Is there a way to get this working with Restore-Database, or is there an alternative command that will do this?
BikerP (121 rep)
Dec 22, 2014, 10:02 AM • Last activity: May 1, 2025, 09:08 AM
1 votes
1 answers
565 views
Is there a way to specify which schema the database task "Export Data-tier Application" exports for the other entities besides the tables?
I noticed in the Export Data-tier Application database task, you can specify which schema to export data from in regards to tables, but I also want to exclude certain schemas for the other object types too such as views and stored procedures. Is this possible?
I noticed in the Export Data-tier Application database task, you can specify which schema to export data from in regards to tables, but I also want to exclude certain schemas for the other object types too such as views and stored procedures. Is this possible?
J.D. (40893 rep)
Feb 13, 2020, 06:51 PM • Last activity: Apr 26, 2025, 05:05 PM
4 votes
2 answers
158 views
MySQL: How to export the output for a non SQL query sentence?
In MySQL 8 server community to export a SQL query's output is possible execute in the `MySQL Shell` the following command (as most basic): ```mysql SELECT * FROM cientifico INTO OUTFILE '/var/lib/mysql-files/cientifico-data.txt'; ``` And it works as expected But for a non SQL query as follows: ```my...
In MySQL 8 server community to export a SQL query's output is possible execute in the MySQL Shell the following command (as most basic):
SELECT * FROM cientifico INTO OUTFILE '/var/lib/mysql-files/cientifico-data.txt';
And it works as expected But for a non SQL query as follows:
SHOW PROCESSLIST INTO OUTFILE '/var/lib/mysql-files/processlist.txt';
Throws the following error > ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'INTO OUTFILE '/var/lib/mysql-files/processlist.txt'' at line 1 Same as:
SHOW DATABASES INTO OUTFILE '/var/lib/mysql-files/databases.txt';
Giving > ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'INTO OUTFILE '/var/lib/mysql-files/databases.txt'' at line 1 Therefore if is possible **Question** * How to export the output for a non SQL query sentence? I am assuming that the INTO OUTFILE syntax is only for SQL sentences, but for non SQL sentences?
Manuel Jordan (229 rep)
Apr 18, 2025, 03:31 AM • Last activity: Apr 22, 2025, 08:27 PM
0 votes
2 answers
4992 views
MongoDB CSV Export: One Array/Column & Array Index on rows, or vice versa
I have a MongoDB schema that looks like this: const batchSchema = mongoose.Schema({ _id: mongoose.Schema.Types.ObjectId, time: {type: [Number]}, tmp: {type: [Number]}, hum: {type: [Number]}, co2: {type: [Number]}, coolRelay: {type: [Boolean]}, humRelay: {type: [Boolean]}, fanRelay: {type: [Boolean]}...
I have a MongoDB schema that looks like this: const batchSchema = mongoose.Schema({ _id: mongoose.Schema.Types.ObjectId, time: {type: [Number]}, tmp: {type: [Number]}, hum: {type: [Number]}, co2: {type: [Number]}, coolRelay: {type: [Boolean]}, humRelay: {type: [Boolean]}, fanRelay: {type: [Boolean]}, lightRelay: {type: [Boolean]}, }); My goal is to have a csv file I can import to excel for use in creating charts, graphs, and other visuals based on the data. Using Studio 3T(https://robomongo.org/) , this doesn't seem to be possible unless I'm just not understanding the settings. Doesn't really matter if the data is organized horizontally or vertically. As long as each is on it own column/row, excluding the _id field. 3T seems to be only capable of placing them all along the same row. I just tried using mongo's native cli csv export tool but found that didn't seem to come with my release. I'm running version 3.6 on openSuse, arm64, on a raspberry pi. mongoexport as a command was not found, and when I ran the suggested cnf mongoexport to find a package containing it, nothing was returned either. I know that I can export one file, per array using 3T, then cut/paste them together, however I hadn't planned on paying the annual license for 3T and was only using the free trial to get a more usable gui while in development. Was hoping to be able to write predefined bash script that could be reused repeatedly to export each "batch" document as it was completed. Is there something I am missing about mongoexport or is this a limitation of the aarch64 release? I really don't want to have to go back, learn, and re-write everything for postgreSQL at this point. I mean, I will if I have to, but wanted to check in with you all first.
AustinFoss (119 rep)
Mar 31, 2019, 03:00 AM • Last activity: Apr 18, 2025, 09:00 AM
1 votes
1 answers
2360 views
Oracle - Delete dmp files from ASM diskgroup
I want to delete my export files daily from ASM diskgroup with a crontab job. I want to prepare a script for it. ASMCMD> ls exp1.dmp exp2.dmp exp3.dmp exp4.dmp ASMCMD> pwd +DATA/EXP How can I prepare this script? I prepared a template for it, but I couldn't script it. ``` set_gridenv asmcmd cd +DATA...
I want to delete my export files daily from ASM diskgroup with a crontab job. I want to prepare a script for it. ASMCMD> ls exp1.dmp exp2.dmp exp3.dmp exp4.dmp ASMCMD> pwd +DATA/EXP How can I prepare this script? I prepared a template for it, but I couldn't script it.
set_gridenv
asmcmd
cd +DATA/EXP
rm -rf exp*.dmp
exit
jrdba (55 rep)
Sep 21, 2020, 10:45 AM • Last activity: Apr 8, 2025, 03:11 AM
1 votes
3 answers
9106 views
How to export a document containing table names, column names and column descriptions from SQL Server 2008?
I created a database in SQL Server 2008 and I wrote some column descriptions to describe them. I now want to export a document with the table design including the description for each column so it can be printed and discussed with the teams.
I created a database in SQL Server 2008 and I wrote some column descriptions to describe them. I now want to export a document with the table design including the description for each column so it can be printed and discussed with the teams.
Amr Elgarhy (169 rep)
Dec 7, 2011, 03:13 PM • Last activity: Feb 26, 2025, 03:06 PM
0 votes
0 answers
64 views
Copy table data from one RDS MySQL server to another RDS MySQL server
I have ten standalone MySQL RDS instances, each with separate credentials and configurations. Here's an example for clarity (dummy credentials provided): Instance ABC: ```lang-none MySQL Host: abc.dbsjhbns.east-j-rds.amazonaws.com Username: user1 Password: [REDACTED] SSH Host: ec2-22-33-22.compute-a...
I have ten standalone MySQL RDS instances, each with separate credentials and configurations. Here's an example for clarity (dummy credentials provided): Instance ABC:
-none
MySQL Host: abc.dbsjhbns.east-j-rds.amazonaws.com
Username: user1
Password: [REDACTED]
SSH Host: ec2-22-33-22.compute-amazonaws.com
SSH User: ec2
SSH Key: [REDACTED]
Instance DEF:
-none
MySQL Host: def.dbsjhbns.east-j-rds.amazonaws.com
Username: user2
Password: [REDACTED]
SSH Host: ec2-22-33-22.compute-amazonaws.com
SSH User: ec2
SSH Key: [REDACTED]
Both instances have the same schema, called Enterprise, with most tables being identical. My goal is to copy data for a specific table from instance ABC to DEF (and vice versa). ## Solutions I have considered: ### Federated tables While possible, this would require creating federated tables for all tables in both databases, which is inefficient and has its limitations. ### mysqldump This involves exporting data to a .sql file and then importing it into the other instance. While feasible, it's a lengthy process, especially for frequent transfers. ## What I’m Looking For: Are there alternative solutions to efficiently transfer table data between these standalone RDS instances? Are there any third-party tools or best practices that can simplify this process? The data transfer is not very frequent. It might be required once or twice a month or occasionally once or twice a week, depending on the specific need at the time.
RajeevKorthiwada (1 rep)
Jan 24, 2025, 12:06 PM • Last activity: Jan 27, 2025, 05:24 AM
2 votes
2 answers
2366 views
Unexpected newlines in json output in Postgresql
I'm squashing a datastructure to JSON objects using the PostgreSQL Json functions, however, there are unexpected newlines in the output. Basically, it creates JSON objects and put them in a list. This is the SQL: copy ( select json_agg(Y.*) from ( select table_logical_id as "logicalId", max(table_ac...
I'm squashing a datastructure to JSON objects using the PostgreSQL Json functions, however, there are unexpected newlines in the output. Basically, it creates JSON objects and put them in a list. This is the SQL: copy ( select json_agg(Y.*) from ( select table_logical_id as "logicalId", max(table_actual_name) as "tableName", max(table_display_name) as "displayName", max(table_restriction_expression) as "restrictionExpression", json_strip_nulls(json_agg( json_build_object('field' , column_actual_name, 'type' , case column_type when 'DATE_TIME' then 'DATE' else column_type end, 'displayName' , column_display_name, 'displayLocation' , column_display_location, ) ORDER BY column_order ASC )) as "columns", coalesce( json_agg( json_build_object( 'field' , column_actual_name, 'direction' , case column_is_asc_sort when true then 'ASC' else 'DESC' end ) order by column_sort_order desc ) filter (where column_sort_order is not null) , '[]') as "defaultSort" from my_table_config group by table_logical_id ) Y ) TO STDOUT WITH (FORMAT TEXT, ENCODING 'UTF8'); The output contains a literal \n (so 2 characters, not the line feed control character) between each json record, like this: ..."direction" : "ASC"}]}, \n {"logicalId ... Is this a bug in PostgreSQL? Or, how can I prevent this? UPDATE: Exporting a BINARY as suggested in the answer does not work, as that returns something that starts with: 00000000: 5047 434f 5059 0aff 0d0a 0000 0000 0000 PGCOPY.......... 00000010: 0000 0000 0100 7480 cc5b 7b22 6c6f 6769 ......t..[{"logi 00000020: 6361 6c49 6422 3a22 4155 4449 545f 3230 calId":"AUDIT_20
Rob Audenaerde (213 rep)
Oct 31, 2022, 09:46 AM • Last activity: Jan 18, 2025, 02:07 PM
0 votes
3 answers
41035 views
Encountering "Snapshot too old" error during execution of expdp command
As am facing an issue while performing expdp command in my production db.(Oracle 11g in windows enviornment) >cmd> expdp 'sys/123@PROD as sysdba' DUMPFILE=BACKUP_02082017_BACKUP.dmp LOGFILE=BakupLog_02082017_BACKUP.log SCHEMAS=A1,B2,C3,D4.. exclude=statistics consistent=y It was taking more than 1 d...
As am facing an issue while performing expdp command in my production db.(Oracle 11g in windows enviornment) >cmd> expdp 'sys/123@PROD as sysdba' DUMPFILE=BACKUP_02082017_BACKUP.dmp LOGFILE=BakupLog_02082017_BACKUP.log SCHEMAS=A1,B2,C3,D4.. exclude=statistics consistent=y It was taking more than 1 day to export the database sized 7GB. But my issue is that the exporting have error and shows error message >ORA-31693: Table data object "owner"."PASSWORD_HISTORY" failed to load/unload and is being skipped due to error: ORA-02354: error in exporting/importing data ORA-01555: snapshot too old: rollback segment number 19 with name "_SYSSMU19_255734752$" too small As i set my retention policy to 16500 from default 900. Even though, same error was occurring. As i am planning to increase the retention policy up to 10 hrs ie, 36000. Is it viable? I am confused that do my undo table space is capable for this or not? Providing some more details: > show parameter undo_%; NAME TYPE VALUE -------------------------------------------------- ----------- -------- undo_management string AUTO undo_retention integer 16500 undo_tablespace string UNDOTBS1 > select file_name,tablespace_name,trunc(bytes/1024/1024) mb, trunc(maxbytes/1024/1024) mm FROM dba_data_files where tablespace_name = 'UNDOTBS1'; FILE_NAME TABLESPACE_NAME MB MM -------------------------------------------------------------------- C:\APP\ADMIN\ORADATA\PROD\UNDOTBS01.DBF UNDOTBS1 5630 32767 >Size of undo with current undo_retention : Actual Undo size[MBytes]:5630 UNDO retention[Sec]:16500 Needed Undo Size[MBytes]:909.433359 I am stuck with this issue. Anyone please advice how i deal with this error? Thanks in advance.
SHS (152 rep)
Feb 8, 2017, 01:09 PM • Last activity: Dec 10, 2024, 03:35 PM
3 votes
2 answers
17118 views
Can we export data from PostgreSQL to xlsx file?
I want to export the result of a query to xlsx file format. I know we can export it to csv file, but I want to export it to xlsx file. Like the following: copy(select name_related from hr_employee limit 3) to '/tmp/ABC.xlsx' delimiter ',' xlsx header;
I want to export the result of a query to xlsx file format. I know we can export it to csv file, but I want to export it to xlsx file. Like the following: copy(select name_related from hr_employee limit 3) to '/tmp/ABC.xlsx' delimiter ',' xlsx header;
Abdul Raheem Ghani (562 rep)
May 3, 2016, 10:20 AM • Last activity: Dec 7, 2024, 02:54 PM
0 votes
0 answers
103 views
SSMS - Automatically save SQL query result to a file
I use SSMS for SQL queries. I've find how to make JSON result with `FOR JSON AUTO` or `FOR JSON PATH`. Now I want to schedule request result to a JSON file with MaintenancePlan. How I can do that with SSMS, without bcp ?
I use SSMS for SQL queries. I've find how to make JSON result with FOR JSON AUTO or FOR JSON PATH. Now I want to schedule request result to a JSON file with MaintenancePlan. How I can do that with SSMS, without bcp ?
David C. - ProServ (13 rep)
Nov 20, 2024, 12:02 AM • Last activity: Nov 20, 2024, 12:05 AM
1 votes
1 answers
7769 views
Import .pde file on PL/SQL Developer
How can I import my .pde file which I already exported from another database by PL/SQL Developer?
How can I import my .pde file which I already exported from another database by PL/SQL Developer?
Yasin Okumuş (143 rep)
Feb 16, 2012, 05:32 PM • Last activity: Aug 29, 2024, 01:50 PM
0 votes
0 answers
122 views
How can I export PostgreSQL data from multiple tables joined by foreign keys, preferably as insert statements?
I'd like to automate exporting data from one database to be inserted to another. The data is stored in multiple tables joined by foreign keys, so rows from the parent table need to be inserted by for the child tables. An example: Tables: create table table_a (id bigserial primary key, a_name text nu...
I'd like to automate exporting data from one database to be inserted to another. The data is stored in multiple tables joined by foreign keys, so rows from the parent table need to be inserted by for the child tables. An example: Tables: create table table_a (id bigserial primary key, a_name text null ); create table table_b (id bigserial primary key, b_name text null, table_a_id bigint not null references table_a); create table table_c (id bigserial primary key, c_name text null, table_b_id bigint not null references table_b); Now I want to select all the data associated with table_a.id = 1 that for table_a, I can select all of that data following foreign keys using: select * from table_a ta join table_b tb on ta.id = tb.table_a_id join table_c tc on tb.id = tc.table_b_id and ta.id = 1; I get: id,a_name,id,b_name,table_a_id,id,c_name,table_b_id 1,a_test,2,b_test,1,1,c_test,2 But is there a way to export all of the data in a way that is easy to import into another database? My rich db client (DatGrip) can export data but only one table at a time and my real example has far more than 3 tables I need to export. Ideally I would like insert statements but csv is okay if it works. I'll assume that there won't be any primary key conflicts. Thanks for your help!
Kramer (101 rep)
Jul 11, 2024, 06:46 PM • Last activity: Jul 11, 2024, 10:41 PM
0 votes
1 answers
144 views
Export-DbaScript: How can I make it make a directory if it doesn't already exist?
I like to use `Export-DbaScript` to export to folders that are named `C:\TargetFolder\Database\Schema`. Often, I don't already have a `Database\Schema` folder, which will cause `Export-DbaScript` to fail to write to that folder. Is there any way to make `Export-DbaScript` create the folder if it doe...
I like to use Export-DbaScript to export to folders that are named C:\TargetFolder\Database\Schema. Often, I don't already have a Database\Schema folder, which will cause Export-DbaScript to fail to write to that folder. Is there any way to make Export-DbaScript create the folder if it doesn't already exist? I'm expecting something like a -Force parameter, but I have not found it in the documentation. I'm doing
Export-DbaScript `
  -InputObject $_ `
  -NoPrefix `
  -NoClobber `
  -FileName "$($MyLocation)\$($_.Database)\$($_.Name).sql
When the path doesn't exist, I get System.IO.DirectonaryNotFoundExceptions. I'm on version 2.1.18
J. Mini (1225 rep)
Jul 5, 2024, 06:35 PM • Last activity: Jul 7, 2024, 03:07 PM
0 votes
1 answers
117 views
Microsoft SQL Server Import Export Wizard on Remote SQL Server
Is it possible to run the SQL Server Import/Export Wizard purely on a remote server (e.g., from one database to another on the same remote server)? If I setup the source/destination connections in the Wizard on my local machine, I can only enter connections to the remote server as I would make them...
Is it possible to run the SQL Server Import/Export Wizard purely on a remote server (e.g., from one database to another on the same remote server)? If I setup the source/destination connections in the Wizard on my local machine, I can only enter connections to the remote server as I would make them from my local machine. I cannot enter the connections as the remote server would make them. If I then run the Wizard, it transfers the data via my local machine, even though it would be much faster if the remote server connected directly to the source/destination. For example: - A single SQL Server on a remote machine with two databases: DB1 and DB2. - SSMS on local machine connected to SQL Server on remote machine. - Task: copy (very large) table from DB1 to DB2. - Setup Import/Export Wizard with connections to both tables. - Issue: data transfer via local machine (from DB1 to local machine to DB2). I have looked at the Wizard documentation , but I couldn’t find any description of this issue. Am I missing something? I would guess that most SQL Servers run on some remote machine and not locally.
Phil C (103 rep)
Jun 26, 2024, 07:19 AM • Last activity: Jun 26, 2024, 08:18 AM
Showing page 1 of 20 total questions