Sample Header Ad - 728x90

Database Administrators

Q&A for database professionals who wish to improve their database skills

Latest Questions

0 votes
2 answers
1939 views
How can I use pg_dump to export a single table to CSV that can then be imported by Oracle SQL Loader?
Thanks in advance for any help on this. **Problem Summary:** I'm looking for the most efficient way possible to export a single table from Postgres/Greenplum for a large number of records (100M+) so that it can be imported by Oracle SQL Loader. **Research Background:** I know from research thus far...
Thanks in advance for any help on this. **Problem Summary:** I'm looking for the most efficient way possible to export a single table from Postgres/Greenplum for a large number of records (100M+) so that it can be imported by Oracle SQL Loader. **Research Background:** I know from research thus far that the pg_dump utility is more efficient than Postgres COPY, so I do NOT want to use the COPY command. Using pg_dump has many pluses, and can: 1. Can use multiple threads/cores 2. Can dump a single table of output 3. but to CSV? **My Main Question:** The critical thing I can't figure out yet is how to get pg_dump to export to csv or fixed-width plain text output. **A sidebar question:** I can't seem to find a detailed description (other than, 'The pg_dump --format=custom means the data is compressed') of what exactly the "custom" pg_dump format does to the data. The word "custom" implies that the output should be to a controllable schema, but I haven't been able to locate documentation yet of how this works.
zigmoo (9 rep)
Oct 12, 2022, 04:39 PM • Last activity: May 14, 2025, 01:05 PM
2 votes
3 answers
1855 views
sqlldr running slow after indexes created
let me explain this.... I had ran 1 session of sqlldr while I run indexes into a table for 6 fields... now the indexes are completed but the sqlldr running very slow it loads one file after each 30 minutes... now when I do any alterations to set the indexes off it won't let me do it and though an er...
let me explain this.... I had ran 1 session of sqlldr while I run indexes into a table for 6 fields... now the indexes are completed but the sqlldr running very slow it loads one file after each 30 minutes... now when I do any alterations to set the indexes off it won't let me do it and though an error of ( ORA-000054 and ORA-000604) even if I run another sqlldr.... any suggestion will be highly appreciated.... bellow is a sample of my control files header... OPTIONS(DIRECT=TRUE,ROWS=10000,BINDSIZE=209700000,readsize=209700000) load data append into table name FIELDS TERMINATED by '!' OPTIONALLY ENCLOSED by '"' trailing nullcols When I run the following query select * from dba_dml_locks session_id,owner,name,mode_held,mode_requested,last_convert,blocking_others 123,username,tablename,exclusive,none,1228,not blocking any suggestion ... this is my daily scheduler to load around 8000+ files around 20GB so any suggestion
hi4ppl (141 rep)
Feb 11, 2015, 03:32 AM • Last activity: Apr 21, 2025, 04:03 PM
3 votes
2 answers
9482 views
NLS_NUMERIC_CHARACTERS in SQL loader control file
I am using a Control File to import data from a CSV file into a table, using sqlldr to import the data, where a large number of fields are numeric values with integer and decimal part separated by dots. The Control File works just fine because I cast the number to the input my table expects by using...
I am using a Control File to import data from a CSV file into a table, using sqlldr to import the data, where a large number of fields are numeric values with integer and decimal part separated by dots. The Control File works just fine because I cast the number to the input my table expects by using "TO_NUMBER(:MyVariable, '9999999999D999999', 'NLS_NUMERIC_CHARACTERS = ''.,''')" Although the control file works just fine I would like to know: Is it possible to use: ALTER SESSION SET NLS_NUMERIC_CHARACTERS = "," In this way I don't mind about differences between character used in NLS and the one used to create the file, nor i mind about the actual number of decimal significative number.
gRomano (31 rep)
Jul 1, 2015, 03:59 PM • Last activity: Dec 2, 2024, 03:25 PM
0 votes
0 answers
44 views
Loading a file using sql loader having header record which needs to be column 1 in the table to distinguish client name
I have 3 clients sending their own data files which have the same detail record layout which needs to insert in the same staging table and they are distinguished by header record. File 1 Clientname1 date Field1 Field2 Field3 .... .... .... File 2 Clientname2 date Field1 Field2 Field3 .... .... .......
I have 3 clients sending their own data files which have the same detail record layout which needs to insert in the same staging table and they are distinguished by header record. File 1 Clientname1 date Field1 Field2 Field3 .... .... .... File 2 Clientname2 date Field1 Field2 Field3 .... .... .... I want to basically load these 3 files in same staging table but with column 1 of the table as header clientname record so I can distinguish which client file got loaded in same staging table. How do I add header record as column 1 in same staging table? When we append the files there is no way to know which client sent it as all detail records are exact same. The output in table should look like Clentname1. Field1. Field2. Field3 Clentname1. Field1. Field2. Field3 Clentname1. Field1. Field2. Field3 Clentname2. Field1. Field2. Field3 Clentname2. Field1. Field2. Field3 Clentname2. Field1. Field2. Field3 Clentname3. Field1. Field2. Field3 Clentname3. Field1. Field2. Field3 Clentname3. Field1. Field2. Field3 Thank you
Rahul Vaidya (1 rep)
Aug 30, 2024, 12:40 PM • Last activity: Aug 30, 2024, 01:14 PM
-1 votes
1 answers
154 views
Expecting "," or ")", found "nvarchar2"
Here is my SQL*Loader: Release 11.2.0.2.0 - Production on Fri Nov 10 00:39:56 2023 OS=UBUNTU 20 I am trying to load data from csv file with the help of following code: OPTIONS (SKIP=1) LOAD DATA INFILE '/home/oracle/airports.csv' INTO TABLE airportDemoData fields terminated by ',' optionally enclose...
Here is my SQL*Loader: Release 11.2.0.2.0 - Production on Fri Nov 10 00:39:56 2023 OS=UBUNTU 20 I am trying to load data from csv file with the help of following code: OPTIONS (SKIP=1) LOAD DATA INFILE '/home/oracle/airports.csv' INTO TABLE airportDemoData fields terminated by ',' optionally enclosed by '"' ( country nvarchar2(48), country_code varchar2(2), city nvarchar2(48), city_code varchar2(3), airport nvarchar2(84), airport_code varchar2(3) ) Here is sample of my csv file:
CSV FILE Here is my sqlldr command running in oracle user: sqlldr login/pass control=airport.ctl LOG=airport.log
Here is the error that I am facing:
SQL*Loader: Release 11.2.0.2.0 - Production on Fri Nov 10 00:39:56 2023 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved. SQL*Loader-350: Syntax error at line 7. Expecting "," or ")", found "nvarchar2". country nvarchar2(48), ^ Please help me to point out where I am making mistake Best Regards
Java_Beginner (1 rep)
Nov 9, 2023, 09:25 PM • Last activity: Nov 9, 2023, 09:57 PM
0 votes
0 answers
92 views
Problem to start sqlldr with parfile=parfile.par
If have installed Oracle Client 21.11 under Ubuntu. Sqlplus is working fine so far. Only when I try to use `sqlldr` in combination with parfile it is not working. when I call sqlldr user.../pw...@ORCLPDB1 parfile=parfile.par I got the follwing SQL*Loader: Release 21.0.0.0.0 - Production on Sat Sep 2...
If have installed Oracle Client 21.11 under Ubuntu. Sqlplus is working fine so far. Only when I try to use sqlldr in combination with parfile it is not working. when I call sqlldr user.../pw...@ORCLPDB1 parfile=parfile.par I got the follwing SQL*Loader: Release 21.0.0.0.0 - Production on Sat Sep 23 17:07:40 2023 Version 21.11.0.0.0 Copyright (c) 1982, 2023, Oracle and/or its affiliates. All rights reserved. Path used: Direct No error-message and nothing happen If I only enter sqlldr user.../pw...@ORCLPDB1 without parfile= I got control = and I enter my control-file control = /t/cdp/cdp7/sp3b/data/out/mandant-201/tmp/KEY_0_00020230923122819.ctl SQL*Loader: Release 21.0.0.0.0 - Production on Sat Sep 23 17:19:14 2023 Version 21.11.0.0.0 Copyright (c) 1982, 2023, Oracle and/or its affiliates. All rights reserved. Path used: Conventional Commit point reached - logical record count 1 Commit point reached - logical record count 2 Commit point reached - logical record count 3 Table KEY_0_000XMLF: 119 Rows successfully loaded. Check the log file: KEY_0_00020230923122819.log for more information about the load. I can upload the file without any problem. Has somebody a idea why sqlldr is not taking the parfile ? What could be the error?
Joachim Bönnen
Sep 23, 2023, 05:25 PM • Last activity: Sep 24, 2023, 09:12 AM
0 votes
2 answers
110 views
Efficient approach to Query DB with data in text file
I've got an oracle DB set up and want to Query it after I receive data in a text file. Number of product-names present in text file are around a million. After Searching a lot I've found 2 ways of doing same but I'm not sure which'll be more efficient. 1. Use SQL loader to load data in a table then...
I've got an oracle DB set up and want to Query it after I receive data in a text file. Number of product-names present in text file are around a million. After Searching a lot I've found 2 ways of doing same but I'm not sure which'll be more efficient. 1. Use SQL loader to load data in a table then query using where in or a join 2. Query DB for each product name in loop. The second option is a consideration just because I think insert(using loader)+select will be more expensive than only selects. P.S.: I would prefer less load on DB server rather than less time. File contents are something like: ABC BCD FGS LTB . . NHL I've to query db as Option 1: Run a sqlloader on file products.txt and load contents into TMPTABLE. SELECT PRICE FROM PRICEMAP WHERE PRODUCT_NAME IN (SELECT PRODUCT_NAME FROM TMPTABLE); Option 2: Foreach $productname in products.txt: SELECT PRICE FROM PRICEMAP WHERE PRODUCT_NAME IN=$productname
Anubhav (21 rep)
Feb 11, 2020, 07:24 PM • Last activity: Feb 12, 2020, 06:00 PM
0 votes
1 answers
212 views
SQL*Loader and DC firewall timeout
We are loading a huge dataset using SQLLDR and there are index creation that lasts for a couple hours or even more. Firewall drops this connection as there is no data transfer during index creation for an hour. Oracle doesn't see this session as idle and doesn't drop connection, but firewall kills....
We are loading a huge dataset using SQLLDR and there are index creation that lasts for a couple hours or even more. Firewall drops this connection as there is no data transfer during index creation for an hour. Oracle doesn't see this session as idle and doesn't drop connection, but firewall kills. Do we have any workaround of this issue except changing firewall settings?
Mikhail Aksenov (430 rep)
Feb 6, 2020, 01:17 PM • Last activity: Feb 6, 2020, 01:23 PM
0 votes
1 answers
1780 views
When is the best time to use RMAN vs when to use Data Pump?
And also... if I backup a database with RMAN, is it *not* possible to use `impdp` to restore small parts of that database?
And also... if I backup a database with RMAN, is it *not* possible to use impdp to restore small parts of that database?
Phillip (11 rep)
Jan 25, 2019, 02:16 PM • Last activity: Jan 25, 2019, 08:05 PM
0 votes
1 answers
775 views
sqlldr syntax error
oracle@tdc2mdb07> cat rsm.ctl ************************************************** LOAD DATA infile '/tmp/20181012_bkp.txt' REPLACE INTO TABLE ASSET_ODS.ROADSIDE_MESSAGE FIELDS TERMINATED BY '|' TRAILING NULLCOLS ( RM_ID NOT NULL NUMBER(12), RM_COMMS_DATE TIMESTAMP(3), RM_DETECTION_DATE TIMESTAMP(3),...
oracle@tdc2mdb07> cat rsm.ctl ************************************************** LOAD DATA infile '/tmp/20181012_bkp.txt' REPLACE INTO TABLE ASSET_ODS.ROADSIDE_MESSAGE FIELDS TERMINATED BY '|' TRAILING NULLCOLS ( RM_ID NOT NULL NUMBER(12), RM_COMMS_DATE TIMESTAMP(3), RM_DETECTION_DATE TIMESTAMP(3), RM_TSMC_ID NOT NULL NUMBER(3), RM_CHANNEL NOT NULL NUMBER(1), RM_SEQUENCE_NO NUMBER(10), RM_MESSAGE_CLASS NUMBER(3), RM_MESSAGE_TYPE NUMBER(5), RM_RAW_DATA RAW(1200), RM_STATUS NOT NULL NUMBER(1), RM_SEQUENCE_NUMBER_STATUS VARCHAR2(2), RM_TOLLING_SEGMENT_ID NUMBER(3), RM_ASSET_ID NOT NULL VARCHAR2(4) ) oracle@tdc2mdb07> sqlldr \'/ as sysdba\' rsm.ctl SQL*Loader: Release 12.1.0.2.0 - Production on Mon Oct 15 10:04:06 2018 Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved. SQL*Loader-350: Syntax error at line 7. Expecting "," or ")", found "NOT". RM_ID NOT NULL NUMBER(12), ^
user161696
Oct 14, 2018, 11:07 PM • Last activity: Oct 18, 2018, 07:58 PM
1 votes
0 answers
1564 views
SQL Loader intermittently throws "ORA-12516: TNS:listener could not find available handler with matching protocol stack"
I am running a weekly batch job which is loading data from txt/csv files to database using SQL Loader. But SQL Loader intermittently throws below error: > SQL*Loader-704: Internal error: ulconnect: OCIServerAttach [0] > ORA-12516: TNS:listener could not find available handler with matching protocol...
I am running a weekly batch job which is loading data from txt/csv files to database using SQL Loader. But SQL Loader intermittently throws below error: > SQL*Loader-704: Internal error: ulconnect: OCIServerAttach
> ORA-12516: TNS:listener could not find available handler with matching protocol stack. As understood from similar posts, we have increased maximum number of sessions/connections limit which reduced the failure cases to lower side, but not totally resolved the similar issue. Do we also need to increase the maximum number of concurrent processes, as in the below screen shot I can see that oracle concurrent processes are exhausted? If not, please share your thought process if any other solution to the cause. enter image description here
Abhinav Soni (11 rep)
Sep 4, 2017, 10:04 AM • Last activity: Sep 4, 2017, 01:12 PM
0 votes
1 answers
1419 views
Loading Oracle ctl export file into postgresql
I have a number of tables exported from an oracle database in the form of Ctl files with that data included within the file. Im looking for a way to import these into postgresql but the the enterprise db toolkit seems to want a direct link to an already running oracle instance. A program similar to...
I have a number of tables exported from an oracle database in the form of Ctl files with that data included within the file. Im looking for a way to import these into postgresql but the the enterprise db toolkit seems to want a direct link to an already running oracle instance. A program similar to sql-loader would be excellent.
Stevetech (119 rep)
Jul 12, 2017, 07:12 PM • Last activity: Jul 17, 2017, 05:18 PM
3 votes
1 answers
6170 views
Example of loading into RDS from S3 using SQL*Loader
Does anyone have a working sample of loading data into an Amazon AWS RDS Oracle database using SQL*Loader where the source file is on S3? In principle it's clearly possible. The [Amazon RDS Data Import Guide for Oracle][1] documents using sqlldr. But the only sample there loads data from a local EC2...
Does anyone have a working sample of loading data into an Amazon AWS RDS Oracle database using SQL*Loader where the source file is on S3? In principle it's clearly possible. The Amazon RDS Data Import Guide for Oracle documents using sqlldr. But the only sample there loads data from a local EC2 instance. I have worked with RDS and with S3 and with SQL*Loader before, but I haven't worked with all 3 together. One sample would be really useful. (Related advice would be welcome too... but a sample is all I'm really looking for.)
mdahlman (357 rep)
Jul 24, 2013, 11:25 PM • Last activity: Jun 8, 2017, 06:45 PM
1 votes
0 answers
420 views
Modification of the sql loader control file
I am working on a script which uses Oracle SQL *Loader to load data into a table. However, anywhere I search about the control file of an sql loader I am seeing only statements like: "load data into table TABLE_NAME TRUNCATE FIELDS TERMINATED BY ';' TRAILING NULLCOLS ( Field1, Field2, Field3, Field4...
I am working on a script which uses Oracle SQL *Loader to load data into a table. However, anywhere I search about the control file of an sql loader I am seeing only statements like: "load data into table TABLE_NAME TRUNCATE FIELDS TERMINATED BY ';' TRAILING NULLCOLS ( Field1, Field2, Field3, Field4) ' Can I insert select statement in this control file ? I suppose want to verify if field3 in the incoming record has a certain value for which a record can be found in an existing table containing a similar field and if yes then it load the data and if no then it would send an output to a log file. For example, for the incoming record, say p, I want to verify if the count of records for (select * from table_k where field_demo = p.filed1) is greater than zero or not. If it is greater than zero then it would insert the record into the table otherwise show an output message to the log file.
private ryan (11 rep)
Jul 17, 2016, 08:08 PM • Last activity: Oct 21, 2016, 05:31 AM
0 votes
2 answers
3532 views
Using output value of column from function for another column in SQL*Loader
I have asked the same question on StackOveflow. Do not know, if this site is linked to that or not and thus whether that question will be seen by professionals here, so I am adding it here. Please let me know, if this is inappropriate to do so. ============================== I am trying to define a...
I have asked the same question on StackOveflow. Do not know, if this site is linked to that or not and thus whether that question will be seen by professionals here, so I am adding it here. Please let me know, if this is inappropriate to do so. ============================== I am trying to define a SQL*Loader control file where col1 is determined by using a user function which takes a couple of parameters with values from the input file. This part is working fine. I want to use the value returned by above function as a parameter for another user defined function and that does not work. The value sent to the second function is not the calculated value of the first column. Is this possble? Can someone give me a sample? Here is my control file ( col1 BOUNDFILLER POSITION(1:8), col2 INTEGER EXTERNAL "schema_own.get_col2_val(:col1, :col6)", col3 POSITION(29:33) CHAR, col4 "sysdate", col5 constant " ", col6 POSITION(9:11) CHAR, col7 "sysdate", col8 POSITION(12:18) CHAR, col9 POSITION(19:28) CHAR, col10 POSITION(29:33) CHAR, col11 POSITION(52:63) CHAR, col12 "(col2)", --constant "TEST", col13 "schema_own.get_col13_value(:col1, :col2)", LAST_UPDATE_TS "sysdate", LAST_UPDATE_USER_I constant "SQLLDR" ) What seems to be happening is that the the second parameter in get_col13_value seems to be containing the FIRST character of the col6 instead of the number that col2 should have from the call to get_col2_val. I tried col12 with both (col2) and (:col2) - same outcome. Am I missing something? Can someone also help me understand when to use col2 vs. :col2 in sql loader?
adbdkb (103 rep)
Jan 25, 2015, 03:28 PM • Last activity: Aug 31, 2016, 05:15 AM
4 votes
2 answers
4262 views
Loading XML documents to Oracle 11g DB with control file
I am using Oracle 11g XML database and trying to load XML documents to this DB with a control file and the `sqlldr` utility. All these XML files have an element that contains a date string with time stamp (and the letter T in the middle of it). Oracle rejects this date string because of T in it and...
I am using Oracle 11g XML database and trying to load XML documents to this DB with a control file and the sqlldr utility. All these XML files have an element that contains a date string with time stamp (and the letter T in the middle of it). Oracle rejects this date string because of T in it and thus the XML file is not loaded to the DB. I want to use the Oracle function TO_TIMESTAMP_TZ on the date string during the data load, but I do not know how to do it. That's where I need help. If there is any other way to import the XML (with date string with timestamp), I will try that also. Here is the date entry in XML file: 2013-04-20T21:02:52.468-04:00 And here is entire code the control file: load data infile 'filelist.dat' append into table STXP xmltype(XMLDATA) ( filename filler char(120), XMLDATA lobfile(filename) terminated by eof ) I believe that I can execute the above control file with the sqlldr utility on SQL*Plus command line also, but not sure about this option. If this is possible, I guess I can ALTER SESSION (to somehow format date string) on command line before executing the control file. The filelist.dat mentioned above contains entries for input XML file, with one line listing one XML file. The above date entry is required in each XML file. Each XML file has about 50 different elements, some required and some optional. I would greatly appreciate your help. UPDATE: I successfully registered the schema, which contains definition for the date string, and 100 other schema, with a script. Since this script is very large, I am posting only 2 registration portions of it: DECLARE SCHEMAURL VARCHAR2( 100 ); SCHEMADOC VARCHAR2( 100 ); BEGIN SCHEMAURL := 'http://www.some.org/stxp/DataTypes.xsd '; SCHEMADOC := 'DataTypes.xsd'; DBMS_XMLSCHEMA.registerSchema( SCHEMAURL, BFILENAME( 'XSD_DIR', SCHEMADOC ), LOCAL => TRUE, -- local GENTYPES => TRUE, -- generate object types GENBEAN => FALSE, -- no java beans GENTABLES => TRUE, -- generate object tables OWNER => USER ); SCHEMAURL := 'http://www.some.org/stxp/STXP.xsd '; SCHEMADOC := 'STXP.xsd'; DBMS_XMLSCHEMA.registerSchema( SCHEMAURL, BFILENAME( 'XSD_DIR', SCHEMADOC ), LOCAL => TRUE, -- local GENTYPES => TRUE, -- generate object types GENBEAN => FALSE, -- no java beans GENTABLES => TRUE, -- generate object tables OWNER => USER ); END; / The 2nd registration above is the last in the script, and this creates the table STXP, in which I am trying to load about 800 XML files. Each XML file has a root element called stxp. This is the relevant definition of date string: And this is how I am using the above definition: When I make the above element optional (for testing purpose) and remove the date string entry (mentioned near the top of this question) from my XML file, the XML file is loaded successfully to Oracle XML database. When I put this entry back to XML file (because it is required), Oracle rejects it. Because I let Oracle take care of population of STXP table with data from XML files, I am not sure if I can set a trigger to pre-process the date string from the input XML file before saving it in database. i think there is a way to do it in the control file.
Asif (41 rep)
Apr 21, 2013, 11:06 PM • Last activity: Aug 16, 2016, 08:07 AM
2 votes
3 answers
4051 views
Repopulating Oracle tables using data from Excel/CSV file
I'm developing a Java EE application that utilizes Hibernate/JPA. Some of the data I want to use comes from another Business unit's spreadsheet that I need to periodically (re)load into 5 Oracle tables, overwriting any existing data. I'm using a shell script that kicks off 5 sql*loader control files...
I'm developing a Java EE application that utilizes Hibernate/JPA. Some of the data I want to use comes from another Business unit's spreadsheet that I need to periodically (re)load into 5 Oracle tables, overwriting any existing data. I'm using a shell script that kicks off 5 sql*loader control files, where each control file specifies filler fields to skip because the excel file is not normalized. OPTIONS(skip=2) load data REPLACE --######################################### into table TABLE_NAME fields terminated by "," optionally enclosed by '"' TRAILING NULLCOLS ( ACCOUNT_ID , REST_STOP_ID , DESCRIPTION FILLER , GROUP FILLER , FUNCTION , JOB_ID "substr(:JOB_ID, 0, INSTR(:JOB_ID,' - ',1))", etc... First time import of this data into empty tables using SQL*Loader and a csv file from this spreadsheet works fine. But then any subsequent run sql*loader fails because either control file option: when using REPLACE: SQL*Loader-926: OCI error while executing delete/truncate (due to REPLACE/TRUNCATE keyword) **ORA-02292:** integrity constraint (SCHEMA.FK5BA979794B0A176A) violated - child record found when using TRUNCATE: SQL*Loader-926: OCI error while executing delete/truncate (due to REPLACE/TRUNCATE keyword) **ORA-02266:** unique/primary keys in table referenced by enabled foreign keys I've also tried reordering which the tables are populated but no luck. I've also tried disabling constraints ALTER TABLE TABLE_NAME DISABLE CONSTRAINT SYS_C0090398 ORA-02297: cannot disable constraint (SCHEMA.SYS_C0090398) - dependencies exist (0 rows affected) My work around so far is to drop all the tables restart my application and have Hibernate recreate my 5 tables and then run my sql*loader script. Right now I'm in development so not a big deal. But when we go to production, I can't keep restarting and creating tables. Actually when I go to production, I'll create the tables outside of hibernate but right now I have the persistence option hibernate.hbm2ddl.auto set to update. I'm looking for an elegant (semi)automatic solution to periodically repopulate these 5 tables with data provided to me in the csv/Excel file.
jeff (133 rep)
Sep 8, 2014, 03:20 PM • Last activity: Aug 8, 2016, 04:07 PM
2 votes
1 answers
1105 views
SQL*Loader-926, ORA-04031 unable to allocate x bytes of shared memory
A customer received this error while importing serveral files. The import routine (VB.NET) is using SQL*Loader (Release 11.2.0.3.0) via `Process` on each table separately (64 in total) and is called weekly for many years now. Last week following error was thrown: `value used for ROWS parameter chang...
A customer received this error while importing serveral files. The import routine (VB.NET) is using SQL*Loader (Release 11.2.0.3.0) via Process on each table separately (64 in total) and is called weekly for many years now. Last week following error was thrown: value used for ROWS parameter changed from 1024 to 795 (all tables are imported with ROWS=1024, no BINDARRAY parameter specified) > SQL*Loader-926: OCI error while executing delete/truncate (due to > REPLACE/TRUNCATE keyword) for table x ORA-04031: unable to allocate > 4160 bytes of shared memory ("shared pool","unknown object","sga > heap(1,0)","modification ") I'd googled the ORA-4031 and determined the current memory informations: - 3,08 GB (shared pool) (select sum(bytes), pool from v$sgastat GROUP BY pool), - 922MB free memory (select * from v$sgastat where name = 'free memory';) and - 1,18 GB shareable memory (select sum(SHARABLE_MEM) from v$db_object_cache;) To resolve the issue several posts pointed out to look at the views v$SGA_TARGET, v$PGA_TARGET, v$SGA_TARGET_ADVICE, v$PGA_TARGET_ADVICE, v$SGA_DYNAMIC_COMPONENTS and so on, but i have no dba privilege. (table or view does not exist). Thus i'd like to know whether i can resolve the issue only with SQL*Loader settings or must a DBA investigate into this? EDIT: control file: LOAD DATA INFILE '' INTO TABLE x TRUNCATE() call: sqlldr.exe userid=... log=... ctl=... bad=... rows=1024
Frederick (23 rep)
Oct 4, 2015, 06:30 PM • Last activity: Oct 5, 2015, 09:39 AM
2 votes
0 answers
508 views
oracle bulk loader processing columns with newline characters
I'm trying to load a file with description columns which may contain newline characters. The file is Unix format utf16, tab separated and description columns are enclosed in ". The problem is bulk loader assumes the newline is end of record even though it is enclosed in the " characters, and it woul...
I'm trying to load a file with description columns which may contain newline characters. The file is Unix format utf16, tab separated and description columns are enclosed in ". The problem is bulk loader assumes the newline is end of record even though it is enclosed in the " characters, and it would create more rows than there should be. Here's an example of the record I tried to load: `key_column description 1 "very long description" 2 "some more "" description" 3 "even more ' description"` Here's what I have for my control file: `load data CHARACTERSET UTF16 infile 'C:\LoadWSRworkfolder\book1.txt' badfile 'C:\LoadWSRworkfolder\wsl2106.bad' append into table load_table1 fields terminated by " " optionally enclosed by '"' trailing nullcols ( key_column, description char(4000) ) `
Kevin Tianyu Xu (121 rep)
May 14, 2015, 12:38 AM • Last activity: May 14, 2015, 12:44 AM
1 votes
1 answers
3064 views
Load file based on line number in sqlldr
I am trying to load header and detail records from a file. The only way I know the difference between the header and the detail records is the line number. The first line is always the header and the other lines are detail records. How can I specify the line number of a file when doing a conditional...
I am trying to load header and detail records from a file. The only way I know the difference between the header and the detail records is the line number. The first line is always the header and the other lines are detail records. How can I specify the line number of a file when doing a conditional load? I would like to do something like: LOAD DATA APPEND INTO TABLE x WHEN [line_number] = 1 ([mapping for header line]) WHEN [line_number] > 1 ([mapping for detail lines]) The only reason I ask is b/c I take the primary key from the header record in the database and insert it into each detail record. That is a sequence that doesn't get populated until the insert into the header table. I have to load the header/detail in the same transaction because I can't call currval on the sequence until I call nextval - which is only called in the header insert. Is there a way for me to do this?
Raja (31 rep)
Apr 21, 2015, 09:07 PM • Last activity: Apr 22, 2015, 01:55 PM
Showing page 1 of 20 total questions