Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
0
votes
1
answers
233
views
How to use CLOB datatype for storing limitless characters in column each rows?
I want to use CLOB datatype instead of varchar2(4000) and long, in which I want store limitless characters . However, its easy to declare the string based column to CLOB but the main problem is when I insert the paragraph it stores and on runtime when check it shows just a single line.. As am new, y...
I want to use CLOB datatype instead of varchar2(4000) and long, in which I want store limitless characters . However, its easy to declare the string based column to CLOB but the main problem is when I insert the paragraph it stores and on runtime when check it shows just a single line.. As am new, your guidance will help me a-lot thanks.
I have tried this, on runtime it only displays limited characters. I want to store one page paragraph in single row and on runtime it displays all the characters of the paragraph.
Note:
Using SQL*PLUS

Faisal.softech
(1 rep)
Sep 20, 2022, 07:58 AM
• Last activity: Jun 5, 2025, 09:07 AM
0
votes
1
answers
3437
views
inserting blob/clob over dblinks
I'm trying to find the best way to load a remote table with a blob column (blob only contains text, so clob would have been better). So far we've been able to reduce loadtime from 218 minutes to 2 minutes using a view on the source table and converting the blob to varchar2 using (simplified): ``` CR...
I'm trying to find the best way to load a remote table with a blob column (blob only contains text, so clob would have been better).
So far we've been able to reduce loadtime from 218 minutes to 2 minutes using a view on the source table and converting the blob to varchar2 using (simplified):
CREATE OR REPLACE FORCE EDITIONABLE VIEW "TABLE_V" ("ID", "DATE", "CLOB_PT1", "CLOB_PT2", "CLOB", CONSTRAINT "TABLE_V_PK" PRIMARY KEY ("ID") RELY DISABLE) AS
SELECT
"ID",
"DATE",
UTL_RAW.CAST_TO_VARCHAR2(DBMS_LOB.SUBSTR(BLOB, 2000, 1)) "CLOB_PT1",
UTL_RAW.CAST_TO_VARCHAR2(DBMS_LOB.SUBSTR(BLOB, 2000, 2001)) "CLOB_PT2",
CASE WHEN LENGTH(INHOUD) > 4000 THEN TO_CLOB(BLOB) END "CLOB"
FROM OWNER.TABLE;
Because we cannot guarantee the size of the blob will continue to fit in our two varchars we put anything bigger into a separate clob column in the view.
now the idea is to merge the varchars or clob back to one clob in our staging area.
Here come's the challenge of ORA-22992: cannot use LOB locators selected from remote tables
Concatenating the two varchars into the targeted clob column works fine.
INSERT
/*+ APPEND PARALLEL */
INTO SA.TESTTABLE
(
ID ,
DATE ,
CLOB
)
SELECT
TABLE_V_1.ID ,
TABLE_V_1.DATE ,
TABLE_V_1.CLOB_PT1||TABLE_V_1.CLOB_PT2
FROM
OWNER.TABLE_V_1@DB_LNK TABLE_V_1
Inserting the clob into the targeted clob column works fine.
INSERT
/*+ APPEND PARALLEL */
INTO SA.TESTTABLE
(
ID ,
DATE ,
CLOB
)
SELECT
TABLE_V_1.ID ,
TABLE_V_1.DATE ,
CLOB
FROM
OWNER.TABLE_V_1@DB_LNK TABLE_V_1
Using a 'case when' inserting either the concatenated varchars or the clob column fails!
INSERT
/*+ APPEND PARALLEL */
INTO SA.TESTTABLE
(
ID ,
DATE ,
CLOB
)
SELECT
TABLE_V_1.ID ,
TABLE_V_1.DATE ,
CASE WHEN CLOB IS NOT NULL
THEN TABLE_V_1.CLOB_PT1||TABLE_V_1.CLOB_PT2
ELSE TABLE_V_1.INHOUD_CLOB
END
FROM
OWNER.TABLE_V_1@DB_LNK TABLE_V_1
I also tried concatenating all three columns by default, (varchars1 and 2 will be empty if size blob >4000), same error.
Why is this failing?
Any suggestions on getting this to work?
R. Sluiter
(1 rep)
Nov 6, 2020, 11:30 AM
• Last activity: May 27, 2025, 04:05 PM
0
votes
1
answers
1310
views
UTL_HTTP call inside trigger leads to ORA-06502 when inserting from select
In order to allow a developer to make a SOAP call from an Oracle 11gR1 (11.1.0.7 PSU 24) instance using UTL_HTTP. He created a stored procedure : CREATE OR REPLACE PROCEDURE CALL_WS (URL IN VARCHAR2, REQUEST IN VARCHAR2, RESPONSE OUT VARCHAR2, ERRLOG OUT VARCHAR2) IS L_HTTP_REQ UTL_HTTP.REQ; L_HTTP_...
In order to allow a developer to make a SOAP call from an Oracle 11gR1 (11.1.0.7 PSU 24) instance using UTL_HTTP.
He created a stored procedure :
CREATE OR REPLACE PROCEDURE CALL_WS (URL IN VARCHAR2, REQUEST IN VARCHAR2,
RESPONSE OUT VARCHAR2, ERRLOG OUT VARCHAR2) IS
L_HTTP_REQ UTL_HTTP.REQ;
L_HTTP_RESP UTL_HTTP.RESP;
L_HTTP_URL_V VARCHAR2 (32767);
IP_INPUTDATA_V VARCHAR2 (32767);
L_OUTPUTDATA_V VARCHAR2 (32767);
BEGIN
L_HTTP_URL_V := URL;
IP_INPUTDATA_V := REQUEST;
ERRLOG := NULL;
-- Configuration
UTL_HTTP.SET_DETAILED_EXCP_SUPPORT (TRUE);
L_HTTP_REQ := UTL_HTTP.BEGIN_REQUEST (L_HTTP_URL_V, 'POST', 'HTTP/1.1');
-- Authentification (not necessary at the moment)
--UTL_HTTP.SET_AUTHENTICATION (L_HTTP_REQ, L_HTTP_USERNAME_V, L_HTTP_PASSWORD_V);
UTL_HTTP.SET_PERSISTENT_CONN_SUPPORT (L_HTTP_REQ, FALSE);
-- Headers
UTL_HTTP.SET_HEADER (L_HTTP_REQ, 'Content-Type', 'text/xml');
UTL_HTTP.SET_HEADER (L_HTTP_REQ, 'charset', 'UTF-8');
UTL_HTTP.SET_HEADER (L_HTTP_REQ, 'Content-Length', LENGTH (IP_INPUTDATA_V));
--WRITES SOME TEXT DATA IN THE HTTP REQUEST BODY
UTL_HTTP.WRITE_TEXT (L_HTTP_REQ, IP_INPUTDATA_V);
-- GET HTTP RESPONSE
L_HTTP_RESP := UTL_HTTP.GET_RESPONSE (L_HTTP_REQ);
-- GET THE RESPONSE TEXT VALUE
UTL_HTTP.READ_TEXT (L_HTTP_RESP, L_OUTPUTDATA_V);
-- Check HTTP status code
IF (L_HTTP_RESP.STATUS_CODE 200) THEN
ERRLOG := 'HTTP REQUEST CALL FAILED. STATUS CODE IS ' || L_HTTP_RESP.STATUS_CODE;
END IF;
UTL_HTTP.END_RESPONSE (L_HTTP_RESP);
-- Set Output parameter
RESPONSE := L_OUTPUTDATA_V;
EXCEPTION
WHEN OTHERS THEN
RESPONSE := NULL;
ERRLOG := SQLERRM;
END;
And a BEFORE INSERT trigger as follows:
CREATE OR REPLACE TRIGGER TR_kkk_AFTERIN
BEFORE INSERT OR UPDATE ON CHU_kkk
REFERENCING NEW AS N OLD AS O
FOR EACH ROW
DECLARE
-- Variable declarations
l_URL VARCHAR2 (32767);
l_REQUEST VARCHAR2 (32767);
l_RESPONSE VARCHAR2 (32767);
l_ERRLOG VARCHAR2 (32767);
v_def definitions.compterendu%TYPE;
BEGIN
IF :n.traite = 0 THEN
-- Variable initializations
l_URL := 'http://zz/uu/services.asmx ';
v_def := NULL;
-- Get request SOAP model in a Omnipro definition
SELECT VALUE INTO v_def
FROM definitions
WHERE cle LIKE 'kkk';
IF v_def IS NOT NULL THEN
-- Replace values in the request SOAP
l_REQUEST := REPLACE(v_def, '@@DPTID@@', :n.dptid);
l_REQUEST := REPLACE(l_REQUEST, '@@USERID@@', :n.userid);
l_REQUEST := REPLACE(l_REQUEST, '@@APPTID@@', :n.apptid);
l_REQUEST := REPLACE(l_REQUEST, '@@REFID@@', :n.refid);
l_REQUEST := REPLACE(l_REQUEST, '@@REFAPP@@', :n.refapp);
l_REQUEST := REPLACE(l_REQUEST, '@@DTEDEB@@', TO_CHAR(:n.dtedeb, 'yyyy-mm-dd'));
l_REQUEST := REPLACE(l_REQUEST, '@@DTEFIN@@', TO_CHAR(:n.dtefin, 'yyyy-mm-dd'));
l_REQUEST := REPLACE(l_REQUEST, '@@NUMNAT@@', :n.numnat);
l_REQUEST := REPLACE(l_REQUEST, '@@NUMREF@@', :n.numref);
-- Call
CALL_WEB_SERVICE (URL => l_URL,
REQUEST => l_REQUEST,
RESPONSE => l_RESPONSE,
ERRLOG => l_ERRLOG);
IF l_ERRLOG IS NOT NULL THEN
:n.traite := 9;
:n.reflog := l_ERRLOG || ' : ' || SUBSTR (l_RESPONSE, 0, 1900);
ELSE
:n.traite := 1;
END IF;
:n.dtetrt := SYSDATE;
END IF;
END IF;
EXCEPTION
WHEN OTHERS THEN
-- Consider logging the error and then re-raise
:n.traite := 9;
:n.reflog := SQLERRM;
:n.dtetrt := SYSDATE;
--RAISE;
END;
Everything works well for an INSERT query with a single value, but fails whenever he calls:
INSERT INTO ... VALUES ...
COMMIT;
INSERT INTO ... VALUES ...
COMMIT;
He gets an ORA-06502.
If he calls each insert separately, there is no error...
I suggested him to investigate using SQLDeveloper debugger, but we are not sure that it will lead to something...
What could cause that ?
Jefferson B. Elias
(536 rep)
Jan 21, 2021, 10:39 AM
• Last activity: Aug 21, 2021, 10:01 AM
4
votes
1
answers
1215
views
How to migrate Unicode UTF-8 CLOB data from Oracle to SQL Server 2017 UTF-?
I am currently assisting in a migration effort of this application from Oracle 12c to SQL Server 2017. Initially I just performed table inserts using openquery to Oracle. I discovered that the tables which contained CLOB data could not be migrated using a single table to table insert query otherwise...
I am currently assisting in a migration effort of this application from Oracle 12c to SQL Server 2017. Initially I just performed table inserts using openquery to Oracle. I discovered that the tables which contained CLOB data could not be migrated using a single table to table insert query otherwise we would end up with dirty data. I am able to do the inserts by dynamically generating a single insert statement per row and running thousands of lines of insert statements. The problem is I've now come across a table which contains over 300k records with CLOB data in them. The one record at a time insert is taking an extremely long time and may at this rate run over 24hrs which is unacceptable.
What is my best option for migrating such a large amount of CLOB data from Oracle? Should I use the bulk tools like BCP, BULK INSERT or OPENROWSET?
**Edit/Update:** I have since learned that my troubles are most likely due to the UTF-8 encoding at the source Oracle system. Both methods I've used for migrating both CLOB and BLOB data have resulted in mismatched rows or missing rows.
UTF-8 has some known issues with SQL Server. The ones in particular I'm dealing with, Linked Server OPENQUERY and SSIS Project deployments, are both fixed in SQL Server 2019 .
Geoff Dawdy
(1143 rep)
May 1, 2019, 05:12 PM
• Last activity: Apr 3, 2021, 07:01 PM
1
votes
1
answers
4941
views
How to use GROUP BY on a CLOB Column
i'm trying to use this QUERY which contains a Clob Column (flow.IDFONCTIONNEL) ``` SELECT flow.flowid, min(flow.CONTEXTTIMESTAMP) contextTime, flow.STATUT, flow.IDFONCTIONNEL, flow.ETAT FROM Flux flow WHERE flow.FLOWCODE = 'HELLO' AND flow.CONTEXTTIMESTAMP BETWEEN '06/01/20 11:36:21,566000000' AND '...
i'm trying to use this QUERY which contains a Clob Column (flow.IDFONCTIONNEL)
SELECT
flow.flowid,
min(flow.CONTEXTTIMESTAMP) contextTime,
flow.STATUT,
flow.IDFONCTIONNEL,
flow.ETAT
FROM Flux flow
WHERE flow.FLOWCODE = 'HELLO'
AND flow.CONTEXTTIMESTAMP BETWEEN '06/01/20 11:36:21,566000000' AND '06/07/20 11:36:21,566000000'
GROUP BY flow.flowid, flow.STATUT , flow.ETAT, flow.IDFONCTIONNEL
ORDER BY contextTime desc
When I run this query, I get the error
ORA-00932: inconsistent data types Expected got CLOB
This is because the column flow.IDFONCTIONNEL is a CLOB data type. If I comment this column from select clause it works fine but I need this column in the output.
I have seen a post telling to try to use DBMS_LOB.SUBSTR to try avoid this problem, so i tryed :
SELECT
flow.flowid,
min(flow.CONTEXTTIMESTAMP) contextTime,
flow.STATUT,
DBMS_LOB.SUBSTR(flow.IDFONCTIONNEL,4000,1) as idf1,
DBMS_LOB.SUBSTR(flow.IDFONCTIONNEL,8000,4001) as idf2,
flow.ETAT
FROM Flux flow
WHERE flow.FLOWCODE = 'HELLO'
AND flow.CONTEXTTIMESTAMP BETWEEN '06/01/20 11:36:21,566000000' AND '06/07/20 11:36:21,566000000'
GROUP BY flow.flowid, flow.STATUT,flow.ETAT, idf1 ,idf2
ORDER BY contextTime desc
But i get ORA-00904: "IDF2" invalid identifier.
Anyone as an idea of why it is not working ?
Thanks a lot
Dallincha
(15 rep)
Jul 23, 2020, 09:07 AM
• Last activity: Jul 23, 2020, 09:42 AM
1
votes
1
answers
244
views
Are LOBs also backed up if I take dump of all partitions of table individually?
I am taking export of a large table in oracle through datapump. I am taking export of all partitions one by one. The table has a LOB column as well. Are LOBs also backed up with partitions? Because partitions seem to be very small as compared to LOBs
I am taking export of a large table in oracle through datapump. I am taking export of all partitions one by one. The table has a LOB column as well. Are LOBs also backed up with partitions? Because partitions seem to be very small as compared to LOBs
user3102917
(23 rep)
Nov 6, 2019, 09:37 AM
• Last activity: Feb 18, 2020, 12:05 PM
1
votes
1
answers
245
views
Migrating 2+ GB Oracle CLOB column to SQL Server VARCHAR(MAX)
I have a column on an Oracle db containing HTML data with embedded images that result in some records being over 2GB in size for that column. I have successfully migrated the rest of the database to SQL Server except for this CLOB column. With SQL Server varchar(max) unable to handle anything over 2...
I have a column on an Oracle db containing HTML data with embedded images that result in some records being over 2GB in size for that column. I have successfully migrated the rest of the database to SQL Server except for this CLOB column. With SQL Server varchar(max) unable to handle anything over 2GB in size, what are my options?
Geoff Dawdy
(1143 rep)
Apr 1, 2019, 09:52 PM
• Last activity: Apr 1, 2019, 10:45 PM
Showing page 1 of 7 total questions