Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
1
votes
1
answers
5284
views
timestamp with timezone issue with TZR vs TZH:TZM
I have a Hibernate with Oracle JDBC based application inserting/updating into the following column; COLUMN_A TIMESTAMP WITH TIME ZONE DEFAULT systimestamp NOT NULL but while it automatically appends timezone, it uses `TZR`, e.g. `UTC`, and weird thing with this data type in Oracle is that, it retain...
I have a Hibernate with Oracle JDBC based application inserting/updating into the following column;
COLUMN_A TIMESTAMP WITH TIME ZONE DEFAULT systimestamp NOT NULL
but while it automatically appends timezone, it uses
TZR
, e.g. UTC
, and weird thing with this data type in Oracle is that, it retains the format of timezone from insertion when we select it without any formatting, [explained here](https://stackoverflow.com/questions/51417100/timestamp-with-timezone-retains-timezone-format)
With this said, we are utilizing Golden Gate that is replicating this data to MongoDB, but afaik it requires these timestamps to contain only TZH:TZM
formatting, so I have this issue where one side blocked by Oracle JDBC, where insertion is done with TZR
, and one side blocked by Golden Gate where data is expected with TZH:TZM
Are there any way to handle this issue? Can I not forbid a certain formatting for TIMESTAMP WITH TIME ZONE
? Using NLS_TIMESTAMP_TZ_FORMAT
works for SELECT
formatting, but it is not usable for Golden Gate I was told. Also for INSERT
case I was able to utilize default
value of the column, and using systimestamp
does insert with TZH:TZM
, but for UPDATE
case, I am unable to achieve this. Totally stuck!
I have [this question](https://stackoverflow.com/questions/51412424/update-column-to-default-value-without-specifying-it-in-spring-data-jpa) with focus on application side of this issue. I am wondering if there is anything that can be done in DB side?
bvrakvs
(111 rep)
Jul 19, 2018, 06:18 AM
• Last activity: Jul 19, 2025, 03:08 AM
0
votes
1
answers
1012
views
Goldengate timestamp (FFF)
Working with Goldengate 19.1.0.0.211019 replicat inbound to Oracle 19.0.0.0.0 consuming a trail file incoming from Nonstop (ie a non-Oracle database). Column in question from Nonstop is a string containing value of: "20210723112304812" in format of: "yyyymmddhhmmssfff" ie timestamp. In my replicat,...
Working with Goldengate 19.1.0.0.211019
replicat inbound to Oracle 19.0.0.0.0
consuming a trail file incoming from Nonstop (ie a non-Oracle database).
Column in question from Nonstop is a string containing value of:
"20210723112304812" in format of:
"yyyymmddhhmmssfff" ie timestamp.
In my replicat, I use following colmap in the appropriate map .. target command:
COLMAP ( usedefaults,
TS_UPDATE = @DATE ( 'YYYY-MM-DD HH:MI:SS.FFF', 'YYYYMMDDHHMMSSFFF', TS_UPDATE ) );
Howevever, finishing, the data stored in Oracle is:
(defined as TIMESTAMP(3) )
23-jul-2021 11:23:04.001
and other dates either get a ".001" or ".000" value for the fractional second everytime.
I have also tried:
@DATE ( 'YYYY-MM-DD HH:MI:SS.FFFFFF', 'YYYYMMDDHHMMSSFFFFFF', TS_UPDATE )
with same results.
What format setting should I be using to import/include the fractional second properly from non_oracle DB into Oracle DB with TIMESTAMP(3) ??
Ditto
(504 rep)
Jan 19, 2022, 03:32 PM
• Last activity: Jun 5, 2025, 03:03 PM
0
votes
1
answers
559
views
Exclude Supplemental Logs from Import
I am trying to import tables from a production environment, to a test environment. The production environment has supplemental logs on a number of tables for golden gate replication. The supplemental logs are being included in the import, which isn't a problem, but they aren't needed so I would like...
I am trying to import tables from a production environment, to a test environment. The production environment has supplemental logs on a number of tables for golden gate replication. The supplemental logs are being included in the import, which isn't a problem, but they aren't needed so I would like to exclude them.
I am using a par file to do the import currently, and while I feel like I should be able to exclude the supplemental logs, but I don't see an obvious exclusion type
SCHEMAS=PROD_SCHEMA
REMAP_SCHEMA=PROD_SCHEMA:TEST_SCHEMA
REMAP_TABLESPACE=\"TBS_1\":TEST_TBS
REMAP_TABLESPACE=\"TBS_2\":TEST_TBS
REMAP_TABLESPACE=\"TBS_3\":TEST_TBS
parallel=32
cluster=n
DIRECTORY=DPUMP
TABLE_EXISTS_ACTION=REPLACE
DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS
EXCLUDE=TABLE:"IN ('TABLE1','TABLE2')"
EXCLUDE=TRIGGER
EXCLUDE=CONTRAINT
Patrick
(698 rep)
Feb 3, 2021, 04:46 PM
• Last activity: Apr 6, 2025, 01:07 PM
1
votes
1
answers
646
views
Oracle Materialized View - Is it Safe to Use Only ROWID as Record Identifier?
this is a question related to Oracle 12c. I'm currently trying to create MVs as data sources for Golden Gate replication. We need **fast refresh** because We have to replicate 60,000 rows + at one time and deleting and reinserting all rows seem to be inefficient. The base tables have unique columns...
this is a question related to Oracle 12c.
I'm currently trying to create MVs as data sources for Golden Gate replication. We need **fast refresh** because We have to replicate 60,000 rows + at one time and deleting and reinserting all rows seem to be inefficient.
The base tables have unique columns which act as composite primary keys but there's no primary key constraint defined in them. They're peoplesoft tables so We prefer not to add pk constraints if possible, as They're used in so many applications.
I have built the MVs using only **WITH ROWID** clause for the materialized view logs and They seem to be working fine. But I don't know if it's a good practice to do so, should I add PK constraints on all the master tables instead? or maybe there's another way?
Feedbacks are appreciated. Thanks.
Lee
(61 rep)
Jul 17, 2019, 04:32 AM
• Last activity: Mar 11, 2025, 03:04 PM
0
votes
0
answers
85
views
ora-12715 - Oracle GoldaneGate can not connect to database
I install Oracle 21c database and Oracle GoldaneGate 21.3 on Windows server, I can connect to my PDB using sqlplus and SqlDeveloper from another pc, but when try to dblogin from GGSCI get error this is my environment I set in my server: ORACLE_HOME = D:\oracle21c\WINDOWS.X64_213000_db_home LD_LIBRAR...
I install Oracle 21c database and Oracle GoldaneGate 21.3 on Windows server, I can connect to my PDB using sqlplus and SqlDeveloper from another pc, but when try to dblogin from GGSCI get error
this is my environment I set in my server:
ORACLE_HOME = D:\oracle21c\WINDOWS.X64_213000_db_home
LD_LIBRARY_PATH = D:\oracle21c\WINDOWS.X64_213000_db_home\lib
TNS_ADMIN = D:\oracle21c\app\oracle\homes\OraDB21Home1\network\admin
NLS_LANG = AMERICAN_AMERICA.AR8MSWIN1256
OGG_HOME = OGG_HOME
PATH = ...;D:\oracle21c\WINDOWS.X64_213000_db_home\bin;D:\oracle21c\WINDOWS.X64_213000_db_home\lib
get this error:
GGSCI (Oracle-test) 2> dblogin userid ggs@mypdb
Password:
Error: Failed to initialize timezone information. Check location of ORACLE_HOME.
I set
ORA_TZFILE
in environment
ORA_TZFILE = D:\oracle21c\WINDOWS.X64_213000_db_home\oracore\zoneinfo\timezlrg_35.dat
now I get this error:
GGSCI (Oracle-test) 2> dblogin userid ggs@mypdb
Password:
Error: OCI Error ORA (status = 12715-Error while trying to retrieve text for error ORA-12715
)
I change mgr file in dirprm like this:
PORT 7809
USERID ggs@mypdb, PASSWORD ggs
and when run start manager in ggsci, have this error:
2024-11-25T02:29:57.852-0800 INFO OGG-03541 Oracle GoldenGate Manager for Oracle, MGR.prm: Oracle Environment Variables:
TNS_ADMIN = D:\oracle21c\WINDOWS.X64_213000_db_home\network\admin
PATH = C:\Program Files\Common Files\Oracle\Java\javapath;D:\oracle21c\WINDOWS.X64_213000_db_home\bin;D:\oracle21c\WINDOWS.X64_213000_db_home\lib;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;D:\oracle21c\ords\bin;D:\ggs;C:\Users\Administrator\AppData\Local\Microsoft\WindowsApps;.
2024-11-25T02:29:57.978-0800 ERROR OGG-00303 Oracle GoldenGate Manager for Oracle, MGR.prm: OCI Error ORA (status = ١٢٧١٥-Error while trying to retrieve text for error ORA-12715
).
2024-11-25T02:29:57.994-0800 ERROR OGG-01668 Oracle GoldenGate Manager for Oracle, MGR.prm: PROCESS ABENDING.
I don't config extract,pump or replicate!
sadegh
(101 rep)
Nov 25, 2024, 10:43 AM
2
votes
1
answers
702
views
Whether or Not to Use DataPump with Oracle GoldenGate
We are setting up GoldenGate to keep 3 of our Oracle databases in sync. Regarding configuration:   We plan to have an instance of GoldenGate running on all 3 nodes. Each node will run an extract on (local) Transaction/Redo Logs to create local Trail files, then run Replicat (locally) to th...
We are setting up GoldenGate to keep 3 of our Oracle databases in sync. Regarding configuration:
We plan to have an instance of GoldenGate running on all 3 nodes. Each node will run an extract on (local) Transaction/Redo Logs to create local Trail files, then run Replicat (locally) to the other 2 databases.
These connections are using TCP/IP so there should not be any lost packets? If Replicat running over the network sends and fails, will GG reattempt until it does or just move on? (Resulting in out of sync?)
**VS**
DataPump/ Checkpoint process seems to be industry standard for replication over the network remedying the above process.
What are the advantages to using the DataPump/Checkpoints in this process vs what I described above?
Does the Datapump/Checkpoint process add additional syncing for redundancies ?

Lance Allison
Mar 4, 2020, 06:29 PM
• Last activity: Jul 12, 2024, 05:39 AM
0
votes
0
answers
69
views
Practices to maintain historical data via replication?
We have a use case like of banks, where we generate large number of transactions, say in `transaction` table and few other tables with large number of inserts, every day. Transactions are most inserted and updated within a day or two. Our database is Oracle 12c(planned upgrade to 19). Coming to use...
We have a use case like of banks, where we generate large number of transactions, say in
transaction
table and few other tables with large number of inserts, every day. Transactions are most inserted and updated within a day or two. Our database is Oracle 12c(planned upgrade to 19). Coming to use case, we need to set up a big data cluster(Oracle or mongo/cassandra) ,say , X so we can support people fetching account statement of older time.
So, in our primary(and data guard replicated standby) Oracle database, we should keep 3 months of data and people querying for data older than 3 months must use cluster X.
What are the industry practices for building such replications?
Process must be:
- Stable,resilient and well monitored
- Cluster X must be scalable
- Highly consistent and must not ever loose data
What are our initial thoughts:
Set up oracle goldengate pipeline from source DB to a y-nodes sharded oracle DB cluster. Although Oracle19c sharded db does supports adding a node with auto balancing, I'm reluctant to use a relational db. Not using relational DB and using replication to DBs like Mongo/cassandra also making me skeptical of instabilities in CDC pipelines, NoSQL inconsistency.
What does banks do for such cases?
Sachin Verma
(839 rep)
Jan 7, 2024, 05:00 PM
• Last activity: Jan 7, 2024, 05:05 PM
1
votes
0
answers
30
views
dblogin cannot connect to SYSTEM DSN
I'm trying to install golden 19.1.0.0 on Window Server. I try to connect to SQL Server 2019 with dblogin and got the error in the picture. Any idea how to solve it? [![enter image description here][1]][1] [1]: https://i.sstatic.net/5kQmu.png
I'm trying to install golden 19.1.0.0 on Window Server. I try to connect to SQL Server 2019 with dblogin and got the error in the picture. Any idea how to solve it?

KhoiNguyen
(11 rep)
Dec 14, 2023, 05:57 PM
0
votes
1
answers
3441
views
Oracle goldengate service is not starting: Failed to retrieve the name of a missing Oracle redo log
I am trying to start GG service, when try 'start my_service' then starting but it didn't start, status is changed always starting to stopped. I need to start goldengate service.. I don't know what happened its service and what should set to solve the problem. Help me guys My service name is: MY_SERV...
I am trying to start GG service, when try 'start my_service' then starting but it didn't start, status is changed always starting to stopped.
I need to start goldengate service.. I don't know what happened its service and what should set to solve the problem. Help me guys
My service name is: MY_SERVICE
HERE ARE LOG:
2020-02-17 14:52:37 INFO OGG-00987 Oracle GoldenGate Command Interpreter for Oracle: GGSCI command (oracle): start MY_SERVICE.
2020-02-17 14:52:37 INFO OGG-00963 Oracle GoldenGate Manager for Oracle, mgr.prm: Command received from GGSCI on host [192.168.1.100 ]:37022 (START EXTRACT MY_SERVICE ).
2020-02-17 14:52:37 INFO OGG-00960 Oracle GoldenGate Manager for Oracle, mgr.prm: Access granted (rule #6).
2020-02-17 14:52:37 INFO OGG-00975 Oracle GoldenGate Manager for Oracle, mgr.prm: EXTRACT MY_SERVICE starting.
2020-02-17 14:52:37 INFO OGG-00992 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: EXTRACT MY_SERVICE starting.
2020-02-17 14:52:37 INFO OGG-03059 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Operating system character set identified as UTF-8.
2020-02-17 14:52:37 INFO OGG-02695 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: ANSI SQL parameter syntax is used for parameter parsing.
2020-02-17 14:52:37 INFO OGG-02095 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Successfully set environment variable NLS_LANG=AMERICAN_AMERICA.AL32UTF8.
2020-02-17 14:52:37 INFO OGG-02095 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Successfully set environment variable ORACLE_SID=MYSNAME.
2020-02-17 14:52:37 INFO OGG-02095 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Successfully set environment variable ORACLE_HOME=/u01/app/oracle/product/12.1.0/db_1.
2020-02-17 14:52:37 INFO OGG-02095 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Successfully set environment variable NLS_LANG=AMERICAN_AMERICA.AL32UTF8.
2020-02-17 14:52:37 INFO OGG-02095 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Successfully set environment variable ORACLE_SID=MYSNAME.
2020-02-17 14:52:37 INFO OGG-02095 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Successfully set environment variable ORACLE_HOME=/u01/app/oracle/product/12.1.0/db_1.
2020-02-17 14:52:38 INFO OGG-03522 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Setting session time zone to source database time zone 'GMT'.
2020-02-17 14:52:38 INFO OGG-01815 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Virtual Memory Facilities for: BR
anon alloc: mmap(MAP_ANON) anon free: munmap
file alloc: mmap(MAP_SHARED) file free: munmap
target directories:
/u01/ogg/BR/MY_SERVICE.
2020-02-17 14:52:38 INFO OGG-01851 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: filecaching started: thread ID: 140383342216960.
2020-02-17 14:52:38 INFO OGG-01815 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Virtual Memory Facilities for: COM
anon alloc: mmap(MAP_ANON) anon free: munmap
file alloc: mmap(MAP_SHARED) file free: munmap
target directories:
/u01/ogg/dirtmp.
2020-02-17 14:52:38 INFO OGG-01639 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: BOUNDED RECOVERY: ACTIVE: for object pool 1: p49266_extr.
2020-02-17 14:52:38 INFO OGG-01640 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: BOUNDED RECOVERY: recovery start XID: 0.0.0.
2020-02-17 14:52:38 INFO OGG-01641 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: BOUNDED RECOVERY: recovery start position: SeqNo: 11932, RBA: 25311476, SCN: 1.353105371 (4648072667), Timestamp: 2019-12-30 16:06:40.000000, Thread: 2.
2020-02-17 14:52:38 INFO OGG-01642 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: BOUNDED RECOVERY: recovery end position: SeqNo: 11932, RBA: 25311476, SCN: 1.353105371 (4648072667), Timestamp: 2019-12-30 16:06:40.000000, Thread: 2.
2020-02-17 14:52:38 INFO OGG-01643 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: BOUNDED RECOVERY: CANCELED: for object pool 1: p49266_extr.
2020-02-17 14:52:38 INFO OGG-01579 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: BOUNDED RECOVERY: VALID BCP: CP.MY_SERVICE.000002823.
2020-02-17 14:52:38 INFO OGG-01629 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: BOUNDED RECOVERY: PERSISTED OBJECTS RECOVERED: >.
2020-02-17 14:52:40 WARNING OGG-02045 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Database does not have streams_pool_size initialization parameter configured.
2020-02-17 14:52:40 INFO OGG-02248 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Logmining server DDL filtering enabled.
2020-02-17 14:54:47 INFO OGG-00987 Oracle GoldenGate Command Interpreter for Oracle: GGSCI command (oracle): info all.
2020-02-17 14:55:41 ERROR OGG-02037 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: Failed to retrieve the name of a missing Oracle redo log.
2020-02-17 14:55:41 ERROR OGG-00662 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: OCI Error retrieving bind info for query (status = 100).
2020-02-17 14:55:41 ERROR OGG-01668 Oracle GoldenGate Capture for Oracle, MY_SERVICE.prm: PROCESS ABENDING.
J.Col
(9 rep)
Feb 17, 2020, 08:46 AM
• Last activity: Aug 5, 2023, 04:04 AM
0
votes
1
answers
1231
views
Upgrade Oracle DB from 12C to 19C with minimum downtime
We are planning to apply the below steps --> 1. Export Live DB setup on the Oracle version 12C 2. Import on the new DB setup on version 19C 3. At this point, V12, and V19 schema structures will be exactly the same (till here we tested and it works fine) 4. Add a new data guard in the DB running on O...
We are planning to apply the below steps -->
1. Export Live DB setup on the Oracle version 12C
2. Import on the new DB setup on version 19C
3. At this point, V12, and V19 schema structures will be exactly the same (till here we tested and it works fine)
4. Add a new data guard in the DB running on Oracle v12 to point to the upgrade DB setup on version 19C
5. **Once the data guard is ready and DB import completes, we point the data guard to the upgrade DB and wait for sync complete**
Can you please review and confirm if this is achievable?
At this time we plan to use Data guard and if it is not possible then we will use the golden gate.
rajan dawra
(1 rep)
Jan 23, 2023, 02:59 AM
• Last activity: Jun 18, 2023, 01:02 PM
0
votes
0
answers
117
views
Converting Wire Protocol from Oracle to Postgres and Vice-Versa
Naive Question Assume we have an Application which was originally written for ORACLE as DBMS. If I want write an intermediary layer that gives this application independence to be able to work with PostgreSQL DBMS - I presume that I would be required to perform the following : SQL : Application--> OD...
Naive Question
Assume we have an Application which was originally written for ORACLE as DBMS. If I want write an intermediary layer that gives this application independence to be able to work with PostgreSQL DBMS - I presume that I would be required to perform the following :
SQL : Application--> ODBC API (Oracle Wire Protocol) --> My Layer ---> ODBC API (PostgreSQL Wire Protocol)
Response : Application-->ODBC API (Oracle Wire Protocol) ODBC API (PostgreSQL Wire Protocol) ----> PostgreSQL DBMS
Response : Application-->ODBC API (PostgreSQL Wire Protocol) <---- PostgreSQL DBMS
ultimate cause
(101 rep)
Nov 6, 2022, 12:39 PM
-2
votes
1
answers
122
views
AWS TO OCI GOLDENGATE REPLICATION
I want to use oracle Goldengate to my Multi cloud setup database, I'm just wondering if Im required to down my primary database to avoid data discrepancy? or goldengate can sync the data even if my primary running while im setting up the goldengate to my primary and dr database?
I want to use oracle Goldengate to my Multi cloud setup database, I'm just wondering if Im required to down my primary database to avoid data discrepancy? or goldengate can sync the data even if my primary running while im setting up the goldengate to my primary and dr database?
GoodGuyChaddy
(21 rep)
Jul 13, 2022, 06:22 PM
• Last activity: Jul 13, 2022, 10:36 PM
0
votes
1
answers
289
views
Oracle golden gate to BigQuery
Im trying to setting up the Golden gate to sync the data to BigQuery. When I start pushing the initial load, my extractor exported all the data and even from the replicat stats Im able to see the records are received but its not pushed to Bigquery. ## Extractor ``` GGSCI (bhuvi-oracle) 12> info extr...
Im trying to setting up the Golden gate to sync the data to BigQuery. When I start pushing the initial load, my extractor exported all the data and even from the replicat stats Im able to see the records are received but its not pushed to Bigquery.
## Extractor
GGSCI (bhuvi-oracle) 12> info extract load1
EXTRACT LOAD1 Last Started 2020-05-07 16:52 Status STOPPED
Checkpoint Lag Not Available
Log Read Checkpoint Table HR.TEST
2020-05-07 16:52:41 Record 32768
Task SOURCEISTABLE
## Replicat stats
GGSCI (bhuvi-1) 1> stats bq
Sending STATS request to REPLICAT BQ ...
gg.handlerlist=bigquery
Start of Statistics at 2020-05-07 16:53:33.
Replicating from HR.TEST to bhuvi_orcl.test:
*** Total statistics since 2020-05-07 16:52:42 ***
Total inserts 32768.00
Total updates 0.00
Total deletes 0.00
Total discards 0.00
Total operations 32768.00
## Log file on BigData Golen Gate:
2020-05-07T16:59:04.895+0000 INFO OGG-00987 Oracle GoldenGate Command Interpreter: GGSCI command (root): start bq.
2020-05-07T16:59:04.904+0000 INFO OGG-00963 Oracle GoldenGate Manager, mgr.prm: Command received from GGSCI on host [10.18.0.61]:42130 (START REPLICAT BQ ).
2020-05-07T16:59:04.905+0000 INFO OGG-00975 Oracle GoldenGate Manager, mgr.prm: REPLICAT BQ starting.
2020-05-07T16:59:04.964+0000 INFO OGG-00995 Oracle GoldenGate Delivery, bq.prm: REPLICAT BQ starting.
2020-05-07T16:59:04.967+0000 INFO OGG-03059 Oracle GoldenGate Delivery, bq.prm: Operating system character set identified as UTF-8.
2020-05-07T16:59:04.967+0000 INFO OGG-02695 Oracle GoldenGate Delivery, bq.prm: ANSI SQL parameter syntax is used for parameter parsing.
2020-05-07T16:59:05.173+0000 INFO OGG-15052 Oracle GoldenGate Delivery, bq.prm: Using Java class path: .:ggjava/ggjava.jar:ggjava/resources/lib/optional/log4j-api-2.9.1.jar:ggjava/resources/lib/optional/log4j-core-2.9.1.jar:ggjava/resources/lib/optional/log4j-slf4j-impl-2.9.1.jar.
2020-05-07T16:59:10.243+0000 INFO OGG-01815 Oracle GoldenGate Delivery, bq.prm: Virtual Memory Facilities for: COM
anon alloc: mmap(MAP_ANON) anon free: munmap
file alloc: mmap(MAP_SHARED) file free: munmap
target directories:
/opt/ogg/dirtmp.
2020-05-07T16:59:10.243+0000 INFO OGG-00996 Oracle GoldenGate Delivery, bq.prm: REPLICAT BQ started.
2020-05-07T16:59:10.251+0000 INFO OGG-02243 Oracle GoldenGate Delivery, bq.prm: Opened trail file /opt/ogg/dirdat/oo000000 at 2020-05-07 16:59:10.249063.
2020-05-07T16:59:10.251+0000 INFO OGG-03506 Oracle GoldenGate Delivery, bq.prm: The source database character set, as determined from the trail file, is UTF-8.
2020-05-07T16:59:10.253+0000 INFO OGG-06505 Oracle GoldenGate Delivery, bq.prm: MAP resolved (entry HR.TEST): MAP "HR"."TEST", TARGET bhuvi_orcl.test.
2020-05-07T16:59:10.253+0000 INFO OGG-02756 Oracle GoldenGate Delivery, bq.prm: The definition for table HR.TEST is obtained from the trail file.
2020-05-07T16:59:10.253+0000 INFO OGG-06511 Oracle GoldenGate Delivery, bq.prm: Using following columns in default map by name: ID.
2020-05-07T16:59:10.253+0000 INFO OGG-06510 Oracle GoldenGate Delivery, bq.prm: Using the following key columns for target table bhuvi_orcl.test: ID.
TheDataGuy
(1986 rep)
May 7, 2020, 05:15 PM
• Last activity: May 6, 2022, 12:58 PM
0
votes
1
answers
347
views
Do we need additional license to dump goldengate trail logs to kafka?
We have a Oracle (12c) GoldenGate license only (we do not have an oracle goldengate for big data). We need to transfer the gg trail logs to kafka then to hdfs. Do we need to buy the oracle goldengate for big data license to use the kafka connect? **Research we did:** These are the following kafka co...
We have a Oracle (12c) GoldenGate license only (we do not have an oracle goldengate for big data).
We need to transfer the gg trail logs to kafka then to hdfs.
Do we need to buy the oracle goldengate for big data license to use the kafka connect?
**Research we did:**
These are the following kafka connect we found:
1. kafka connect for big data: Probably needs additional license
https://docs.oracle.com/en/middleware/goldengate/big-data/19.1/gadbd/using-kafka-handler.html#GUID-2561CA12-9BAC-454B-A2E3-2D36C5C60EE5
2. Kafka Connect handler: Where do we get it? Which licenses do we need?
https://docs.oracle.com/en/middleware/goldengate/big-data/19.1/gadbd/using-kafka-connect-handler.html
Other links:
https://stackoverflow.com/questions/56107652/how-can-i-configure-the-kafka-connect-handler-in-ogg-for-hdfs
jimmiing
(3 rep)
Dec 31, 2021, 10:00 AM
• Last activity: Dec 31, 2021, 11:39 AM
0
votes
1
answers
520
views
oracle goldengate replication
Please I need help. Actually i am working on oracle 19c to perform a data base replication using the goldengate. I created two data bases (source and destination) the 3 processes (extract, pump and replicat) are running well, but when i insert datas in one data base for it to replicat in the other i...
Please I need help. Actually i am working on oracle 19c to perform a data base replication using the goldengate. I created two data bases (source and destination) the 3 processes (extract, pump and replicat) are running well, but when i insert datas in one data base for it to replicat in the other it's not replicating. Here is the link of the tutorial i used.
https://googleweblight.com/i?u=https%3A%2F%2Fmedium.com%2F%40madhukaudantha%2Fsetting-up-oracle-goldengate-12-9d9546a538d6&geid=NSTN&hl=en-US
this are the warning from the log file: 2021-08-30T05:52:38.284+0100 WARNING OGG-01877 Oracle GoldenGate Manager for Oracle, MGR.prm: Missing explicit accessrule for server collector.
2021-08-30T05:54:01.811+0100 WARNING OGG-01842 Oracle GoldenGate Capture for Oracle, EXTRACT.prm: CACHESIZE PER DYNAMIC DETERMINATION (7.66G) LESS THAN RECOMMENDED: 64G (64bit system)
vm found: 11.66G
Check swap space. Recommended swap/extract: 128G (64bit system).
2021-08-30T05:54:01.811+0100 INFO OGG-01639 Oracle GoldenGate Capture for Oracle, EXTRACT.prm: BOUNDED RECOVERY: ACTIVE: for object pool 1: p11804_extr.
2021-08-30T07:48:02.156+0100 WARNING OGG-00938 Oracle GoldenGate Manager for Oracle, MGR.prm: Manager is stopping at user request.
Franck Yoba
(1 rep)
Aug 31, 2021, 11:56 AM
• Last activity: Sep 1, 2021, 07:16 AM
0
votes
1
answers
2086
views
How to enable "only" drop partition DDL for Oracle Golden Gate?
I'm struggling with how to do this, and hoping somebody could assist with some ideas. Oracle Golden Gate - general question. Version shouldn't be specific, however, I am currently running on: Golden Gate: 12.2.0.1.170919 Oracle DB: 12.1.0.2.0 Using Golden Gate, Integrated Extract (and Replicat). I h...
I'm struggling with how to do this, and hoping somebody could assist with some ideas.
Oracle Golden Gate - general question. Version shouldn't be specific, however, I am currently running on:
Golden Gate: 12.2.0.1.170919
Oracle DB: 12.1.0.2.0
Using Golden Gate, Integrated Extract (and Replicat).
I have setup a test table using Interval partitioning (by month).
I have setup a GG flow to capture the data/rows for this table.
This works perfectly, no issues.
I then try to setup/enable DDL replication for this table.
I add the following to the extract .prm file:
DDL INCLUDE OBJNAME myschema.*
and that works perfectly. It captures ALL ddl ..
ALTER TABLE
TRUNCATE TABLE
CREATE INDEX
... etc.
Now, I want to restrict the DDL so that it captures ONLY :
ALTER TABLE myschema.* DROP PARTITION ...
that's it ... no TRUNCATEs, no DROPs, no other ALTER TABLE
ie ALTER TABLE myschema.mytable MODIFY ( col ....
don't want to capture that - for example.
I've tried some of the following .. all to no avail:
DDL INCLUDE OBJNAME myschema.* &
INCLUDE OPTYPE DROP &
INCLUDE OBJTYPE 'PARTITION'
this still allows ALL DDL
DDL INCLUDE OBJNAME myschema.* &
INCLUDE OPTYPE DROP &
INCLUDE OBJTYPE 'PARTITION' &
EXCLUDE ALL
This doesn't allow ANY DDL at all ..
DDL INCLUDE OBJNAME myschema.* &
INCLUDE OPTYPE DROP &
INCLUDE OBJTYPE 'PARTITION' &
EXCLUDE OPTYPE TRUNCATE &
EXCLUDE OPTYPE CREATE
That manages to prevent TRUNCATEs and CREATEs ..
however, I can't add "ALTER" or "DROP" .. as it starts affecting/preventing the DROP PARTITION clause.
I've opened a ticket with Oracle to get some feedback from them .. however, so far, not much luck there.
I was hoping somebody here might actually know how to do this ?
Thanks!
Ditto
(504 rep)
Sep 9, 2020, 12:46 PM
• Last activity: Oct 1, 2020, 02:42 PM
-1
votes
1
answers
1845
views
Golden Gate Privilege Missing for using Encryption
dblogin userid ogg_user, password AADAAAAAAAAAAAIAKAQIJAGAMFUFKFBGQHXBJIWGECTEIAZHJFYALFFIBGHGJBPFBCPBKEPCRCRHAESJ, AES128, encryptkey securekey1 ERROR: Unable to connect to database using user ogg_user. Please check privileges. ORA-01017: invalid username/password; logon denied. What are the privil...
dblogin userid ogg_user, password AADAAAAAAAAAAAIAKAQIJAGAMFUFKFBGQHXBJIWGECTEIAZHJFYALFFIBGHGJBPFBCPBKEPCRCRHAESJ, AES128, encryptkey securekey1
ERROR: Unable to connect to database using user ogg_user. Please check privileges.
ORA-01017: invalid username/password; logon denied.
What are the privileges required to get the connection done. I have given all required for a Goldengate User including DBA, but still there is an error.
Let me know.
Thanks!
Gaurav Bhaskar
(109 rep)
Jun 16, 2016, 11:59 AM
• Last activity: Aug 29, 2019, 03:26 PM
1
votes
2
answers
2119
views
Goldengate warning on table with primary key
Installed Goldengate Version 18.1.0.0.0 for Oracle 11gR2. Table has primary key and yet goldengate complains that there is no key found,see below GGSCI (server01.localdomain as gguser@UPGR) 3> add trandata scott.books 2019-01-23 22:44:42 WARNING OGG-06439 No unique key is defined for table BOOKS. Al...
Installed Goldengate Version 18.1.0.0.0 for Oracle 11gR2. Table has primary key and yet goldengate complains that there is no key found,see below
GGSCI (server01.localdomain as gguser@UPGR) 3> add trandata scott.books
2019-01-23 22:44:42 WARNING OGG-06439 No unique key is defined for table BOOKS. All viable columns will be used to represent the key, but may not guarantee uniqueness. KEYCOLS may be used to define the key.
2019-01-23 22:44:42 INFO OGG-15130 No key found for table SCOTT.BOOKS. All viable columns will be logged.
2019-01-23 22:44:42 INFO OGG-15135 TRANDATA for instantiation CSN has been added on table SCOTT.BOOKS`
GGSCI (server01.localdomain as gguser@UPGR) 10> info trandata scott.books
Logging of supplemental redo log data is enabled for table SCOTT.BOOKS.
Columns supplementally logged for table SCOTT.BOOKS: "AUTHOR", "ID", "LANGUAGE_ID", "TITLE", "YEAR_PUBLISHED".
Prepared CSN for table SCOTT.BOOKS: 978016
This is DDL of scott.books from Oracle SQL Developer
CREATE TABLE "SCOTT"."BOOKS"
("ID" NUMBER(*,0),
"TITLE" VARCHAR2(26 BYTE),
"AUTHOR" VARCHAR2(23 BYTE),
"YEAR_PUBLISHED" NUMBER(*,0),
"LANGUAGE_ID" NUMBER(*,0),
CONSTRAINT "PK_ID" PRIMARY KEY ("ID")
)
Can someone point me in right direction.
Thanks
From dba_constraints
SQL> SELECT
2 constraint_name,
3 constraint_type,
4 status,
5 validated
6 FROM
7 dba_constraints
8 WHERE
9 owner = 'SCOTT'
10 AND table_name = 'BOOKS';
CONSTRAINT_NAME C STATUS VALIDATED
------------------------------ - -------- -------------
PK_ID P ENABLED VALIDATED
user168186
Jan 23, 2019, 09:59 PM
• Last activity: Jan 25, 2019, 07:08 AM
0
votes
1
answers
385
views
Golden Gate activity auditing with SQL?
In a scenario where Oracle Golden Gate is used to replicate a primary site with an Oracle RAC database to a secondary site (and active/active back) we suspect unexpected changes from the unused secondary site. The issue is a bit hard to debug as we do not have direct DBA access. I wonder is there an...
In a scenario where Oracle Golden Gate is used to replicate a primary site with an Oracle RAC database to a secondary site (and active/active back) we suspect unexpected changes from the unused secondary site.
The issue is a bit hard to debug as we do not have direct DBA access. I wonder is there an easy way with unprivileged SQL access on the primary side to see if any changes are received from the other database?
Can I see counters or timestamps of OGG activity which helps me to track down DML made?
As I understand it I could see changes from the OGG user when setting up triggers or auditing - however both is not available in this situation.
eckes
(1456 rep)
Oct 26, 2017, 08:28 AM
• Last activity: Aug 1, 2018, 02:22 PM
2
votes
1
answers
1108
views
Migration of table from MySQL to Oracle using GoldenGate
I have a table of size 120 GB in MySQL 5.7 which I need to migrate to Oracle 12c. I am using [Oracle GoldenGate][2] for initial load of the table. I tried with initial load of full table but it consumed complete RAM and I had to cancel it. Even trying to run with primary key range initial load for 1...
I have a table of size 120 GB in MySQL 5.7 which I need to migrate to Oracle 12c. I am using Oracle GoldenGate for initial load of the table.
I tried with initial load of full table but it consumed complete RAM and I had to cancel it.
Even trying to run with primary key range initial load for 1/20th of table at a time, but it abends after certain time with:
> ERROR OGG-01233 Send TCP params error: TCP/IP error 110 (Connection timed out)
Is there any better approach to get this completed?
samir sahu
(21 rep)
Jul 30, 2018, 09:50 PM
• Last activity: Jul 31, 2018, 02:52 PM
Showing page 1 of 20 total questions