Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
2
votes
1
answers
179
views
Can data streams from Informatica to SQL Server be multi-threaded?
Suppose we have a database server, with SQL Server on it. And we are moving data from another database server, through our Informatica server, and onto that SQL Server database server. And our SQL Server database server has four processors. Is it possible to force the connections to be multi-threade...
Suppose we have a database server, with SQL Server on it. And we are moving data from another database server, through our Informatica server, and onto that SQL Server database server. And our SQL Server database server has four processors. Is it possible to force the connections to be multi-threaded so it sends data quicker? How can this be done?
Right now, only one of the processors is being used on the SQL Server database server.
JustBeingHelpful
(2116 rep)
Jul 21, 2015, 08:30 PM
• Last activity: Nov 6, 2024, 10:14 PM
0
votes
1
answers
1181
views
Turn on multi-threading in Oracle for ETL purposes
As you can see from this question, I was able to configure multi-threading in SQL Server and build a workflow in Informatica to multi-thread a job to move data faster into SQL Server. https://dba.stackexchange.com/questions/107966/multi-thread-informatica-connections-to-use-different-processors-on-t...
As you can see from this question, I was able to configure multi-threading in SQL Server and build a workflow in Informatica to multi-thread a job to move data faster into SQL Server.
https://dba.stackexchange.com/questions/107966/multi-thread-informatica-connections-to-use-different-processors-on-target-datab
Now I'd like to do the same thing in Oracle. Does anyone know the feature in Oracle that's comparable to the "Max Degree of Parallelism" setting in SQL Server? I'd like to move data on each core of the server in parallel.
I'm strictly dealing with
INSERT
statements. So DML.
JustBeingHelpful
(2116 rep)
Jan 11, 2016, 06:46 PM
• Last activity: Sep 20, 2024, 09:06 PM
0
votes
2
answers
728
views
What causes disk space growth from Informatica load onto SQL Server?
We've been dealing with huge disk growth on SQL Server from Informatica. After the load, the database grows to 2.4TB. After the database shrink, it goes to 1.05TB. What could likely cause this to happen? What settings can we check in Informatica and/or SQL Server for our next run to troubleshoot thi...
We've been dealing with huge disk growth on SQL Server from Informatica. After the load, the database grows to 2.4TB. After the database shrink, it goes to 1.05TB. What could likely cause this to happen? What settings can we check in Informatica and/or SQL Server for our next run to troubleshoot this or do a guess/check?
EDIT:
There are two ways to move data using an Informatica mapping. Using SQL Overrides (running straight SQL) using the SQL Transformation type or using the built in data streams with Informatica's out of the box functionality. In this case, we are using data streams. When data streams are used, straight SQL is still used, but Informatica creates the SQL code behind the scenes. We are loading in 1,000,000 record increments. I thought perhaps Informatica might be tellinig SQL Server to allocate disk space as it loads, but I'm not even sure what command(s) to look for if it did.
JustBeingHelpful
(2116 rep)
Mar 24, 2015, 08:43 PM
• Last activity: Mar 24, 2023, 12:04 PM
1
votes
1
answers
117
views
Learning resources for Data Engineering
I recently started my career as Data Engineer, with everyday I explore the dynamics of field and get to know something new. I want to ask if there is any dedicated website, courses or learning path(s) available online to learn this relatively new field. Note: Let me know if I have asked this questio...
I recently started my career as Data Engineer, with everyday I explore the dynamics of field and get to know something new. I want to ask if there is any dedicated website, courses or learning path(s) available online to learn this relatively new field.
Note: Let me know if I have asked this question on wrong stack and guide me the correct platform to ask help on this question.
curious_nustian
(119 rep)
Nov 10, 2021, 09:27 AM
• Last activity: Nov 10, 2021, 11:40 AM
1
votes
0
answers
668
views
Informatica MD5 function vs SQL Hashbytes MD5 function
I am hoping someone has some crossover experience here. I am attempting to utilize the [Hashbytes][1] function within SQL to match a value that is being generated utilizing the MD5 function from Informatica. Has anyone ever been dove into this issue? My first assumption is that since the systems are...
I am hoping someone has some crossover experience here. I am attempting to utilize the Hashbytes function within SQL to match a value that is being generated utilizing the MD5 function from Informatica.
Has anyone ever been dove into this issue? My first assumption is that since the systems are different, the actual calculation of the
checksum
might have minute differences. It also could be how I am attempting to concatenate the values prior to using the hashbytes
MD5 function.
I am at a loss, and am beginning to think it won't ever be possible to match these values (as many have said online) Any help would be greatly appreciated.
TestMcTesterson
(111 rep)
Aug 23, 2019, 08:05 PM
• Last activity: Aug 23, 2019, 08:27 PM
1
votes
2
answers
14316
views
procedure for postgres to create table if not exist
i want to create table if that doesn't, i tried below code: create or replace function create_table() returns void as $$ begin if not exists(select * from pg_tables where schemaname = 'Public' and tablename = 'test') then create table test ( the_id int not null, name text ); end if; end; $$ language...
i want to create table if that doesn't, i tried below code:
create or replace function create_table() returns void as
$$
begin
if not exists(select *
from pg_tables
where schemaname = 'Public'
and tablename = 'test')
then
create table test
(
the_id int not null,
name text
);
end if;
end;
$$
language 'plpgsql';
While executing this procedure first time:
select creat_table();
table gets created but when I execute it again I get the below error:
ERROR: relation "test" already exists
CONTEXT: SQL statement "create table test
(
the_id int not null,
name text
)"
PL/pgSQL function create_table() line 8 at SQL statement
********** Error **********
ERROR: relation "test" already exists
SQL state: 42P07
Context: SQL statement "create table test
(
the_id int not null,
name text
)"
PL/pgSQL function create_table() line 8 at SQL statement
How to achieve this, and also I want to call this procedure from Informatica pre-sql target session property so i want to call procedure with table name as parameter.
user3253227
(11 rep)
Jan 17, 2018, 08:05 AM
• Last activity: Jan 17, 2018, 07:50 PM
2
votes
1
answers
672
views
Data Warehouse - DB2 to SQL Server: how & what impacts on ETL?
I'm working towards migrating an IBM DB2 database to SQL Server, however the DB2 database sits in a Data Warehouse environment. As I am very new to Data Warehousing, DB2 and DB Migration between different types of databases, I would like to ask about how to best approach this and understand what com...
I'm working towards migrating an IBM DB2 database to SQL Server, however the DB2 database sits in a Data Warehouse environment.
As I am very new to Data Warehousing, DB2 and DB Migration between different types of databases, I would like to ask about how to best approach this and understand what components are involved (listing or linking elsewhere is fine - happy to read a bunch of things or learn something new).
The context:
- The ETL tool in use, and to be kept in use, is Informatica
- The only application that queries the DB is Cognos
- There are 4 DB2 DBs in use on 2 (non-windows) hosts, one of these is 'H1' which has a 'DB2-DW-PROD' DB used as part of the Data Warehouse
- I am interested in migrating the 'DB2-DW-PROD' DB to some space I have on an existing (Windows) SQL Server host, 'H2' under the name 'SQL-DW-PROD'. This host currently has several DBs on it for other purposes (in one instance) and can have SSRS/SSIS or other services/resources installed or configured if needed.
- I would like to clean up the poorly maintained DB a little bit during the migration, if possible, based off what has not been used/updated in the past couple of years.
What I would like to know, directly or indirectly:
- Should I create a new instance on H2 or is a single new database enough?
- Does the new instance/DB need particular resources? If so, where from?
- Should I use a particular Microsoft tool or use some type of import/export to get the data I need? (Is Informatica involved in this process?)
- Someone mentioned 'metadata tables' for Informatica and/or Cognos, should I be wary of something?
- Another mentioned maintaining sequence values -How is this best achieved; is it needed?
- Any changes on the Informatica side are not my personal concern but the work of another, as such, Is there anything I should provide this person (apart from the address of the new db)?
I'm very new in this field, so anything dumbed down is greatly appreciated. That being said, anything complicated but necessary or good to know is as well. It's a steep learning curve for me but I'm kind of stumped right now and need a hand.
I have no control over the process - Informatica and Cognos must be kept as they are. Also, at this stage, I am not able to view the DB2 DB but I know it is not too large. At the very, very most, assume it's 1 TB for the purpose of this question.
At the moment I am simply trying to understand the process I will follow and what I will need to look out for when the time comes.
EDIT: Have received plenty of feedback on the instance vs database part of the question which I greatly appreciate. However, I am still confused as to how I should approach migrating the data accross. I.e. 'how, or what should I use, to migrate such that I have everything I need to support Informatica and cognos?'
The most pressing issue is that I don't understand any of informatica's/cognos's dependencies or know how db2 works. I am not confident that simply copying user table data accross is enough and would really like some confirmation or pointers to know exactly what to do.
SillyGhost
(23 rep)
May 11, 2017, 08:40 AM
• Last activity: Oct 9, 2017, 06:21 PM
2
votes
2
answers
11958
views
How to resolve red invalid icon for Informatica mapping
I'm having a difficult time resolving a red "invalid" icon within Informatica. I've done all possible scenarios for checking in / checking out this object (mapping). It currently does not have any dependencies. How can I fix this? ![enter image description here][1] [1]: https://i.sstatic.net/xfBRw.j...
I'm having a difficult time resolving a red "invalid" icon within Informatica. I've done all possible scenarios for checking in / checking out this object (mapping). It currently does not have any dependencies. How can I fix this?

JustBeingHelpful
(2116 rep)
Jan 30, 2014, 12:32 AM
• Last activity: Jan 6, 2017, 09:09 AM
2
votes
2
answers
3276
views
How to increase ETL performance in Informatica for Netezza as a source and SQL Server as a target?
What settings or configuration on the Informatica server, in the Informatica software itself, or on the database servers can be changed to increase Informatica ETL throughput? What are some benchmarks we can set to troubleshoot performance? We are specifically using Netezza as a source and SQL Serve...
What settings or configuration on the Informatica server, in the Informatica software itself, or on the database servers can be changed to increase Informatica ETL throughput? What are some benchmarks we can set to troubleshoot performance? We are specifically using Netezza as a source and SQL Server as a target.
Please exclude multi-threading and Informatica partitioning from this question.
This we've done in the past:
- restart servers every so often
- remove indexes on target tables in SQL Server before ETL load
- increase commit level
JustBeingHelpful
(2116 rep)
Aug 19, 2015, 04:33 PM
• Last activity: Mar 13, 2016, 10:50 AM
3
votes
1
answers
17459
views
Why do I get an incorrect error "ORA-01775: looping chain of synonyms", when the base table does not existing?
I use Informatica for managing some ETL process that load data into an Oracle 9i Data-warehouse. Today I got the below error in the Informatica session logs: Message: Database driver error... CMN_1022 [DELETE FROM SOME_TABLE WHERE PERIOD_NAME = 'OCT-12' ORA-01775: looping chain of synonyms Database...
I use Informatica for managing some ETL process that load data into an Oracle 9i Data-warehouse.
Today I got the below error in the Informatica session logs:
Message: Database driver error...
CMN_1022 [DELETE FROM SOME_TABLE
WHERE PERIOD_NAME = 'OCT-12'
ORA-01775: looping chain of synonyms
Database driver error...
Function Name : executeDirect
SQL Stmt : DELETE FROM SOME_TABLE
WHERE PERIOD_NAME = 'OCT-12'
Oracle Fatal Error
Database driver error...
Function Name : ExecuteDirect
Oracle Fatal Error
]
Now it was a typographical error, the table name was misspelt. The Oracle error obviously sent us in the wrong direction... So just wanted to understand why this error was show when there was no issue with synonyms.
Kent Pawar
(283 rep)
Jan 3, 2013, 08:44 AM
• Last activity: Oct 2, 2015, 02:02 PM
1
votes
1
answers
1694
views
Multi-thread Informatica connections to use different processors on target database server
See this question for reference. https://dba.stackexchange.com/questions/107680/can-data-streams-from-informatica-to-sql-server-be-multi-threaded I've got most of my problem figured out. I've split up the one large table into four smaller tables. I'm now moving the data from the four source tables t...
See this question for reference.
https://dba.stackexchange.com/questions/107680/can-data-streams-from-informatica-to-sql-server-be-multi-threaded
I've got most of my problem figured out. I've split up the one large table into four smaller tables. I'm now moving the data from the four source tables to four target tables. Each in their own mapping. In Workflow Manager, each mapping is a separate session, inside of its own workflow. And then I have a master workflow with four command tasks calling the four workflows.
How do I isolate the connections so that each command connects to the target SQL Server database, so that each runs on a different processor?
EDIT:
Informatica also has functionality called "Partitioning", which does exactly what I did here. It splits the data up however you want. But there is some Informatica server level setup for this to work.

JustBeingHelpful
(2116 rep)
Jul 24, 2015, 02:59 AM
• Last activity: Aug 20, 2015, 08:59 PM
0
votes
1
answers
1442
views
Why does Informatica hang and say "Not Responding" when connecting to ODBC Data Source
Informatica just hangs when trying to open this dialog. The ODBC connection exists in Control Panel > Administrative Tools > ODBC Data Sources. Source Analyzer > Sources (menu) > Import from Database ![enter image description here][1] [1]: https://i.sstatic.net/1Huip.jpg
Informatica just hangs when trying to open this dialog. The ODBC connection exists in Control Panel > Administrative Tools > ODBC Data Sources.
Source Analyzer > Sources (menu) > Import from Database

JustBeingHelpful
(2116 rep)
Apr 13, 2015, 04:40 PM
• Last activity: Apr 13, 2015, 04:43 PM
0
votes
1
answers
1035
views
How do I debug an ETL process that is unable to commit any records?
I used a DB link to connect to another DB and delete a couple of records. Later I ran an Informatica Workflow (ETL tool) to load data into that database using a DB connection that connects directly to it. The Workflow hasn't committed any records since 2hrs (it should run just for 30mins or so..) so...
I used a DB link to connect to another DB and delete a couple of records. Later I ran an Informatica Workflow (ETL tool) to load data into that database using a DB connection that connects directly to it.
The Workflow hasn't committed any records since 2hrs (it should run just for 30mins or so..) so I am guessing it is because I was not able to run "commit" using the DB link. *The ETL tools logs don't provide any error or debug information at this point.*
Any ideas how I could debug this..? I ran another ETL process to update a table and then run commit. But even after that the first process keeps on running without committing any records. I am not a DBA expert so probably am misinterpreting the DB behavior.. would really appreciate any suggestions. Thanks!
Kent Pawar
(283 rep)
Feb 12, 2013, 05:51 PM
• Last activity: Feb 12, 2013, 06:33 PM
2
votes
1
answers
2848
views
Informatica Source Qualifier not working from Windows ODBC data source (system DSN) from Excel xlsx file
**Reference:** https://community.informatica.com/message/62128 **Windows ODBC Workflow:** Windows > Control Panel > Administrative Tools > Data Sources (ODBC) > click "System DSN" menu > click "Add" > choose "Microsoft Excel Driver (*.xlsx) > Data Source Name: "aaaaaaaaa" Version: Excel 12.0 click "...
**Reference:**
https://community.informatica.com/message/62128
**Windows ODBC Workflow:**
Windows > Control Panel > Administrative Tools > Data Sources (ODBC) > click "System DSN" menu > click "Add" > choose "Microsoft Excel Driver (*.xlsx) >
Data Source Name: "aaaaaaaaa"
Version: Excel 12.0
click "Select Workbook" button > choose xlsx file
**Informatica PowerCenter Designer Workflow:**
Sources > Import from Database >
ODBC Data Source: "aaaaaaaaa"
click "Connect" button > the button's name changes to "Re-Connect"
**Error:**
.. it doesn't seem to find the data in the worksheet ... I have 3 columns with 10 rows of data

JustBeingHelpful
(2116 rep)
May 30, 2012, 05:14 PM
• Last activity: Jun 21, 2012, 06:08 AM
Showing page 1 of 14 total questions