Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
0
votes
0
answers
14
views
Linked Server failure on clustered SQL Server
I have two clustered Microsoft SQL Servers (`SQLA` & `SQLB`) installed, and confirmed that both of the servers have an ODBC connector for a local PostgreSQL server. From that ODBC connection, I have a linked server created for use in some stored procedures that fails at least once a fortnight with t...
I have two clustered Microsoft SQL Servers (
Linked Server settings:
EXEC master.dbo.sp_addlinkedserver @server = N'POSTGRESP23', @srvproduct=N'', @provider=N'MSDASQL', @datasrc=N'PSQLPROD'
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'POSTGRESP23',@useself=N'False',@locallogin=NULL,@rmtuser=N'postgres',@rmtpassword='########'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'collation compatible', @optvalue=N'false'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'data access', @optvalue=N'true'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'dist', @optvalue=N'false'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'pub', @optvalue=N'false'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'rpc', @optvalue=N'false'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'rpc out', @optvalue=N'false'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'sub', @optvalue=N'false'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'connect timeout', @optvalue=N'0'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'collation name', @optvalue=null
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'lazy schema validation', @optvalue=N'false'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'query timeout', @optvalue=N'0'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'use remote collation', @optvalue=N'true'
EXEC master.dbo.sp_serveroption @server=N'POSTGRESP23', @optname=N'remote proc transaction promotion', @optvalue=N'true'
SQLA
& SQLB
) installed, and confirmed that both of the servers have an ODBC connector for a local PostgreSQL server.
From that ODBC connection, I have a linked server created for use in some stored procedures that fails at least once a fortnight with this error message:
> Cannot initialize the data source object of OLE DB provider "MSDASQL" for linked server "POSTGRESP23"
When troubleshooting the issues, the ODBC connector on both SQLA
and SQLB
tests successfully from the System DSN menu on the server; the error originates from the linked server.
Currently, to fix this for convenience and lower downtime I am just deleting the linked server and remaking it, pointing it to the same ODBC object. However, this is not a sustainable process.
Can anyone suggest where to look when troubleshooting? As I'm at a loss.
**Additional Information**
psqlODBC_X64
is installed on both machines already from https://odbc.postgresql.org/
System DSN settings:


NathanM
Jul 30, 2025, 10:48 PM
• Last activity: Aug 4, 2025, 03:37 AM
2
votes
1
answers
465
views
Cosmos DB ODBC driver installation problem
We are trying to install Cosmos DB ODBC Driver following the instructions provided [in this Microsoft Azure resource](https://learn.microsoft.com/en-us/azure/cosmos-db/odbc-driver) and are running into the following error: >The setup routines for Microsoft DocumentDB ODBC Driver ODBC driver could no...
We are trying to install Cosmos DB ODBC Driver following the instructions provided [in this Microsoft Azure resource](https://learn.microsoft.com/en-us/azure/cosmos-db/odbc-driver) and are running into the following error:
>The setup routines for Microsoft DocumentDB ODBC Driver ODBC driver could not be loaded due to a system error code 126: The specified module could not be found. (C:\Program Files\Microsoft DocumentDB ODBC Driver\lib\DocumentDBODBC_sb64.dll).
We've tried unsuccessfully on Windows Server 2008R2 and Windows Server 2012R2. We have it working on several Windows 10 desktops, just not server OS's unfortunately. We tried several things like installing using run as administrator, from command line, with both 32 and 64 bit installed, and then one or the other. We opened the location of the DLL to 'Everyone/full control' We checked that the Visual C++ redistributables are installed although not sure which specific ones are needed.
Any advice would be greatly appreciated.
Ben Saunders
(21 rep)
Jan 11, 2018, 01:56 PM
• Last activity: Jul 26, 2025, 06:08 AM
0
votes
1
answers
35
views
Clickhouse - Oracle ODBC connection error
I am trying to create connection between my oracle and clickhouse databases, so I could query oracle through ch like this: ```SELECT * FROM odbc('DSN=OracleODBC-21', 'sys', 'test')```. I have successfully installed unixODBC, Oracle Instant Client, Oracle ODBC for client. Also, I configured my ```.od...
I am trying to create connection between my oracle and clickhouse databases, so I could query oracle through ch like this:
* FROM odbc('DSN=OracleODBC-21', 'sys', 'test')
. I have successfully installed unixODBC, Oracle Instant Client, Oracle ODBC for client.
Also, I configured my .odbc.ini
and .ini
, so I can access oracle:
[oracle@host ~]$ isql -v OracleODBC-21
+---------------------------------------+
| Connected! |
...
SQL> select * from sys.test;
+-----------------------------------------+-----------------------------------------------------------------------------------------------------+
| ID | DATA |
+-----------------------------------------+-----------------------------------------------------------------------------------------------------+
| 0 | 123 |
+-----------------------------------------+-----------------------------------------------------------------------------------------------------+
User
also can do this, but with some envs:
[oracle@host ~]$ sudo -u clickhouse bash -c "export LD_LIBRARY_PATH=/opt/oracle/instantclient_21_19; isql -v OracleODBC-21"
+---------------------------------------+
| Connected! |
...
But when I am trying to query oracle in ch:
host :) select * from odbc('DSN=OracleODBC-21','sys','test');
SELECT *
FROM odbc('DSN=OracleODBC-21', 'sys', 'test')
Query id: d263cc54-bd51-4a97-94c0-085177149947
Elapsed: 9.529 sec.
Received exception from server (version 25.6.2):
Code: 86. DB::Exception: Received from localhost:9000. DB::HTTPException. DB::HTTPException: Received error from remote server http://127.0.0.1:9018/columns_info?use_connection_pooling=1&version=1&connection_string=DSN%3DOracleODBC-21&schema=sys&table=test&external_table_functions_use_nulls=1 . HTTP status code: 500 'Internal Server Error', body length: 267 bytes, body: 'Error getting columns from ODBC 'std::exception. Code: 1001, type: nanodbc::database_error, e.what() = contrib/nanodbc/nanodbc/nanodbc.cpp:1275: IM004: [unixODBC][Driver Manager]Driver's SQLAllocHandle on SQL_HANDLE_HENV failed (version 25.1.5.31 (official build))'
'. (RECEIVED_ERROR_FROM_REMOTE_IO_SERVER)
Will be grateful for any advice.
pashkin5000
(101 rep)
Jul 22, 2025, 05:58 AM
• Last activity: Jul 22, 2025, 06:14 PM
0
votes
1
answers
148
views
Speeding up linking Oracle tables in Microsoft Access
Many people in my office use Oracle tables by linking them in Access (either 2010 or 2016) databases. They always complain about the fact that, when Access is about to present them the list of tables they have access to in a specific Oracle instance for selection, they have to wait 10 to 15 minutes...
Many people in my office use Oracle tables by linking them in Access (either 2010 or 2016) databases.
They always complain about the fact that, when Access is about to present them the list of tables they have access to in a specific Oracle instance for selection, they have to wait 10 to 15 minutes for the selection window to pop-up.
The reason seems to be the huge number of accessible tables, in turn related to the huge number of accessible schemas, on that particular instance, so that the resulting list contains tens of thousands of table names.
There was a similar problem with some DB2 databases, but was solved by adding a 'SCHEMALIST' option in the ODBC configuration (db2cli.ini) for those databases, to limit the number of schemas visible to the connection.
I had a look at Oracle ODBC documentation, but couldn't find a similar option, so I'm stuck.
Does somebody have a trick to share about how to filter schemas/tables visible to an ODBC connection to Oracle or otherwise speed-up linking those tables?
aatoma
(1 rep)
Apr 9, 2020, 11:24 AM
• Last activity: Jul 19, 2025, 12:08 AM
0
votes
1
answers
197
views
What are the pitfalls of using "Preserve" for Cursor Behavior in Informix ODBC DSN?
I am converting an Informix SE to an IDS database. The programs I have use ODBC System DSNs to communicate with the database. For SE, we use the sesoctcp protocol. The program code, would open a Recordset from the DSN, loop through each record and save each record to a local AccessDB (execute INSERT...
I am converting an Informix SE to an IDS database. The programs I have use ODBC System DSNs to communicate with the database.
For SE, we use the sesoctcp protocol. The program code, would open a Recordset from the DSN, loop through each record and save each record to a local AccessDB (execute INSERT INTO), and then perform a MoveNext on the Recordset.
For IDS, we use the onsoctcp protocol. With the same code, the program throws an error on MoveNext.
I believe the Recordset is being closed... I added code to perform a MoveLast/MoveFirst on the Recordset before the loop and the MoveNext does not throw an error. This change will require me to update potentially 400 programs.
On the other hand, I can forego the code changes and instead change the DSN Cursor Behavior to "1 - Preserve" and the MoveNext does not throw an error. My limited understanding is that if I leave it on "0 - close", this tells the driver to decide when to close the recordset instead of letting the program decide.
The only documentation I could find for Cursor Behavior doesn't explain the values:
https://www.ibm.com/support/knowledgecenter/en/SSGU8G_11.70.0/com.ibm.odbc.doc/ids_odbc_069.htm
So... what are the pitfalls of using "Preserve" for Cursor Behavior in Informix ODBC DSN?

kttii
(101 rep)
Sep 5, 2017, 02:50 PM
• Last activity: Jun 21, 2025, 02:07 PM
0
votes
1
answers
203
views
Apache Ignite as linked Server in SQLServer?
I tried to set up a linked server via ODBC connection in SQLServer. It seem to connect but showed all the caches as empty names in SSMS. Also I couldnt get a select query to work. Has anybody had any luck with doing this? If so.. How did you write the select statement?
I tried to set up a linked server via ODBC connection in SQLServer. It seem to connect but showed all the caches as empty names in SSMS. Also I couldnt get a select query to work.
Has anybody had any luck with doing this?
If so.. How did you write the select statement?
W. Sprague
(1 rep)
Dec 14, 2018, 11:34 PM
• Last activity: Jun 18, 2025, 08:03 PM
0
votes
1
answers
5028
views
Excel VBA Connection String for ODBC System DSN
I am trying to create an Oracle connection string in Excel VBA. After lots of research, I have a working connection string: ``` Set Oracle_conn = New ADODB.Connection Oracle_conn.Open "ODBC; Driver='Oracle in instantclient_19_9'; Data Source='Data Mart'; UserID=" & ID & "; Password=" & Password & ";...
I am trying to create an Oracle connection string in Excel VBA. After lots of research, I have a working connection string:
Set Oracle_conn = New ADODB.Connection
Oracle_conn.Open "ODBC; Driver='Oracle in instantclient_19_9'; Data Source='Data Mart'; UserID=" & ID & "; Password=" & Password & ";"
However, this only works when my ODBC Data Source is a **User** DSN. When the ODBC Data Source is a **System** DSN, I end up with:
"[Oracle][ODBC][Ora]ORA-01017: invalid username/password; logon denied"
The reason I am using **System** DSN is because my code runs from a shared PC, so System DSN works for all users without each individual user needing to create a **User** DSN.
Any advice on how to tweak my code such that it will run with a **System** DSN would be greatly appreciated. Thank you.
Edit 1: System DSN connects successfully if I manually go to Data ribbon > Get Data > From Other Sources > From ODBC
Edit 2: I still haven't been able to figure out how to connect to a System DSN with VBA code. Maybe it's a bug with the ODBC driver? I realize this is a bit of an obscure question, but I'm still hopeful someone can help me connect. Thank you!
Andy
(1 rep)
Feb 15, 2023, 02:56 PM
• Last activity: May 26, 2025, 11:06 PM
1
votes
1
answers
272
views
What is MS Access ODBC SQL syntax for a table with AutoIncrement and Memo columns?
I am migrating a full framework .NET app to core. It used ADOX to create an MS Access database and tables. It used OLE DB to populate data. In ADOX, the data type was `ADOX.DataTypeEnum.adInteger` and the syntax from c# was: fld.Properties["Autoincrement"].Value = true; fld.Properties["Seed"].Value...
I am migrating a full framework .NET app to core. It used ADOX to create an MS Access database and tables. It used OLE DB to populate data. In ADOX, the data type was
ADOX.DataTypeEnum.adInteger
and the syntax from c# was:
fld.Properties["Autoincrement"].Value = true;
fld.Properties["Seed"].Value = 1;
fld.Properties["Increment"].Value = 1;
For memo
, ADOX data type was ADOX.DataTypeEnum.adLongVarWChar
.
Since we can no longer use ADOX under .NET Core, I have to use SQL DML through ODBC. When I attempt to create a table using syntax similar to
create table t (
col1 int not null identity(1,1),
col2 varchar(512)
)
I get a syntax error. When I try to adjust this for Access according to this and other related articles in the same documentation tree, I also get syntax error.
create table t (
col1 counter,
col2 text
)
What is the correct syntax for these columns that is compatible with .NET Core and MS Access ODBC driver?
Gandon Jank
(37 rep)
Dec 28, 2020, 08:02 PM
• Last activity: May 14, 2025, 12:02 PM
0
votes
1
answers
281
views
What is an ODBK file and how to restore that?
I have recently come across a situation where I am required to restore an ODBK file. Basically this is a database file of Intergy Server (An EHR platform). I don't know how to restore that file into a supported DB engine and how can I view the data. Upon searching I came to know that this is an Open...
I have recently come across a situation where I am required to restore an ODBK file. Basically this is a database file of Intergy Server (An EHR platform). I don't know how to restore that file into a supported DB engine and how can I view the data. Upon searching I came to know that this is an OpenEdge DB file.
The server is running a Postgres engine as well but there is no table related to that EHR.
Hasnain
(1 rep)
Apr 13, 2020, 03:42 PM
• Last activity: May 13, 2025, 08:06 PM
0
votes
1
answers
776
views
ODBC Calling Fill - Unexpected closed connection after 2 hours
Using PostgreSQL 12.3 (Ubuntu 12.3-1.pgdg18.04+1) on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0, 64-bit I use driver PostgresSQL Unicode(x64) version 13.00.00.00 I have a query that is executed through an ODBC connection in a Powershell 5.1 script. I use the Fill() meth...
Using PostgreSQL 12.3 (Ubuntu 12.3-1.pgdg18.04+1) on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0, 64-bit
I use driver PostgresSQL Unicode(x64) version 13.00.00.00
I have a query that is executed through an ODBC connection in a Powershell 5.1 script. I use the Fill() method to retreive about 3500 records daily. When the script works, it takes 2-5 minutes to execute and retrieve data when it works.
Problem is that the script "fails" half of the time approx. When this occurs, the Powershell script does stop only after 2h and 30 seconds.
We double checked the postgres logs and when this occurs, we see that the query successfully completed within 6 minutes always. I don't know what to look for. any Idea?
Below is the error got:
Executed as user: NT Service\SQLSERVERAGENT. A job step
received an error at line 94 in a PowerShell script. The
corresponding line is '(New-Object system.Data.odbc.
odbcDataAdapter($cmd)).fill($ds) | out-null '. Correct
the script and reschedule the job. The error information
returned by PowerShell is: 'Exception calling "Fill" with
"1" argument(s): "The connection has been disabled." '.
Process Exit Code -1.
Not too familiar with postgreSQL.
Thanks!
Actarus
(3 rep)
May 10, 2021, 09:27 PM
• Last activity: Apr 24, 2025, 01:01 AM
0
votes
1
answers
330
views
SSIS Package is writing corrupt data, but only when run as a Job
Our environment is the following: SQL Server 2019 (15.0.4153.1) + SQL Integration Services (15.0.2000) We have a DTSX Package that uses a OLE connection manager to connect to an SQL Server to read data from and an ODBC 32 Bit connection manager using the Oracle Instant Client 21 to write the data to...
Our environment is the following:
SQL Server 2019 (15.0.4153.1) + SQL Integration Services (15.0.2000)
We have a DTSX Package that uses a OLE connection manager to connect to an SQL Server to read data from and an ODBC 32 Bit connection manager using the Oracle Instant Client 21 to write the data to.
The package itself is absolutely stupid and simple and just reads data from a view and without any modification writes the data to the target table.
We have to variants of the DTSX Package: the first (and original package which ran on an SQL Server 2012 for years without problems) has a transformation step included which casts the data read from the SQL Server to different types before writing them to the target table. E.g. DT_I4 was cast to DT_NUMERIC. It seems that this step was necessary back then, but can be omitted now.
Now the observation is that the package works perfectly fine when executed directly from Visual Studio (2019). The data is written correctly, here is a simplified example:
Given Name | Last Name
-----------------------
John | Doe
William | Smith
Claire | Anderson
Jane | Scott
But when run as a Job via the SQL Server Agent (details below), the result in the target table looks like this:
Given Name | Last Name
-----------------------
William | Smit
Willi | Smi
Wil | Smith
William | Sm
I have absolutely no clue how that can be even possible, because in my naive world (which usually does not involve Microsoft products, so I am absolutely not familiar with how SSIS actually works) rows are processed as rows and nothing like this can ever happen.
Not only that the rows are mixed up, the contents of the row are also corrupted and truncated arbitrarily.
Has anyone observed this problem and knows a solution?
The job we created consists of one single step which executes an Integration Service Package. We already tried a) reading the file directly from the file system and b) reading the file from a file storage on the SQL Server where we imported the DTSX. Both yield the same result. Integration Service Catalog is unfortunately not an option for us.
The package runs in the 32 bit environment due to the 32 bit ODBC driver we needed to use (because the Visual Studio 2019 where the package is built is 32 bit software).
The target server version configuration property in the package is set to SQL Server 2019 (was SQL Server 2012 before).
The most irritating things here are a) that the same DTSX Package ran for several years in an SQL Server 2012 without any problems and b) that the Package also runs from Visual Studio without any problems. Only when executed as a Job in the SQL Server, the data is completely corrupt.
*More Information:*
The data source for the Oracle Database is a system data source which is used by both the SQL Server and Visual Studio when executing. So I can rule out that different ODBC drivers are used.
Credentials for the Database are stored within the package and protection mode is EncryptSensitiveInformationWithPassword. The question in the comment was which permissions the user have on the Oracle Database. The user we use is the schema owner.
The user executing the package, though, is different, since it is the logged in user (local admin) on the machine when executing the package from Visual Studio and on the other hand, the package is executed by the SQL Server Agent and I suspect the SYSTEM User is doing that.
UniversE
(126 rep)
Jun 1, 2022, 10:10 AM
• Last activity: Apr 20, 2025, 11:04 AM
1
votes
1
answers
2016
views
MS Access - Creating a report based on a pass-through query that requires a parameter
I am attempting to make a report in Access that uses parameters as well as a pass-through query. I'm open to using a temporary table but the guides I can find using temporary tables all use an Insert query which doesn't work with my pass-through query. I see a lot of mention of creating a recordset...
I am attempting to make a report in Access that uses parameters as well as a pass-through query.
I'm open to using a temporary table but the guides I can find using temporary tables all use an Insert query which doesn't work with my pass-through query.
I see a lot of mention of creating a recordset from the pass-though query and then copying the recordset to a temporary table but have not had luck location instructions on how to do that.
Some background. I have a PostgreSQL server that is aggregating multiple databases via a foreign data wrapper. Across those databases I have multiple table names that a reused.
I can do a normal pass-through query using the front end query properties but this doesn't handle parameters for the pass-through query.
I can build a form using ADO objects by setting the record set for the form to equal a record set created from a pass-through query. (
Set Me.Recordset = MyRecordset
) This allows me to create the query string dynamically so I can use parameters! One step closer but not a report.
Reports do not support setting themselves to a record set the way forms do so I'm back to the drawing board.
Any suggestion on how to accomplish my goal? Should I give up and have Postgres store the results in a table, link that table, and call it a day?
BGCDC
(167 rep)
Sep 24, 2018, 08:20 PM
• Last activity: Apr 18, 2025, 08:02 AM
0
votes
1
answers
1935
views
error [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified when import excel data to sql server
I work on SQL server 2017 I need to import data from `excel 2016` to `sql server 2017` I using python script to do that I create `odbc` and success test with name `Testserver` path `G:\ImportExportExcel` have `allpackage and every one full control permissions` my instance name is : AHMEDSALAHSQL my...
I work on SQL server 2017 I need to import data from
excel 2016
to sql server 2017
I using python script to do that
I create odbc
and success test with name Testserver
path G:\ImportExportExcel
have allpackage and every one full control permissions
my instance name is : AHMEDSALAHSQL
my pc name DESKTOP-L558MLK
named pipe enabled true
and instance allow remote
when run script below
declare @ImportPath NVARCHAR(MAX)='G:\ImportExportExcel'
declare @DBConnectionString NVARCHAR(MAX) = 'dsn=Testserver;Uid=sa;Pwd=321'
declare @ImportAll BIT=0
declare @CombineTarget BIT=0
declare @ExcelFileName NVARCHAR(200)='dbo.studentsdata'
declare @ExcelSheetName NVARCHAR(50)='students2'
SELECT @ImportPath = CASE WHEN RIGHT(@ImportPath,1) = '\' THEN @ImportPath ELSE CONCAT(@ImportPath,'\') END
DECLARE @Serv NVARCHAR(200) = CONCAT(CHAR(39),CHAR(39),@@SERVERNAME,CHAR(39),CHAR(39))
DECLARE @ValidPath TABLE (ValidPathCheck BIT)
INSERT @ValidPath
EXEC sp_execute_external_script
@language =N'Python',
@script=N'
import pandas as pd
d = os.path.isdir(ImportFilePath)
OutputDataSet = pd.DataFrame([d],columns=["Filename"])'
,@params = N'@ImportFilePath NVARCHAR(MAX)'
,@ImportFilePath = @ImportPath
DECLARE @PythonScript NVARCHAR(MAX) =CONCAT('
import pandas as pd
import os
import glob
from revoscalepy import RxSqlServerData, rx_data_step
sqlConnString = "Driver=Testserver;Server=Serv; ',@DBConnectionString,'"
Filefolderepath = ImportFilePath+"*.xlsx"
if ImportAll ==0:
Filename =ImportFilePath+ExcelFileName+".xlsx"
exists = os.path.isfile(Filename)
if exists and ExcelSheetName in pd.ExcelFile(Filename).sheet_names:
Output = pd.read_excel(Filename, sheetname=ExcelSheetName, na_filter=False).astype(str)
if not Output.empty:
sqlDS = RxSqlServerData(connection_string = sqlConnString,table = "".join(fl for fl in ExcelFileName if fl.isalnum())+"_"+"".join(sh for sh in ExcelSheetName if sh.isalnum()))
rx_data_step(input_data = Output, output_file = sqlDS,overwrite = True)
else:
print("Invalid Excel file or sheet name")')
EXEC sp_execute_external_script
@language = N'Python'
,@script = @PythonScript
,@params = N'@ImportFilePath NVARCHAR(MAX),@ImportAll BIT,@CombineTarget BIT,@ExcelFileName NVARCHAR(200),@ExcelSheetName NVARCHAR(50),@Serv NVARCHAR(200)'
,@ImportFilePath = @ImportPath
,@ImportAll = @ImportAll
,@CombineTarget = @CombineTarget
,@ExcelFileName = @ExcelFileName
,@ExcelSheetName = @ExcelSheetName
,@Serv = @Serv
I get error when run query
Msg 39004, Level 16, State 20, Line 0
A 'Python' script error occurred during execution of 'sp_execute_external_script' with HRESULT 0x80004004.
Msg 39019, Level 16, State 2, Line 0
An external script error occurred:
[Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified
Error in execution. Check the output for more information.
DataStep error: [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified
**so can any one help me to solve issue ?**
I add odbc connection to my pc
and test it success

user3223372
(1 rep)
Apr 16, 2022, 02:56 AM
• Last activity: Apr 16, 2025, 09:02 PM
0
votes
0
answers
48
views
ODBC select * skips columns reading from sqlServer
I'm missing knowledge, I only know ORACLE. The Situation : Oracle 12.1 (linux) connects to SQL Server (Win) using dg4odbc and MS-SQLServer ODBC driver for linux. Reading from SQLServer via `select * from schema.table@dblink` works, basically. The Problem: Every column that has a **non-null entry in...
I'm missing knowledge, I only know ORACLE.
The Situation :
Oracle 12.1 (linux) connects to SQL Server (Win) using dg4odbc and MS-SQLServer ODBC driver for linux. Reading from SQLServer via
select * from schema.table@dblink
works, basically.
The Problem:
Every column that has a **non-null entry in column "domain_catalog"** in INFORMATION_SCHEMA.columns
is missing!
e.g. select * from INFORMATION_SCHEMA.columns where table_name = 'X'
yields all 242 Columns.
While select * from X
delivers only ~64 columns. There seems to be no scheme to this, there are bigint columns with domain_catalog and bigint w/o values in domain_catalog
Now it could be the ORACLE ODBC interface, I know. So I experimented with Microsofts sqlcmd on linux, version 11 and version 18, with its respective ODBC driver, of course.
And for example exec sp_columns BUV_BUCHUNGSVARIANTE
yields two(!) rows only,
while select * from BUV_BUCHUNGSVARIANTE
displays all 10 columns.
Just MS-tools + MS-ODBC involvend, no ORACLE.
What I need to know:
1. Why are columns with domain_catalog not null missing from select * ?
2. What is the meaning of "domain_catalog" on a column level ?
3. Why would a developer design columns with and without domain_catalog
entry?
4. **What privilege(s) does the connecting SQLServer account need
to access all columns of a table**?
tia
peter
dipr
(1 rep)
Apr 5, 2025, 07:57 PM
0
votes
1
answers
8532
views
Error Calling Python module function DbMySQLFE.connect
Migrating a database from SqlServer to Mysql using "MySQL WorkBench 6.3 CE" using ODBC data source as default methode to use to connect to the RDBMS. -> Source Selection Test connection is Successful -> Target Selection Test Connection gives Error Calling Python module function DbMySQLFE.connect I h...
Migrating a database from SqlServer to Mysql using "MySQL WorkBench 6.3 CE" using ODBC data source as default methode to use to connect to the RDBMS.
-> Source Selection Test connection is Successful
-> Target Selection Test Connection gives Error Calling Python module function DbMySQLFE.connect
I have researched alot and few folks blaming SSH connection over TCP. If its the case Can I disable SSH over TCP??
I have attached the Screenshot. Please respond its really urgent. thanks

Jr. dhh
(17 rep)
Apr 26, 2018, 01:26 PM
• Last activity: Mar 18, 2025, 02:01 AM
0
votes
0
answers
86
views
SSIS ODBC Destination to MariaDB is slow – How to enable bulk inserts?
I'm trying to copy data from SQL Server to MariaDB using SSIS. - I’ve configured a DSN using the official MariaDB ODBC connector. - My SSIS data flow consists of a simple source query (retrieving ~5,000 records) and an ODBC Destination. - However, the write performance is extremely slow. After enabl...
I'm trying to copy data from SQL Server to MariaDB using SSIS.
- I’ve configured a DSN using the official MariaDB ODBC connector.
- My SSIS data flow consists of a simple source query (retrieving ~5,000 records) and an ODBC Destination.
- However, the write performance is extremely slow.
After enabling MariaDB's general log, I noticed that SSIS is generating one INSERT statement per row, significantly slowing down the process.
I tested an external third-party component, which was much faster. From the logs, it appears to use a bulk insert approach, like this:
INSERT INTO tablename (col_1, col_2, col_3, col_4)
VALUES
(7529766, NULL, 5, '2024-01-01 03:17:17'),
(7529767, NULL, 5, '2024-01-01 03:17:29'),
(7529768, NULL, 5, '2024-01-01 03:17:32')...
I’d like to achieve similar performance without relying on paid third-party components.
Is there a way to configure SSIS's ODBC Destination to perform batch inserts instead of row-by-row inserts?
Are there any MariaDB ODBC driver settings that can improve performance?
Mattia Nocerino
(512 rep)
Feb 26, 2025, 09:04 AM
• Last activity: Feb 28, 2025, 01:21 PM
0
votes
0
answers
30
views
Is there any way to hide tables when using a ODBC connection in Excel?
When using the ODBC data source, or the Microsoft Query Wizard to connect to a PostgreSQL database, I can see all tables; even though I only have permission to see several. The wizard even shows me the names of the table columns. Is there any way to prevent the tables that I don't have permission to...
When using the ODBC data source, or the Microsoft Query Wizard to connect to a PostgreSQL database, I can see all tables; even though I only have permission to see several. The wizard even shows me the names of the table columns. Is there any way to prevent the tables that I don't have permission to read from being shown?
Evil Elf
(101 rep)
Feb 25, 2025, 01:26 PM
1
votes
1
answers
18258
views
unixODBC - [IM002][unixODBC][Driver Manager] Data source name not found and no default driver specified
Folks, We installed unixODBC-2.3.7 on Aix 7.1 to connect to Oracle 11g. After setup odbc.ini and odbcinst.ini files and tried to connect with isql, we had the following error: # isql -v ORACLE [IM002][unixODBC][Driver Manager]Data source name not found and no default driver specified [ISQL]ERROR: Co...
Folks,
We installed unixODBC-2.3.7 on Aix 7.1 to connect to Oracle 11g. After setup odbc.ini and odbcinst.ini files and tried to connect with isql, we had the following error:
# isql -v ORACLE
[IM002][unixODBC][Driver Manager]Data source name not found and no default driver specified
[ISQL]ERROR: Could not SQLConnect
Before execute the isql command, we ran the following export commands:
export ODBCSYSINI=/usr/local/etc
export PATH=$PATH:/usr/local/bin
export TNS_ADMIN=/u01/app/oracle/product/11.2.0.4/client_1/network/admin
export ODBCINI=/usr/local/etc/odbc.ini
export LIBPATH=/u01/app/oracle/product/11.2.0.4/client_1/lib:/usr/local/lib:/u01/app/oracle/product/11.2.0.4/client_1
export LD_LIBRARY_PATH=/u01/app/oracle/product/11.2.0.4/client_1/lib:/usr/local/lib:/u01/app/oracle/product/11.2.0.4/client_1
And odbcinst -j command too:
unixODBC 2.3.7
DRIVERS............: /usr/local/etc/odbcinst.ini
SYSTEM DATA SOURCES: /usr/local/etc/odbc.ini
FILE DATA SOURCES..: /usr/local/etc/ODBCDataSources
USER DATA SOURCES..: /usr/local/etc/odbc.ini
SQLULEN Size.......: 4
SQLLEN Size........: 4
SQLSETPOSIROW Size.: 2
Illegal instruction(coredump)
Our /usr/local/etc/odbc.ini file is:
[ODBC Data Sources]
ORACLE = Oracle ODBC Test
[ORACLE]
Application Attributes = T
Attributes = W
BatchAutocommitMode = IfAllSuccessful
BindAsFLOAT = F
CloseCursor = F
DisableDPM = F
DisableMTS = T
Driver = Oracle11gdriver
DSN = ORACLE
EXECSchemaOpt =
EXECSyntax = T
Failover = T
FailoverDelay = 10
FailoverRetryCount = 10
FetchBufferSize = 64000
ForceWCHAR = F
Lobs = T
Longs = T
MaxLargeData = 0
MetadataIdDefault = F
QueryTimeout = T
ResultSets = T
ServerName = ORAODBC.customer.com
SQLGetData extensions = F
Translation DLL =
Translation Option = 0
DisableRULEHint = T
UserID = user
Password = pass
StatementCache=F
CacheBufferSize=20
UseOCIDescribeAny=F
And our /usr/local/etc/odbcinst.ini is
[Oracle11gdriver]
Description = Oracle ODBC driver for Oracle 11g
Driver = /u01/app/oracle/product/11.2.0.4/client_1/lib/libsqora.so
Finally, our tnsnames.ora is:
ORAODBC =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS = (PROTOCOL = TCP)(HOST = customer.com.br)(PORT = 1521))
)
(CONNECT_DATA =
(SERVICE_NAME = ORAODBC.customer.com)
)
)
Any idea what is going on?
Nati
Sep 17, 2018, 09:55 PM
• Last activity: Feb 22, 2025, 02:01 AM
-1
votes
2
answers
1763
views
SQL agent doesn't start after ODBC upgrade
For security reasons, I would like to upgrade the ODBC driver from version 17 to version 18, but when I do this, the SQL agent does not start. How to configure it to start? The log says: Windows could not start the SQL Server Agent (MSSQLSERVER) service on Local Computer. Error 1053: The service did...
For security reasons, I would like to upgrade the ODBC driver from version 17 to version 18, but when I do this, the SQL agent does not start. How to configure it to start?
The log says: Windows could not start the SQL Server Agent (MSSQLSERVER) service on Local Computer. Error 1053: The service did not respond to the start or control request in a timely fashion.
[sqagtres] OnlineThread: Error 41d bringing resource online.
RVrv
(1 rep)
Nov 20, 2023, 01:40 PM
• Last activity: Jan 8, 2025, 03:27 PM
1
votes
2
answers
6818
views
Sqlcmd: Error: Microsoft ODBC Driver 17 for SQL Server : Data source name not found and no default driver specified
I recently upgraded from SQL Server 2017 Express to SQL Server 2019 Express and installed Microsoft SQL Server ODBC Driver 18. However, a sqlcmd script to back up all databases, using Ola Hallengren's maintenance solution, still wants to use Microsoft ODBC Driver 17. I uninstalled Microsoft ODBC Dri...
I recently upgraded from SQL Server 2017 Express to SQL Server 2019 Express and installed Microsoft SQL Server ODBC Driver 18. However, a sqlcmd script to back up all databases, using Ola Hallengren's maintenance solution, still wants to use Microsoft ODBC Driver 17. I uninstalled Microsoft ODBC Driver 17 for SQL Server. Now I am getting the below error when I run the script in a .cmd file via the Windows Task Manager.
>Sqlcmd: Error: Microsoft ODBC Driver 17 for SQL Server : Data source name not found and no default driver specified.
How can I get sqlcmd to use the Microsoft SQL Server ODBC Driver 18?
Mike Pace
(11 rep)
Jul 22, 2022, 06:26 PM
• Last activity: Dec 30, 2024, 01:07 AM
Showing page 1 of 20 total questions