Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
0
votes
1
answers
487
views
importing bacpac file using sql package
How can I specify Database maximum size when importing bacpac file using sql package? The maximum size of 32GB is auto-sizing, but my actual DB size is over 35GB. Can I get to use DatabaseMaximumSize=(INT60) instead of DatabaseMaximumSize=(INT32)? eg. import parameter /p:DatabaseEdition=GeneralPurpo...
How can I specify Database maximum size when importing bacpac file using sql package? The maximum size of 32GB is auto-sizing, but my actual DB size is over 35GB. Can I get to use DatabaseMaximumSize=(INT60) instead of DatabaseMaximumSize=(INT32)?
eg. import parameter /p:DatabaseEdition=GeneralPurpose /p:DatabaseServiceObjective=GP_Gen5_2 /p:DatabaseMaximumSize=(INT60)
Cho Wai Tun
(1 rep)
Jun 12, 2021, 07:12 AM
• Last activity: Jun 30, 2025, 07:08 AM
0
votes
2
answers
343
views
SSIS Package hanging with two Constraints pointed at one Task: Can you have multiple constraints pointing to one Task?
In the example I pasted in the image, I have 5 Tasks. I don't think it's important what kind of Tasks I'm using--but I could be wrong, of course--because the main issue seems to be how the constraints (the arrows) flow. I have one flow that goes: Task 1, Task 2, Task 3, Task 4, Task 5. The other flo...
In the example I pasted in the image, I have 5 Tasks. I don't think it's important what kind of Tasks I'm using--but I could be wrong, of course--because the main issue seems to be how the constraints (the arrows) flow.
I have one flow that goes: Task 1, Task 2, Task 3, Task 4, Task 5.
The other flow goes: Task 1, Task 2, Task 5.
What I'm trying to do is: If Task 2 fails, skip Task 3 and 4 and go right to Task 5. If Task 2 succeeds, go ahead and run Tasks 3, 4, and 5.
Unfortunately, when I debug the package, the package effectively hangs and never executes Task 5, regardless if the Task 2 Constraint is Success or Failure:
Is there some design limitation where you cannot assign multiple constraints, pointing to the same Task?
I'd rather not have boiler plate code, where Task 5 is created twice, one for each flow. Sure, in this example, it's only one Task. I could duplicate Task 5 and the package should run.
But for scaling, it's woefully inefficient and a change management nightmare to create n number of duplicate Tasks, just to skip two Tasks.
How can I make this work as the flow scales? Is it possible for a Task to have multiple constraints pointing at it and I'm just not setting something correctly? If not, what are my options? Could using a container help me out here?
Thank you for your help in advance.


user3621633
(275 rep)
Feb 8, 2024, 11:40 PM
• Last activity: May 10, 2025, 05:02 PM
-1
votes
1
answers
313
views
Why does SQLPackage include my table-valued function in deployments when it hasn't changed?
I'm using SQLPackage in TFS to automate an SSDT project/DACPAC deployment with SQL Server 2014. I have a table-valued function that appears in the deployment report and script with every deployment, even though the source code and compiled rarely change. Example; I can do a new build and deployment...
I'm using SQLPackage in TFS to automate an SSDT project/DACPAC deployment with SQL Server 2014. I have a table-valued function that appears in the deployment report and script with every deployment, even though the source code and compiled rarely change. Example; I can do a new build and deployment with no source code changes, and my deployment report will look like this (the corresponding SQL script will have the definition matching what's already in the DB):
I would expect the deployment to have nothing in it. It's the only object for which this happens, in a project containing thousands of objects. Anybody had an experience like this with SQLPackage/TFS/DACPAC before?
John Hurtubise
(9 rep)
Apr 30, 2021, 12:55 AM
• Last activity: Apr 30, 2025, 02:10 PM
0
votes
0
answers
50
views
DACPAC deployment hangs at Updating database (start)
We are currently in the middle of creating a proof of concept using DACPACs as our deployment/release method rather than our in-house deployment product. The two databases that I am currently using to test are Azure SQLDB databases, and our repo is stored in Azure DevOps. We have been using the pipe...
We are currently in the middle of creating a proof of concept using DACPACs as our deployment/release method rather than our in-house deployment product. The two databases that I am currently using to test are Azure SQLDB databases, and our repo is stored in Azure DevOps. We have been using the pipelines in devops to deploy.
The dacpac/sqlproj contains many tables, procedures, etc, and unfortunately must include many very large post deployment scripts. These scripts are used to manage the data in lookup tables we use in the app [a bunch of inserts that are used in a merge for the target table]. I mention this because I suspect that these may be involved, but I'm not sure.
When I initially tested this, it did deploy to one of the databases successfully (did not try the other at that point), but this only had one large post-deployment script. I have added the rest in, and now the dacpac publish just does not seem to do anything after getting to "Updating database (start)" when running the release in Devops. There have been a couple of times where the release seemed to be cancelled by Azure (a request to deprovision the agent), and none of the logs are available for the step that hangs.
I have also resorted to attempting to publish in VS via SSDT, but that also just hangs for hours. Today, I started trying to use the command-line tool to deploy, and it did actually start to refresh the target DB, but then hung very very early into a procedure refresh. I have tried again multiple times, and every attempt has resulted in the hang at Updating database.
sp_Who2 shows a SPID in the database that has a CPU time of ~2000, but it is sleeping and AWATING COMMAND. This seems to never change. The Azure Portal also shows a spike in resources when the dacpac publish started, but then drops to 0 and stays there. I cannot seem to find any further information about this in particular.
Below is the cmd I am using, which is pretty much the same, if not exactly the same as what is in the devops pipeline.
.\SqlPackage /Action:Publish /SourceFile:"C:\Users\ME\source\repos\REPORNAME\PROJECTNAME\bin\Debug\PROJECTNAME.dacpac" /TargetConnectionString:"Server=tcp:.database.windows.net,1433;Initial Catalog=;Persist Security Info=False;User ID=username;Password=;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"
Does this seem like a resource contention issue? Can anyone point me to some resources or have any idea what might be hanging up, here?
scarr030
(115 rep)
Apr 23, 2025, 03:58 PM
0
votes
0
answers
33
views
Sqlpackage.exe is dropping default constraints and computed columns
I'm deploying database schema-only (no data) changes to our Azure Sql Managed Instance via Azure DevOps task `SqlAzureDacpacDeployment@1`, and for some tables it is always dropping and re-creating computed columns and/or default constraints. Some of these tables are very large and to drop/re-add a c...
I'm deploying database schema-only (no data) changes to our Azure Sql Managed Instance via Azure DevOps task
SqlAzureDacpacDeployment@1
, and for some tables it is always dropping and re-creating computed columns and/or default constraints. Some of these tables are very large and to drop/re-add a computed column or default constraint is very time consuming.
Anyone know why this is happening, and is there a way to prevent it? As mentioned, this is a schema-only deployment, there is no data being inserted.
This is a snippet of what is being executed by the DevOps task:
"C:\Program Files\Microsoft SQL Server\160\DAC\bin\SqlPackage.exe" /Action:Publish
/SourceFile:"D:\a\1\drop\CCDatabase\bin\Release\CCDatabase.dacpac" /TargetServerName:"***"
/TargetDatabaseName:"***" /AccessToken:"********" /p:TreatVerificationErrorsAsWarnings=True
/p:DropObjectsNotInSource=True /p:DoNotDropObjectTypes=Users;Logins;RoleMembership;Permissions;Credentials;DatabaseScopedCredentials;LinkedServerLogins;LinkedServers;
/p:BlockOnPossibleDataLoss=False
This is what I see in the log, and I've verified the columns/constraints are being dropped via Sql Profiler.
Dropping Default Constraint [VZ].[DF_Location_CreatedDateUtc]...
Dropping Default Constraint [VZ].[DF_Location_Hold_CreatedDateUtc]...
Dropping Default Constraint [VZ].[DF_Location_Stage_CreatedDateUtc]...
Altering Table [CRM].[QuarterlyGoal]...
Altering Table [D365].[CUSTCUSTOMERV3STAGING]...
Starting rebuilding table [SM].[TechnicianAvailability]...
Starting rebuilding table [VZ].[Location]...
Starting rebuilding table [VZ].[Location_Hold]...
Starting rebuilding table [VZ].[Location_Stage]...
Here is an example of a table that is being rebuilt:
This one had the default constraint [DF_Location_Hold_CreatedDateUtc]
dropped and the table data copy to a temp table, and then renamed back to original table name:
Table definition:
CREATE TABLE [VZ].[Location_Hold]
(
[VehicleNumber] VARCHAR(30) NOT NULL,
[UpdateUtc] DATETIME NULL,
[UpdateLocal] DATETIME NULL,
[Latitude] DECIMAL(9,6) NULL,
[Longitude] DECIMAL(9,6) NULL,
[GeoLocation] AS geography::STPointFromText('POINT(' + CAST([Longitude] AS VARCHAR(20)) + ' ' + CAST([Latitude] AS VARCHAR(20)) +')', 4326) PERSISTED,
[SpeedKm] DECIMAL(5, 2) NULL,
[DisplayState] VARCHAR(20) NULL,
[CreatedDateUtc] DATETIME NULL CONSTRAINT [DF_Location_Hold_CreatedDateUtc] DEFAULT SYSUTCDATETIME(),
CONSTRAINT [PK_Location_Hold] PRIMARY KEY ([VehicleNumber])
)
GO
This is what was captured in the profiler trace:
BEGIN TRANSACTION;
SET TRANSACTION ISOLATION LEVEL SERIALIZABLE;
SET XACT_ABORT ON;
CREATE TABLE [VZ].[tmp_ms_xx_Location_Hold] (
[VehicleNumber] VARCHAR (30) NOT NULL,
[UpdateUtc] DATETIME NULL,
[UpdateLocal] DATETIME NULL,
[Latitude] DECIMAL (9, 6) NULL,
[Longitude] DECIMAL (9, 6) NULL,
[GeoLocation] AS geography::STPointFromText('POINT(' + CAST ([Longitude] AS VARCHAR (20)) + ' ' + CAST ([Latitude] AS VARCHAR (20)) + ')', 4326) PERSISTED,
[SpeedKm] DECIMAL (5, 2) NULL,
[DisplayState] VARCHAR (20) NULL,
[CreatedDateUtc] DATETIME CONSTRAINT [DF_Location_Hold_CreatedDateUtc] DEFAULT SYSUTCDATETIME() NULL,
CONSTRAINT [tmp_ms_xx_constraint_PK_Location_Hold1] PRIMARY KEY CLUSTERED ([VehicleNumber] ASC)
);
IF EXISTS (SELECT TOP 1 1
FROM [VZ].[Location_Hold])
BEGIN
INSERT INTO [VZ].[tmp_ms_xx_Location_Hold] ([VehicleNumber], [UpdateUtc], [UpdateLocal], [Latitude], [Longitude], [SpeedKm], [DisplayState], [CreatedDateUtc])
SELECT [VehicleNumber],
[UpdateUtc],
[UpdateLocal],
[Latitude],
[Longitude],
[SpeedKm],
[DisplayState],
[CreatedDateUtc]
FROM [VZ].[Location_Hold]
ORDER BY [VehicleNumber] ASC;
END
DROP TABLE [VZ].[Location_Hold];
EXECUTE sp_rename N'[VZ].[tmp_ms_xx_Location_Hold]', N'Location_Hold';
EXECUTE sp_rename N'[VZ].[tmp_ms_xx_constraint_PK_Location_Hold1]', N'PK_Location_Hold', N'OBJECT';
COMMIT TRANSACTION;
SET TRANSACTION ISOLATION LEVEL READ COMMITTED;
This table simply had the computed column dropped and then re-added:
Original table definition:
CREATE TABLE [CRM].[QuarterlyGoal]
(
[QuarterlyGoalRecId] INT IDENTITY (1, 1) NOT NULL,
[StartDate] DATE NOT NULL,
[EndDate] DATE NOT NULL,
[UserId] INT NOT NULL,
[RevenueAmount] DECIMAL(11, 2) NOT NULL,
[IsActiveQuarter] AS
(CASE WHEN (
[StartDate] = CAST(GETDATE() AS DATE))
THEN 1
ELSE 0
END),
CONSTRAINT [PK_QuarterlyGoal] PRIMARY KEY CLUSTERED ([QuarterlyGoalRecId] ASC),
CONSTRAINT [FK_QuarterlyGoal_UserId] FOREIGN KEY ([UserId]) REFERENCES [dbo].[Users] ([Id])
);
This is what was in the profiler trace:
ALTER TABLE [CRM].[QuarterlyGoal] DROP COLUMN [IsActiveQuarter];
...
ALTER TABLE [CRM].[QuarterlyGoal]
ADD [IsActiveQuarter] AS (CASE WHEN ([StartDate] = CAST (GETDATE() AS DATE)) THEN 1 ELSE 0 END);
Greg
(582 rep)
Apr 10, 2025, 08:22 PM
1
votes
2
answers
1147
views
SqlPackage importing very large bacpacs to Managed Instance
I want to import a database from a `.bacpac` file to a SQL Server Managed Instance. I have read [SqlPackage Import parameters and properties][1] in the documentation. It says that there is a flag `DatabaseMaximumSize=(INT32)`. I wanted to know if there's a limit that SqlPackage can support? For exam...
I want to import a database from a
.bacpac
file to a SQL Server Managed Instance.
I have read SqlPackage Import parameters and properties in the documentation. It says that there is a flag DatabaseMaximumSize=(INT32)
.
I wanted to know if there's a limit that SqlPackage can support?
For example, if I got 8 GB of RAM available, will SqlPackage be able to load larger .bacpac
files than that, meaning it doesn't load it all to the memory?
I'm not talking about runtime performance, but memory allocation. What happens if the bacpac is larger than my available RAM?
There is for example dacfx in .NET Core which has an import method which receives a bacpac, but the option to tell it not to load everything to the memory doesn't work there.
CodeMonkey
(111 rep)
Mar 3, 2021, 07:14 AM
• Last activity: Feb 12, 2025, 04:48 AM
0
votes
1
answers
57
views
How do I synchronise data from Azure SQL DB to its newer version without changing the schema?
I have read about SqlPackage tool, and I found it amazing, very useful. I'm struggling though to understand how it can work with the following scenario. I have an Azure SQL DB, let's call it "MyDb", in a production environment, version 1.0. I would like to spin up a new environment. This new environ...
I have read about SqlPackage tool, and I found it amazing, very useful.
I'm struggling though to understand how it can work with the following scenario.
I have an Azure SQL DB, let's call it "MyDb", in a production environment, version 1.0.
I would like to spin up a new environment. This new environment will a fresh "MyDb" with an updated schema, version 1.5, and no data.
"MyDb" can indeed receive schema updates and I would like to test my application using this DB with data coming from production.
I would like to know if SqlPackage can migrate data from "MyDb" v1.0 to "MyDb" v1.5, keeping the schema in the target but migrating the data from source (v1.0) to target (v1.5).
I quickly tested SqlPackage, but it works rather in the opposite direction, basically that "MyDb" v1.0 receives the schema updates from "MyDb" v1.5. This would mean original data is kept and schema updated, but in my opinion the procedure would be cumbersome, with too many steps:
1. Create a copy of "MyDb" v1.0 from the production environment in the test environment
2. Create a new "MyDb" v1.5 in the test environment;
3. Run SqlPackage to apply schema changes, with "MyDb" v1.5 as source and "MyDb" v1.0 as target;
4. Destroy "MyDb" v1.5 in the test environment.
Perhaps this is the ideal sequence and I can't expect too much from this tool.
In my mind, I thought this would be possible:
1. Create a new "MyDb" v1.5 in the test environment;
2. SqlPackage updates "MyDb" v1.5 with the data from "MyDb" v1.0 from production environment, keeping the schema in "MyDb" v1.5 intact. It would also "figure out" how to migrate data if there are schema changes, but maybe I'm too optimistic.
Thank you.
Alberto Solano
(101 rep)
Oct 23, 2024, 02:32 PM
• Last activity: Nov 4, 2024, 05:38 PM
0
votes
0
answers
133
views
SqlPackage.exe not honoring DoNotDropObjectTypes
When deploying Sql Server changes, I'm seeing objects being dropped despite requesting them not to be dropped. For example, in `/p:DoNotDropObjectTypes`, I specified `Services`, but in the deployment output, it fails while attempting to drop the Services that exist. What am I doing wrong? Here's the...
When deploying Sql Server changes, I'm seeing objects being dropped despite requesting them not to be dropped. For example, in
/p:DoNotDropObjectTypes
, I specified Services
, but in the deployment output, it fails while attempting to drop the Services that exist.
What am I doing wrong?
Here's the pipeline task:
- task: SqlAzureDacpacDeployment@1
displayName: Deploy Azure SQL Db
inputs:
azureSubscription: ${{ parameters.serviceConnection }}
authenticationType: servicePrincipal
serverName: ${{ parameters.sqlServerName }}
databaseName: ${{ parameters.sqlDatabaseName }}
DacpacFile: ${{ parameters.dacpacFile }}
AdditionalArguments: |
/p:TreatVerificationErrorsAsWarnings=True
/p:DropObjectsNotInSource=True
/p:DoNotDropObjectTypes=Users;Logins;RoleMembership;Permissions;Credentials;DatabaseScopedCredentials;LinkedServerLogins;LinkedServers;Services;MessageTypes;
/p:BlockOnPossibleDataLoss=False
Here's the deployment failing while trying to drop a Service:
Could not deploy package.
Warning SQL72032: If this deployment is executed, [dbo_t1_xxxxx_Receiver] will be dropped and not re-created.
Warning SQL72032: If this deployment is executed, [dbo_t2_xxxxx_Receiver] will be dropped and not re-created.
Warning SQL72032: If this deployment is executed, [dbo_t3_xxxxx_Receiver] will be dropped and not re-created.
Warning SQL72032: If this deployment is executed, [dbo_t4_xxxxx_Receiver] will be dropped and not re-created.
Warning SQL72032: If this deployment is executed, [dbo_t5_xxxxx_Receiver] will be dropped and not re-created.
Warning SQL72032: If this deployment is executed, [dbo_t1_xxxxx_Sender] will be dropped and not re-created.
Warning SQL72032: If this deployment is executed, [dbo_t2_xxxxx_Sender] will be dropped and not re-created.
Warning SQL72032: If this deployment is executed, [dbo_t3_xxxxx_Sender] will be dropped and not re-created.
Warning SQL72032: If this deployment is executed, [dbo_t4_xxxxx_Sender] will be dropped and not re-created.
Warning SQL72032: If this deployment is executed, [dbo_t5_xxxxx_Sender] will be dropped and not re-created.
Error SQL72014: Framework Microsoft SqlClient Data Provider: Msg 15151, Level 16, State 1, Line 1
Cannot drop the service 'dbo_t4_xxxxx_Sender', because it does not exist or you do not have permission.
Error SQL72045: Script execution error. The executed script:
DROP SERVICE [dbo_t4_xxxxx_Sender];
System.Management.Automation.RemoteException
System.Management.Automation.RemoteException
System.Management.Automation.RemoteException
The Azure SQL DACPAC task failed. SqlPackage.exe exited with code 1.
Finishing: Deploy Azure SQL Db
Greg
(582 rep)
Aug 23, 2024, 02:25 AM
0
votes
1
answers
41
views
.DTSX package is there a way to wrap a workflow into a transaction
I have a dtsx package that imports data from a csv into a table, it truncates the table before it loads the data into the table. The problem is that sometimes for a reason or another the package fails, hence I end with an empty table, I would like to wrap the workflow inside some kind of transaction...
I have a dtsx package that imports data from a csv into a table, it truncates the table before it loads the data into the table.
The problem is that sometimes for a reason or another the package fails, hence I end with an empty table, I would like to wrap the workflow inside some kind of transaction and restore the table in case of failure.
Is there any way to achieve this?
Axeltherabbit
(117 rep)
Jul 9, 2024, 01:51 PM
• Last activity: Jul 11, 2024, 11:16 AM
0
votes
1
answers
61
views
Why is SQLPackage suddenly including every scalar function in my SSDT project for ALTER in my deployment?
I have a SQL 2019 SSDT project with a large number of scalar and table-valued functions. Suddenly yesterday, my Windows TFS (on prem) deployment project's SQLPackage scripting step is suddenly including every scalar function in the project (~300) in the deployment script (with ALTER). I spot-checked...
I have a SQL 2019 SSDT project with a large number of scalar and table-valued functions. Suddenly yesterday, my Windows TFS (on prem) deployment project's SQLPackage scripting step is suddenly including every scalar function in the project (~300) in the deployment script (with ALTER). I spot-checked the first dozen - in every one, the current compiled definition in the target database is identical to the definition SQLPackage intends to execute. I checked the git repo - I don't see any commits that change the project's sqlproj file. Anyone seen anything like this before?
John Hurtubise
(9 rep)
Jun 10, 2024, 07:18 PM
• Last activity: Jun 14, 2024, 12:12 PM
-1
votes
1
answers
441
views
Need help with DBMS_COMPRESSION.GET_COMPRESSION_RATIO
I need some help on how to run the `DBMS_COMPRESSION.GET_COMPRESSION_RATIO` package on all the tables of a database to see if is worth the implementation or not. This is the procedure I'm running alone and I need a way to automatize it so that I don't have to execute it manually table by table. ```...
I need some help on how to run the
DBMS_COMPRESSION.GET_COMPRESSION_RATIO
package on all the tables of a database to see if is worth the implementation or not.
This is the procedure I'm running alone and I need a way to automatize it so that I don't have to execute it manually table by table.
SET SERVEROUTPUT ON
DECLARE
blkcnt_cmp PLS_INTEGER;
blkcnt_uncmp PLS_INTEGER;
row_cmp PLS_INTEGER;
row_uncmp PLS_INTEGER;
cmp_ratio NUMBER;
comptype_str VARCHAR2(32767);
BEGIN
DBMS_COMPRESSION.GET_COMPRESSION_RATIO (
scratchtbsname => 'USERS' ,
ownname => 'OWNER' ,
objname => 'TABLE_NAME' ,
subobjname => NULL ,
comptype => DBMS_COMPRESSION.COMP_ADVANCED,
blkcnt_cmp => blkcnt_cmp,
blkcnt_uncmp => blkcnt_uncmp,
row_cmp => row_cmp,
row_uncmp => row_uncmp,
cmp_ratio => cmp_ratio,
comptype_str => comptype_str,
subset_numrows=> DBMS_COMPRESSION.comp_ratio_minrows,
objtype => DBMS_COMPRESSION.objtype_table
);
DBMS_OUTPUT.put_line( 'Number of blocks used by the compressed sample of the object : ' ||
blkcnt_cmp);
DBMS_OUTPUT.put_line( 'Number of blocks used by the uncompressed sample of the object : ' ||
blkcnt_uncmp);
DBMS_OUTPUT.put_line( 'Number of rows in a block in compressed sample of the object : ' ||
row_cmp);
DBMS_OUTPUT.put_line( 'Number of rows in a block in uncompressed sample of the object : ' ||
row_uncmp);
DBMS_OUTPUT.put_line ('Estimated Compression Ratio of Sample : ' ||
cmp_ratio);
DBMS_OUTPUT.put_line( 'Compression Type : ' ||
comptype_str);
END;
Daniel Soto
(3 rep)
Dec 8, 2023, 09:20 PM
• Last activity: Dec 10, 2023, 12:33 AM
4
votes
2
answers
4746
views
Exclude certain schema along with unnamed constraints in SSDT
### Task 1. Automate database deployment (SSDT/dacpac deployment with CI/CD) 2. The database is a 3rd party database 3. It also includes our own customized tables/SP/Fn/Views in separate schemas 4. Should exclude 3rd party objects while deploying the database project(dacpac) to Production 5. Thanks...
### Task
1. Automate database deployment (SSDT/dacpac deployment with CI/CD)
2. The database is a 3rd party database
3. It also includes our own customized tables/SP/Fn/Views in separate schemas
4. Should exclude 3rd party objects while deploying the database project(dacpac) to Production
5. Thanks to Ed Elliott for the AgileSqlClub.DeploymentFilterContributor . Used the dll to filter out the schema successfully.
### Problem
1. The 3rd party schema objects(Tables) are defined with unnamed constraints(default / primary key) when creating the tables. Example:
CREATE TABLE [3rdParty].[MainTable]
(ID INT IDENTITY(1,1) NOT NULL,
CreateDate DATETIME DEFAULT(GETDATE())) --There is no name given to default constraint
2. When I generate the script for deployment using sqlpackage.exe, I see following statements in the generated script.
Generated the script using:
>"C:\Program Files\Microsoft SQL Server\150\DAC\bin\sqlpackage.exe" /action:script /sourcefile:C:\Users\User123\source\repos\DBProject\DBProject\bin\Debug\DBProject.dacpac /TargetConnectionString:"Data Source=MyServer; Initial Catalog=MSSQLDatabase; Trusted_Connection=True" /p:AdditionalDeploymentContributorPaths="C:\Program Files\Microsoft SQL Server\150\DAC\bin\AgileSqlClub.SqlPackageFilter.dll" /p:AdditionalDeploymentContributors=AgileSqlClub.DeploymentFilterContributor /p:AdditionalDeploymentContributorArguments="SqlPackageFilter=IgnoreSchema(3rdParty)" /outputpath:"c:\temp\script_AfterDLL.sql"
Script Output:
/*
Deployment script for MyDatabase
This code was generated by a tool.
Changes to this file may cause incorrect behavior and will be lost if
the code is regenerated.
*/
...
...
GO
PRINT N'Dropping unnamed constraint on [3rdParty].[MainTable]...';
GO
ALTER TABLE [3rdParty].[MainTable] DROP CONSTRAINT [DF__MainTabl__Crea__59463169];
...
...
...(towards the end of the script)
ALTER TABLE [3rdParty].[MainTable_2] WITH CHECK CHECK CONSTRAINT [fk_518_t_44_t_9];
3. I cannot alter 3rd party schema due to company restrictions
4. There are many lines of **unnamed constraint** and WITH CHECK CHECK
constraints generated in the script.
### Questions
1. How can I be able to remove the lines to **DROP unnamed Constraint** on 3rd party schemas? - Even though the dll excludes 3rd party schema, it still has these unnamed constraints scripted/deployed. Also, it is not adding them back too.
2. How can I be able to skip/remove generating WITH CHECK CHECK CONSTRAINT
on 3rd party schemas
Also, I found another issue. The deployment will not succeed due to:
> Rows were detected. The schema update is terminating because data loss might occur
### Output
/*
The column [3rdParty].[MainTable_1].[Col1] is being dropped, data loss could occur.
The column [3rdParty].[MainTable_1].[Col2] is being dropped, data loss could occur.
The column [3rdParty].[MainTable_1].[Col3] is being dropped, data loss could occur.
The column [3rdParty].[MainTable_1].[Col4] is being dropped, data loss could occur.
*/
IF EXISTS (select top 1 1 from [3rdParty].[MainTable_1])
RAISERROR (N'Rows were detected. The schema update is terminating because data loss might occur.', 16, 127) WITH NOWAIT
GO
I tried various combination of parameters with no luck. /p:ExcludeObjectType=Defaults
or /p:DropObjectsNotInSource=False /p:DoNotDropObjectType=Defaults
etc.
Santhoshkumar KB
(581 rep)
Jun 8, 2020, 06:53 AM
• Last activity: Dec 5, 2023, 03:07 PM
2
votes
0
answers
350
views
Oracle 19C Auto Compile fails on Package With SQL_Macro but ALTER PACKAGE ... COMPILE Succeeds
We have a package SCHEMA.MYPACKAGE with a procedure SCHEMA.MYPACKAGE.GETDATA that calls a sql macro SCHEMA.SQLMACROFUNCTION that exists within the same schema but outside the package. The macro is called within a select statement and qualified by schema name even though it's within the same schema,...
We have a package SCHEMA.MYPACKAGE with a procedure SCHEMA.MYPACKAGE.GETDATA that calls a sql macro SCHEMA.SQLMACROFUNCTION that exists within the same schema but outside the package. The macro is called within a select statement and qualified by schema name even though it's within the same schema, like this:
SELECT col1,col2,col3
FROM SCHEMA.SQLMACROFUNCTION(param1, param2, param3)
ORDER BY col1 DESC;
The problem is that when anything invalidates the package (for example, adding a column to a table the package depends on) and the next procedure call tries to auto-compile the package as Oracle is supposed to do, the compile fails with
> PLS-00201: identifier 'SQLMACROFUNCTION' must be declared
leaving the package in an invalid state until it's manually compiled with
ALTER PACKAGE SCHEMA.MYPACKAGE COMPILE;
which always succeeds.
Anyone have an explanation for this behavior?
I considered permissions, but the package and the sql_macro function are in the same schema.
I considered function name qualification, but it's already called with schemaname.objectname.
Also, and I can't corroborate with evidence from Oracle documentation, but I read somewhere that functions (in this case a sql_macro) have to be executed in order to validate the code, but an auto-compile only parses the code, which might explain.
There are obvious workarounds to this issue (substituting the function code into the procedure in place of the function call resolves the auto-compile issue), but I'm curious to understand why it's happening and am interested in some documentation, if it's out there.
Thanks in advance.
lightwing
(31 rep)
Jun 21, 2023, 06:57 PM
0
votes
2
answers
2951
views
SQL Server Agent not running SSIS package
I have followed many articles to create an SSIS pipeline package, deploy it and schedule it. I'm trying to transfer data from two remote servers, an OLEDB driver that connects to a SQL Server Database as a Source and an OLEDB provider for ODBC drivers that connects to a MySQL Database as a Destinati...
I have followed many articles to create an SSIS pipeline package, deploy it and schedule it.
I'm trying to transfer data from two remote servers, an OLEDB driver that connects to a SQL Server Database as a Source and an OLEDB provider for ODBC drivers that connects to a MySQL Database as a Destination...
The Pipeline works very well on Visual Studio and when I deploy it and execute it from SSMS, but when I create a Job I get the following error :
**NOTE** : Am using 32bits ODBC driver for MySQL.
Any ideas ?
**UPDATE (had to update because image not visible to everyone)**
ERROR :
Data Flow Task:Error: OLE DB Destination failed
validation and returned error code 0xC020801C.
Data Flow Task:Error: SSIS Error Code
DTS_E_CANNOTACQUIRECONNECTIONFROMCONNE
CTIONMANAGER. The AcquireConnection method call to
the connection manager
"remotemysql.com.uVt96HeM3y.uVt96HeM3y1" failed with
error code 0xC0202009. There may be error messages
posted before this with more information on why the
AcquireConnection method call failed.
Package1:Error: SSIS Error Code
DTS_E_OLEDBERROR. An OLE DB error has occurred.
Error code: 0x80004005.
An OLE DB record is available. Source: "MSDataShape"
Hresult: 0x80004005 Description: "Data provider could not
be initialized.".
An OLE DB record is available. Source: "Microsoft OLE
DB Provider for ODBC Drivers" Hresult: 0x80004005
Description: "[Microsoft][ODBC Driver Manager] Data
source name not found and no default driver specified".

Hamza Hamdani
(17 rep)
Mar 8, 2022, 02:44 PM
• Last activity: Mar 9, 2022, 10:13 AM
0
votes
1
answers
465
views
SqlPackage Import - How to chose vCore purchase model?
I read the following documentation. [Properties specific to the Import action][1] Looking at the switch description for `DatabaseEdition`, I do not see a choice for [vCore Azure SQL Database purchase model][2]. Is it not possible to choose the vCore model while using the SqlPackage import feature? [...
I read the following documentation.
Properties specific to the Import action
Looking at the switch description for
DatabaseEdition
, I do not see a choice for vCore Azure SQL Database purchase model .
Is it not possible to choose the vCore model while using the SqlPackage import feature?
SqlWorldWide
(13707 rep)
Jan 20, 2022, 11:56 AM
• Last activity: Jan 21, 2022, 05:34 PM
0
votes
1
answers
547
views
Is it possible to apply filters on SqlPackage?
We develop a lot of modules separately. This means the Blog module has its own database in development. All objects of each module start with the module schema. So the `Posts` table inside the Blog module is named `blog.Posts`. However, in production we deploy all modules to one target database. For...
We develop a lot of modules separately. This means the Blog module has its own database in development. All objects of each module start with the module schema. So the
Posts
table inside the Blog module is named blog.Posts
.
However, in production we deploy all modules to one target database.
For example, this means that in production I will have these objects:
blog.Posts
content.Pages
seo.Configurations
menu.Items
email.Templates
`
Now, I want to compare Blog module database with the production database. I'm using SqlPackage for this purpose.
The point is that SqlPackage creates a huge drop script, because it thinks that all other tables from other modules that exist in the production database should be dropped.
Is there a way to tell SqlPackage to only compare the objects of a given schema? Can I configure SqlPackage to only compare blog
schema?
Ali EXE
(215 rep)
Dec 19, 2021, 08:32 AM
• Last activity: Dec 20, 2021, 12:17 AM
1
votes
1
answers
218
views
sqlpackage.exe fails on SignalR objects
Issue: When running sqlpackage.exe to apply a sql dacpac, it occasionally fails on broker/SignalR related objects, which breaks the build: *** Could not deploy package. Error SQL72014: .Net SqlClient Data Provider: Msg 15151, Level 16, State 1, Line 1 Cannot drop the service 'Xxxxxx_xxxxx_0de7d69e-4...
Issue:
When running sqlpackage.exe to apply a sql dacpac, it occasionally fails on broker/SignalR related objects, which breaks the build:
*** Could not deploy package.
Error SQL72014: .Net SqlClient Data Provider: Msg 15151, Level 16, State 1, Line 1 Cannot drop the service 'Xxxxxx_xxxxx_0de7d69e-48ac-4a2b-95ef-d758d69b2a1e_Receiver', because it does not exist or you do not have permission.
Error SQL72045: Script execution error. The executed script:
DROP SERVICE [Xxxxxx_xxxxx_0de7d69e-48ac-4a2b-95ef-d758d69b2a1e_Receiver];
This is our
sqlpackage.exe
cmd line parameters:
/p:TreatVerificationErrorsAsWarnings=True /p:DoNotAlterChangeDataCaptureObjects=True /p:DropIndexesNotInSource=True /p:DropObjectsNotInSource=True /p:ExcludeObjectTypes=Users;Logins;RoleMembership;Permissions;Credentials;DatabaseScopedCredentials;LinkedServerLogins;LinkedServers /p:BlockOnPossibleDataLoss=False
Again, the failure rate is like 1 out of every 10 runs, and no one is manually removing these objects, thus I have no idea why it would not be able to drop the object if it knows about it. Any ideas on a good solution to get it to apply consistently ?
Greg
(582 rep)
Mar 12, 2021, 08:29 PM
• Last activity: Mar 13, 2021, 02:57 PM
1
votes
1
answers
723
views
Deploying dacpac through sqlpackage.exe with NETWORK SERVICE user
I have a database release pipeline in Azure DevOps Server which deploys our dacpac files through Powershell commands that run the sqlpackage.exe commands to publish the database. I am using the NETWORK SERVICE user to publish the databases to our DEV and QA environment as I wanted to avoid putting t...
I have a database release pipeline in Azure DevOps Server which deploys our dacpac files through Powershell commands that run the sqlpackage.exe commands to publish the database. I am using the NETWORK SERVICE user to publish the databases to our DEV and QA environment as I wanted to avoid putting the username and password on the release pipeline. We will be deploying these dacpac files to Production soon and the production server is outside our network. Are there security concerns with using the network service user to deploy the dacpac files on a production environment? What alternatives are there besides putting the username and password in secret variables?
user216359
Sep 28, 2020, 09:02 PM
• Last activity: Sep 29, 2020, 12:42 PM
3
votes
1
answers
849
views
SqlPackage resolving table that doesn't exist
I'm trying to validate the integrity of one of our SQL Server 2016 databases by using sqlpackage.exe to extract the dacpac and validate the schema: ./sqlpackage.exe /DiagnosticsFile:"diagnostics_users.txt`" /Action:Extract /TargetFile: "users.dacpac" /SourceConnectionString:"connstring" /p:VerifyExt...
I'm trying to validate the integrity of one of our SQL Server 2016 databases by using sqlpackage.exe to extract the dacpac and validate the schema:
./sqlpackage.exe /DiagnosticsFile:"diagnostics_users.txt`" /Action:Extract /TargetFile: "users.dacpac" /SourceConnectionString:"connstring" /p:VerifyExtraction=true"
When I run it, I get the output
Resolving references in schema model
Successfully extracted database and saved it to file
However, when I check the source database, there is a stored procedure which references a table in a database that does not exist. If I try to run it in SSMS, I get the error:
Invalid object name '[Old_Database].dbo.VIEWER_LOG'
. However, SqlPackage did not flag this as an error
When I search in the diagnostic logs, I do see the following:
Microsoft.Data.Tools.Diagnostics.Tracer Verbose: 1 : ColumnResolver: #34027998 ResolvedDescriptor for column:
ResolvedDescriptor: [DATE_TIME_STAMP]
Potentials(1):
[SqlColumn : P1 - [OLD_DATABASE.dbo.VIEWER_LOG.DATE_TIME_STAMP]]
Available Column Sources:
ColumnSource VIEWER_LOG (affinity = 1) with no known columns
There is a table on the target database called Viewer_Log, but, not with the schema OLD_DATABASE.dbo.VIEWER_LOG.DATE_TIME_STAMP, since OLD_DATABASE.DBO does not exist. Shouldn't SqlPackage be flagging this as an error when it resolves references? Is there a configuration available to detect and flag this?
Daryl1976
(191 rep)
Jun 26, 2020, 08:00 PM
• Last activity: Jul 2, 2020, 09:11 PM
Showing page 1 of 19 total questions