Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
0
votes
1
answers
487
views
importing bacpac file using sql package
How can I specify Database maximum size when importing bacpac file using sql package? The maximum size of 32GB is auto-sizing, but my actual DB size is over 35GB. Can I get to use DatabaseMaximumSize=(INT60) instead of DatabaseMaximumSize=(INT32)? eg. import parameter /p:DatabaseEdition=GeneralPurpo...
How can I specify Database maximum size when importing bacpac file using sql package? The maximum size of 32GB is auto-sizing, but my actual DB size is over 35GB. Can I get to use DatabaseMaximumSize=(INT60) instead of DatabaseMaximumSize=(INT32)?
eg. import parameter /p:DatabaseEdition=GeneralPurpose /p:DatabaseServiceObjective=GP_Gen5_2 /p:DatabaseMaximumSize=(INT60)
Cho Wai Tun
(1 rep)
Jun 12, 2021, 07:12 AM
• Last activity: Jun 30, 2025, 07:08 AM
8
votes
1
answers
5365
views
SQLPackage : Import or Export SQL Azure BACPAC without users
I know how to use `SQLpackage.exe`. I can export from SQL Azure and create a bacpac. I can also Import to SQL Server via `SQLPackage.exe`. I want to export or import without the users in the database. The users are preventing the normal import to fail. > Error importing database :could not import pa...
I know how to use
I can't delete the users before I export, or exclude them from the import.
Currently we have to backup the database, restore it on SQLAzure, delete the use on this second database, then create a bacpac from the restored database.
How do I get around this?
can I use IgnoreUserSettingsObjects with my sqlpackage ? I tried using the parameters but it fails
SQLpackage.exe
. I can export from SQL Azure and create a bacpac. I can also Import to SQL Server via SQLPackage.exe
.
I want to export or import without the users in the database. The users are preventing the normal import to fail.
> Error importing database :could not import package.
>
> Error SQL72014 Cannot grant, deny or remove permissions to or from special roles

Peter PitLock
(1405 rep)
Apr 18, 2017, 02:00 PM
• Last activity: May 14, 2025, 02:02 PM
0
votes
1
answers
340
views
Change the schema of table while restoring the database from bacpac file?
I have a **bacpac** file which I generated from SQL Server. And now, I am restoring the database using this **bacpac** file into the Azure Managed Instance. Here, I am trying to change the schema of the table according to the requirement. The question is, can I change the schema of tables while rest...
I have a **bacpac** file which I generated from SQL Server. And now, I am restoring the database using this **bacpac** file into the Azure Managed Instance.
Here, I am trying to change the schema of the table according to the requirement.
The question is, can I change the schema of tables while restoring the database into Azure Managed Instance?
For instance, I have 10 tables having the schema
and I want to change the schema to
.
Robin
(1 rep)
Oct 8, 2022, 06:28 AM
• Last activity: Jan 22, 2025, 11:06 AM
0
votes
1
answers
513
views
The store schema does not contain the Element class SqlXmlCompressionOption
I hope you are all good I am doing or importing a bacpac file from azure SQL to SQL on-premisse with SSMS and I am receiving this error. Somebody get me help or give me a north [![enter image description here][1]][1] Details Could not load schema model from package. (Microsoft.SqlServer.Dac) -------...
I hope you are all good
I am doing or importing a bacpac file from azure SQL to SQL on-premisse
with SSMS
and I am receiving this error.
Somebody get me help or give me a north
Details
Could not load schema model from package. (Microsoft.SqlServer.Dac)
------------------------------
Program Location:
at Microsoft.SqlServer.Dac.DacPackage.LoadModel(IPackageSource packageSource, Boolean ignoreUnresolvedExternalErrors)
at Microsoft.SqlServer.Dac.Extensions.DacExtensions.GetCollationString(IPackageSource packageSource)
at Microsoft.SqlServer.Dac.Extensions.DacExtensions.GetCollationString(BacPackage package)
at Microsoft.SqlServer.Management.Dac.DacWizard.CreateDatabaseOnTargetWorkItem.DoWork()
at Microsoft.SqlServer.Management.TaskForms.SimpleWorkItem.Run()
===================================
The store schema does not contain the Element class SqlXmlCompressionOption. (Microsoft.Data.Tools.Schema.Sql)
------------------------------
Program Location:
at Microsoft.Data.Tools.Schema.SchemaModel.DataSchemaModel.ModelEntityFactory.CreateElement(String elementClassName, String nameString, Int32 line, Int32 column)
at Microsoft.Data.Tools.Schema.SchemaModel.DataSchemaModel.PossibleElementState.ReadElement()
at Microsoft.Data.Tools.Schema.SchemaModel.DataSchemaModel.ModelContentsState.ProcessStartElement()
at Microsoft.Data.Tools.Schema.SchemaModel.XmlDeserializerEngine.ReadData(DeserializerState initialState, Boolean keepCurrentReaderPosition)
at Microsoft.Data.Tools.Schema.SchemaModel.DataSchemaModel.DeserializeXml(TextReader input, ErrorManager errors, String source, Action`3 constructorParametersSetter)
at Microsoft.Data.Tools.Schema.SchemaModel.DataSchemaModel.DeserializePackage(SqlPackage package, ErrorManager errors, Action`3 constructorParametersSetter)
at Microsoft.SqlServer.Dac.DacPackage.DeserializePackage(SqlPackage package, DacSchemaModelStorageType modelStorageType, ErrorManager errorManager, DataSchemaModelHeader& header)
at Microsoft.SqlServer.Dac.DacPackage.LoadModel(IPackageSource packageSource, Boolean ignoreUnresolvedExternalErrors)
SQL on premise Version (sql server 2019)
SQL Azure Version



Wesley Neves
(11 rep)
Aug 8, 2022, 09:53 PM
• Last activity: Jul 3, 2024, 11:06 PM
0
votes
1
answers
1164
views
SQL72014 error importing bacpac with multiple columns using a UDT bound to a rule
#### Apology First off, apologies for length - this is a weird issue so I'm trying to give enough detail to make it reproducible. --- #### tl;dr So I've think found an issue while **exporting** a ```*.bacpac``` Data Tier Application between two MS SQL Servers to migrate an instance of an Indeo ProGe...
#### Apology
First off, apologies for length - this is a weird issue so I'm trying to give enough detail to make it reproducible.
---
#### tl;dr
So I've think found an issue while **exporting** a
> **Microsoft SQL Server Management Studio**
>
> Could not import package.
>
> Error SQL72014: Framework Microsoft SqlDataClient Data Provider: MSG 2714, Level 16, State 3, Procedure YNINDICATOR_Domain, Line 1 There is already an object named 'YNINDICATOR_Domain' in the database.
>
> Error SQL72045: Script execution error. The executed script:
>
> CREATE RULE [dbo].[YINDICATOR_Domain]
>
> AS @Ind COLLATE Latin1_General_BIN IN ('Y', 'N')
>
> (Microsoft.SqlServer.Dac)
---
### Investigation
I've unzipped the bacpac and found that when there is *more than one* column defined using the
*.bacpac
Data Tier Application between two MS SQL Servers to migrate an instance of an Indeo ProGet database which creates an inconsistent bacpac. Trying to import this bacpac results in an error, and I'd appreciate a second pairs of eyes on the problem to make sure I'm not doing anything daft...
I *think* the root cause is a possible bug in the "Export data-tier application" process when there are multiple schema columns using the same User Defined Type, and that UDT is bound to a validation Rule - the result seems to be a corrupt *.bacpac
file that can't be imported without doing some manual tweaking first.
My question is:
* **Part 1**: Am I doing something wrong, or is this a known (or new) bug?
* **Part 2**: If it's a bug, do you have any idea where I can report it?
---
### Schema
The following re-creates a small part of the schema from the Inedo ProGet database that demonstrates the issue. (FWIW, I'm currently using SQL Server 2022 v16.0.1105.1, but I'm pretty sure it occurs in other versions as well).
It basically does this:
* Creates a User Defined Type
* Binds it to a rule that restricts the values to
and
* Creates two tables that each have a column of type
(see [dbo].[CustomLanguage].[Active_Indicator]
and [dbo].[ClusterNodes].[Primary_Indicator]
)
CREATE TYPE [dbo].[YNINDICATOR] FROM [char](1) NULL
GO
CREATE RULE [dbo].[YNINDICATOR_Domain]
AS
@Ind COLLATE Latin1_General_BIN IN ('Y', 'N');
EXEC sp_bindrule 'YNINDICATOR_Domain', 'YNINDICATOR'
GO
CREATE TABLE [dbo].[CustomLanguages](
[CustomLanguage_Id] [int] IDENTITY(1,1) NOT NULL,
[Culture_Name] [varchar](50) NOT NULL,
[Language_Name] [nvarchar](100) NOT NULL,
[Active_Indicator] [dbo].[YNINDICATOR] NOT NULL,
[CustomLanguage_Xml] [xml] NOT NULL,
CONSTRAINT [PK__CustomLanguages] PRIMARY KEY CLUSTERED
(
[CustomLanguage_Id] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
CREATE TABLE [dbo].[ClusterNodes](
[Server_Name] [nvarchar](64) NOT NULL,
[NodeType_Code] [char](1) NOT NULL,
[LastUpdated_Date] [datetime] NOT NULL,
[Primary_Indicator] [dbo].[YNINDICATOR] NOT NULL,
[Node_Configuration] [xml] NOT NULL,
CONSTRAINT [PK__ClusterNodes] PRIMARY KEY CLUSTERED
(
[Server_Name] ASC,
[NodeType_Code] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
ALTER TABLE [dbo].[ClusterNodes] WITH CHECK ADD CONSTRAINT [CK__ClusterNodes__NodeType_Code] CHECK (([NodeType_Code]='S' OR [NodeType_Code]='W'))
GO
ALTER TABLE [dbo].[ClusterNodes] CHECK CONSTRAINT [CK__ClusterNodes__NodeType_Code]
GO
---
### Error
Exporting the above schema into a Data Tier Application bacpac file completes without any errors, but attempting to re-import it as a new database gives the following error:

UDT, the .xml
file inside the bacpac contains something like this, with **two**
elements defined for [dbo].[YNINDICATOR_Domain]
- the first contains the *column* references, and the second contains the UDT reference:
**Broken bacpac - two column references and a UDT reference -> two SqlRule elements**
... snip ...
... snip ...
This is presumably the cause of the an object named 'YINDICATOR_Domain'
error because it tries to create the SqlRule twice - once for each
.
If I **DROP** either of the columns using the
UDT and re-export a Data Tier Application, the bacpac contains a single Type="SqlRule"
node which contains both the column *and* UDT References:
**Working bacpac - one column reference and a UDT reference -> one SqlRule element**
... snip ...
... snip ...
and if I use
to remove the binding the bacpac contains just a single
- it no longer creates a second one for the UDT binding as it obviously doesn't exist any more:
**Working bacpac - two column references and _no_ UDT reference -> one SqlRule elements**
I can't really use either of these options though as it doesn't match the actual database schema for the ProGet application, but it's interesting to note how the column and UDT binding get serialised depending on whether there's one or more-than-one column using the UDT.
---
### Workaround
If I *manually* hack around with the .xml
in the original bacpac file and merge the two SqlRule
s so there's only one that contains the columns *and* Rule binding I can import it fine and it creates all the schema objects without error:
This isn't *very* useful though as I don't want to have to keep doing this each time I migrate the database (I won't be doing it *many* times, but enough for this to be a painful workaround).
**Working bacpac - two column references and a UDT reference hacked into _one_ SqlRule element**
Side note: if I then re-export *this* database it reverts back to the original form of 2 SqlRule elements - one that contains the columns and one that contains the rule binding.
---
### Update
The problem also happens with the Az CLI:
PS> az version
{
"azure-cli": "2.56.0",
"azure-cli-core": "2.56.0",
"azure-cli-telemetry": "1.1.0",
"extensions": {
"storage-preview": "1.0.0b1"
}
}
PS> az sql db export `
--subscription "My Subscription" `
--resource-group "my-resource-group" `
--server "my-sql-server" `
--name "ProGet" `
--admin-user "my-admin-user" `
--admin-password "my-admin-password" `
--storage-uri "https://mystorageaccount.blob.core.windows.net/proget-db-backups/proget.bacpac " ``
--storage-key-type "StorageAccessKey" `
--storage-key "my storage key"
Downloading the blob and unpacking it shows the problem with multiple SqlRule
s.
---
### Recap
If you've got this far then thanks for reading, and just to recap, my question in the "tl;dr" at the top was:
* **Part 1**: Am I doing something wrong, or is this a known (or new) bug?
* **Part 2**: If it's a bug, do you have any idea where I can report it?
Any help greatly appreciated.
mclayton
(143 rep)
Jan 11, 2024, 10:15 PM
• Last activity: Jan 30, 2024, 07:07 PM
7
votes
3
answers
15842
views
Bacpac Import fails with error 'The Element or Annotation class SqlDatabaseOptions does not contain the Property class CatalogCollation'
I want to restore some production azure database on my local machine for development purposes. So I exported the production database to a .bacpack file & downloaded that file on local. Now I am trying to restore that file on my local machine using [sqlpackage utility][1] but it is throwing following...
I want to restore some production azure database on my local machine for development purposes.
So I exported the production database to a
Can someone please tell me what does this error means & how to fix it?
Thanks in advance.
.bacpack
file & downloaded that file on local.
Now I am trying to restore that file on my local machine using sqlpackage utility but it is throwing following error
> *** Error importing database:An error occurred during deployment plan
> generation. Deployment cannot continue. The Element or Annotation
> class SqlDatabaseOptions does not contain the Property class
> CatalogCollation.

Ketan
(507 rep)
Jun 5, 2018, 10:11 AM
• Last activity: Mar 15, 2023, 07:21 AM
3
votes
0
answers
502
views
Find SQL version of Azure server
I have been restoring bacpacs provided by a third party on my on prem sql server via powershell. I copy over the bacpac from an SFTP site on to my local drive. This time I have been getting errors which lead me to believe that the bacpac has been created on a higher version of sql . My on prem versi...
I have been restoring bacpacs provided by a third party on my on prem sql server via powershell. I copy over the bacpac from an SFTP site on to my local drive. This time I have been getting errors which lead me to believe that the bacpac has been created on a higher version of sql .
My on prem version is-
**Microsoft SQL Server 2019 (RTM-GDR) (KB5014356) - 15.0.2095.3 (X64) Apr 29 2022 18:00:13 Copyright (C) 2019 Microsoft Corporation Standard Edition (64-bit) on Windows Server 2019 Standard 10.0 (Build 17763: ) (Hypervisor)**
On enquiring with the third party they told me that the bacpac has been created on-
**Microsoft SQL Azure (RTM) - 12.0.2000.8 Nov 16 2022 04:43:19 Copyright (C) 2022 Microsoft Corporation.**
How can i compare the 2 to find if the Azure version is higher ?
mail_sady
(51 rep)
Feb 3, 2023, 09:55 AM
• Last activity: Feb 3, 2023, 10:58 AM
2
votes
1
answers
4508
views
Importing BACPAC file from azure to SQL creating an empty database
Exported Data-Tier_application from an Azure database to create a BACPAC of it. > SQL editions I'm trying to use to import the BACPAC: > SQL Server 2016 express. > SQL Server 2017 Developer Edition. I droped some views and procedures to fix error and it fixed everything. I have now a 3GB BACPAC file...
Exported Data-Tier_application from an Azure database to create a BACPAC of it.
> SQL editions I'm trying to use to import the BACPAC:
> SQL Server 2016 express.
> SQL Server 2017 Developer Edition.
I droped some views and procedures to fix error and it fixed everything.
I have now a 3GB BACPAC file and when i'm trying to import it(right click - import data tier application on databases menu from SSMS ) to my SQL Server 2016 or 2017, I receive this error:
with this error message:
> TITLE: Microsoft SQL Server Management Studio
> ------------------------------
>
> Could not import package. Warning SQL72012: The object
> [DATABASE_NAME_Data] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
> Warning SQL72012: The object [DATABASE_NAME_Log] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
>Error SQL72014: .Net SqlClient Data Provider: Msg
> 7609, Level 17, State 5, Line 2 Full-Text Search is not installed, or
> a full-text component cannot be loaded. Error SQL72045: Script
> execution error. The executed script: CREATE FULLTEXT INDEX ON
> [dbo].[Table]
> ([Nome] LANGUAGE 1033, [field_1] LANGUAGE 1033)
> KEY INDEX [PK_Table]
> ON [table];
it's not possible to have already these DATA and LOG because It's the first time I'm restoring it.
Changing .BACPAC file to .ZIP I could see a DATA folder with a lot of BCP files:
The bacpac file has 3GB.
Other answers from websites just say : "wait until the end" but the export ended 4 days ago.
I have another BACPAC file from a different database and I can import it to sql server normally. I've made the same steps for both BACPAC files when exporting.


Racer SQL
(7546 rep)
Aug 15, 2019, 12:19 PM
• Last activity: Aug 23, 2022, 10:00 PM
5
votes
1
answers
1407
views
What are the .BCP files inside a .bacpak file?
Diving on dba.stackexchange.com I found a [nice answer][1] that teach me how to open a `.bacpac` file with File Explorer. I gave it a try with the database `AdventureWorks2008R2` and all I did was: 1. Save the database as `.bacpac` file 2. Rename the file extension to `.zip` 3. Unzip it And there yo...
Diving on dba.stackexchange.com I found a nice answer that teach me how to open a
But when I open the folder
At the beginning I taught they were the pages in which the data are split. But those cannot be pages because they weight about
.bacpac
file with File Explorer.
I gave it a try with the database AdventureWorks2008R2
and all I did was:
1. Save the database as .bacpac
file
2. Rename the file extension to .zip
3. Unzip it
And there you go, you can see the tables of the database as folders and inside each folder you can see the raw data.

Person.Address
I see there are 38 files into it with extension .BCP
.

67KB
. Pages should weight 8KB
.
So I checked the number of pages with this query:
-- Total # of pages, used_pages, and data_pages for a given heap/clustered index
SELECT
t.NAME AS TableName,
p.rows AS RowCounts,
SUM(a.total_pages) AS TotalPages,
SUM(a.used_pages) AS UsedPages,
(SUM(a.total_pages) - SUM(a.used_pages)) AS UnusedPages
FROM
sys.tables t
INNER JOIN
sys.indexes i ON t.OBJECT_ID = i.object_id
INNER JOIN
sys.partitions p ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id
INNER JOIN
sys.allocation_units a ON p.partition_id = a.container_id
WHERE
t.NAME NOT LIKE 'dt%'
AND t.is_ms_shipped = 0
AND i.OBJECT_ID > 255
GROUP BY
t.Name, p.Rows
ORDER BY
t.Name
And I found out that the Person.Address
table has 689
pages.
So what are those 38 .BCP
files?
Francesco Mantovani
(1695 rep)
Aug 16, 2021, 03:00 PM
• Last activity: Aug 16, 2021, 04:43 PM
3
votes
2
answers
1429
views
Free some space on a full Express Edition instance
Using Microsoft SQL Server 2016 (RTM) - 13.0.1601.5 (X64) Express Edition (64-bit) I have a database that reaches the maximum space (10Gb) every month. In order to free up some space I do a bacpac, create a new database with the bacpac and shrink it 2 times. The database goes from 10Gb to 6Gb, and t...
Using Microsoft SQL Server 2016 (RTM) - 13.0.1601.5 (X64) Express Edition (64-bit)
I have a database that reaches the maximum space (10Gb) every month.
In order to free up some space I do a bacpac, create a new database with the bacpac and shrink it 2 times.
The database goes from 10Gb to 6Gb, and the application using this database gets faster (I think the bacpac refreshes the indexes).
Is there a SQL command to do this? Can I do this in a production database?
Pedro Adão
(33 rep)
May 14, 2021, 10:58 AM
• Last activity: May 21, 2021, 11:58 AM
4
votes
1
answers
1561
views
MS Azure remote back up with SqlPackage
I wanted to do an export of a .bacpac file from MS Azure database, from local machine, with command like this: ~~~ sqlpackage.exe /Action:Export /ssn:tcp: .database.windows.net,1433 /sdn: /su: /sp: /tf: /p:Storage=File ~~~ The DB was apparently found and the tool reported it started extracting the s...
I wanted to do an export of a .bacpac file from MS Azure database, from local machine, with command like this:
~~~
sqlpackage.exe /Action:Export /ssn:tcp:.database.windows.net,1433 /sdn: /su: /sp: /tf: /p:Storage=File
~~~
The DB was apparently found and the tool reported it started extracting the schema from the database, but then it fails:
~~~
Extracting schema
Extracting schema from database
Time elapsed 00:00:05.17
*** An unexpected failure occurred: .NET Core should not be using a file backed
model..
~~~
To my surprise, I couldn't find anything similar on the web. Is this some kind of an MS Azure bug?
userfuser
(147 rep)
Dec 9, 2019, 09:24 AM
• Last activity: Apr 21, 2021, 03:01 PM
0
votes
1
answers
808
views
Is the BACPAC from a TDE encrypted DB also encrypted?
**If you create a BACPAC of an encrypted database, will the data in the BACPAC be encrypted?** We're currently using on-prem SQL Server and use TDE to encrypt our customer databases. We also have a requirement to maintain customer directed backups for up to 10 years. A backup of an encrypted DB and...
**If you create a BACPAC of an encrypted database, will the data in the BACPAC be encrypted?**
We're currently using on-prem SQL Server and use TDE to encrypt our customer databases. We also have a requirement to maintain customer directed backups for up to 10 years. A backup of an encrypted DB and retention is not an issue. We're looking to migrate our platform to Azure SQL DB and thus will lose the ability to do customer directed backups. I'm exploring the use of BACPAC to meet our needs. If you create a BACPAC of an encrypted database, will the data in the BACPAC be encrypted?
SqlNightOwl
(45 rep)
Feb 23, 2021, 09:47 PM
• Last activity: Feb 24, 2021, 08:55 AM
4
votes
1
answers
507
views
SQL Server on linux unable to export to bacpac
I'm trying to export my database to a .bacpac file. I get the following error: (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server) The command I'm using is: /opt/mssql/bin/sqlpackage /action:Export /SourceServerName:localhost /SourceDatabaseName:MyDB /targetfile:'...
I'm trying to export my database to a .bacpac file. I get the following error:
(provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)
The command I'm using is:
/opt/mssql/bin/sqlpackage /action:Export /SourceServerName:localhost /SourceDatabaseName:MyDB /targetfile:'/media/jake/1F171B4C75D06418/MyDB.bacpac'
The database I'm trying to back up is on a different ssd than my mssql installation. I don't know if that makes a difference.
I've been googling around for how to fix this and most people using windows had their similar problem solved by making sure that various windows services were running and starting them if they weren't. But I'm not sure how to check for the same thing in linux, but I think all the services I need are currently running because I can connect to the database via
sqlcmd
as well as through JetBrains' DataGrip application. So I know that a connection is possible. I just don't know why sqlpackage
is having trouble connecting...
I've also tried variations of this command on the SourceServerName
parameter:
/opt/mssql/bin/sqlpackage /action:Export /SourceServerName:tcp:localhost,1433 /SourceDatabaseName:MyDB /targetfile:'/media/jake/1F171B4C75D06418/MyDB.bacpac'
And the error is slightly different:
(provider: TCP Provider, error: 0 - No such host is known.)
Jake Smith
(141 rep)
Jul 1, 2017, 07:42 PM
• Last activity: Jul 12, 2020, 05:06 PM
1
votes
1
answers
1984
views
What is the best way to include a large amount of configuration data in a SQL project?
I have a BIDS `SQL Server Database Project` with a database that I publish dynamically to various servers (SQL2016). It initially starts blank and everything is parameterized. Everything is in Azure DevOps and deploys to places. I want to include/pack in (mostly static) configuration **data** for a...
I have a BIDS
SQL Server Database Project
with a database that I publish dynamically to various servers (SQL2016). It initially starts blank and everything is parameterized. Everything is in Azure DevOps and deploys to places.
I want to include/pack in (mostly static) configuration **data** for a few tables. Initially I used the script wizard to script the tables, then just put that script text in my post-deployment file (InitConfigs.PostDeployment1.sql
).
I've come across one config table, when scripted, is a nearly 600mb script. The size isn't the problem, it just seems like there must be a proper/better way to embed it in the project.
Some things I've tried:
* Export to flat file and embed somehow. - Seems messy and I have to figure out a way to import it in my post deployment script
* Export single table bacpac - Doesn't seem to work because it tries to include entire database
* Use Script wizard to generate a giant *.sql
file. - I can add to the project and reference with SQLCMD :r
. Seems odd.
Is there some sort of data package dacpac/bacpac-style file that can wrap everything up nicely and can be easily imported? Perhaps compressed too?
Alex K
(175 rep)
Apr 20, 2020, 11:40 PM
• Last activity: Apr 21, 2020, 03:50 PM
2
votes
1
answers
468
views
Importing a single large table in Azure
I have an existing DB called `MyDb` in Azure SQL Server. I have a bacpac of another DB with several tables in it. I'm interested in importing **one single table** (that table has no FK, it makes things easier) into a dedicated table `MyDb.dbo.ImportedTable`. The final goal is to be able to do some d...
I have an existing DB called
MyDb
in Azure SQL Server.
I have a bacpac of another DB with several tables in it. I'm interested in importing **one single table** (that table has no FK, it makes things easier) into a dedicated table MyDb.dbo.ImportedTable
. The final goal is to be able to do some data reconstruction using that table.
Problems are:
- MyDb.dbo.ImportedTable
is ~60 Gb large
- The main column in that table is a NVARCHAR(MAX)
. That forbids me to use *Elastic queries* in Azure. It times out since Elastic queries hates anything larger than NVARCHAR(4000)
(I tried)
I guess a good approach is:
1. Use BCP
but I only have the binary *.bcp
files (15'000 of them) that are inside the bacpac
archive (opened as a zip, in its data folder)
But I'm unable to make it work, especially because I find no documentation about the *.bcp
file format used in the bacpac
.
*tl;dr* What is the good approach to import a single ~60Gb table fro ma bacpac in an existing database in azure SQL Server?
Askolein
(131 rep)
Dec 13, 2019, 06:12 PM
• Last activity: Dec 15, 2019, 03:03 AM
1
votes
0
answers
744
views
Import bacpac containing security policies and temporal-tables with schemabinding
I've created a `bacpac` backup of my database (both Azure and local with SSMS). Trying to import the `bacpac` to same or another machine (again both Azure and local) the import always fails with this error: >Error SQL72014: .Net SqlClient Data Provider: Meldung 3729, Ebene 16, Status 4, Zeile 1 Cann...
I've created a
bacpac
backup of my database (both Azure and local with SSMS). Trying to import the bacpac
to same or another machine (again both Azure and local) the import always fails with this error:
>Error SQL72014: .Net SqlClient Data Provider: Meldung 3729, Ebene 16, Status 4, Zeile 1 Cannot ALTER 'SecTable' because it is being referenced by object 'sec_SecTable'. Error SQL72045: Fehler bei der Skriptausführung. Ausgeführtes Skript: ALTER TABLE [dbo].[SecTable] SET (SYSTEM_VERSIONING = OFF); ...
SQL-Server versions: 14.0.3223.3 and 15.0.1800.32.
Already tried different versions of SSMS (17,18) and via SqlPackage.exe
(Visual Studio 19 and SSMS 17/18) to execute the import/export of the bacpac
.
The smallest DB creation script that will reproduce this problem is:
CREATE TABLE [dbo].[SecTable](
[Id] [uniqueidentifier] NOT NULL DEFAULT (newid()),
[SystemHistoryStart] [datetime2](7) GENERATED ALWAYS AS ROW START NOT NULL,
[SystemHistoryEnd] [datetime2](7) GENERATED ALWAYS AS ROW END NOT NULL,
[SecValue] [int] NOT NULL
CONSTRAINT [PK_SecTable] PRIMARY KEY CLUSTERED
(
[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY],
PERIOD FOR SYSTEM_TIME ([SystemHistoryStart], [SystemHistoryEnd])
) ON [PRIMARY]
WITH ( SYSTEM_VERSIONING = ON ( HISTORY_TABLE = [dbo].[SecTable_History]))
GO
-- Function
CREATE FUNCTION [dbo].[sf_SchemaBoundTableFunction]
(
@secValue int
)
RETURNS TABLE
WITH SCHEMABINDING
RETURN
SELECT 1 AS Result
GO
-- security policy
CREATE SECURITY POLICY [dbo].[sec_SecTable]
ADD BLOCK PREDICATE [dbo].[sf_SchemaBoundTableFunction]([SecValue]) ON [dbo].[SecTable] BEFORE DELETE
WITH (STATE = ON, SCHEMABINDING = ON)
GO
Any suggestions or workaround how to solve this problem or how to configure the import/export of the bacpac
without removing the schemabinding
? Is it maybe a bug or am I doing something wrong?
AKL
(11 rep)
Sep 16, 2019, 03:22 PM
• Last activity: Sep 16, 2019, 05:12 PM
2
votes
0
answers
399
views
Is there a way to programmatically get the version of a .bacpac file without unzipping and inspecting the xml?
The situation is that we get a copy of the latest production bacpac, then restore it locally when setting up our dev environment. During the setup, we do an explicit check for the DacFX dll but target a specific version to warn the user if they are missing the required DacFx installation. $DacPath =...
The situation is that we get a copy of the latest production bacpac, then restore it locally when setting up our dev environment. During the setup, we do an explicit check for the DacFX dll but target a specific version to warn the user if they are missing the required DacFx installation.
$DacPath = "C:\Program Files\Microsoft SQL Server\130\DAC"
// ... code to check x86 path if x64 not found and throw error if none
$DacAssembly = "$DacPath\bin\Microsoft.SqlServer.Dac.dll"
We *COULD* continue to just update it as Azure updates the SQL instance, but I feel there's probably a more sane way to check the export version of the bacpac. This would allow us to automatically roll forward our version check as it increases.
Failed, my google-fu has...
Doc Rinehart
(21 rep)
Jun 14, 2018, 03:25 PM
• Last activity: Jun 14, 2018, 03:54 PM
2
votes
2
answers
4373
views
Create bacpac from SQL Server 2008
Is it possible to export bacpac file from SQL SERVER 2008 (running 10.50.6220)? I can only see option for dacpac.
Is it possible to export bacpac file from SQL SERVER 2008 (running 10.50.6220)? I can only see option for dacpac.
JS_Diver
(165 rep)
Apr 25, 2017, 10:07 AM
• Last activity: Apr 27, 2017, 07:46 PM
Showing page 1 of 18 total questions