Importing a single large table in Azure
2
votes
1
answer
468
views
I have an existing DB called
MyDb
in Azure SQL Server.
I have a bacpac of another DB with several tables in it. I'm interested in importing **one single table** (that table has no FK, it makes things easier) into a dedicated table MyDb.dbo.ImportedTable
. The final goal is to be able to do some data reconstruction using that table.
Problems are:
- MyDb.dbo.ImportedTable
is ~60 Gb large
- The main column in that table is a NVARCHAR(MAX)
. That forbids me to use *Elastic queries* in Azure. It times out since Elastic queries hates anything larger than NVARCHAR(4000)
(I tried)
I guess a good approach is:
1. Use BCP
but I only have the binary *.bcp
files (15'000 of them) that are inside the bacpac
archive (opened as a zip, in its data folder)
But I'm unable to make it work, especially because I find no documentation about the *.bcp
file format used in the bacpac
.
*tl;dr* What is the good approach to import a single ~60Gb table fro ma bacpac in an existing database in azure SQL Server?
Asked by Askolein
(131 rep)
Dec 13, 2019, 06:12 PM
Last activity: Dec 15, 2019, 03:03 AM
Last activity: Dec 15, 2019, 03:03 AM