Sample Header Ad - 728x90

Catching errors in bulk insert

0 votes
0 answers
201 views
I am currently adding a new feature to my ASP.NET website, the goal is to upload a CSV file and have it processed by a SQL Server stored procedure. To perform the bulk insert, I am using this command in my procedure:
SET @SQLBulk = '
        BEGIN TRANSACTION
        BULK INSERT '+ @Prod_Table +
        ' FROM ''' + @CSVFilePath + '''
        WITH
        (
            ERRORFILE = '''+ @ErrorFile +''',
            CHECK_CONSTRAINTS,
            FIELDTERMINATOR = '';'',
            ROWTERMINATOR = ''\n'',
            FIRSTROW = 2,
            FORMAT = ''CSV'',
            MAXERRORS = 10
        )
        COMMIT TRANSACTION'
         
    -- Exécuter la requête dynamique
    BEGIN TRY
        PRINT CAST(@SQLBulk as ntext)
        EXEC sp_executesql @SQLBulk
    END TRY
    BEGIN CATCH
    ROLLBACK TRANSACTION
 
    IF ERROR_NUMBER() IN (100, 7330)-- Erreur de syntaxe
        BEGIN
            SET @ErrorMessage = 'ERREUR:: Le fichier est mal formaté, il ne contient pas le bon nombre de champs et/ou Mauvais type de données'
        END
    IF ERROR_NUMBER() IN (547) -- Violation de contrainte d'intégrité référentielle
        BEGIN
            SET @ErrorMessage = 'ERREUR:: ...'
        END
    IF ERROR_NUMBER() IN (515) -- Violation de contrainte de clé étrangère
        BEGIN
            SET @ErrorMessage = 'ERREUR:: ...'
        END
    IF ERROR_NUMBER() IN (2627, 2601) -- Violation de contrainte d'unicité
        BEGIN
            SET @ErrorMessage = 'ERREUR::...' 
        END
    RAISERROR (@ErrorMessage, 15, 1)
 
    END CATCH
 
END
Now I want to log errors related to constraints (unique, foreign key, etc.). I've seen that the ERRORFILE option allows me to store errors in log files, but unfortunately, this only concerns errors related to the file itself (wrong data type, etc.) and not data integrity. Specifically, my problem is that when catching an error with bulk insert, I have no way of knowing which line caused the error, and from the moment there is a single bad entry, everything is blocked thereafter. This means that if my file contains 1000 entries with 50 incorrect ones, I have to re-upload the file at least 50 times to be able to insert everything. What I want is to go through the entire file, insert everything if there are no errors, and if there are errors, log them (in a table or file, whichever is convenient). Thank you for your help!
Asked by Helmi KAMEL (1 rep)
Apr 8, 2024, 09:42 AM