1
votes

I've been trying to run the following script to read the file from azure blob storage.

--------------------------------------------
--CREATING CREDENTIAL
-- --------------------------------------------
    --------------------------------------------
    --shared access signature
    -- --------------------------------------------
CREATE DATABASE SCOPED CREDENTIAL dlcred
with identity='SHARED ACCESS SIGNATURE',
SECRET = 'sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-12-01T07:28:58Z&st=2019-08-31T23:28:58Z&spr=https,http&sig=<signature from storage account>';

--------------------------------------------
--CREATING SOURCE
--------------------------------------------

CREATE EXTERNAL DATA SOURCE datalake
WITH (
    TYPE =  BLOB_STORAGE,
    LOCATION='https://<storageaccount>.blob.core.windows.net/<blob>', 
    CREDENTIAL = dlcred
);

Originally, the script worked just fine, but later on it started giving the following error when running the last query below - Cannot bulk load because the file "test.txt" could not be opened. Operating system error code 86(The specified network password is not correct.)


--TEST
--------------------------------------------
SELECT CAST(BulkColumn AS XML)
FROM OPENROWSET
(
 BULK 'test.xml',
 DATA_SOURCE = 'datalake', 
 SINGLE_BLOB
) as xml_import

The same error happens if I create a credential with service principal or access key. Tried literally everything and logged the ticket with Azure support, however they are struggling to replicate this error.

I feel like it's an issue outside of the storage account and SQL server - Azure has a whole bunch of services that can be activated/deactivated against a subscription, and I feel like it's one of these that's preventing us from successfully mapping the storage account.

Has anyone encountered this error? And if so, how did you solve it?

1
Is that you want to load file from Azure data lake storage to Azure SQL?Jim Xu
did you find a solution for thisIbrahim Magdy
im working through the same problem right now, ill let you know what i findJames McShane

1 Answers

0
votes

I was able to get this issue resolved with Microsoft support. In section F here, I granted Storage Blob Data Contributor access to the managed identity of the SQL Server instance, then ran the SQL statements using the managed identity section in here: https://docs.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-ver15#f-importing-data-from-a-file-in-azure-blob-storage.

Preserving the code solution below:

CREATE DATABASE SCOPED CREDENTIAL msi_cred WITH IDENTITY = 'Managed Identity';
GO
CREATE EXTERNAL DATA SOURCE MyAzureBlobStorage
WITH ( TYPE = BLOB_STORAGE,
          LOCATION = 'https://****************.blob.core.windows.net/curriculum'
          , CREDENTIAL= msi_cred );

BULK INSERT Sales.Invoices
FROM 'inv-2017-12-08.csv'
WITH (DATA_SOURCE = 'MyAzureBlobStorage');

In order to do this, the SQL server instance requires a managed identity to be assigned to it. This can be done at creation time with the --assign-identity flag.