I have CSV file in Azure Blob storage and I want to insert file content to my MS SQL table. My CSV file and table which I'm using in my code are having same number of columns.
I have used Bulk insert command as specified here: Importing data from a file in Azure blob storage.
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'xxxxxxxxx';
CREATE DATABASE SCOPED CREDENTIAL AzureBlobStorageCredential
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = '<I kept my SAS token here without leading question(?) mark>';
CREATE EXTERNAL DATA SOURCE GSCSVFileAzureBlobStorage
WITH ( TYPE = BLOB_STORAGE,
LOCATION = 'https://myaccount.blob.core.windows.net/csvfileshare',
CREDENTIAL= AzureBlobStorageCredential );
BULK INSERT RawItemData
FROM 'itemdata_csv_test.csv'
WITH (DATA_SOURCE = 'GSCSVFileAzureBlobStorage',FORMAT = 'CSV',FIELDTERMINATOR = ',',FIRSTROW=2);
If I use my local file path bulk insert is working as expected, but if I read it from blob I get this error:
Cannot bulk load because the file "itemdata_csv_test.csv" could not be opened. Operating system error code 32 (The process cannot access the file because it is being used by another process.).
How can I find where exactly the issue is?