Huge edit -- I removed the \';\' characters and replace them with \'GO\' and ... the secondary key and URL worked, except I got this: Cannot bulk load. The file \"06May2013_usr_
I feel obliged to post at some info even if it's not a full answer.
I was getting this error:
Msg 4860, Level 16, State 1, Line 58 Cannot bulk load. The file "container/folder/file.txt" does not exist or you don't have file access rights.
I believe the problem might have been that I generated my SAS key from right now, but that is UTC time, meaning that here in Australia, the key only becomes valid in ten hours. So I generated a new key that started a month before and it worked.
The SAS (Shared Access Signature) is a big string that is created as follows:
?
before pasting it into your SQL scriptBelow is my full script with comments.
-- Target staging table
IF object_id('recycle.SampleFile') IS NULL
CREATE TABLE recycle.SampleFile
(
Col1 VARCHAR(MAX)
);
-- more info here
-- https://blogs.msdn.microsoft.com/sqlserverstorageengine/2017/02/23/loading-files-from-azure-blob-storage-into-azure-sql-database/
-- You can use this to conditionally create the master key
select * from sys.symmetric_keys where name like '%DatabaseMasterKey%'
-- Run once to create a database master key
-- Can't create credentials until a master key has been generated
-- Here, zzz is a password that you make up and store for later use
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'zzz';
-- Create a database credential object that can be reused for external access to Azure Blob
CREATE DATABASE SCOPED CREDENTIAL BlobTestAccount
WITH
-- Must be SHARED ACCESS SIGNATURE to access blob storage
IDENTITY= 'SHARED ACCESS SIGNATURE',
-- Generated from Shared Access Signature area in Storage account
-- Make sure the start date is at least a few days before
-- otherwise UTC can mess you up because it might not be valid yet
-- Don't include the ? or the endpoint. It starts with 'sv=', NOT '?' or 'https'
SECRET = 'sv=2016-05-31&zzzzzzzzzzzz'
-- Create the external data source
-- Note location starts with https. I've seen examples without this but that doesn't work
CREATE EXTERNAL DATA SOURCE BlobTest
WITH (
TYPE = BLOB_STORAGE,
LOCATION = 'https://yourstorageaccount.blob.core.windows.net',
CREDENTIAL= BlobTestAccount);
BULK INSERT recycle.SampleFile
FROM 'container/folder/file'
WITH ( DATA_SOURCE = 'BlobTest');
-- If you're fancy you can use these to work out if your things exist first
select * from sys.database_scoped_credentials
select * from sys.external_data_sources
DROP EXTERNAL DATA SOURCE BlobTest;
DROP DATABASE SCOPED CREDENTIAL BlobTestAccount;
One thing that this wont do that ADF does, is pick up a file based on wildcard.
That is: If I have a file called ABC_20170501_003.TXT
, I need to explicitly list that in the bulk insert
load script, whereas in ADF I can just specify ABC_20170501
and it automatically wildcards the rest
Unfortunately there is no (easy) way to enumerate files in blob storage from SQL Server. I eventually got around this by using Azure Automation to run a powershell script to enumerate the files and register them into a table that SQL Server could see. This seems complicated but actually Azure Automation is a very useful tool to learn and use, and it works very reliably
More opinions on ADF:
I couldn't find a way to pass the filename that I loaded (or other info) into the database.
Do not use ADF if you need data to be loaded in the order it appears in the file (i.e. as captured by an identity field). ADF will try and do things in parallel. In fact, my ADF did insert things in order for about a week (i.e. as recorded by the identity) then one day it just started inserting stuff out of order.
The timeslice concept is useful in limited circumstances (when you have cleanly delineated data in cleanly delineated files that you want to drop neatly into a table). In any other circumstances it is complicated, unwieldy and difficult to understand and use. In my experience real world data needs more complicated rules to work out and apply the correct merge keys.
I don't know the cost difference between importing files via ADF and files via BULK INSERT, but ADF is slow. I don't have to patience to hack through Azure blades to find metrics right now but your talking 5 minutes in ADF vs 5 seconds in Bulk Insert
Try Azure Data Factory V2. It is vastly improved, and you are no longer bound to timeslices.