Jun 7, 2017 11:00:00 AM by Denny Cherry
One of my clients recently had the need to upload tens of terabytes of data into Azure Blob Storage. This gave us the perfect opportunity to use the Azure Import Export service to get these several terabytes of data into Azure by using the Azure Import/Export Service. The Azure Import/Export Service allows you to ship hard drives to Microsoft with your data on them, which is then copied up to the Azure Blob Storage. You can then move the data around from there as needed. All this is done using the Azure Import/Export Tool which is a command line tool, which has a few quirks.
The biggest querks that we ran into getting the Azure Import/Export Tool working was that it doesn’t support quotes in the parameters unless there are spaces in the folder or file names. So the fix here, don’t use spaces and don’t use quotes. For example at first I was trying to use the following in my command line.
But what I needed to use was actually this.
That’s a pretty small difference, but an important one. And the error message that the tool gives doesn’t say that you have invalid characters in the logdir parameter. It just tells you that you have invalid characters in the path or file name. Which means any of the paths or file names, and you have to specify several of them including the journal file (/j) the log folder (/logdir) the drive configuration layout for what drives it’s writing to (/InitialDriveSet) and what folders to pull onto those drives (/DataSet).
Another annoying thing was that WAImportExport.exe didn’t like having /DataSet at the right end of the command line. It only liked it when it was at the left hand side of the command line. Now this was before I figured out the double quotes issue, and that may have been part of it but with the double quotes on all the parameters and with the DataSet parameter on the right hand side, it complained that there was no DataSet parameter provided.
When configuring the DataSet CSV file, you need to put the container name in all lowercase.
Overall I was pretty impressed with how the processed worked. The CSV files were pretty easy to put together. The DataSet file that you provide just tells the application what folders to move where. In this example I’m moving the files from C:something to the “something” container in my blob storage account (you can use a network share instead of a local file).
In the InitialDriveSet parameter you tell the application which drives that are attached to your computer that it’s using to ship the data.
In my base the drive was formatted and the disk was already bit lockered.
The application has some quirks to it, like I said earlier in the post. But once those got figured out, it was pretty easy.
Tags: SQL Server
Written by Denny Cherry
I am a Senior SQL Server DBA at CDW with 10 years of IT experience, mostly as a software developer building web and windows based applications (VB, VB.NET, C#, C++ and a smidge of Java). I have always found database design and set based logic interesting, so 3 years ago I took the plunge and became a DBA, soon after I discovered people would tell anyone who would listen all about the SQL Server internals. I was hooked. I have not looked back since. The things I say represent my opinion and in no way represent the views or opinions of my employer or coworkers.