Copy Azure SQL Database to local machine (SQL 2008 R2)

You would have thought in these days of point and click a simple task like copying a database from one place to another would be a walk in the park. So when I wanted to take a copy of a SQL database from Azure and move it to my local machine, I thought it surely would be straightforward. It’s not. Here’s what I came up with:

In Azure make sure you have a Storage account set up, with a container inside. My container is called ‘files’.

export1

In Azure go to your database. Click the Export button down the bottom.

export2

A new window will pop up. The file name will probably be pre-filled. Select your blob storage account (if you only have one then it’s already selected). Select a container (mine is ‘files’). Enter your SQL Server log in name and password – these are the credentials you set up when you added the database. Click OK and the export should start – you’ll see the status down the bottom.

export3

Once the export is done, you can click to view your Storage container, and you should see the export file there.

export4

The trick now is to get the file from Azure to your local machine. I use Azure Storage Explorer, it’s a free download. Download the file from your container.

export5

If you have SQL Server 2012, then you are home free, because you can import the BACPAC file in SQL Management Studio. Right click on the Databases folder under a server in the Object Explorer and choosing “Import Data-tier Application.

If like me you are on SQL Server 2008 R2, you need some help.

Create a new blank database on your local machine.

Go to  http://msdn.microsoft.com/en-us/jj650014 and install the SQL Server Data Tools for Visual Studio 2010. After install go and find SqlPackage.exe, I located it in C:\Program Files\Microsoft SQL Server\110\DAC\bin. Open up a DOS prompt and use the following command:

SqlPackage.exe /a:Import /sf:c:\Desktop\FILENAME.bacpac /tdn:DATABASENAME /tsn:SQLSERVERNAME
  • FILENAME.bapac is the file you downloaded from Azure, whereever it is located on your machine.
  • DATABASENAME is the name of the new blank database you created a minute ago.
  • SQLSERVERNAME is the name of the SQL server on your local machine.

I followed these steps, albeit for a fairly small file, and it worked perfectly.

nvarchar(max) doesn’t work on Azure SQL with Classic ASP

stress

This is one of those completely obtuse issues that one hits from time to time – nvarchar(max) fields behave oddly on Microsoft Azure SQL – yet it all works fine on SQL Server 2008 on my local machine.

I battled this for an hour before finally simultaneously figuring out that trying to return data from an nvarchar(max) column via Classic ASP on Azure yields a blank value (or weird binary garbage); and falling over someone else who’s had exactly the same problem recently.

It’s fine to have a table with nvarchar(max) in Azure. I can push data into it, I can see the data via SSMS, but returning via ASP – nada.

I’m returning the data via a UDF, so I altered the table definition returned from the function so the column is nvarchar(4000). My function grabs the data from the (max) column and sticks it into the output column without trouble, and the function returns values fine, don’t even need a CAST.

It’s frustrating because I’m anticipating these columns to hold large blocks of text, and now I’m size restricted.

I haven’t had time to explore whether messing with the connection strings might do the trick.