Migrating Data to SQL Azure Using BCP
The bcp utility provides bulk copying of data between instances of Microsoft SQL Server. This utility is installed with SQL Server and requires no knowledge or understanding of T-SQL syntax. If you aren't familiar with the bcp utility, don't confuse or associate its functionality with that of the Import/Export Wizard in SQL Server. Although the bcp documentation refers to what bcp does as a bulk copy, be aware that you can't bcp data from a source into a destination with a single statement. You must first bcp the data out of the source; then, you can bcp the data in to the destination.
This article describe show to use the bcp utility to export data from a local database and import the data into your SQL Azure database. It also discusses some things you should watch out for when using the bcp utility for SQL Azure.
The bcp utility has no GUI; it's a command prompt-driven utility. But don't let that intimidate you, especially given what you're using it for. It's very flexible and can seem a bit overwhelming, but it's quite simple. The basic syntax for the bcp utility is as follows:
bcp table direction filename -servername -username -password
- table is the source or destination table based on the direction parameter.
- direction is in or out, depending on whether you're copying data into the database or out of the database.
- filename is the filename you're copying data to or from.
- servername is the name of the server you're copying data to or from.
- username is the username used to connect to either the local or SQL Azure database.
- password is the password associated with the username.
Let's get started by exporting the data from your source database.
Exporting the Data
Begin by copying data out of your local SQL instance. Open a command prompt, and type the command shown in Figure 1. Enter your own values for the server name, the target directory, and the username and password for your local server. (The password is blanked out in Figure 1.)
Figure 1 - Using bcp to Export Data
Notice that in this example you're using the out keyword for the direction parameter. That's because you're copying data out of SQL Server.
The -n parameter performs the bulk-copy operation using the native database data types of the data. The -q parameter executes the SET QUOTED_IDENTIFIERS ON statement in the connection between the bcp utility and your SQL Server instance.
After you type in the command, press the Enter key to execute the bcp utility. In mere milliseconds, 105 rows are exported and copied to the user.dat file. Now, do the same for the Docs and UserDocs tables.
Importing the Data
The next step is to copy the data into the cloud-specifically, to your SQL Azure TechBio database. The syntax for copying into a database is very similar to the syntax for copying data out. You use the in keyword and specify the server name and credentials for your SQL Azure database, as shown in Figure 2.
Figure 2 - Uniqueidentifier data type error during bcp import
After you type in the command, press Enter to execute the bcp utility. Only one row is copied over, and then an error is generated, stating that an unexpected end-of-file (EOF) was encountered. This error isn't specific to SQL Azure; the bcp utility has issues with columns of the uniqueidentifier data type. You can find posts and blogs all over the Internet about this problem. The solution is to execute the following T-SQL against the Users table in your SQL Azure database:
ALTER TABLE users DROP COLUMN rowguid
The cool thing is that you don't need to re-export the data. You can re-execute the bcp import command. Do that, as shown in Figure 3, and all 105 rows are imported. Then, use the same syntax to import the Docs and UserDocs data.
Figure 3 - Successful bcp import
Don't forget to put the rowguid column back on the Users table. You can do this by using the same syntax as before:
ALTER TABLE Users ADD rowguid uniqeidentifier
Next, let's make this a little simpler and put the export and import together, so you aren't running the statements one at a time.
Putting the Export and Import Together
You can put the export and import processes together into a single file to make them easier to use. To do so, open Notepad, and type in the following, replacing the italicized information with the appropriate information for your environment:
bcp Techbio.dbo.Users out c:\scott\user.dat -S Scott-PC -U sa -P Password -n -q
bcp Techbio.dbo.Docs out c:\scott\docs.dat -S Scott-PC -U sa -P Password -n -q
bcp Techbio.dbo.UserDocs out c:\scott\userdoc.dat -S Scott-PC -U sa -P Password -n -q
bcp Techbio.dbo.Users in c:\scott\user.dat -S servername.database.windows.net -U SQLScott@ servername -P Password -n -q
bcp Techbio.dbo.docs in c:\scott\docs.dat -S servername.database.windows.net -U SQLScott@ servername -P Password -n -q
bcp Techbio.dbo.userdocs in c:\scott\userdoc.dat -S servername.database.windows.net -U SQLScott@ servername -P Password -n -q
Save the file as AzureBCP.cmd, and navigate to its location. Double-click the file to execute it. A command window appears, showing the results of the bcp commands that export and import the data.
As stated earlier, SQL Server BOL is full of information about how to use the bcp utility. This section is a brief introductory look at how to use this utility to move data from your local SQL Server instance to SQL Azure. The bcp utility is bulk-copy method of moving data. It lacks SSIS's ability to convert data from one data type to another, and SSIS's workflow components. But if all you're interested in is moving data from one table to a similar destination table, bcp is your best friend.
Scott Klein is co-founder of Blue Syntax, a consulting a training company specialized on the Azure platform. Scott has been working with SQL Server for nearly 20 years, working with companies large and small in fields such as medical, finance, and retail. Scott is a veteran author, having written a number of books including the soon to be released Pro SQL Azure. Scott is a SQL Server MVP and President of the South Florida SQL Server Users Group and the newly formed Azure PASS Virtual Chapter. Scott also organizes and runs the South Florida SQL Saturday and speaks frequently at SQL Saturday and .NET Code Camp events as well as user groups. He was lucky enough to speak at the 2008 European PASS conference and recently returned from Germany providing a 3-day deep dive training at the PASS Camp.
His online presences include: