How to Copy SQL Server Database in Amazon RDS

The easiest way of duplicating a database in SQL Server is by using backup and restore. Unfortunately this feature is not available in Amazon RDS  because there is no file system access.

This is a walk through for copying a database in RDS:

Open SQL Server Management Studio

Open SQL Server Management Studio and connect to the database server. All steps are also available in Express Edition.

Generate script

First you need to generate a script for all the objects in the database, excluding the data. You do this by right clicking on the source database on Tasks > Generate Scripts…

Select the objects you want to duplicate in your database.

Generate-Scripts-1

If you don’t have any DDL triggers, don’t select it. Amazon adds a trigger to each database to prevent creating a backup called rds_deny_backups_trigger. Otherwise make sure this trigger is excluded from the generated script.

Also don’t select the master user created when initializing the RDS database server. Every new database will already have this user.

Click Next.

Generate-Scripts-2

Select Save to new query window. You could opt for saving it to file, but in my case I only use the script once.

Click on Advanced.

Generate-Scripts-3

  • Set Script USE DATABASE to False
  • Set Script Indexes to True
  • Set Script Triggers to True

Click Next and Next.

Wait until the script is created.

Create new database

Now create the new empty database by right clicking on Databases. Select New Database…

Enter the name of the database and press OK.

Change the connection of the generated script window to the new database and run the script.

Disable triggers and foreign key constraints

Run the following script on the new database to disable triggers and foreign key constraints:

--Disable foreign keys on all tables
DECLARE @table_name SYSNAME;
DECLARE @schema_name SYSNAME;
DECLARE @cmd NVARCHAR(MAX);
DECLARE table_cursor CURSOR FOR SELECT sys.tables.name,sys.schemas.name as schemaName FROM sys.tables inner join sys.schemas on sys.tables.schema_id=sys.schemas.schema_id;
OPEN table_cursor;
FETCH NEXT FROM table_cursor INTO @table_name,@schema_name;
WHILE @@FETCH_STATUS = 0 BEGIN
SELECT @cmd = 'ALTER TABLE '+QUOTENAME(@schema_name)+'.'+QUOTENAME(@table_name)+' NOCHECK CONSTRAINT ALL';
EXEC (@cmd);
FETCH NEXT FROM table_cursor INTO @table_name,@schema_name;
END
CLOSE table_cursor;
DEALLOCATE table_cursor;
GO 

--Disable triggers on all tables
DECLARE @enable BIT = 0;
DECLARE @trigger SYSNAME;
DECLARE @table SYSNAME;
DECLARE @cmd NVARCHAR(MAX);
DECLARE trigger_cursor CURSOR FOR SELECT trigger_object.name trigger_name,
 table_object.name table_name
FROM sysobjects trigger_object
JOIN sysobjects table_object ON trigger_object.parent_obj = table_object.id
WHERE trigger_object.type = 'TR';

OPEN trigger_cursor;
FETCH NEXT FROM trigger_cursor INTO @trigger, @table;

WHILE @@FETCH_STATUS = 0 BEGIN
 IF @enable = 1
 SET @cmd = 'ENABLE ';
 ELSE
 SET @cmd = 'DISABLE ';

 SET @cmd = @cmd + ' TRIGGER dbo.'+QUOTENAME(@trigger)+' ON dbo.'+QUOTENAME(@table)+' ';
 EXEC (@cmd);
 FETCH NEXT FROM trigger_cursor INTO @trigger, @table;
END

CLOSE trigger_cursor;
DEALLOCATE trigger_cursor;

GO

Copy all data

Next step is to copy all the data from the source database to the new database. For this you have to use the SQL Server Import and Export Wizard. You open this by right clicking on the source database and click Tasks > Export…

First enter the connection details for the source database.

import-export-1

Then enter the connection details for the target database.

import-export-2

Highlight all tables by holding the Shift key and clicking on the first and last table. Click on one of the check boxes left to the tables. This will select all tables. Make sure not to select any views because there are no destination tables to insert into.

import-export-4

Now keep the highlight and click on Edit Mappings…

import-export-5

Check Enable identity insert. Close the dialog by clicking OK.

Click Next.

Choose Run immidiately,

import-export-6

Click Next, Next and Finish.

Wait until the import completes.

Enable triggers and foreign key constraints

Enable all triggers and foreign key constraints on the target database by running this script:

--Enable foreign keys on all tables
DECLARE @table_name SYSNAME;
DECLARE @schema_name SYSNAME;
DECLARE @cmd NVARCHAR(MAX);
DECLARE table_cursor CURSOR FOR SELECT sys.tables.name,sys.schemas.name as schemaName FROM sys.tables inner join sys.schemas on sys.tables.schema_id=sys.schemas.schema_id;
OPEN table_cursor;
FETCH NEXT FROM table_cursor INTO @table_name,@schema_name;
WHILE @@FETCH_STATUS = 0 BEGIN
SELECT @cmd = 'ALTER TABLE '+QUOTENAME(@schema_name)+'.'+QUOTENAME(@table_name)+' CHECK CONSTRAINT ALL';
EXEC (@cmd);
FETCH NEXT FROM table_cursor INTO @table_name,@schema_name;
END
CLOSE table_cursor;
DEALLOCATE table_cursor;
GO
--Enable triggers on all tables
DECLARE @enable BIT = 1;
DECLARE @trigger SYSNAME;
DECLARE @table SYSNAME;
DECLARE @cmd NVARCHAR(MAX);
DECLARE trigger_cursor CURSOR FOR SELECT trigger_object.name trigger_name,
 table_object.name table_name
FROM sysobjects trigger_object
JOIN sysobjects table_object ON trigger_object.parent_obj = table_object.id
WHERE trigger_object.type = 'TR';

OPEN trigger_cursor;
FETCH NEXT FROM trigger_cursor INTO @trigger, @table;

WHILE @@FETCH_STATUS = 0 BEGIN
 IF @enable = 1
 SET @cmd = 'ENABLE ';
 ELSE
 SET @cmd = 'DISABLE ';

 SET @cmd = @cmd + ' TRIGGER dbo.'+QUOTENAME(@trigger)+' ON dbo.'+QUOTENAME(@table)+' ';
 EXEC (@cmd);
 FETCH NEXT FROM trigger_cursor INTO @trigger, @table;
END

CLOSE trigger_cursor;
DEALLOCATE trigger_cursor;
GO

Set permissions

Only the users where imported, but not its roles and permissions. These have to be added manually.

Update

Dan commented that the SQL scripts didn’t support schemas. He provided me with an updated version that takes schemas into account. I updated the scripts in this post. Thanks Dan for this great contribution! An update for triggers will follow soon.

Error “Keyword not supported: ‘data source’.” using Octopus Deploy

We recently started using Octopus Deploy for our deployments. After the last deployment I got this strange error:

System.ArgumentException: Keyword not supported: 'data source'.
Stack Trace: 
at System.Data.EntityClient.DbConnectionOptions.ParseInternal(Hashtable parsetable, String connectionString, Hashtable synonyms)
at System.Data.EntityClient.DbConnectionOptions..ctor(String connectionString, Hashtable synonyms)
at System.Data.EntityClient.EntityConnection.ChangeConnectionString(String newConnectionString)
at System.Data.Entity.Internal.LazyInternalConnection.TryInitializeFromAppConfig(String name, AppConfig config)
at System.Data.Entity.Internal.LazyInternalConnection.Initialize()
...

The exception stack trace indicated that it had something to do with Entity Framework. I assumed the connection string could have been misconfigured on the server. We use the Configuration variables feature of Octopus Deploy to replace some connection strings in the web.config file.

The value for the connection string looked alright:

Quot-Octopus-Deploy

The connection string was for Entity Framework and had this value that contains another connection string inside quotes.

Then I had a look at the actual web.config file on the server. This showed something interesting in the connectionStrings section:

<add name="DataModelContainer" connectionString="metadata=[...] connection string=&amp;quot;data source=[....]&amp;quot;" providerName="System.Data.EntityClient"/>

Apparently the ‘&quot;’ was encoded twice to ‘&amp;quot;’. This was done when Octopus Deploy replaced the connection string value.

The solution was to not use the XML encoded value in Octopus deploy. So the ‘&quot;’ had to be replaced with ‘"’.

Passing a List to a Stored Procedure Using a User Defined Table Type

By default stored procedures only allow you to use scalar values as parameters. Sometimes you would like to use a list of values, for instance when selecting a set of items or inserting multiple values in one statement.

Before I would either use a comma separated string that would be parsed or an XML type. Recently I discovered a third – probably better – way of doing this: by using a User Defined Table Type. This is supported from SQL Server 2008.

First you create the User Defined Table Type:

CREATE TYPE VarcharList AS TABLE (
	Value varchar(50) NOT NULL PRIMARY KEY
)

Then you use it as a parameter in a stored procedure:

CREATE PROCEDURE GetItemsByIDs
    @ids VarcharList READONLY
AS
BEGIN
    SELECT * FROM Items
    WHERE ID IN (SELECT Value FROM @ids);
END

The parameter has to be defined as readonly to make this work.

In your code calling the stored procedure you have to add a parameter using the SqlDbType.Structured type. You set the TypeName property with the name of the Table Type you just created.

SqlParameter sqlParameter = command.Parameters.Add("ids",
    SqlDbType.Structured);
sqlParameter.TypeName = "VarcharList";
var sqlMetaData = new SqlMetaData("Value", 
    SqlDbType.VarChar, 50);
sqlParameter.Value = IDs.Select(id =>
{
    var record = new SqlDataRecord(sqlMetaData);
    record.SetString(0, id);
    return record;
});

One of the possible types you can use as the value for the parameter is IEnumerable. In the example above Linq is used to create it.

Interestingly you will get an error when the parameter value doesn’t contain any rows. I added an extra argument check to prevent this:

public IList<Item> GetItemsByIds(ICollection<string> ids)
{
    if (ids == null) throw new ArgumentNullException("ids");
    if (ids.Count == 0) throw new ArgumentException
        ("Collection cannot be empty", "ids");

I have not any performance tests, but I imagine this will perform better than the other options because the parameters are already structured like the SQL server native types.