This script will copy out all the tables from one version of a database in native BCP format, and place them in a directory of your choice, but defaulting to your user area in a directory called BCPFiles. They are placed in sub directories based on your server name and database name, just to keep things neat. It will then if you wish, copy it out to a target server. On first glance, this routine may seem over-complicated but it is designed to be able to perform this task for a list of these pairs of source & target databases for multi-database applications if you need to do that.
If the script is used to copy this data from one database to another, it will need the target to be an empty version of the same database (same meaning with the same table schema) with no data in it. If the two versions have a different table structure you’ll get an error. If you use this system and you change the table structure, or constraints, you would need to run your migration code on the source until it is the same as the target.
BCP must be installed to run this. This comes with SSMS so you probably have it already. Sometimes you need to create an alias for BCP but I think that problem has gone away
To get started, fill in the connection string for your source of data $datasource and $dataTarget. You also need to fill in an array of objects, each of which define your source and target databases, so you can do a whole list of databases. If you have to use credentials rather than integrated windows security, you add the userID but not the password. You will be asked the first time you run the routine for your password and this is then stored as an encrypted string in a file in your user area protected by NTFS security. Be warned, though, that you shouldn’t allow anyone else to run this using your PC, logged in with your credentials.
if you don’t want to do both the copying out and the copying in together in a single operation, you merely assign a $null to either the $DataSource or $Datatarget as appropriate, (if you are just copying out, you make the target $null and if just copying in make the source $null.
This script uses SMO in order to get the list of tables from the database and command-line BCP to do the copying of the data. I’ve published other ways of doing this over the years.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 |
$Filepath = "$env:USERPROFILE\BCPFiles" # local directory to save build-scripts to $DataSource = @{ 'ConnectionString' = 'Server=MySourceServer;Persist Security Info=False' }; # server name and instance $DataTarget = @{ 'ConnectionString' = 'Server=MyTargetServer;User Id=sa;Persist Security Info=False' }; # server name and instance $FileSourceDirectory = 'MyServer'<#if you are reading files in only, the script needs to know the subdirectory of your root directory to use to get the right files#> if ($DataSource -eq $null -and $FileSourceDirectory -eq $null) { write-error 'the script needs to know the subdirectory of your root directory to use' break; } $Databases = @(@{ 'source' = 'MyDatabase'; 'target' = 'MyNewDatabase' }) $slash = '+' #the string that you want to replace for the 'slash' in an instance name for files etc # set "Option Explicit" to catch subtle errors set-psdebug -strict $ErrorActionPreference = "stop" # you can opt to stagger on, bleeding, if an error occurs #load the sqlserver module $popVerbosity = $VerbosePreference $VerbosePreference = "Silentlycontinue" # the import process can be very noisy if you are in verbose mode Import-Module sqlserver -DisableNameChecking #load the SQLPS functionality $VerbosePreference = $popVerbosity if (!(Test-Path -path $Filepath -PathType Container)) { $null = New-Item -ItemType directory -Path $Filepath } @($DataSource, $DataTarget) | where { $_ -ne $null } | foreach { $csb = New-Object System.Data.Common.DbConnectionStringBuilder $csb.set_ConnectionString($_.ConnectionString) if ($csb.'user id' -ne '') #then it is using SQL Server Credentials { <# Oh dear, we need to get the password, if we don't already know it #> $SqlEncryptedPasswordFile = ` "$env:USERPROFILE\$($csb.'user id')-$($csb.server.Replace('\', $slash)).xml" # test to see if we know about the password in a secure string stored in the user area if (Test-Path -path $SqlEncryptedPasswordFile -PathType leaf) { #has already got this set for this login so fetch it $SqlCredentials = Import-CliXml $SqlEncryptedPasswordFile } else #then we have to ask the user for it (once only) { #hasn't got this set for this login $SqlCredentials = get-credential -Credential $csb.'user id' $SqlCredentials | Export-CliXml -Path $SqlEncryptedPasswordFile } $_.ServerConnection = new-object ` "Microsoft.SqlServer.Management.Common.ServerConnection"` ($csb.server, $SqlCredentials.UserName, $SqlCredentials.GetNetworkCredential().password) $csb.Add('password', $SqlCredentials.GetNetworkCredential().password) } else { $_.ServerConnection = new-object ` "Microsoft.SqlServer.Management.Common.ServerConnection" ` ($csb.server) } $_.csb = $csb try # now we make an SMO connection to the server, using the connection string { $_.srv = new-object ("Microsoft.SqlServer.Management.Smo.Server") $_.ServerConnection } catch { Write-error "Could not connect to SQL Server instance $($DataSource.csb.server) $($error[0]). Script is aborted" exit -1 } } <# all this work just to maintain passwords ! #> if ($DataSource -ne $null) { $DirectoryToSaveTo = $DataSource.csb.server.Replace('\', $slash) if ($DataSource.srv.Version -eq $null) { Throw "Can't find the instance $($DataSource.csb.server)" } Write-verbose "writing data out to $directoryToSaveTo" $DataSource.srv.Databases[$Databases.source].Tables | Select Name, Schema | foreach{ <# calculate where it should be saved #> $directory = "$($FilePath)\$($DirectoryToSaveTo)\$($Databases.Source)\Data" <# check that the directory exists #> if (-not (Test-Path -PathType Container $directory)) { <# we create the directory if it doesn't already exist #> $null = New-Item -ItemType Directory -Force -Path $directory; } $filename = "$($_.Schema)_$($_.Name)" -replace '[\\\/\:\.]', '-' Write-Verbose "Writing out $($_.Schema).$($_.Name) t0 $($directory)\$filename.bcp" If ($DataSource.csb.'user id' -eq '')<# OK. Easy, a trusted connection #> { #native format -n, Trusted connection -T $Progress = BCP "$($_.Schema).$($_.Name)" out "$($directory)\$filename.bcp" ` -n -T "-d$($Databases.source)" "-S$($DataSource.csb.server)" } else <# if not a trusted connection we need to provide a userid and password #> { $Progress = BCP "$($_.Schema).$($_.Name)" out "$($directory)\$($_.Schema)_$($_.Name).bcp" ` -n "-d$($Databases.source)" "-S$($DataSource.csb.server)" ` "-U$($DataSource.csb.'user id')" "-P$($DataSource.csb.password)" } if (-not ($?) -or $Progress -like '*Error*') # if there was an error { throw ("Error with data export of $($directory)\$($_.Schema)_$($_.Name).bcp - $Progress"); } } } if ($DataTarget -ne $null) { if ($DataSource -ne $null) { $DirectoryToLoadFrom = $DataSource.csb.server.Replace('\', $slash) } else { $DirectoryToLoadFrom = $FileSourceDirectory } if ($DataTarget.srv.Version -eq $null) { Throw "Can't find the instance $($DataTarget.csb.server)" } If ($DataTarget.srv.Databases[$Databases.target] -eq $null) { Throw "Can't find the database $($Databases.target) on instance $($DataTarget.csb.server)" } Write-verbose "Reading data in from $DirectoryToLoadFrom" $DataTarget.srv.Databases[$Databases.target].Tables | Select Name, Schema | foreach { # calculate where it gotten from # $directory = "$($FilePath)\$($DirectoryToLoadFrom)\$($Databases.Source)\Data" $filename = "$($_.Schema)_$($_.Name)" -replace '[\\\/\:\.]', '-' $progress = ''; Write-Verbose "Reading in $($_.Schema).$($_.Name) from $($directory)\$filename.bcp" if ($DataTarget.csb.'user id' -ne '') { $Progress = BCP "$($Databases.target).$($_.Schema).$($_.Name)" in "$($directory)\$filename.bcp" -q -N -E ` "-U$($DataTarget.csb.'user id')" "-P$($DataTarget.csb.password)" "-S$($DataTarget.csb.server)" } else { $Progress = BCP "$($Databases.target).$($_.Schema).$($_.Name)" in ` "$($directory)\$filename.bcp" -q -N -T -E ` "-S$($DataTarget.csb.server)" } if (-not ($?) -or $Progress -like '*Error*') # if there was an error { throw ("Error with data import of $($directory)\$($_.Schema)_$($_.Name).bcp - $Progress "); } } try # now we make an SMO connection to the server, using the connection string { $DataTarget.srv.ConnectionContext.ExecuteNonQuery(" use [$($Databases.target)] EXEC sp_msforeachtable 'ALTER TABLE ? WITH CHECK CHECK CONSTRAINT all'") } catch { Write-error "there was a constraint error! Script is aborted" exit -1 } } |
Load comments