I have a table with 1 million records. I need to be able to move those records to another database and another table. I'm using a stored proc to get the data. It fills a data adapter it then bcp's the data into the new table.
We're on SQL 2005 and C# 4. we will be moving to SQL 2012 or 2014 and Studio 2015 with C# 4.6 or 5.0 If there are any functions that would make this work well.
for 10k records the process takes less than 1 second for 500k records, the dataadapter runs out of memory and the process fails. batching to 100k records, the select statement is the issue in SQL returning 100k records at a time takes 2 minutes per loop.
Is there a way, or what is wrong with my code below to keep the data adapter from being filled and instead map the columns and have BulkCopy stay server side and just push the records from the db to the new table like maybe SSIS? IT seems the bulk copy itself is lightning fast, but the adapterfill fails because it runs out of memory trying to populate the adapter with 1 million records. Without doing 1 row at a time, i'd just like to move the data between tables. one table has 27 columns with 5 of the columns not being in table 2 which has 32 columns and some columns are not named the same in both tables.
This is a Proof of Concept (PoC).
sourceConn_transDB.Open();
SqlCommand sourceCommand = new SqlCommand(queryString, sourceConn_transDB);
DataTable table = new DataTable();
sourceCommand.CommandTimeout = 600;
using (var adapter = new SqlDataAdapter(sourceCommand))
{
WriteLineWithTime("Adapter Fill");
adapter.Fill(table);
}
if ((table == null) || (table.Rows.Count <= 0))
break;
using (SqlBulkCopy bulk = new SqlBulkCopy(targetConn_reportDB, SqlBulkCopyOptions.KeepIdentity, null) { DestinationTableName = "PatientEvent" })
{
bulk.ColumnMappings.Add(new SqlBulkCopyColumnMapping("PatientID", "PatientID"));
}
Aucun commentaire:
Enregistrer un commentaire