Bulk insert & Bulk update

1. Bulk Copy Operation

Introduction

Bulk copying of data from one data source to another data source is a new feature added to ADO.NET 2.0. Bulk copy classes provide the fastest way to transfer a set of data from one source to another.

Each ADO.NET data provider provides bulk copy classes. For example, in SQL.NET data provider, the bulk copy operation is handled by the SqlBulkCopy class, which is described in Figure 1. As you can see from Figure 1, data from a data source can be copied to one of the four types - DataReader, DataSet, DataTable, or XML.

Screenshot - BulkCopyIm1.gif
Figure 1. Bulk Copy operation in ADO.NET 2.0.

Using bulk copy operation, you can transfer data between two tables on the same SQL Server, between two different SQL Servers, or even between two different types of database servers.

Filling Data from the Source

The first step in copying bulk data from a data source to another is to fill data from the source database. This source data can be filled in a DataSet, DataTable, or a DataReader.

Collapse
// Select data from Products table



cmd = new SqlCommand("SELECT * FROM Products", source);

// Execute reader



SqlDataReader reader = cmd.ExecuteReader();

Creating SqlBulkCopy Object

In ADO.NET 2.0, each data provider has a bulk copy operations class, which provides bulk copy related functionality. For example, SQL data provider has a SqlBulkCopy class.

SqlBulkCopy class constructor takes a connection string or SqlConnection object as the first parameter, which defines the destination data source. After creating the object, you need to set the DestinationTableName property to the table, which you want to copy data to.

Collapse
// Create SqlBulkCopy



SqlBulkCopy bulkData = new SqlBulkCopy(destination);

// Set destination table name



bulkData.DestinationTableName = "BulkDataTable";

Copying Data to the Destination

The SqlBulkCopy class provides the WriteToServer method which is used to write data from a DataReader, DataSet, or DataTable to the destination data source.

Collapse
bulkData.WriteToServer(reader);

In this code, I fill data in a DataReader object from the source data source. You can even fill data in a DataSet and pass DataSet as the input parameter of the WriteToServer method. You can also pass an XML object or fill data in a DataSet from an XML document.

Closing SqlBulkCopy Object

The Close method of SqlBulkCopy closes the bulk copy operation.

Collapse
bulkData.Close();

Complete Source Code

The following table lists the complete source code.

Collapse
// Create source connection



SqlConnection source = new SqlConnection(connectionString);

// Create destination connection



SqlConnection destination = new SqlConnection(connectionString);



// Clean up destination table. Your destination database must have the



// table with schema which you are copying data to.



// Before executing this code, you must create a table BulkDataTable



// in your database where you are trying to copy data to.





SqlCommand cmd = new SqlCommand("DELETE FROM BulkDataTable", destination);

// Open source and destination connections.



source.Open();

destination.Open();

cmd.ExecuteNonQuery();

// Select data from Products table



cmd = new SqlCommand("SELECT * FROM Products", source);

// Execute reader



SqlDataReader reader = cmd.ExecuteReader();

// Create SqlBulkCopy



SqlBulkCopy bulkData = new SqlBulkCopy(destination);

// Set destination table name



bulkData.DestinationTableName = "BulkDataTable";

// Write data



bulkData.WriteToServer(reader);

// Close objects



bulkData.Close();

destination.Close();

source.Close();

Note

Before executing this code, make sure your database has a table named BulkDataTable with the same schema as the Products table.

2. Batch Update

Batch update can provide a huge improvement in performance by making just one round trip to the server for multiple batch updates, instead of several trips if the database server supports the batch update feature. The UpdateBatchSize property provides the number of rows to be updated in a batch. This value can be set up to the limit of decimal.

Batch Updates in ADO.NET 2.0 for Improved Performance

When you updated a database using the DataAdapter in .NET 1.1, each command was sent to the database one at a time. This caused a lot of roundtrips to the database.

ADO.NET 2.0 has introduced the concept of Batch Updates, which allows you to designate the number of commands sent to the database at a given time. If used correctly, this can increase the performance of your data access layer by reducing the number of roundtrips to the database.

DataAdapter.UpdateBatchSize Property

The DataAdapter has an UpdateBatchSize property that allows you to set the number of commands that will be sent to the database with each request.

  • UpdateBatchSize = 1, disables batch updates
  • UpdateBatchSize = X where X > 1, sends x statements to the database at a time
  • UpdateBatchSize = 0, sends the maximum number of statements at a time allowed by the server

Command.UpdatedRowSource Property

When using batch mode, the UpdatedRowSource property of the command can only be set to either UpdatedRowSource.None or UpdatedRowSource.OutputParameters.

Batch Updates Tutorial Using Northwind

You can test out batch updates on the Northwind Database by simulating an update to the Categories Table.

First, get the data from the Categories Table. The code below gets the information and places it in an untyped DataSet:

Collapse
SqlConnection connection = new SqlConnection("...");

SqlDataAdapter adapter = 

    new SqlDataAdapter("SELECT * FROM Categories",connection);

DataSet ds = new DataSet();

adapter.Fill(ds);

Simulate modification of the CategoryName of each category so there is something to update:

Collapse
foreach (DataRow dr in ds.Tables[0].Rows)

{

 string categoryName = dr["CategoryName"].ToString();

    dr["CategoryName"] = categoryName;

}

Construct an update command for the SqlDataAdapter to update the data and assign it to the adapter's UpdateCommand property:

Collapse
SqlCommand command = new SqlCommand();

command.CommandText = "Update Categories Set CategoryName = @CategoryName

WHERE CategoryID = @CategoryID";

command.Parameters.Add(new SqlParameter ("@CategoryID",

                 SqlDbType.Int)).SourceColumn = "CategoryID";



command.Parameters.Add(new SqlParameter

      ("@CategoryName", SqlDbType.NVarChar, 15)).SourceColumn = "CategoryName";

adapter.UpdateCommand = command;

Set the UpdatedBatchSize and UpdatedRowSource equal to the proper values. In the case of the Categories Table, there are only eight records in it and we have changed them all. I will set the UpdatedBatchSize to 2 for the sake of testing.

Collapse
adapter.UpdateBatchSize = 2;

command.UpdatedRowSource = UpdateRowSource.None;

Execute the Update Process

Collapse
adapter.Update(ds);

Hooking into the DataRowUpdating and DataRowUpdated Events of the DataAdapter

I hook into the DataRowUpdating and DataRowUpdated events of the DataAdapter as below:

Collapse
adapter.RowUpdating +=

    new SqlRowUpdatingEventHandler(adapter_RowUpdating);

adapter.RowUpdated +=

    new SqlRowUpdatedEventHandler(adapter_RowUpdated);

private void adapter_RowUpdated(object sender, SqlRowUpdatedEventArgs e)

{

    _countUpdated++;

}



void adapter_RowUpdating(object sender, SqlRowUpdatingEventArgs e)

{

    _countUpdating++;

}

Without Batch Updates, both events will fire eight times, one for each row being updated.

However, with Batch Updates, RowUpdated will only be called once for each batch update (8 rows / 2 updates per batch = 4 times). RowUpdating will be called the usual eight times.

Conclusion

Batch updates can improve the performance of your data access layer by reducing the number of roundtrips to the database.

 

from: http://www.codeproject.com/KB/database/ADONET2Features.aspx

你可能感兴趣的:(update)