For inserting a huge amount of data in a database, I used to collect all the inserting information into a list and convert this list into a DataTable
. I then insert that list to a database via SqlBulkCopy
.
Where I send my generated listLiMyList
which contain information of all bulk data which I want to insert to database
and pass it to my bulk insertion operation
InsertData(LiMyList, "MyTable");
Where InsertData
is
public static void InsertData<T>(List<T> list,string TableName)
{
DataTable dt = new DataTable("MyTable");
clsBulkOperation blk = new clsBulkOperation();
dt = ConvertToDataTable(list);
ConfigurationManager.OpenExeConfiguration(ConfigurationUserLevel.PerUserRoamingAndLocal);
using (SqlBulkCopy bulkcopy = new SqlBulkCopy(ConfigurationManager.ConnectionStrings["SchoolSoulDataEntitiesForReport"].ConnectionString))
{
bulkcopy.BulkCopyTimeout = 660;
bulkcopy.DestinationTableName = TableName;
bulkcopy.WriteToServer(dt);
}
}
public static DataTable ConvertToDataTable<T>(IList<T> data)
{
PropertyDescriptorCollection properties = TypeDescriptor.GetProperties(typeof(T));
DataTable table = new DataTable();
foreach (PropertyDescriptor prop in properties)
table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
foreach (T item in data)
{
DataRow row = table.NewRow();
foreach (PropertyDescriptor prop in properties)
row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;
table.Rows.Add(row);
}
return table;
}
Now I want to do an update operation, is there any way as for inserting data is done by SqlBulkCopy
for Updating data to DataBase From C#.Net
9条答案
按热度按时间ocebsuys1#
What I've done before is perform a bulk insert from the data into a temp table, and then use a command or stored procedure to update the data relating the temp table with the destination table. The temp table is an extra step, but you can have a performance gain with the bulk insert and massive update if the amount of rows is big, compared to updating the data row by row.
Example:
Notice that a single connection is used to perform the whole operation, in order to be able to use the temp table in each step, because the scope of the temp table is per connection.
93ze6v8z2#
In my personal experience, the best way to handled this situation is utilizing a Stored Procedure with a
Table-Valued Parameter
and aUser-Defined Table Type
. Just set up the type with the columns of the data table, and pass in said-data table as a parameter in the SQL command.Within the Stored Procedure, you can either join directly on some unique key (if all rows you are updating exist), or - if you might run into a situation where you are having to do both updates and inserts - use the SQL
Merge
command within the stored procedure to handle both the updates and inserts as applicable.Microsoft has both syntax reference and an article with examples for the Merge.
For the .NET piece, it's a simple matter of setting the parameter type as
SqlDbType.Structured
and setting the value of said-parameter to the Data Table that contains the records you want to update.This method provides the benefit of both clarity and ease of maintenance. While there may be ways that offer performance improvements (such as dropping it into a temporary table then iterating over that table), I think they're outweighed by the simplicity of letting .NET and SQL handle transferring the table and updating the records itself. K.I.S.S.
vsnjm48y3#
Bulk Update:
Step 1: put the data which you want to update and primary key in a list.
Step 2: pass this list and ConnectionString to BulkUpdate Method As shown below
Example:
Step 3: put The ConvertToDataTable Method as shown Below.
Example:
Notes: WhereEver
SquareBracket[]
is there, put your own value.zf9nrax14#
Try out SqlBulkTools available on Nuget.
Disclaimer: I'm the author of this library.
Only 'SomeColumn1' and 'SomeColumn2' will be updated. More examples can be found here
2wnc66cl5#
I would insert new values in a temporary table and then do a merge against the destination table, something like this:
siv3szwd6#
You could try to build a query that contains all data. Use a
case
. It could look like thismrphzbgm7#
I'd go for a TempTable approach because that way you aren't locking anything. But if your logic needs to be only in the front end and you need to use bulk copy, I'd try a Delete/Insert approach but in the same SqlTransaction to ensure integrity which would be something like this:
2eafrhcq8#
Complete answer, disclaimer: arrow code; this is mine built from research; Published in SqlRapper. It uses custom attributes over properties to determine whether a key is primary. Yes, super complicated. Yes super reusable. Yes, needs to be refactored. Yes, it is a nuget package. No, the documentation isn't great on github, but it exists. Will it work for everything? Probably not. Will it work for simple stuff? Oh yeah.
How easy is it to use after setup?
Here's how it works:
daolsyd09#
I made this generic solution with same idea as accepted answer (create temp table, fill it with bulk insert and then update target table) which uses reflection to read properties so you don't have to write lengthy UPDATE SET command:
This solution uses reflection to read properties so you don't have to write whole update command
NOTE: AsDataReader() is extension function from microsoft's ObjectDataReader which can be found here: https://github.com/microsoftarchive/msdn-code-gallery-community-m-r/tree/master/ObjectDataReader
you can use this solution like this