SQL Server SqlBulkCopy failing when trying to handle more than 1 row

tv6aics1  于 2023-03-28  发布在  其他
关注(0)|答案(1)|浏览(246)

I'm trying to perform a bulk upload operation to a SQL Server database. I'm working in .NET 7, and I'm using a SqlBulkCopy instance to upload my data as a DataTable .

I've had all sorts of issues getting it to work at all, mostly relating to the types of the fields I'm sending, but I'm at a point now that data will upload, but only if there's a single row in the DataTable . I can skip around in my enumerable of data, and no matter what I do, it works with a single row, but fails with 2 or more.

The error message from the operation simply says
The given ColumnMapping does not match up with any column in the source or destination

but without telling me which ColumnMapping .

I've tried changing the types multiple times, to make sure they match the DB types. The column mappings is a new addition after I found an article on the internet, but that doesn't seem to make a difference with this issue. I've tried inserting different rows from my dataset, but the behaviour is consistent - 1 row works, multiple rows fail. I'm honestly not sure what else to try at this point!

This is my code to convert to DataTable :

public static DataTable ToDataTable<T>(IEnumerable<T> items)
{
    var tb = new DataTable(typeof(T).Name);

    var props = typeof(T).GetProperties(BindingFlags.Public | BindingFlags.Instance);
    var columns = props.Select(x =>
            x switch
            {
                { PropertyType.Name: "Nullable`1" } pi => Nullable.GetUnderlyingType(x.PropertyType).Map(y =>
                {
                    var c = new DataColumn(pi.Name, y);
                    c.AllowDBNull = true;
                    return c;
                }),
                not null => new DataColumn(x.Name, x.PropertyType)
            }).ToArray();

    tb.Columns.AddRange(columns);
        
    foreach (var item in items)
    {
        var r = tb.NewRow();

        foreach (var c in columns)
        {
            var value = (typeof(T).GetProperty(c.ColumnName)?.GetValue(item)) ?? DBNull.Value;
            r[c.ColumnName] = value;
        }

        tb.Rows.Add(r);
    }

    return tb;
}

And this is my code for performing the bulk insert after it:

var sb = new StringBuilder();

using var bulkCopy = new SqlBulkCopy(this.dbConnectionString);
bulkCopy.DestinationTableName = table;

foreach (DataRow r in data.Rows)
{
    foreach (DataColumn c in data.Columns)
    {
        // N.B - the column names of my class are the same as the Db columns
        bulkCopy.ColumnMappings.Add(c.ColumnName, c.ColumnName);
    }
 }

 bulkCopy.WriteToServer(data);

I'd be extremely grateful if someone could point me to where it is I've gone wrong. This issue has taken up far more time than it really has any right to!

Thanks,

Simon

jw5wzhpr

jw5wzhpr1#

Here's your problem:

foreach (DataRow r in data.Rows)
{
    foreach (DataColumn c in data.Columns)
    {
        // N.B - the column names of my class are the same as the Db columns
        bulkCopy.ColumnMappings.Add(c.ColumnName, c.ColumnName);
    }
}

Your code is adding newColumnMapping entries for all columns for every row in the table - so if your DataTable has 10 columns and 500 rows then you're instructing SqlBulkCopy that you've got 5,000 columns to map, with 99.9% of them sharing identical names - hence why SqlBulkCopy complained about "The given ColumnMapping does not match up with any column in the source or destination" (though that error is likely caused by an initial c.ColumnName not matching-up exactly with your source or destination tables).

Change your code to this:

public static DataTable ToDataTable<T>( IEnumerable<T> items )
{
    ( DataColumn c, PropertyInfo pi )[] columns = typeof(T)
        .GetProperties( BindingFlags.Public | BindingFlags.Instance );
        .Select( pi =>
        {
            if( Nullable.GetUnderlyingType( pi.PropertyType ) is nullableUnderlyingType )
            {
                return ( c: new DataColumn( pi.Name, nullableUnderlyingType ) { AllowDBNull = true }, pi );
            }
            else
            {
                return ( c: new DataColumn( pi.Name, pi.PropertyType ), pi );
            }
        } )
        .ToArray();

    DataTable tb = new DataTable( typeof(T).Name );
    foreach( ( DataColumn c, _ ) in columns )
    {
        tb.Columns.Add( c );
    }

    foreach( T item in items )
    {
        DataRow row = tb.NewRow();

        foreach( ( DataColumn c, PropertyInfo pi ) in columns )
        {
            Object value = pi.GetValue( item ) ?? DBNull.Value;
            row[c] = value;
        }

        tb.Rows.Add( row );
    }

    return tb;
}

public static async Task ImportAsync( IEnumerable<T> items, SqlConnection c, String destTableName, CancellationToken cancellationToken = default )
{
    DataTable table = ToDataTable( items );

    using SqlBulkCopy bulkCopy = new SqlBulkCopy( c );
    bulkCopy.DestinationTableName = destTableName;

    foreach( DataColumn c in table.Columns )
    {
        bulkCopy.ColumnMappings.Add( c.ColumnName, c.ColumnName );
    }

    await bulkCopy.WriteToServerAsync( table, cancellationToken ).ConfigureAwait(false);
}

相关问题