How to bulk insert records into SQL server

k5hmc34c  于 12个月前  发布在  SQL Server
关注(0)|答案(1)|浏览(126)

I'm trying to read a text file and then split each line as per it's significance which forms a particular record entry in my database table. I'm storing these records in a list and bulk insert the data from list to database. The file that I'm reading is of size ~18MB and has around 15,000 to 18,000 of lines . Below is the code :

StringBuilder logInsertCommand = new StringBuilder();
List<string> bulkLogInsert = new List<string>();
using (FileStream fs = File.Open(FilePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (BufferedStream bs = new BufferedStream(fs))
using (StreamReader sr = new StreamReader(bs, Encoding.GetEncoding("iso-8859-1")))
{
    while ((line = sr.ReadLine()) != null)
    {
        //Perform some logic with `line` and get all the column values required for inserting a new record in database table. Values like FirstColumnValue, SecondColumnValue are obtained from the logic performed on `line` variable.
        logInsertCommand.Append(FirstColumnValue).Append(';').Append(SecondColumnValue).Append(';').Append(ThirdColumnValue).Append(';').Append(FourthColumnValue).Append(';').Append(FifthColumnValue);
        bulkLogInsert.Add(logInsertCommand.ToString());
    }
}

public void InsertBulkLog(List<string> records)
{
    try
    {
        String connectionString = ConfigurationManager.AppSettings["DBConString"];
        DataTable table = new DataTable("TORNADO_LOGS");
        table.Columns.Add(new DataColumn("FILENAME", typeof(string)));
        table.Columns.Add(new DataColumn("PROJ_CODE", typeof(string)));
        table.Columns.Add(new DataColumn("IS_RECORD_PROCESSED", typeof(string)));
        table.Columns.Add(new DataColumn("FILE_LAST_MODIFIED_DATE", typeof(string)));
        table.Columns.Add(new DataColumn("MP3_FILE", typeof(string)));

        foreach (string record in records)
        {
            string[] rowParameters = record.Split(new char[] { ';' }, StringSplitOptions.RemoveEmptyEntries);
            table.Rows.Add(rowParameters);
        }
        using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connectionString))
        {
            bulkCopy.BulkCopyTimeout = 600;
            bulkCopy.DestinationTableName = table.TableName;
            bulkCopy.WriteToServer(table);
        }
    }
    catch (Exception ex)
    {
        //Write to log
    }
}

My question here is that I'm storing the records (15k to 17k) in a container like list and then trying to bulk insert the data in SQL server I guess this is not so good approach so how can I efficiently insert this data into database? Any approach will be helpful.

tp5buhyn

tp5buhyn1#

To fully stream the data from the file into SQL, you need to create a IDataReader .

There are many ways to do this, but the easiest is to use the NuGet FastMember library, which has ObjectReader.Create . This accepts an IEnumerable<SomeType> and returns a IDataReader which you can pass directly to WriteToServer . This means that each line is streamed into the Bulk Copy, and you never store the whole file in memory at once.

private IEnumerable<RecordLine> GetRecords()
{
    using (FileStream fs = File.Open(FilePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
    using (StreamReader sr = new StreamReader(bs, Encoding.GetEncoding("iso-8859-1")))
    {
        string line;
        while ((line = sr.ReadLine()) != null)
        {
            var rec = new RecordLine();
            // use logic to create a RecordLine object here
            yield return rec;
        }
    }
}

public void InsertBulkLog()
{
    try
    {
        var connectionString = ConfigurationManager.AppSettings["DBConString"];
        using (var reader = ObjectReader.Create(GetRecords());
        using (var bulkCopy = new SqlBulkCopy(connectionString))
        {
            bulkCopy.BulkCopyTimeout = 600;
            bulkCopy.DestinationTableName = table.TableName;
            bulkCopy.WriteToServer(reader);
        }
    }
    catch (Exception ex)
    {
        //Write to log
    }
}

相关问题