I have a table in SQL something like below
PersonDetails
{
ID, // identity column
Age int,
FirstName varchar,
LastName varchar,
CreatedDateTime DateTime,
}
This table currently has around 100 million rows, I need to increase the data to about 1 billion to test the time it would take to create some indexes on this table. I can use the data from same table to insert the rows again and again. What is the fastest way to achieve it?
I made a simple while loop in SQL something like below
Declare @maxrows bigint = 900000000,
@currentrows bigint,
@batchsize bigint = 10000000;
select @currentrows = count(*) from [dbo].[PersonDetails] with(nolock)
while @currentrows < @maxrows
begin
insert into [dbo].[PersonDetails]
select top(@batchsize)
[Age]
,[FirstName]
,[LastName]
,[CreatedDateTime]
from [dbo].[PersonDetails]
select @currentrows = count(*) from [dbo].[PersonDetails] with(nolock)
end
But the problem is that this query shows below error after inserting some of the data. The transaction log for database 'DBNAME' is full due to 'LOG_BACKUP'.
I can either add some delay in each insert, or reduce the batch size.
What is the best way to move forward here?
2条答案
按热度按时间z5btuh9x1#
If this database is living in a testing environment, change the recovery model to simple:
If not, I am guessing you are not able to change the database settings, I will advice to perform the operation on small batches and commit each iteration. It will be something like the following:
and you can use waitfor delay, too in order to allocated to many resources and block the other queries.
az31mfrm2#
I would create a temp table and then tranfer the date to the acual table
Something like this