Saturday, January 1, 2011

Insert large batches of data


More times than I can count, I've had to simulate data-flow. Maybe just testing new code, performance reasons, or simply debugging the application layer, I've had to create test tables, and load up with tons of data.  If you're in the DBA seat, you will need to do the same...I can assure you.


Check this out, just a quick method to write data in bulk.


--drop/recreate table
IF OBJECT_ID ('TableName','u') > 0
DROP TABLE TableName 
GO 
CREATE TABLE TableName (
  a2 VARCHAR(10),
  c2 INT

GO 


--insert 100,000 rows 
SET NOCOUNT ON 
BEGIN TRAN 
DECLARE @i INT 
SET @i = 0 
WHILE @i < 100000 
BEGIN 
INSERT TableName VALUES ('abcdefghij',@i) 
SET @i = @i + 1 
END 
COMMIT TRAN 
GO 


--update stats (always good after large inserts, or deletions)
UPDATE STATISTICS TableName WITH FULLSCAN
GO


1 comment:

  1. This is helpful information. I would add that you should be careful to account for tran logging, especially if if your database is in "full" mode. You can easily blow the tran log with testing like this ;-)

    Also I've found that running update stats with fullscan is not always better than letting SQL Server pick the sampling. Moreover update statistics isn't always the answer to maintaining good query performances as suggested in this recent post: http://www.sqlsoldier.com/wp/sqlserver/misunderstoodandabusedupdatestatisticscommand

    ReplyDelete