Thursday, March 29, 2012

Datatype change INT to BIGINT on a large table

Hi folks,
I have a table which is of 500 GB in size. I need to change the datatype
of a column from INT to BIGINT.
When I tried making this change from Enterprise Manager, it was throwing
log space is full. I also truncated the log and tried again, eventhen I
face the same problem.
Also I have limited space available on data drive. After some research I
found that SQL Server interally creates a Tmp table with the new
datatype, populates that table with orginal table data, drops the
original table and then renames the Tmp table.
So I must need atleast 500 GB additional freespace on data drive, but I
do not have 500 GB free space on data drive.
I am just thinking the below alternate way to do this task.
1. BCP out the data to a temporary mapped network drive which has 500 GB
free space.
2. Drop the table.
3. Recreate the table with BIGINT datatype on the required column.
4. BCP in the data.
5. Recreate the Keys and constraints.
Can someone suggest me whether this is the best way, any possibility of
loosing the data if I follow this way. Please suggest me if there is a
better approach.
Thanks in advance.
*** Sent via Developersdex http://www.examnotes.net ***Another option is to use ALTER TABLE ... ALTER COLUMN ...
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"Venkat" <nospam_venkat_asp@.yahoo.co.uk> wrote in message
news:u38r6IDyFHA.3588@.tk2msftngp13.phx.gbl...
> Hi folks,
> I have a table which is of 500 GB in size. I need to change the datatype
> of a column from INT to BIGINT.
> When I tried making this change from Enterprise Manager, it was throwing
> log space is full. I also truncated the log and tried again, eventhen I
> face the same problem.
> Also I have limited space available on data drive. After some research I
> found that SQL Server interally creates a Tmp table with the new
> datatype, populates that table with orginal table data, drops the
> original table and then renames the Tmp table.
> So I must need atleast 500 GB additional freespace on data drive, but I
> do not have 500 GB free space on data drive.
> I am just thinking the below alternate way to do this task.
> 1. BCP out the data to a temporary mapped network drive which has 500 GB
> free space.
> 2. Drop the table.
> 3. Recreate the table with BIGINT datatype on the required column.
> 4. BCP in the data.
> 5. Recreate the Keys and constraints.
> Can someone suggest me whether this is the best way, any possibility of
> loosing the data if I follow this way. Please suggest me if there is a
> better approach.
> Thanks in advance.
> *** Sent via Developersdex http://www.examnotes.net ***|||Thanks Tibor,
If I do ALTER TABLE ...ALTER COLUMN, will it log to transaction log
file?
*** Sent via Developersdex http://www.examnotes.net ***|||You need to test first. Create a similar table in a smaller database, copy o
ver a subset of the rows
and do a test. Sometimes, these changes can go without touching the data (im
mediately), sometimes,
all data is changed immediately, and changes has to be logged. I haven't see
n any document
describing the exact rules for when a change is immediate or not. So, do a t
est first to be certain.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"Venkat" <nospam_venkat_asp@.yahoo.co.uk> wrote in message
news:OkPRGnDyFHA.3312@.TK2MSFTNGP09.phx.gbl...
> Thanks Tibor,
> If I do ALTER TABLE ...ALTER COLUMN, will it log to transaction log
> file?
> --
> *** Sent via Developersdex http://www.examnotes.net ***|||I think your bulk copy solution is the best way to go. To reduce the size of
the exported file, use native format instead of fixed width or tab delimited
text columns. To reduce transaction logging while importing the data, set
the database recovery model to "bulk insert" or "simple", set the database
to single user / dbo use only mode, and re-create indexes only after the
import has completed.
"Venkat" <nospam_venkat_asp@.yahoo.co.uk> wrote in message
news:u38r6IDyFHA.3588@.tk2msftngp13.phx.gbl...
> Hi folks,
> I have a table which is of 500 GB in size. I need to change the datatype
> of a column from INT to BIGINT.
> When I tried making this change from Enterprise Manager, it was throwing
> log space is full. I also truncated the log and tried again, eventhen I
> face the same problem.
> Also I have limited space available on data drive. After some research I
> found that SQL Server interally creates a Tmp table with the new
> datatype, populates that table with orginal table data, drops the
> original table and then renames the Tmp table.
> So I must need atleast 500 GB additional freespace on data drive, but I
> do not have 500 GB free space on data drive.
> I am just thinking the below alternate way to do this task.
> 1. BCP out the data to a temporary mapped network drive which has 500 GB
> free space.
> 2. Drop the table.
> 3. Recreate the table with BIGINT datatype on the required column.
> 4. BCP in the data.
> 5. Recreate the Keys and constraints.
> Can someone suggest me whether this is the best way, any possibility of
> loosing the data if I follow this way. Please suggest me if there is a
> better approach.
> Thanks in advance.
> *** Sent via Developersdex http://www.examnotes.net ***|||Thanks JT for your excellent suggestions/comments.
I tried Tibor's suggestion (ALTER TABLE...ALTER COLUMN), it logs to
transaction log even when the recovery model is set to simple.
Internally it updates all the rows. But I do not have enough space on
log drive. So I will have to go with BCP option.
*** Sent via Developersdex http://www.examnotes.net ***|||I tried BCPing with Native format and Char format option. It seems that
the the file unloaded using Char format is smaller than the one created
with Native format. Any ideas..
--
*** Sent via Developersdex http://www.examnotes.net ***|||I don't recall offhand the specifics, but there are cases (perhaps with
decimal data types) where exporting to char format and then re-importing
will cause loss of data resolution. If you have space for exporting to
native format, then go ahead and use that.
"Venkat" <nospam_venkat_asp@.yahoo.co.uk> wrote in message
news:%235R%23ikNyFHA.3420@.TK2MSFTNGP10.phx.gbl...
>I tried BCPing with Native format and Char format option. It seems that
> the the file unloaded using Char format is smaller than the one created
> with Native format. Any ideas..
> --
> *** Sent via Developersdex http://www.examnotes.net ***sql