Thursday, March 8, 2012

22 GB database copy between servers

We have a process that does a production backup, copy .bak
to the dev environment, then restore.
Once we crossed the 21 GB .bak size, the copy process
started failing because of the size of the file. I found
the knowledge base article where this was explained.
How do you get around this? BCP the database out by table?
DTS package to export data? What works for you?
It sounds as though your network may not be up to par. I have copied =
40GB across our network with ease.
You mention that you found a KB article. What article are you talking =
about?
--=20
Keith
"Jay" <anonymous@.discussions.microsoft.com> wrote in message =
news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
> We have a process that does a production backup, copy .bak=20
> to the dev environment, then restore.
>=20
> Once we crossed the 21 GB .bak size, the copy process=20
> started failing because of the size of the file. I found=20
> the knowledge base article where this was explained.
>=20
> How do you get around this? BCP the database out by table?=20
> DTS package to export data? What works for you?
|||Hi,
If the normal copy doesn't work you can always try to FTP it over your
network. This is more stable than the normal copy. (In my experience.)
Greets,
Jo Segers.
"Keith Kratochvil" <sqlguy.back2u@.comcast.net> schreef in bericht
news:%23x250huIEHA.2556@.TK2MSFTNGP12.phx.gbl...
It sounds as though your network may not be up to par. I have copied 40GB
across our network with ease.
You mention that you found a KB article. What article are you talking
about?
Keith
"Jay" <anonymous@.discussions.microsoft.com> wrote in message
news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
> We have a process that does a production backup, copy .bak
> to the dev environment, then restore.
> Once we crossed the 21 GB .bak size, the copy process
> started failing because of the size of the file. I found
> the knowledge base article where this was explained.
> How do you get around this? BCP the database out by table?
> DTS package to export data? What works for you?
|||What was the copy process that you used? Robocopy?
http://support.microsoft.com/default.aspx?scid=kb;en-
us;259837
MORE INFORMATION
The CopyFileEx and CopyFile functions are used by all of
the standard file-copy utilities that are included with
Windows NT 4.0 and Windows 2000. These utilities include
COPY, XCOPY, and Explorer. The resource kit utility,
Robocopy.exe, also uses the CopyFileEx function;
therefore, all of these tools have the same limitation.
Because these functions use buffered I/O, the maximum size
of a file that you can copy depends on the amount of paged-
pool memory that is available at the start of the copy
procedure and on whether the Physical Address Extensions
(PAE) kernel (which allows access to memory above 4
gigabytes [GB]) is in use.
With the standard kernel, 1 kilobyte (KB) of paged pool is
required for each megabyte (MB) of file size that is
opened for buffered I/O. Because there are two files
involved in a copy procedure, 2 KB of paged pool are
required for each MB in the source file. When the PAE
kernel is used, the overhead is doubled and 4 KB of paged
pool is required for each MB in the source file.
Service Pack 2 (SP2) for Windows 2000 contains a fix for
this limitation. The kernel memory manager has been
altered with respect to management of paged pool memory
when files are opened for buffered I/O. This change
permits buffered operations on arbitrarily large files and
therefore permits arbitrarily large files to be copied by
any utility that uses the CopyFileEx function.
|||Hi,
BCP OUT / BCP IN , DTS will take long execution time. I feel that you can
zip the file after backup and then copy the
zip file to development server and then unzip and restore the database. This
will be the easy and best approach.
Steps:
1. Backup the database
2. Zip the file
3. copy to destination
4. Unzip the file
5. Restore it
Thanks
Hari
MCDBA
"Jay" <anonymous@.discussions.microsoft.com> wrote in message
news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
> We have a process that does a production backup, copy .bak
> to the dev environment, then restore.
> Once we crossed the 21 GB .bak size, the copy process
> started failing because of the size of the file. I found
> the knowledge base article where this was explained.
> How do you get around this? BCP the database out by table?
> DTS package to export data? What works for you?
|||I have used the Method Hari mentioned using FTP and Winrar. Winrar handles the large files well and ftp is faster than a regular copy. You can set up a DTS package that handles it all also.
Jeff
|||Or WINRAR which performs much faster than WINZIP
"Hari" <hari_prasad_k@.hotmail.com> wrote in message
news:OgxYbouIEHA.2924@.TK2MSFTNGP09.phx.gbl...
> Hi,
> BCP OUT / BCP IN , DTS will take long execution time. I feel that you can
> zip the file after backup and then copy the
> zip file to development server and then unzip and restore the database.
This
> will be the easy and best approach.
>
> Steps:
> 1. Backup the database
> 2. Zip the file
> 3. copy to destination
> 4. Unzip the file
> 5. Restore it
>
> Thanks
> Hari
> MCDBA
> "Jay" <anonymous@.discussions.microsoft.com> wrote in message
> news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
>

No comments:

Post a Comment