Showing posts with label process. Show all posts
Showing posts with label process. Show all posts

Thursday, March 22, 2012

3 tables to join

hi,
I knew how to join 2 tables but i have a process to select 3 tables. I have a sample table and field below. I want to join Parts & Orders using field Prt_no and Supplier & Parts using field Sup _code

Parts table Orders Table Supplier Table
Prt_no Prt_no Sup_code
Prt_name Oh_qt Sup_name
Re_Level Or_no
Pri_amnt
Sup_code

Thanks...SELECT
p.Prt_no,
o.Prt_no,
s.Sup_code,
p.Sup_code
FROM
Parts p
INNER JOIN Orders o ON p.Prt_no = o.Prt_no
INNER JOIN Supplier s ON p.Sup_code = s.Sup_code

--I would invest in a good TSQL book if I were you. Teach Yourself Transact-SQL in 21 Days by Sam's Publishing is really good for beginners.|||Where I can get that book?

Thursday, March 8, 2012

22 GB database copy between servers

We have a process that does a production backup, copy .bak
to the dev environment, then restore.
Once we crossed the 21 GB .bak size, the copy process
started failing because of the size of the file. I found
the knowledge base article where this was explained.
How do you get around this? BCP the database out by table?
DTS package to export data? What works for you?It sounds as though your network may not be up to par. I have copied =40GB across our network with ease.
You mention that you found a KB article. What article are you talking =about?
-- Keith
"Jay" <anonymous@.discussions.microsoft.com> wrote in message =news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
> We have a process that does a production backup, copy .bak > to the dev environment, then restore.
> > Once we crossed the 21 GB .bak size, the copy process > started failing because of the size of the file. I found > the knowledge base article where this was explained.
> > How do you get around this? BCP the database out by table? > DTS package to export data? What works for you?|||Hi,
If the normal copy doesn't work you can always try to FTP it over your
network. This is more stable than the normal copy. (In my experience.)
Greets,
Jo Segers.
"Keith Kratochvil" <sqlguy.back2u@.comcast.net> schreef in bericht
news:%23x250huIEHA.2556@.TK2MSFTNGP12.phx.gbl...
It sounds as though your network may not be up to par. I have copied 40GB
across our network with ease.
You mention that you found a KB article. What article are you talking
about?
--
Keith
"Jay" <anonymous@.discussions.microsoft.com> wrote in message
news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
> We have a process that does a production backup, copy .bak
> to the dev environment, then restore.
> Once we crossed the 21 GB .bak size, the copy process
> started failing because of the size of the file. I found
> the knowledge base article where this was explained.
> How do you get around this? BCP the database out by table?
> DTS package to export data? What works for you?|||What was the copy process that you used? Robocopy?
http://support.microsoft.com/default.aspx?scid=kb;en-
us;259837
MORE INFORMATION
The CopyFileEx and CopyFile functions are used by all of
the standard file-copy utilities that are included with
Windows NT 4.0 and Windows 2000. These utilities include
COPY, XCOPY, and Explorer. The resource kit utility,
Robocopy.exe, also uses the CopyFileEx function;
therefore, all of these tools have the same limitation.
Because these functions use buffered I/O, the maximum size
of a file that you can copy depends on the amount of paged-
pool memory that is available at the start of the copy
procedure and on whether the Physical Address Extensions
(PAE) kernel (which allows access to memory above 4
gigabytes [GB]) is in use.
With the standard kernel, 1 kilobyte (KB) of paged pool is
required for each megabyte (MB) of file size that is
opened for buffered I/O. Because there are two files
involved in a copy procedure, 2 KB of paged pool are
required for each MB in the source file. When the PAE
kernel is used, the overhead is doubled and 4 KB of paged
pool is required for each MB in the source file.
Service Pack 2 (SP2) for Windows 2000 contains a fix for
this limitation. The kernel memory manager has been
altered with respect to management of paged pool memory
when files are opened for buffered I/O. This change
permits buffered operations on arbitrarily large files and
therefore permits arbitrarily large files to be copied by
any utility that uses the CopyFileEx function.|||Hi,
BCP OUT / BCP IN , DTS will take long execution time. I feel that you can
zip the file after backup and then copy the
zip file to development server and then unzip and restore the database. This
will be the easy and best approach.
Steps:
1. Backup the database
2. Zip the file
3. copy to destination
4. Unzip the file
5. Restore it
Thanks
Hari
MCDBA
"Jay" <anonymous@.discussions.microsoft.com> wrote in message
news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
> We have a process that does a production backup, copy .bak
> to the dev environment, then restore.
> Once we crossed the 21 GB .bak size, the copy process
> started failing because of the size of the file. I found
> the knowledge base article where this was explained.
> How do you get around this? BCP the database out by table?
> DTS package to export data? What works for you?|||Or WINRAR which performs much faster than WINZIP
"Hari" <hari_prasad_k@.hotmail.com> wrote in message
news:OgxYbouIEHA.2924@.TK2MSFTNGP09.phx.gbl...
> Hi,
> BCP OUT / BCP IN , DTS will take long execution time. I feel that you can
> zip the file after backup and then copy the
> zip file to development server and then unzip and restore the database.
This
> will be the easy and best approach.
>
> Steps:
> 1. Backup the database
> 2. Zip the file
> 3. copy to destination
> 4. Unzip the file
> 5. Restore it
>
> Thanks
> Hari
> MCDBA
> "Jay" <anonymous@.discussions.microsoft.com> wrote in message
> news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
> > We have a process that does a production backup, copy .bak
> > to the dev environment, then restore.
> >
> > Once we crossed the 21 GB .bak size, the copy process
> > started failing because of the size of the file. I found
> > the knowledge base article where this was explained.
> >
> > How do you get around this? BCP the database out by table?
> > DTS package to export data? What works for you?
>

22 GB database copy between servers

We have a process that does a production backup, copy .bak
to the dev environment, then restore.
Once we crossed the 21 GB .bak size, the copy process
started failing because of the size of the file. I found
the knowledge base article where this was explained.
How do you get around this? BCP the database out by table?
DTS package to export data? What works for you?
It sounds as though your network may not be up to par. I have copied =
40GB across our network with ease.
You mention that you found a KB article. What article are you talking =
about?
--=20
Keith
"Jay" <anonymous@.discussions.microsoft.com> wrote in message =
news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
> We have a process that does a production backup, copy .bak=20
> to the dev environment, then restore.
>=20
> Once we crossed the 21 GB .bak size, the copy process=20
> started failing because of the size of the file. I found=20
> the knowledge base article where this was explained.
>=20
> How do you get around this? BCP the database out by table?=20
> DTS package to export data? What works for you?
|||Hi,
If the normal copy doesn't work you can always try to FTP it over your
network. This is more stable than the normal copy. (In my experience.)
Greets,
Jo Segers.
"Keith Kratochvil" <sqlguy.back2u@.comcast.net> schreef in bericht
news:%23x250huIEHA.2556@.TK2MSFTNGP12.phx.gbl...
It sounds as though your network may not be up to par. I have copied 40GB
across our network with ease.
You mention that you found a KB article. What article are you talking
about?
Keith
"Jay" <anonymous@.discussions.microsoft.com> wrote in message
news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
> We have a process that does a production backup, copy .bak
> to the dev environment, then restore.
> Once we crossed the 21 GB .bak size, the copy process
> started failing because of the size of the file. I found
> the knowledge base article where this was explained.
> How do you get around this? BCP the database out by table?
> DTS package to export data? What works for you?
|||What was the copy process that you used? Robocopy?
http://support.microsoft.com/default.aspx?scid=kb;en-
us;259837
MORE INFORMATION
The CopyFileEx and CopyFile functions are used by all of
the standard file-copy utilities that are included with
Windows NT 4.0 and Windows 2000. These utilities include
COPY, XCOPY, and Explorer. The resource kit utility,
Robocopy.exe, also uses the CopyFileEx function;
therefore, all of these tools have the same limitation.
Because these functions use buffered I/O, the maximum size
of a file that you can copy depends on the amount of paged-
pool memory that is available at the start of the copy
procedure and on whether the Physical Address Extensions
(PAE) kernel (which allows access to memory above 4
gigabytes [GB]) is in use.
With the standard kernel, 1 kilobyte (KB) of paged pool is
required for each megabyte (MB) of file size that is
opened for buffered I/O. Because there are two files
involved in a copy procedure, 2 KB of paged pool are
required for each MB in the source file. When the PAE
kernel is used, the overhead is doubled and 4 KB of paged
pool is required for each MB in the source file.
Service Pack 2 (SP2) for Windows 2000 contains a fix for
this limitation. The kernel memory manager has been
altered with respect to management of paged pool memory
when files are opened for buffered I/O. This change
permits buffered operations on arbitrarily large files and
therefore permits arbitrarily large files to be copied by
any utility that uses the CopyFileEx function.
|||Hi,
BCP OUT / BCP IN , DTS will take long execution time. I feel that you can
zip the file after backup and then copy the
zip file to development server and then unzip and restore the database. This
will be the easy and best approach.
Steps:
1. Backup the database
2. Zip the file
3. copy to destination
4. Unzip the file
5. Restore it
Thanks
Hari
MCDBA
"Jay" <anonymous@.discussions.microsoft.com> wrote in message
news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
> We have a process that does a production backup, copy .bak
> to the dev environment, then restore.
> Once we crossed the 21 GB .bak size, the copy process
> started failing because of the size of the file. I found
> the knowledge base article where this was explained.
> How do you get around this? BCP the database out by table?
> DTS package to export data? What works for you?
|||I have used the Method Hari mentioned using FTP and Winrar. Winrar handles the large files well and ftp is faster than a regular copy. You can set up a DTS package that handles it all also.
Jeff
|||Or WINRAR which performs much faster than WINZIP
"Hari" <hari_prasad_k@.hotmail.com> wrote in message
news:OgxYbouIEHA.2924@.TK2MSFTNGP09.phx.gbl...
> Hi,
> BCP OUT / BCP IN , DTS will take long execution time. I feel that you can
> zip the file after backup and then copy the
> zip file to development server and then unzip and restore the database.
This
> will be the easy and best approach.
>
> Steps:
> 1. Backup the database
> 2. Zip the file
> 3. copy to destination
> 4. Unzip the file
> 5. Restore it
>
> Thanks
> Hari
> MCDBA
> "Jay" <anonymous@.discussions.microsoft.com> wrote in message
> news:1cee901c422e6$6f7445d0$a401280a@.phx.gbl...
>

22 GB database copy between servers

We have a process that does a production backup, copy .bak
to the dev environment, then restore.
Once we crossed the 21 GB .bak size, the copy process
started failing because of the size of the file. I found
the knowledge base article where this was explained.
How do you get around this? BCP the database out by table?
DTS package to export data? What works for you?It sounds as though your network may not be up to par. I have copied =
40GB across our network with ease.
You mention that you found a KB article. What article are you talking =
about?
--=20
Keith
"Jay" <anonymous@.discussions.microsoft.com> wrote in message =
news:1cee901c422e6$6f7445d0$a401280a@.phx
.gbl...
> We have a process that does a production backup, copy .bak=20
> to the dev environment, then restore.
>=20
> Once we crossed the 21 GB .bak size, the copy process=20
> started failing because of the size of the file. I found=20
> the knowledge base article where this was explained.
>=20
> How do you get around this? BCP the database out by table?=20
> DTS package to export data? What works for you?|||Hi,
If the normal copy doesn't work you can always try to FTP it over your
network. This is more stable than the normal copy. (In my experience.)
Greets,
Jo Segers.
"Keith Kratochvil" <sqlguy.back2u@.comcast.net> schreef in bericht
news:%23x250huIEHA.2556@.TK2MSFTNGP12.phx.gbl...
It sounds as though your network may not be up to par. I have copied 40GB
across our network with ease.
You mention that you found a KB article. What article are you talking
about?
Keith
"Jay" <anonymous@.discussions.microsoft.com> wrote in message
news:1cee901c422e6$6f7445d0$a401280a@.phx
.gbl...
> We have a process that does a production backup, copy .bak
> to the dev environment, then restore.
> Once we crossed the 21 GB .bak size, the copy process
> started failing because of the size of the file. I found
> the knowledge base article where this was explained.
> How do you get around this? BCP the database out by table?
> DTS package to export data? What works for you?|||What was the copy process that you used? Robocopy?
http://support.microsoft.com/default.aspx?scid=kb;en-
us;259837
MORE INFORMATION
The CopyFileEx and CopyFile functions are used by all of
the standard file-copy utilities that are included with
Windows NT 4.0 and Windows 2000. These utilities include
COPY, XCOPY, and Explorer. The resource kit utility,
Robocopy.exe, also uses the CopyFileEx function;
therefore, all of these tools have the same limitation.
Because these functions use buffered I/O, the maximum size
of a file that you can copy depends on the amount of paged-
pool memory that is available at the start of the copy
procedure and on whether the Physical Address Extensions
(PAE) kernel (which allows access to memory above 4
gigabytes [GB]) is in use.
With the standard kernel, 1 kilobyte (KB) of paged pool is
required for each megabyte (MB) of file size that is
opened for buffered I/O. Because there are two files
involved in a copy procedure, 2 KB of paged pool are
required for each MB in the source file. When the PAE
kernel is used, the overhead is doubled and 4 KB of paged
pool is required for each MB in the source file.
Service Pack 2 (SP2) for Windows 2000 contains a fix for
this limitation. The kernel memory manager has been
altered with respect to management of paged pool memory
when files are opened for buffered I/O. This change
permits buffered operations on arbitrarily large files and
therefore permits arbitrarily large files to be copied by
any utility that uses the CopyFileEx function.|||Hi,
BCP OUT / BCP IN , DTS will take long execution time. I feel that you can
zip the file after backup and then copy the
zip file to development server and then unzip and restore the database. This
will be the easy and best approach.
Steps:
1. Backup the database
2. Zip the file
3. copy to destination
4. Unzip the file
5. Restore it
Thanks
Hari
MCDBA
"Jay" <anonymous@.discussions.microsoft.com> wrote in message
news:1cee901c422e6$6f7445d0$a401280a@.phx
.gbl...
> We have a process that does a production backup, copy .bak
> to the dev environment, then restore.
> Once we crossed the 21 GB .bak size, the copy process
> started failing because of the size of the file. I found
> the knowledge base article where this was explained.
> How do you get around this? BCP the database out by table?
> DTS package to export data? What works for you?|||I have used the Method Hari mentioned using FTP and Winrar. Winrar handles
the large files well and ftp is faster than a regular copy. You can set up
a DTS package that handles it all also.
Jeff|||Or WINRAR which performs much faster than WINZIP
"Hari" <hari_prasad_k@.hotmail.com> wrote in message
news:OgxYbouIEHA.2924@.TK2MSFTNGP09.phx.gbl...
> Hi,
> BCP OUT / BCP IN , DTS will take long execution time. I feel that you can
> zip the file after backup and then copy the
> zip file to development server and then unzip and restore the database.
This
> will be the easy and best approach.
>
> Steps:
> 1. Backup the database
> 2. Zip the file
> 3. copy to destination
> 4. Unzip the file
> 5. Restore it
>
> Thanks
> Hari
> MCDBA
> "Jay" <anonymous@.discussions.microsoft.com> wrote in message
> news:1cee901c422e6$6f7445d0$a401280a@.phx
.gbl...
>

Friday, February 24, 2012

2005 restored msdb, cant see SSIS packages ?

We recently rebuilt a dev server, and as part of that process, I restored
msdb (successfully) from backup.
But now I can't see any SSIS packages from Management Studio !
If I run this query ->
use msdb
select *
from sysdtspackages90
then I can see them all !
How do I get Management Studio to see them as well ?
TIAHi Jim
My guess this is possibly related to ownership. Did you restore the master
database as well?
John
"Jim Trowbridge" wrote:
> We recently rebuilt a dev server, and as part of that process, I restored
> msdb (successfully) from backup.
> But now I can't see any SSIS packages from Management Studio !
> If I run this query ->
> use msdb
> select *
> from sysdtspackages90
> then I can see them all !
> How do I get Management Studio to see them as well ?
> TIA|||John,
I didn't restore master. It turned out to be a problem with the registry
setting that points to the (cluster resource copy of the ) MsDtsSrvr.ini.xml
file had the wrong address. Thats why the tree didn't even have the usual
File System and MSDB sub-tree entries.
Once that registry setting was fixed and the Integration Services service
was bounced, everything behaved properly.
"John Bell" wrote:
> Hi Jim
> My guess this is possibly related to ownership. Did you restore the master
> database as well?
> John
> "Jim Trowbridge" wrote:
> > We recently rebuilt a dev server, and as part of that process, I restored
> > msdb (successfully) from backup.
> > But now I can't see any SSIS packages from Management Studio !
> > If I run this query ->
> > use msdb
> > select *
> > from sysdtspackages90
> > then I can see them all !
> >
> > How do I get Management Studio to see them as well ?
> > TIA|||Hi Jim
Thanks for the information, it's good to know you found the solution.
John
"Jim Trowbridge" wrote:
> John,
> I didn't restore master. It turned out to be a problem with the registry
> setting that points to the (cluster resource copy of the ) MsDtsSrvr.ini.xml
> file had the wrong address. Thats why the tree didn't even have the usual
> File System and MSDB sub-tree entries.
> Once that registry setting was fixed and the Integration Services service
> was bounced, everything behaved properly.
> "John Bell" wrote:
> > Hi Jim
> >
> > My guess this is possibly related to ownership. Did you restore the master
> > database as well?
> >
> > John
> >
> > "Jim Trowbridge" wrote:
> >
> > > We recently rebuilt a dev server, and as part of that process, I restored
> > > msdb (successfully) from backup.
> > > But now I can't see any SSIS packages from Management Studio !
> > > If I run this query ->
> > > use msdb
> > > select *
> > > from sysdtspackages90
> > > then I can see them all !
> > >
> > > How do I get Management Studio to see them as well ?
> > > TIA

Saturday, February 11, 2012

2005 DB backwards-compatible to 2000?

I've attached an SQL2000 db to an SQL2005 Std installation and am curious if
that process can go the other way. Can you detach from 2005 and attach to
2000 without any issues?
Earl wrote:
> Can you detach from 2005 and attach to
> 2000 without any issues?
No. One cannot attach a higher version database to a lower version SQL
Server. The 2005 file format is incompatible with SQL Server 2000.
If you have to migrate backwards, use SSIS to copy the objects and data from
SQL Server 2005 to SQL Server 2000.
Message posted via droptable.com
http://www.droptable.com/Uwe/Forums.aspx/sql-server/200612/1
|||No, You can not do that because of architectural changes.
Thanks
Hari
"Earl" <brikshoe@.newsgroups.nospam> wrote in message
news:%23ye9a5kHHHA.3540@.TK2MSFTNGP02.phx.gbl...
> I've attached an SQL2000 db to an SQL2005 Std installation and am curious
> if that process can go the other way. Can you detach from 2005 and attach
> to 200use of 0 without any issues?
>
|||Ahhh, thanks Chris and Hari. Tis' the answer I need.
"Chris O'C via droptable.com" <u29189@.uwe> wrote in message
news:6aacc6ece6f0b@.uwe...
> Earl wrote:
> No. One cannot attach a higher version database to a lower version SQL
> Server. The 2005 file format is incompatible with SQL Server 2000.
> If you have to migrate backwards, use SSIS to copy the objects and data
> from
> SQL Server 2005 to SQL Server 2000.
> --
> Message posted via droptable.com
> http://www.droptable.com/Uwe/Forums.aspx/sql-server/200612/1
>

2005 DB backwards-compatible to 2000?

I've attached an SQL2000 db to an SQL2005 Std installation and am curious if
that process can go the other way. Can you detach from 2005 and attach to
2000 without any issues?Earl wrote:
> Can you detach from 2005 and attach to
> 2000 without any issues?
No. One cannot attach a higher version database to a lower version SQL
Server. The 2005 file format is incompatible with SQL Server 2000.
If you have to migrate backwards, use SSIS to copy the objects and data from
SQL Server 2005 to SQL Server 2000.
Message posted via droptable.com
http://www.droptable.com/Uwe/Forum...server/200612/1|||No, You can not do that because of architectural changes.
Thanks
Hari
"Earl" <brikshoe@.newsgroups.nospam> wrote in message
news:%23ye9a5kHHHA.3540@.TK2MSFTNGP02.phx.gbl...
> I've attached an SQL2000 db to an SQL2005 Std installation and am curious
> if that process can go the other way. Can you detach from 2005 and attach
> to 200use of 0 without any issues?
>|||Ahhh, thanks Chris and Hari. Tis' the answer I need.
"Chris O'C via droptable.com" <u29189@.uwe> wrote in message
news:6aacc6ece6f0b@.uwe...
> Earl wrote:
> No. One cannot attach a higher version database to a lower version SQL
> Server. The 2005 file format is incompatible with SQL Server 2000.
> If you have to migrate backwards, use SSIS to copy the objects and data
> from
> SQL Server 2005 to SQL Server 2000.
> --
> Message posted via droptable.com
> http://www.droptable.com/Uwe/Forum...server/200612/1
>

2005 DB backwards-compatible to 2000?

I've attached an SQL2000 db to an SQL2005 Std installation and am curious if
that process can go the other way. Can you detach from 2005 and attach to
2000 without any issues?Earl wrote:
> Can you detach from 2005 and attach to
> 2000 without any issues?
No. One cannot attach a higher version database to a lower version SQL
Server. The 2005 file format is incompatible with SQL Server 2000.
If you have to migrate backwards, use SSIS to copy the objects and data from
SQL Server 2005 to SQL Server 2000.
--
Message posted via SQLMonster.com
http://www.sqlmonster.com/Uwe/Forums.aspx/sql-server/200612/1|||No, You can not do that because of architectural changes.
Thanks
Hari
"Earl" <brikshoe@.newsgroups.nospam> wrote in message
news:%23ye9a5kHHHA.3540@.TK2MSFTNGP02.phx.gbl...
> I've attached an SQL2000 db to an SQL2005 Std installation and am curious
> if that process can go the other way. Can you detach from 2005 and attach
> to 200use of 0 without any issues?
>|||Ahhh, thanks Chris and Hari. Tis' the answer I need.
"Chris O'C via SQLMonster.com" <u29189@.uwe> wrote in message
news:6aacc6ece6f0b@.uwe...
> Earl wrote:
>> Can you detach from 2005 and attach to
>> 2000 without any issues?
> No. One cannot attach a higher version database to a lower version SQL
> Server. The 2005 file format is incompatible with SQL Server 2000.
> If you have to migrate backwards, use SSIS to copy the objects and data
> from
> SQL Server 2005 to SQL Server 2000.
> --
> Message posted via SQLMonster.com
> http://www.sqlmonster.com/Uwe/Forums.aspx/sql-server/200612/1
>