Showing posts with label file. Show all posts
Showing posts with label file. Show all posts

Tuesday, March 27, 2012

32GB log file

Recently, I discovered the log file of testing database server which grow up
a lot. The log file is around 32GB.
any idea on shrink/ Purge the log file?
Thanks
Alan
Hi,
It seems you have set recovery model for this database to FULL. In this
recovery model all the activity inside the
database is logged. So you have schedule a transaction log backup to clear
of the trasnaction logs. THis backup can
be used when a recovery is needed. Since this database is a test environment
I recomment you to make the recovery model as SIMPLE.
Show to set to simple
ALTER database <dbname> set recovery SIMPLE
How to clear the existing logs and shrink the file:-
Since the file is really huge, I recomment you to set the database to single
user model befotre doing the below steps.
-- Setting database single user.
Alter database <dbname> set single_user with rollback immediate
-- Truncate the transaction log
BACKUP log <dbname> with truncate_only
-- Shrink the transacton log file
DBCC SHRINKFILE('logical_ldf_name',truncateonly)
After doing the above steps execute the below command to see the transaction
log size and usage.
DBCC SQLPERF(LOGSPACE)
-- Now set the database multi user
Alter database <dbname> set multi_user
Thanks
Hari
MCDBA
"izumi" <test@.test.com> wrote in message
news:#AzZUnZWEHA.2844@.TK2MSFTNGP11.phx.gbl...
> Recently, I discovered the log file of testing database server which grow
up
> a lot. The log file is around 32GB.
> any idea on shrink/ Purge the log file?
> Thanks
> Alan
>
|||see
http://www.nigelrivett.net/Transacti...leGrows_1.html
|||thanks all.
actually my company will backup the database 2 times a day...
But we dont shrink(Truncate) the log..so it grows up a lot...
do we have any problems after i delete the whole log?
"Nigel Rivett" <NigelRivett@.discussions.microsoft.com> wrote in message
news:4E79A30A-2807-4652-921A-7BCC48B6199E@.microsoft.com...
> see
> http://www.nigelrivett.net/Transacti...leGrows_1.html
>
|||no real Problems deleting the log except that you wont be able to recover
transactions potentially lost.
If you are not using the logs for your recovery model, I would just
recommend setting the database recovery model to "SIMPLE". This will turn
logging off then you wont have to worry about the files at all.
Greg Jackson
PDX, Oregon
|||Yes if we can restore mdf withour log
i think it is ok...
"Jaxon" <GregoryAJackson@.hotmail.com> wrote in message
news:%23kaIa5fWEHA.1048@.tk2msftngp13.phx.gbl...
> no real Problems deleting the log except that you wont be able to recover
> transactions potentially lost.
> If you are not using the logs for your recovery model, I would just
> recommend setting the database recovery model to "SIMPLE". This will turn
> logging off then you wont have to worry about the files at all.
>
> Greg Jackson
> PDX, Oregon
>
|||Just wanted to make a distinction between shrinking and truncating the file. Truncation is a TLog specific activity and it deletes old log records
in the file that are no longer necessary. This activity doesnot result in any reduciton in the physical size of the LDF file , and change in file size
is noticed at the file level. This is an internal operation and only makes room inside the file, for re-use.
On the otherhand, shrinking is a physical operation intended for reducing the size of the LDF file.
To address your problem, I would not recommend a flat deletion of the Tlog file. Rather, you should schedule a periodic BACKUP LOG
operation, which will ensure that the TLog is truncated. Then , on a as-needed-basis, you can perform a manual DBCC SHRINKFILE operation
to claim this unused (truncated) space from the ldf and release it back to the OS.
Thanks
Ananth Padmanabham
Microsoft SQL Server support
Please reply only to the newsgroup so that others can benefit. When posting, please state the version of SQL Server being used and the error
number/exact error message text received, if any.
This posting is provided "AS IS" with no warranties, and confers no rights.

32GB log file

Recently, I discovered the log file of testing database server which grow up
a lot. The log file is around 32GB.
any idea on shrink/ Purge the log file?
Thanks
AlanHi,
It seems you have set recovery model for this database to FULL. In this
recovery model all the activity inside the
database is logged. So you have schedule a transaction log backup to clear
of the trasnaction logs. THis backup can
be used when a recovery is needed. Since this database is a test environment
I recomment you to make the recovery model as SIMPLE.
Show to set to simple
ALTER database <dbname> set recovery SIMPLE
How to clear the existing logs and shrink the file:-
Since the file is really huge, I recomment you to set the database to single
user model befotre doing the below steps.
-- Setting database single user.
Alter database <dbname> set single_user with rollback immediate
-- Truncate the transaction log
BACKUP log <dbname> with truncate_only
-- Shrink the transacton log file
DBCC SHRINKFILE('logical_ldf_name',truncateon
ly)
After doing the above steps execute the below command to see the transaction
log size and usage.
DBCC SQLPERF(LOGSPACE)
-- Now set the database multi user
Alter database <dbname> set multi_user
Thanks
Hari
MCDBA
"izumi" <test@.test.com> wrote in message
news:#AzZUnZWEHA.2844@.TK2MSFTNGP11.phx.gbl...
> Recently, I discovered the log file of testing database server which grow
up
> a lot. The log file is around 32GB.
> any idea on shrink/ Purge the log file?
> Thanks
> Alan
>|||see
http://www.nigelrivett.net/Transact...ileGrows_1.html|||thanks all.
actually my company will backup the database 2 times a day...
But we dont shrink(Truncate) the log..so it grows up a lot...
do we have any problems after i delete the whole log?
"Nigel Rivett" <NigelRivett@.discussions.microsoft.com> wrote in message
news:4E79A30A-2807-4652-921A-7BCC48B6199E@.microsoft.com...
> see
> http://www.nigelrivett.net/Transact...ileGrows_1.html
>|||no real Problems deleting the log except that you wont be able to recover
transactions potentially lost.
If you are not using the logs for your recovery model, I would just
recommend setting the database recovery model to "SIMPLE". This will turn
logging off then you wont have to worry about the files at all.
Greg Jackson
PDX, Oregon|||Yes if we can restore mdf withour log
i think it is ok...
"Jaxon" <GregoryAJackson@.hotmail.com> wrote in message
news:%23kaIa5fWEHA.1048@.tk2msftngp13.phx.gbl...
> no real Problems deleting the log except that you wont be able to recover
> transactions potentially lost.
> If you are not using the logs for your recovery model, I would just
> recommend setting the database recovery model to "SIMPLE". This will turn
> logging off then you wont have to worry about the files at all.
>
> Greg Jackson
> PDX, Oregon
>|||Just wanted to make a distinction between shrinking and truncating the file.
Truncation is a TLog specific activity and it deletes old log records
in the file that are no longer necessary. This activity doesnot result in an
y reduciton in the physical size of the LDF file , and change in file size
is noticed at the file level. This is an internal operation and only makes r
oom inside the file, for re-use.
On the otherhand, shrinking is a physical operation intended for reducing th
e size of the LDF file.
To address your problem, I would not recommend a flat deletion of the Tlog f
ile. Rather, you should schedule a periodic BACKUP LOG
operation, which will ensure that the TLog is truncated. Then , on a as-need
ed-basis, you can perform a manual DBCC SHRINKFILE operation
to claim this unused (truncated) space from the ldf and release it back to t
he OS.
Thanks
Ananth Padmanabham
Microsoft SQL Server support
Please reply only to the newsgroup so that others can benefit. When posting,
please state the version of SQL Server being used and the error
number/exact error message text received, if any.
This posting is provided "AS IS" with no warranties, and confers no rights.

32GB log file

Recently, I discovered the log file of testing database server which grow up
a lot. The log file is around 32GB.
any idea on shrink/ Purge the log file?
Thanks
AlanHi,
It seems you have set recovery model for this database to FULL. In this
recovery model all the activity inside the
database is logged. So you have schedule a transaction log backup to clear
of the trasnaction logs. THis backup can
be used when a recovery is needed. Since this database is a test environment
I recomment you to make the recovery model as SIMPLE.
Show to set to simple
ALTER database <dbname> set recovery SIMPLE
How to clear the existing logs and shrink the file:-
Since the file is really huge, I recomment you to set the database to single
user model befotre doing the below steps.
-- Setting database single user.
Alter database <dbname> set single_user with rollback immediate
-- Truncate the transaction log
BACKUP log <dbname> with truncate_only
-- Shrink the transacton log file
DBCC SHRINKFILE('logical_ldf_name',truncateonly)
After doing the above steps execute the below command to see the transaction
log size and usage.
DBCC SQLPERF(LOGSPACE)
-- Now set the database multi user
Alter database <dbname> set multi_user
Thanks
Hari
MCDBA
"izumi" <test@.test.com> wrote in message
news:#AzZUnZWEHA.2844@.TK2MSFTNGP11.phx.gbl...
> Recently, I discovered the log file of testing database server which grow
up
> a lot. The log file is around 32GB.
> any idea on shrink/ Purge the log file?
> Thanks
> Alan
>|||see
http://www.nigelrivett.net/TransactionLogFileGrows_1.html|||thanks all.
actually my company will backup the database 2 times a day...
But we dont shrink(Truncate) the log..so it grows up a lot...
do we have any problems after i delete the whole log?
"Nigel Rivett" <NigelRivett@.discussions.microsoft.com> wrote in message
news:4E79A30A-2807-4652-921A-7BCC48B6199E@.microsoft.com...
> see
> http://www.nigelrivett.net/TransactionLogFileGrows_1.html
>|||no real Problems deleting the log except that you wont be able to recover
transactions potentially lost.
If you are not using the logs for your recovery model, I would just
recommend setting the database recovery model to "SIMPLE". This will turn
logging off then you wont have to worry about the files at all.
Greg Jackson
PDX, Oregon|||Yes if we can restore mdf withour log
i think it is ok...
"Jaxon" <GregoryAJackson@.hotmail.com> wrote in message
news:%23kaIa5fWEHA.1048@.tk2msftngp13.phx.gbl...
> no real Problems deleting the log except that you wont be able to recover
> transactions potentially lost.
> If you are not using the logs for your recovery model, I would just
> recommend setting the database recovery model to "SIMPLE". This will turn
> logging off then you wont have to worry about the files at all.
>
> Greg Jackson
> PDX, Oregon
>|||Just wanted to make a distinction between shrinking and truncating the file. Truncation is a TLog specific activity and it deletes old log records
in the file that are no longer necessary. This activity doesnot result in any reduciton in the physical size of the LDF file , and change in file size
is noticed at the file level. This is an internal operation and only makes room inside the file, for re-use.
On the otherhand, shrinking is a physical operation intended for reducing the size of the LDF file.
To address your problem, I would not recommend a flat deletion of the Tlog file. Rather, you should schedule a periodic BACKUP LOG
operation, which will ensure that the TLog is truncated. Then , on a as-needed-basis, you can perform a manual DBCC SHRINKFILE operation
to claim this unused (truncated) space from the ldf and release it back to the OS.
Thanks
Ananth Padmanabham
Microsoft SQL Server support
Please reply only to the newsgroup so that others can benefit. When posting, please state the version of SQL Server being used and the error
number/exact error message text received, if any.
This posting is provided "AS IS" with no warranties, and confers no rights.sql

Thursday, March 22, 2012

3 XML Bulk Load questions

1) Is it possible to pass the schema file as a stream, rather than a
pathname to a file on disk, as the first parameter of the Bulk Load Execute
method?
2) I've built a .Net wrapper for the Bulk Load application and while I'm
considering expanding on it's functionality I'm pretty confident that
Microsoft will eventually expand their own native .Net versions of the
SQLXML suite to include Bulk Load. Any ideas if this is coming soon or are
we likely looking at a SQL 2005 timeframe.
3) I am encountering the following error: "No data was provided for column
'COLUMN1' on table 'TABLE1', and this column cannot contain NULL values."
According to the documentation, the KeepNulls property, "Specifies what
value to use for a column that is missing a corresponding attribute or
subelement in the XML document. This is a Boolean property. When the
property is set to TRUE, XML Bulk Load assigns a null value to the column.
It does not assign the column's default value, if any, as set on the
server." I have tried setting the KeepNulls property to both True and
False and the same error message is returned.
It seems that if the schema file defines a table element but the data file
doesn't include a value for that element, then bulk load attempts to insert
a NULL value for this field . This obviously becomes a problem in tables
which have fields defined as NOT NULL. That said, I believe you can define
a default value (via attributes) in the case where no value is supplied,
however this design decision seems somewhat flawed to me from a
functionality and performance standpoint.
For example, you might have a table with 50 fields but in some cases you
might only be inserting 2 field values. Therefore, if you create a schema
file with all 50 fields and their corresponding defaults, I imagine Bulk
Load will have to dynamically build and run the following query when only 2
fields values are supplied in a data file for a record...
INSERT INTO TABLE1 (Field1,Field2...Field50)
VALUES (Value1,Value2,...Value50)
rather than dynamically building a shorter Insert statement and running the
following more efficient query...
INSERT INTO TABLE1 (Field1,Field2)
VALUES (Value1,Value2)
Is there a particular reason this design approach was taken or am I missing
a different approach to this particular problem?
Thank you.
ps . I'm using the latest version from the website - SQLXML 3.0 SP2.
Since there have been no replies to my original post...
1) It looks like there isn't an overloaded method to pass a stream as the
first parameter so you must pass the path to a file. That said, if
Microsoft is listening, this would be nice for those cases where you are
dynamically building the schema in memory and would like to prevent having
to force a physical read from the disk.
2) I've read somewhere that Microsoft will be supplying a fully .Net enabled
version - probably sometime during the Longhorn/SQL 2005 timeframe.
3) While it would be nice to get some feedback on the design decision I
hinted at in my third question I was more concerned with the implementation
of Bulk Load given a specific scenario. Here's my findings given three
different approaches with my original question reflected in the third
scenario...
Example 1: Mapped field with value:
Schema file - contains field DocumentID
Data file - contains data for field DocumentID
Result - Successfully integrates value supplied in the data file into the
DocumentID field
Example 2: Unmapped field no value:
Schema file - doesn't contain field DocumentID
Data file - no data for field DocumentID
Result - If a default value has been setup for this field in the destination
table then that value is used, otherwise if no default is defined, a NULL
value is inserted. In addition, you can force a NULL to be inserted for
unmapped columns by setting the KeepNulls property to TRUE. In both
situations where a NULL might be inserted, if no nulls are defined for the
field in the destination table then an error is returned.
Example 3: Mapped field no value:
Schema file - contains field DocumentID
Data file - no data for field DocumentID
Result - Integration fails and error message is returned - "No data was
provided for column 'DocumentID' on table 'Table1', and this column cannot
contain NULL values (the KeepNulls property has no bearing on the results in
this case as it only relates to unmapped columns). You can get around this
by defining a default for the column using attributes, however, in my
opinion this is a flawed solution as it just increases the size of your
schema file and requires you to keep your destination table and schema file
default values always in sync.
So why doesn't Bulk Load use the field default instead of trying to insert a
NULL value in scenario 3? I guess the Bulk Load program is creating the
column_list for its INSERT statement based on ALL the values from the
schema file. And where there is no corresponding data value, it inserts a
NULL value.
INSERT INTO TABLE1 (Field1,DocumentID...Field50)
VALUES (Value1,NULL,...NULL)
From a performance perspective I can understand the decision not to go out
and retrieve the default field values from the table, but wouldn't it make
more sense to just ignore missing data values that have been mapped and let
SQL Server handle the defaults? And in the situation where the intended
value is NULL, either have the data file contain an explicit NULL value or
add another Boolean property to the Bulk Load interface. As I listed
below, I believe the following make more sense and would increase
performance given the conditions outlined in scenario 3:
INSERT INTO TABLE1 (Field1,DocumentID)
VALUES (Value1,Value2)
Anyway, if anyone has insight into this or upcoming changes in functionality
I would be extremely grateful as we are building a fairly large program
around the Bulk Load interface and ideally we would like to prevent having
to rewrite portions of our solution as new versions of XMLSQL are released.
Thanks again.
"Cipher" <c@.c.com> wrote in message
news:OaSGJSFTEHA.1732@.TK2MSFTNGP09.phx.gbl...
> 1) Is it possible to pass the schema file as a stream, rather than a
> pathname to a file on disk, as the first parameter of the Bulk Load
Execute
> method?
> 2) I've built a .Net wrapper for the Bulk Load application and while I'm
> considering expanding on it's functionality I'm pretty confident that
> Microsoft will eventually expand their own native .Net versions of the
> SQLXML suite to include Bulk Load. Any ideas if this is coming soon or
are
> we likely looking at a SQL 2005 timeframe.
> 3) I am encountering the following error: "No data was provided for
column
> 'COLUMN1' on table 'TABLE1', and this column cannot contain NULL values."
> According to the documentation, the KeepNulls property, "Specifies what
> value to use for a column that is missing a corresponding attribute or
> subelement in the XML document. This is a Boolean property. When the
> property is set to TRUE, XML Bulk Load assigns a null value to the column.
> It does not assign the column's default value, if any, as set on the
> server." I have tried setting the KeepNulls property to both True and
> False and the same error message is returned.
> It seems that if the schema file defines a table element but the data file
> doesn't include a value for that element, then bulk load attempts to
insert
> a NULL value for this field . This obviously becomes a problem in tables
> which have fields defined as NOT NULL. That said, I believe you can
define
> a default value (via attributes) in the case where no value is supplied,
> however this design decision seems somewhat flawed to me from a
> functionality and performance standpoint.
> For example, you might have a table with 50 fields but in some cases you
> might only be inserting 2 field values. Therefore, if you create a schema
> file with all 50 fields and their corresponding defaults, I imagine Bulk
> Load will have to dynamically build and run the following query when only
2
> fields values are supplied in a data file for a record...
> INSERT INTO TABLE1 (Field1,Field2...Field50)
> VALUES (Value1,Value2,...Value50)
> rather than dynamically building a shorter Insert statement and running
the
> following more efficient query...
> INSERT INTO TABLE1 (Field1,Field2)
> VALUES (Value1,Value2)
> Is there a particular reason this design approach was taken or am I
missing
> a different approach to this particular problem?
> Thank you.
> ps . I'm using the latest version from the website - SQLXML 3.0 SP2.
>
|||Sorry for the delay in answering.
1) You are correct. It's something we're committed to doing for your topic
in #2
2) We have been working on one that was to be delivered in Whidbey, but that
has since been pushed back. We're now looking at it for the Longhorn
timeframe with the .Net Framework
3) If the column is not null, you need something to put in there. Have you
declared a default value in your database? If so, setting KeepNulls=False
should mean that that value is generated and stored in the table.
Irwin Dolobowsky
Program Manager - SqlXml
http://blogs.msdn.com/irwando
This posting is provided "AS IS" with no warranties, and confers no rights.
"Cipher" <c@.c.com> wrote in message
news:%23BRINDzTEHA.3976@.TK2MSFTNGP09.phx.gbl...
> Since there have been no replies to my original post...
> 1) It looks like there isn't an overloaded method to pass a stream as the
> first parameter so you must pass the path to a file. That said, if
> Microsoft is listening, this would be nice for those cases where you are
> dynamically building the schema in memory and would like to prevent having
> to force a physical read from the disk.
> 2) I've read somewhere that Microsoft will be supplying a fully .Net
> enabled
> version - probably sometime during the Longhorn/SQL 2005 timeframe.
> 3) While it would be nice to get some feedback on the design decision I
> hinted at in my third question I was more concerned with the
> implementation
> of Bulk Load given a specific scenario. Here's my findings given three
> different approaches with my original question reflected in the third
> scenario...
>
> Example 1: Mapped field with value:
> Schema file - contains field DocumentID
> Data file - contains data for field DocumentID
> Result - Successfully integrates value supplied in the data file into the
> DocumentID field
>
> Example 2: Unmapped field no value:
> Schema file - doesn't contain field DocumentID
> Data file - no data for field DocumentID
> Result - If a default value has been setup for this field in the
> destination
> table then that value is used, otherwise if no default is defined, a NULL
> value is inserted. In addition, you can force a NULL to be inserted for
> unmapped columns by setting the KeepNulls property to TRUE. In both
> situations where a NULL might be inserted, if no nulls are defined for the
> field in the destination table then an error is returned.
>
> Example 3: Mapped field no value:
> Schema file - contains field DocumentID
> Data file - no data for field DocumentID
> Result - Integration fails and error message is returned - "No data was
> provided for column 'DocumentID' on table 'Table1', and this column cannot
> contain NULL values (the KeepNulls property has no bearing on the results
> in
> this case as it only relates to unmapped columns). You can get around
> this
> by defining a default for the column using attributes, however, in my
> opinion this is a flawed solution as it just increases the size of your
> schema file and requires you to keep your destination table and schema
> file
> default values always in sync.
>
> So why doesn't Bulk Load use the field default instead of trying to insert
> a
> NULL value in scenario 3? I guess the Bulk Load program is creating the
> column_list for its INSERT statement based on ALL the values from the
> schema file. And where there is no corresponding data value, it inserts a
> NULL value.
> INSERT INTO TABLE1 (Field1,DocumentID...Field50)
> VALUES (Value1,NULL,...NULL)
> From a performance perspective I can understand the decision not to go out
> and retrieve the default field values from the table, but wouldn't it make
> more sense to just ignore missing data values that have been mapped and
> let
> SQL Server handle the defaults? And in the situation where the intended
> value is NULL, either have the data file contain an explicit NULL value or
> add another Boolean property to the Bulk Load interface. As I listed
> below, I believe the following make more sense and would increase
> performance given the conditions outlined in scenario 3:
> INSERT INTO TABLE1 (Field1,DocumentID)
> VALUES (Value1,Value2)
> Anyway, if anyone has insight into this or upcoming changes in
> functionality
> I would be extremely grateful as we are building a fairly large program
> around the Bulk Load interface and ideally we would like to prevent having
> to rewrite portions of our solution as new versions of XMLSQL are
> released.
> Thanks again.
>
> "Cipher" <c@.c.com> wrote in message
> news:OaSGJSFTEHA.1732@.TK2MSFTNGP09.phx.gbl...
> Execute
> are
> column
> insert
> define
> 2
> the
> missing
>

Tuesday, March 20, 2012

3 Questions

1. The printing orientation of crystal report can be set from File --> Printer Setup

My report is set as landscape. I indicate the print mode to be activeX in the Jsp code.

However, when the activeX prompts out, I have to choose a printer and also set the page to be landscape again.

Can I skip the step telling the printer to print landscape?



2. When the activeX prompts out, it is set to Page 1 of N by default, How can I change the default setting to print All pages?



3. The language of a browser can be set from Internet Option --> Languages (in Internet Explorer)

Ive found that the activeX displays nothing if the language of browser is not set as English.

We dont want the user to change the language setting in browser, but how can we display the

activeX properly if the language is originally be chinese?(1) CR.EnablePopupMenu = False

Thursday, March 8, 2012

22GB .TMP file

:confused: Hi,

We have an SQL Server which has developed a myterious .TMP file. We have no idea what it is. The size of it keeps growing and has now reached 22GB!!. We have no idea what its function is and on trying to delete it we get the message that access is denied as there is a sharing violation. We can't can't find the process that is associated with it either. Its name is _BEVspCacheFile_57.TMP. From this it seems it may belong to a stored procedure somewhere?!?

Does anyone know how to find out what this file is and whats using it and if it would be safe to delete it. It maybe in use by a db that is no longer needed, but we have no way of finding this out yet.

Any help on this would be greatly appreciated.I think it could be a file from backup exec from veritas.|||Thanks for the quick reply.

that does make sense. We do use Veritas to back up. Do you know what this file is used for and how we can stop the sharing violation to delete it?

Cheers,|||Can't say for sure. I did a google-search on '_BEVspCacheFile' and found the following: (...) I have been running around like a dog chasing its tail. We are having the exact same problem with this issue. Ever since we had upgraded our version of Backup Exec to 9.0 from 8.6 we have been having all these troubles with users not being able to save their files to their home directories. I have a trouble ticket open with Veritas and the engineer recommended I try this option. They had mentioned to us to remove the Advanced Open File option first and just run the Accelerator Agent without the OFO. They pointed out that there is TMP file that is not being properly removed when you do your daily backups. The file in question is _bevspcachefile.tmp. The file a hidden one so if you do a search just run *.tmp to see if its still in your system. I cant believe that this could be the problem..
If your running veritas only to backup the files/database on that server I would stop all veritas services, move the file to another location, restart the veritas services and see if the backups and restore functions are still operational. Then delete the old file. Perhaps contacting veritas itself also.|||ow btw: veritas has a knowledge base I didn't consider: http://support.veritas.com/ . Select 'get support now' on the Backup exec picture, then select 'view more' on the error messages, after that select 'search knowledge base'. I did a search on 'BEVspCacheFile' which lead to an article you may wish to read.

Saturday, February 25, 2012

2005 SQL EXPRESS DATABASE BACKUP

I am using Veritas backup exec 10.d with sql agent. I am able to see the sql express studio 2005 database files but there is difference in file size between original database file and the one shows in sql agent backup selection. The backup performs 100% successful but the original database file size is different e.g( original file:350MB, backup file: 28MB)..

Any ideas...?

Thanks in advance..

Have you talked to the folks at Veritas/Symantec about this? They can probably give you better information about their software.

Regards,

Mike Wachal
SQL Express team

-
Check out my tips for getting your answer faster and how to ask a good question: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=307712&SiteID=1

|||

The SQL API that BE plugs into will only send extents that are currently allocated by the database, so if you have created a database with an initial size and not filled it, we'd expect this delta.

Beyond this, I'd check if compression is enabled.

Of course, the ultimate test of whether a backup is valid or not is to attempt to do a restore. This should be a periodic task for any backup setup.

Friday, February 24, 2012

2005 SP 1 Install Failure Solution - Unable to install Windows Installer MSP file

This is posted for others who have
If this error occurs here is what worked for me. I hope it may work for you
but am not sure.
In my case I had SQL Express 2005 Installer on my system as well as I run
both (one for development, one for production).
Shut down all SQL related services manually.
I set registry permissions on Software\Policies\Microsoft\Windows\Installer
to include the account that is doing the install with Full Control. Click
Advanced and make sure everything is selected.
I downloaded and ran the SQL Install Cleanup and removed the SQL 2005 Setup
Files Installer and the SQL Express Installer. I then reinstalled the SQL
Setup Files (ONLY!!!!!) from my SQL Server 2005 CD. If you don't don't do
this you won't be able to use the SP.
Q290301 OFFXP: Windows Installer CleanUp Utility
http://support.microsoft.com/support/kb/articles/q290/3/01.asp
I then reinstalled the SP and it worked.
Hope this helps somebody.
Special thanks to Peter Yang of Microsoft for his invaluable assistance on
this issue.
--
Get a powerful web, database, application, and email hosting with KJM
Solutions
http://www.kjmsolutions.comForgot to mention this is on a Windows Server 2003 SP1 platform.
--
Get a powerful web, database, application, and email hosting with KJM
Solutions
http://www.kjmsolutions.com
"vbnetdev" <vbnetdev@.community.nospam> wrote in message
news:OdGUsVgaGHA.3304@.TK2MSFTNGP04.phx.gbl...
> This is posted for others who have
> If this error occurs here is what worked for me. I hope it may work for
> you but am not sure.
> In my case I had SQL Express 2005 Installer on my system as well as I run
> both (one for development, one for production).
> Shut down all SQL related services manually.
> I set registry permissions on
> Software\Policies\Microsoft\Windows\Installer to include the account that
> is doing the install with Full Control. Click Advanced and make sure
> everything is selected.
> I downloaded and ran the SQL Install Cleanup and removed the SQL 2005
> Setup Files Installer and the SQL Express Installer. I then reinstalled
> the SQL Setup Files (ONLY!!!!!) from my SQL Server 2005 CD. If you don't
> don't do this you won't be able to use the SP.
> Q290301 OFFXP: Windows Installer CleanUp Utility
> http://support.microsoft.com/support/kb/articles/q290/3/01.asp
> I then reinstalled the SP and it worked.
> Hope this helps somebody.
> Special thanks to Peter Yang of Microsoft for his invaluable assistance on
> this issue.
> --
> Get a powerful web, database, application, and email hosting with KJM
> Solutions
> http://www.kjmsolutions.com
>
>|||Hi,
Thanks for your feedback. This information has been added to Microsoft's
database. Your solution will benefit many other users, and we really value
having you as a Microsoft customer.
If you have any other questions or concerns, please do not hesitate to
contact us. It is always our pleasure to be of assistance.
Have a nice day!
Sincerely,
Wei Lu
Microsoft Online Community Support
==================================================
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
==================================================This posting is provided "AS IS" with no warranties, and confers no rights.
--
>From: "vbnetdev" <vbnetdev@.community.nospam>
>References: <OdGUsVgaGHA.3304@.TK2MSFTNGP04.phx.gbl>
>Subject: Re: 2005 SP 1 Install Failure Solution - Unable to install
Windows Installer MSP file
>Date: Thu, 27 Apr 2006 10:15:48 -0500
>Lines: 50
>X-Priority: 3
>X-MSMail-Priority: Normal
>X-Newsreader: Microsoft Outlook Express 6.00.2900.2869
>X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.2869
>X-RFC2646: Format=Flowed; Response
>Message-ID: <uQdG71gaGHA.3740@.TK2MSFTNGP03.phx.gbl>
>Newsgroups: microsoft.public.sqlserver.server
>NNTP-Posting-Host: 63-252-11-73.ip.mcleodusa.net 63.252.11.73
>Path: TK2MSFTNGXA01.phx.gbl!TK2MSFTNGP01.phx.gbl!TK2MSFTNGP03.phx.gbl
>Xref: TK2MSFTNGXA01.phx.gbl microsoft.public.sqlserver.server:429460
>X-Tomcat-NG: microsoft.public.sqlserver.server
>Forgot to mention this is on a Windows Server 2003 SP1 platform.
>--
>Get a powerful web, database, application, and email hosting with KJM
>Solutions
>http://www.kjmsolutions.com
>
>"vbnetdev" <vbnetdev@.community.nospam> wrote in message
>news:OdGUsVgaGHA.3304@.TK2MSFTNGP04.phx.gbl...
>> This is posted for others who have
>> If this error occurs here is what worked for me. I hope it may work for
>> you but am not sure.
>> In my case I had SQL Express 2005 Installer on my system as well as I
run
>> both (one for development, one for production).
>> Shut down all SQL related services manually.
>> I set registry permissions on
>> Software\Policies\Microsoft\Windows\Installer to include the account
that
>> is doing the install with Full Control. Click Advanced and make sure
>> everything is selected.
>> I downloaded and ran the SQL Install Cleanup and removed the SQL 2005
>> Setup Files Installer and the SQL Express Installer. I then reinstalled
>> the SQL Setup Files (ONLY!!!!!) from my SQL Server 2005 CD. If you don't
>> don't do this you won't be able to use the SP.
>> Q290301 OFFXP: Windows Installer CleanUp Utility
>> http://support.microsoft.com/support/kb/articles/q290/3/01.asp
>> I then reinstalled the SP and it worked.
>> Hope this helps somebody.
>> Special thanks to Peter Yang of Microsoft for his invaluable assistance
on
>> this issue.
>> --
>> Get a powerful web, database, application, and email hosting with KJM
>> Solutions
>> http://www.kjmsolutions.com
>>
>>
>
>

2005 SP 1 Install Failure Solution - Unable to install Windows Installer MSP file

This is posted for others who have
If this error occurs here is what worked for me. I hope it may work for you
but am not sure.
In my case I had SQL Express 2005 Installer on my system as well as I run
both (one for development, one for production).
Shut down all SQL related services manually.
I set registry permissions on Software\Policies\Microsoft\Windows\Inst
aller
to include the account that is doing the install with Full Control. Click
Advanced and make sure everything is selected.
I downloaded and ran the SQL Install Cleanup and removed the SQL 2005 Setup
Files Installer and the SQL Express Installer. I then reinstalled the SQL
Setup Files (ONLY!!!!!) from my SQL Server 2005 CD. If you don't don't do
this you won't be able to use the SP.
Q290301 OFFXP: Windows Installer CleanUp Utility
http://support.microsoft.com/suppor...s/q290/3/01.asp
I then reinstalled the SP and it worked.
Hope this helps somebody.
Special thanks to Peter Yang of Microsoft for his invaluable assistance on
this issue.
--
Get a powerful web, database, application, and email hosting with KJM
Solutions
http://www.kjmsolutions.comForgot to mention this is on a Windows Server 2003 SP1 platform.
Get a powerful web, database, application, and email hosting with KJM
Solutions
http://www.kjmsolutions.com
"vbnetdev" <vbnetdev@.community.nospam> wrote in message
news:OdGUsVgaGHA.3304@.TK2MSFTNGP04.phx.gbl...
> This is posted for others who have
> If this error occurs here is what worked for me. I hope it may work for
> you but am not sure.
> In my case I had SQL Express 2005 Installer on my system as well as I run
> both (one for development, one for production).
> Shut down all SQL related services manually.
> I set registry permissions on
> Software\Policies\Microsoft\Windows\Inst
aller to include the account that
> is doing the install with Full Control. Click Advanced and make sure
> everything is selected.
> I downloaded and ran the SQL Install Cleanup and removed the SQL 2005
> Setup Files Installer and the SQL Express Installer. I then reinstalled
> the SQL Setup Files (ONLY!!!!!) from my SQL Server 2005 CD. If you don't
> don't do this you won't be able to use the SP.
> Q290301 OFFXP: Windows Installer CleanUp Utility
> http://support.microsoft.com/suppor...s/q290/3/01.asp
> I then reinstalled the SP and it worked.
> Hope this helps somebody.
> Special thanks to Peter Yang of Microsoft for his invaluable assistance on
> this issue.
> --
> Get a powerful web, database, application, and email hosting with KJM
> Solutions
> http://www.kjmsolutions.com
>
>|||Hi,
Thanks for your feedback. This information has been added to Microsoft's
database. Your solution will benefit many other users, and we really value
having you as a Microsoft customer.
If you have any other questions or concerns, please do not hesitate to
contact us. It is always our pleasure to be of assistance.
Have a nice day!
Sincerely,
Wei Lu
Microsoft Online Community Support
========================================
==========
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
==========
This posting is provided "AS IS" with no warranties, and confers no rights.
--
>From: "vbnetdev" <vbnetdev@.community.nospam>
>References: <OdGUsVgaGHA.3304@.TK2MSFTNGP04.phx.gbl>
>Subject: Re: 2005 SP 1 Install Failure Solution - Unable to install
Windows Installer MSP file
>Date: Thu, 27 Apr 2006 10:15:48 -0500
>Lines: 50
>X-Priority: 3
>X-MSMail-Priority: Normal
>X-Newsreader: Microsoft Outlook Express 6.00.2900.2869
>X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.2869
>X-RFC2646: Format=Flowed; Response
>Message-ID: <uQdG71gaGHA.3740@.TK2MSFTNGP03.phx.gbl>
>Newsgroups: microsoft.public.sqlserver.server
>NNTP-Posting-Host: 63-252-11-73.ip.mcleodusa.net 63.252.11.73
>Path: TK2MSFTNGXA01.phx.gbl!TK2MSFTNGP01.phx.gbl!TK2MSFTNGP03.phx.gbl
>Xref: TK2MSFTNGXA01.phx.gbl microsoft.public.sqlserver.server:429460
>X-Tomcat-NG: microsoft.public.sqlserver.server
>Forgot to mention this is on a Windows Server 2003 SP1 platform.
>--
>Get a powerful web, database, application, and email hosting with KJM
>Solutions
>http://www.kjmsolutions.com
>
>"vbnetdev" <vbnetdev@.community.nospam> wrote in message
>news:OdGUsVgaGHA.3304@.TK2MSFTNGP04.phx.gbl...
run[vbcol=seagreen]
that[vbcol=seagreen]
on[vbcol=seagreen]
>
>

Thursday, February 16, 2012

2005 Licensing file based db's

I noticed it seems like 2k5 supports file based dbs... and I had a few
quick questions...
1) does it require me to have sql2k server installed on machines I
deploy my apps to, to use the file based dbs?
2) what kind of licensing surrounds the file based dbs?
Thanks in advance
Weston Weems
hi Weston,
Weston Weems wrote:
> I noticed it seems like 2k5 supports file based dbs... and I had a few
> quick questions...
> 1) does it require me to have sql2k server installed on machines I
> deploy my apps to, to use the file based dbs?
> 2) what kind of licensing surrounds the file based dbs?
> Thanks in advance
> Weston Weems
if you mean "User Instances" (AKA RANU) please have a look at
http://tinyurl.com/9obd8
Andrea Montanari (Microsoft MVP - SQL Server)
http://www.asql.biz/DbaMgr.shtmhttp://italy.mvps.org
DbaMgr2k ver 0.15.0 - DbaMgr ver 0.60.0
(my vb6+sql-dmo little try to provide MS MSDE 1.0 and MSDE 2000 a visual
interface)
-- remove DMO to reply

2005 Licensing file based db's

I noticed it seems like 2k5 supports file based dbs... and I had a few
quick questions...
1) does it require me to have sql2k server installed on machines I
deploy my apps to, to use the file based dbs?
2) what kind of licensing surrounds the file based dbs?
Thanks in advance
Weston Weemshi Weston,
Weston Weems wrote:
> I noticed it seems like 2k5 supports file based dbs... and I had a few
> quick questions...
> 1) does it require me to have sql2k server installed on machines I
> deploy my apps to, to use the file based dbs?
> 2) what kind of licensing surrounds the file based dbs?
> Thanks in advance
> Weston Weems
if you mean "User Instances" (AKA RANU) please have a look at
http://tinyurl.com/9obd8
--
Andrea Montanari (Microsoft MVP - SQL Server)
http://www.asql.biz/DbaMgr.shtmhttp://italy.mvps.org
DbaMgr2k ver 0.15.0 - DbaMgr ver 0.60.0
(my vb6+sql-dmo little try to provide MS MSDE 1.0 and MSDE 2000 a visual
interface)
-- remove DMO to reply

2005 Licensing file based db's

I noticed it seems like 2k5 supports file based dbs... and I had a few
quick questions...
1) does it require me to have sql2k server installed on machines I
deploy my apps to, to use the file based dbs?
2) what kind of licensing surrounds the file based dbs?
Thanks in advance
Weston Weemshi Weston,
Weston Weems wrote:
> I noticed it seems like 2k5 supports file based dbs... and I had a few
> quick questions...
> 1) does it require me to have sql2k server installed on machines I
> deploy my apps to, to use the file based dbs?
> 2) what kind of licensing surrounds the file based dbs?
> Thanks in advance
> Weston Weems
if you mean "User Instances" (AKA RANU) please have a look at
http://tinyurl.com/9obd8
--
Andrea Montanari (Microsoft MVP - SQL Server)
http://www.asql.biz/DbaMgr.shtmhttp://italy.mvps.org
DbaMgr2k ver 0.15.0 - DbaMgr ver 0.60.0
(my vb6+sql-dmo little try to provide MS MSDE 1.0 and MSDE 2000 a visual
interface)
-- remove DMO to reply

2005 keeps telling me the dababase is restoring

Hello all,

I have been seeing problems in restoring database to SQL Server 2005. The source is a .bak file backuped from SQL 2000. Everything in the process went well and at last the restore window told me the resotre has been successfully complete. But in management studio, it keeps telling me the database is restoring, I can not access it or open the property page. I tried the process many times already.

This occured several times in my Sep CTP platform so I removed it and installed the released version, but no good.

My environment: win xp sp2, SQL 2005

Any ideas?

When you restore the db, make sure you specify WITH NORECOVERY. This keeps the system from attempting to rollback uncommitted transactions.

2005 install - Can you specify a directory for the data and one for the logs?

Hi,
I am setting up a template.ini file that I am going to use for some
unattended installs of SQL Server 2005. I believe using the
'INSTALLSQLDATADIR' I can get the system databases setup with their
data files and log in the directory I specify. Is there a way to
specify a location for the logs and a location for the data files?I don't think so. There is really not much a need for that either.
Linchi
"Paul T." wrote:
> Hi,
> I am setting up a template.ini file that I am going to use for some
> unattended installs of SQL Server 2005. I believe using the
> 'INSTALLSQLDATADIR' I can get the system databases setup with their
> data files and log in the directory I specify. Is there a way to
> specify a location for the logs and a location for the data files?
>|||Most people split their user databases so that the log is on one
physical drive and the logs are on another so that I/O is distributed
and so that if they do not lose both the log and data file if a drive
goes, why not do that with the system databases too?|||Paul T. wrote:
> Most people split their user databases so that the log is on one
> physical drive and the logs are on another so that I/O is distributed
> and so that if they do not lose both the log and data file if a drive
> goes, why not do that with the system databases too?
>
Your system tables should have very little I/O against them, and good
backups will protect you from losing them...
Tracy McKibben
MCDBA
http://www.realsqlguy.com

Monday, February 13, 2012

2005 Express Edition Capacity

I am looking for an inexpensive (less than $200 in quantity 100 per month)
database that will meet the following requirements:
Max database file size 2 Gb
Max number of records 2 million
Record insertion rate: 10 per second
Queries while inserting: Max time to complete a basic query over the 2
million records: 5 sec
The database must be robust, i.e. no lost data when inserting and querying.
Does it seem that Express Edition of SQL 2005 would meet these requirements,
especially the number of records and timing requirements? The application is
limited to one computer.
"Rahim" <Rahim@.discussions.microsoft.com> wrote in message
news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
>I am looking for an inexpensive (less than $200 in quantity 100 per month)
> database that will meet the following requirements:
> Max database file size 2 Gb
> Max number of records 2 million
> Record insertion rate: 10 per second
> Queries while inserting: Max time to complete a basic query over the 2
> million records: 5 sec
> The database must be robust, i.e. no lost data when inserting and
> querying.
> Does it seem that Express Edition of SQL 2005 would meet these
> requirements,
> especially the number of records and timing requirements? The application
> is
> limited to one computer.
|||SQLExpress will have no problems with those rates give a proper design and
appropriate hardware. 2 million rows is pretty small for a SQL DB these
days and as long as you have a proper index and are not returning enough
rows to force a scan that should no be a problem. It is free but you should
look at the licensing to make sure you fit the requirements.
Andrew J. Kelly SQL MVP
"Rahim" <Rahim@.discussions.microsoft.com> wrote in message
news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
>I am looking for an inexpensive (less than $200 in quantity 100 per month)
> database that will meet the following requirements:
> Max database file size 2 Gb
> Max number of records 2 million
> Record insertion rate: 10 per second
> Queries while inserting: Max time to complete a basic query over the 2
> million records: 5 sec
> The database must be robust, i.e. no lost data when inserting and
> querying.
> Does it seem that Express Edition of SQL 2005 would meet these
> requirements,
> especially the number of records and timing requirements? The application
> is
> limited to one computer.
|||"Rahim" <Rahim@.discussions.microsoft.com> wrote in message
news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
>I am looking for an inexpensive (less than $200 in quantity 100 per month)
> database that will meet the following requirements:
> Max database file size 2 Gb
> Max number of records 2 million
> Record insertion rate: 10 per second
> Queries while inserting: Max time to complete a basic query over the 2
> million records: 5 sec
> The database must be robust, i.e. no lost data when inserting and
> querying.
> Does it seem that Express Edition of SQL 2005 would meet these
> requirements,
> especially the number of records and timing requirements? The application
> is
> limited to one computer.
The capacities you mentioned aren't a problem. The only question is over the
rate of updates and the time to complete your queries. These will be
determined by your processor, storage and network performance rather than by
the database software.
10 rows per second looks a little bit odd next to your 2 million row metric.
If 10 rows per second is an average then apparently your database only
retains about 55 hours worth of data.
In the case of your query performance, that's obviously entirely dependent
on the nature of the query. You'll have to test it out.
Hope this helps.
David Portas
SQL Server MVP
|||"Andrew J. Kelly" wrote:

> SQLExpress will have no problems with those rates give a proper design and
> appropriate hardware. 2 million rows is pretty small for a SQL DB these
> days and as long as you have a proper index and are not returning enough
> rows to force a scan that should no be a problem. It is free but you should
> look at the licensing to make sure you fit the requirements.
> --
> Andrew J. Kelly SQL MVP
>
> "Rahim" <Rahim@.discussions.microsoft.com> wrote in message
> news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
>
> Thank you very much for your reply.
|||"David Portas" wrote:

> "Rahim" <Rahim@.discussions.microsoft.com> wrote in message
> news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
> The capacities you mentioned aren't a problem. The only question is over the
> rate of updates and the time to complete your queries. These will be
> determined by your processor, storage and network performance rather than by
> the database software.
> 10 rows per second looks a little bit odd next to your 2 million row metric.
> If 10 rows per second is an average then apparently your database only
> retains about 55 hours worth of data.
> In the case of your query performance, that's obviously entirely dependent
> on the nature of the query. You'll have to test it out.
> Hope this helps.
> --
> David Portas
> SQL Server MVP
> --
>
> Thank you for your reply.
The Database will serve to temporarily retain detailed production
information until the product is shipped, after which the information is
discarded, or in a future incarnation, archived. Thus the apparent 55 hour
capacity.

2005 Express Edition Capacity

I am looking for an inexpensive (less than $200 in quantity 100 per month)
database that will meet the following requirements:
Max database file size 2 Gb
Max number of records 2 million
Record insertion rate: 10 per second
Queries while inserting: Max time to complete a basic query over the 2
million records: 5 sec
The database must be robust, i.e. no lost data when inserting and querying.
Does it seem that Express Edition of SQL 2005 would meet these requirements,
especially the number of records and timing requirements? The application is
limited to one computer."Rahim" <Rahim@.discussions.microsoft.com> wrote in message
news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
>I am looking for an inexpensive (less than $200 in quantity 100 per month)
> database that will meet the following requirements:
> Max database file size 2 Gb
> Max number of records 2 million
> Record insertion rate: 10 per second
> Queries while inserting: Max time to complete a basic query over the 2
> million records: 5 sec
> The database must be robust, i.e. no lost data when inserting and
> querying.
> Does it seem that Express Edition of SQL 2005 would meet these
> requirements,
> especially the number of records and timing requirements? The application
> is
> limited to one computer.|||SQLExpress will have no problems with those rates give a proper design and
appropriate hardware. 2 million rows is pretty small for a SQL DB these
days and as long as you have a proper index and are not returning enough
rows to force a scan that should no be a problem. It is free but you should
look at the licensing to make sure you fit the requirements.
Andrew J. Kelly SQL MVP
"Rahim" <Rahim@.discussions.microsoft.com> wrote in message
news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
>I am looking for an inexpensive (less than $200 in quantity 100 per month)
> database that will meet the following requirements:
> Max database file size 2 Gb
> Max number of records 2 million
> Record insertion rate: 10 per second
> Queries while inserting: Max time to complete a basic query over the 2
> million records: 5 sec
> The database must be robust, i.e. no lost data when inserting and
> querying.
> Does it seem that Express Edition of SQL 2005 would meet these
> requirements,
> especially the number of records and timing requirements? The application
> is
> limited to one computer.|||"Rahim" <Rahim@.discussions.microsoft.com> wrote in message
news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
>I am looking for an inexpensive (less than $200 in quantity 100 per month)
> database that will meet the following requirements:
> Max database file size 2 Gb
> Max number of records 2 million
> Record insertion rate: 10 per second
> Queries while inserting: Max time to complete a basic query over the 2
> million records: 5 sec
> The database must be robust, i.e. no lost data when inserting and
> querying.
> Does it seem that Express Edition of SQL 2005 would meet these
> requirements,
> especially the number of records and timing requirements? The application
> is
> limited to one computer.
The capacities you mentioned aren't a problem. The only question is over the
rate of updates and the time to complete your queries. These will be
determined by your processor, storage and network performance rather than by
the database software.
10 rows per second looks a little bit odd next to your 2 million row metric.
If 10 rows per second is an average then apparently your database only
retains about 55 hours worth of data.
In the case of your query performance, that's obviously entirely dependent
on the nature of the query. You'll have to test it out.
Hope this helps.
David Portas
SQL Server MVP
--|||"Andrew J. Kelly" wrote:

> SQLExpress will have no problems with those rates give a proper design and
> appropriate hardware. 2 million rows is pretty small for a SQL DB these
> days and as long as you have a proper index and are not returning enough
> rows to force a scan that should no be a problem. It is free but you shou
ld
> look at the licensing to make sure you fit the requirements.
> --
> Andrew J. Kelly SQL MVP
>
> "Rahim" <Rahim@.discussions.microsoft.com> wrote in message
> news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
>
> Thank you very much for your reply.|||"David Portas" wrote:

> "Rahim" <Rahim@.discussions.microsoft.com> wrote in message
> news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
> The capacities you mentioned aren't a problem. The only question is over t
he
> rate of updates and the time to complete your queries. These will be
> determined by your processor, storage and network performance rather than
by
> the database software.
> 10 rows per second looks a little bit odd next to your 2 million row metri
c.
> If 10 rows per second is an average then apparently your database only
> retains about 55 hours worth of data.
> In the case of your query performance, that's obviously entirely dependent
> on the nature of the query. You'll have to test it out.
> Hope this helps.
> --
> David Portas
> SQL Server MVP
> --
>
> Thank you for your reply.
The Database will serve to temporarily retain detailed production
information until the product is shipped, after which the information is
discarded, or in a future incarnation, archived. Thus the apparent 55 hour
capacity.

2005 Express Edition Capacity

I am looking for an inexpensive (less than $200 in quantity 100 per month)
database that will meet the following requirements:
Max database file size 2 Gb
Max number of records 2 million
Record insertion rate: 10 per second
Queries while inserting: Max time to complete a basic query over the 2
million records: 5 sec
The database must be robust, i.e. no lost data when inserting and querying.
Does it seem that Express Edition of SQL 2005 would meet these requirements,
especially the number of records and timing requirements? The application is
limited to one computer."Rahim" <Rahim@.discussions.microsoft.com> wrote in message
news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
>I am looking for an inexpensive (less than $200 in quantity 100 per month)
> database that will meet the following requirements:
> Max database file size 2 Gb
> Max number of records 2 million
> Record insertion rate: 10 per second
> Queries while inserting: Max time to complete a basic query over the 2
> million records: 5 sec
> The database must be robust, i.e. no lost data when inserting and
> querying.
> Does it seem that Express Edition of SQL 2005 would meet these
> requirements,
> especially the number of records and timing requirements? The application
> is
> limited to one computer.|||SQLExpress will have no problems with those rates give a proper design and
appropriate hardware. 2 million rows is pretty small for a SQL DB these
days and as long as you have a proper index and are not returning enough
rows to force a scan that should no be a problem. It is free but you should
look at the licensing to make sure you fit the requirements.
--
Andrew J. Kelly SQL MVP
"Rahim" <Rahim@.discussions.microsoft.com> wrote in message
news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
>I am looking for an inexpensive (less than $200 in quantity 100 per month)
> database that will meet the following requirements:
> Max database file size 2 Gb
> Max number of records 2 million
> Record insertion rate: 10 per second
> Queries while inserting: Max time to complete a basic query over the 2
> million records: 5 sec
> The database must be robust, i.e. no lost data when inserting and
> querying.
> Does it seem that Express Edition of SQL 2005 would meet these
> requirements,
> especially the number of records and timing requirements? The application
> is
> limited to one computer.|||"Rahim" <Rahim@.discussions.microsoft.com> wrote in message
news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
>I am looking for an inexpensive (less than $200 in quantity 100 per month)
> database that will meet the following requirements:
> Max database file size 2 Gb
> Max number of records 2 million
> Record insertion rate: 10 per second
> Queries while inserting: Max time to complete a basic query over the 2
> million records: 5 sec
> The database must be robust, i.e. no lost data when inserting and
> querying.
> Does it seem that Express Edition of SQL 2005 would meet these
> requirements,
> especially the number of records and timing requirements? The application
> is
> limited to one computer.
The capacities you mentioned aren't a problem. The only question is over the
rate of updates and the time to complete your queries. These will be
determined by your processor, storage and network performance rather than by
the database software.
10 rows per second looks a little bit odd next to your 2 million row metric.
If 10 rows per second is an average then apparently your database only
retains about 55 hours worth of data.
In the case of your query performance, that's obviously entirely dependent
on the nature of the query. You'll have to test it out.
Hope this helps.
--
David Portas
SQL Server MVP
--|||"Andrew J. Kelly" wrote:
> SQLExpress will have no problems with those rates give a proper design and
> appropriate hardware. 2 million rows is pretty small for a SQL DB these
> days and as long as you have a proper index and are not returning enough
> rows to force a scan that should no be a problem. It is free but you should
> look at the licensing to make sure you fit the requirements.
> --
> Andrew J. Kelly SQL MVP
>
> "Rahim" <Rahim@.discussions.microsoft.com> wrote in message
> news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
> >I am looking for an inexpensive (less than $200 in quantity 100 per month)
> > database that will meet the following requirements:
> > Max database file size 2 Gb
> > Max number of records 2 million
> > Record insertion rate: 10 per second
> > Queries while inserting: Max time to complete a basic query over the 2
> > million records: 5 sec
> > The database must be robust, i.e. no lost data when inserting and
> > querying.
> > Does it seem that Express Edition of SQL 2005 would meet these
> > requirements,
> > especially the number of records and timing requirements? The application
> > is
> > limited to one computer.
>
> Thank you very much for your reply.|||"David Portas" wrote:
> "Rahim" <Rahim@.discussions.microsoft.com> wrote in message
> news:5C2C04DA-2CF3-4F85-983A-80F42ABD2CC1@.microsoft.com...
> >I am looking for an inexpensive (less than $200 in quantity 100 per month)
> > database that will meet the following requirements:
> > Max database file size 2 Gb
> > Max number of records 2 million
> > Record insertion rate: 10 per second
> > Queries while inserting: Max time to complete a basic query over the 2
> > million records: 5 sec
> > The database must be robust, i.e. no lost data when inserting and
> > querying.
> > Does it seem that Express Edition of SQL 2005 would meet these
> > requirements,
> > especially the number of records and timing requirements? The application
> > is
> > limited to one computer.
> The capacities you mentioned aren't a problem. The only question is over the
> rate of updates and the time to complete your queries. These will be
> determined by your processor, storage and network performance rather than by
> the database software.
> 10 rows per second looks a little bit odd next to your 2 million row metric.
> If 10 rows per second is an average then apparently your database only
> retains about 55 hours worth of data.
> In the case of your query performance, that's obviously entirely dependent
> on the nature of the query. You'll have to test it out.
> Hope this helps.
> --
> David Portas
> SQL Server MVP
> --
>
> Thank you for your reply.
The Database will serve to temporarily retain detailed production
information until the product is shipped, after which the information is
discarded, or in a future incarnation, archived. Thus the apparent 55 hour
capacity.

Sunday, February 12, 2012

2005 Directory Tree

I am look to maintain a directory tree table in SQL Server 2005. The table
will contain details of file and folder paths, titles, size and a few other
additional fields not related to the file system.
Before I begin, and start writing code to maintain the table is there
anything in 2005 which could do most of the work for me? For example, the
file system be watched for changes and update the table automatically, i.e.
files or folders added, changed or deleted.
I keep hearing about all the fantastic new features in 2005, I would be
really impressed if it means I don't need to do any work for this.
Any help would be greatly appreciated.
DavidHi
There are no new functions in SQL 2k5 that can do this. Unfortunately SQL
Server is not build for file handling. I guess thats the only functionality
that Microsoft has missed out from that :)
--
"Davie" wrote:

> I am look to maintain a directory tree table in SQL Server 2005. The tabl
e
> will contain details of file and folder paths, titles, size and a few othe
r
> additional fields not related to the file system.
> Before I begin, and start writing code to maintain the table is there
> anything in 2005 which could do most of the work for me? For example, the
> file system be watched for changes and update the table automatically, i.e
.
> files or folders added, changed or deleted.
> I keep hearing about all the fantastic new features in 2005, I would be
> really impressed if it means I don't need to do any work for this.
> Any help would be greatly appreciated.
> David
>
>|||I am sure I read a while ago, think it was prior to release, Notification
Services can have a file system watch event?
"Omnibuzz" <Omnibuzz@.discussions.microsoft.com> wrote in message
news:7EDC42CB-A6C5-4B25-960F-9065284E6C8E@.microsoft.com...
> Hi
> There are no new functions in SQL 2k5 that can do this. Unfortunately
> SQL
> Server is not build for file handling. I guess thats the only
> functionality
> that Microsoft has missed out from that :)
> --
>
>
> "Davie" wrote:
>|||I am not sure about that. Sorry about the wrong info if one exists.
--
"Davie" wrote:

> I am sure I read a while ago, think it was prior to release, Notification
> Services can have a file system watch event?
>
> "Omnibuzz" <Omnibuzz@.discussions.microsoft.com> wrote in message
> news:7EDC42CB-A6C5-4B25-960F-9065284E6C8E@.microsoft.com...
>
>

Thursday, February 9, 2012

2005 - Need to Shrink Log File

Is there a way to shrink the log file below it's Initial Size?
Somehow the initial size for a log file was accidentally set to 25 Gig. This
extra large log file is keeping the backups from working (no disk space).
I've tried Shrinking the log file, but I'm not allowed to shrink it below
it's Intial Size.
I've tried changing it's Initial Size through it's properties, but that
doesn't seem to work either. No message, no indication that it failed.
Is there some way to perhaps replicate the database to another database with
a smaller initial log file setting and then copy it back or something?
Is there some way that I can get the Initial Size back down to a more
manageable size?
Thanks for any help you can offer.
Terry
Thank you, Tibor, for your quick reply.
DBCC SHRINKFILE didn't work. I received the message: "Cannot shrink log file
2 (PAIR_log) because all logical log files are in use."
When I use the SMS to select the Shrink Files option, I can see that the log
file is 99% empty.
I will need to read over your suggested website more carefully to see if
that will help.
Terry
"Tibor Karaszi" <tibor_please.no.email_karaszi@.hotmail.nomail.com> wrote in
message news:%233MNF0qWHHA.1180@.TK2MSFTNGP05.phx.gbl...
> Did you try DBCC SHRINKFILE? That command should be able to get it below
> initial size. Also, see
> http://www.karaszi.com/SQLServer/info_dont_shrink.asp for general tips
> when shrink don't seem to work.
> --
> Tibor Karaszi, SQL Server MVP
> http://www.karaszi.com/sqlserver/default.asp
> http://sqlblog.com/blogs/tibor_karaszi
|||Thanks again, Tibor.
Following your suggestions on your website:
I only had to do the Backup Log once and that freed the last page that had
the status of 2. Now everything is back to being a much more manageable
size!
Thanks again!
Terry

> "Tibor Karaszi" <tibor_please.no.email_karaszi@.hotmail.nomail.com> wrote
> in message news:%233MNF0qWHHA.1180@.TK2MSFTNGP05.phx.gbl...
>

2005 - Need to Shrink Log File

Is there a way to shrink the log file below it's Initial Size?
Somehow the initial size for a log file was accidentally set to 25 Gig. This
extra large log file is keeping the backups from working (no disk space).
I've tried Shrinking the log file, but I'm not allowed to shrink it below
it's Intial Size.
I've tried changing it's Initial Size through it's properties, but that
doesn't seem to work either. No message, no indication that it failed.
Is there some way to perhaps replicate the database to another database with
a smaller initial log file setting and then copy it back or something?
Is there some way that I can get the Initial Size back down to a more
manageable size?
Thanks for any help you can offer.
TerryDid you try DBCC SHRINKFILE? That command should be able to get it below initial size. Also, see
http://www.karaszi.com/SQLServer/info_dont_shrink.asp for general tips when shrink don't seem to
work.
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://sqlblog.com/blogs/tibor_karaszi
"Terry Carnes" <wbcarnes3@.yahoo.com> wrote in message news:uKxilsqWHHA.4624@.TK2MSFTNGP03.phx.gbl...
> Is there a way to shrink the log file below it's Initial Size?
> Somehow the initial size for a log file was accidentally set to 25 Gig. This extra large log file
> is keeping the backups from working (no disk space).
> I've tried Shrinking the log file, but I'm not allowed to shrink it below it's Intial Size.
> I've tried changing it's Initial Size through it's properties, but that doesn't seem to work
> either. No message, no indication that it failed.
> Is there some way to perhaps replicate the database to another database with a smaller initial log
> file setting and then copy it back or something?
> Is there some way that I can get the Initial Size back down to a more manageable size?
> Thanks for any help you can offer.
> Terry
>|||Thank you, Tibor, for your quick reply.
DBCC SHRINKFILE didn't work. I received the message: "Cannot shrink log file
2 (PAIR_log) because all logical log files are in use."
When I use the SMS to select the Shrink Files option, I can see that the log
file is 99% empty.
I will need to read over your suggested website more carefully to see if
that will help.
Terry
"Tibor Karaszi" <tibor_please.no.email_karaszi@.hotmail.nomail.com> wrote in
message news:%233MNF0qWHHA.1180@.TK2MSFTNGP05.phx.gbl...
> Did you try DBCC SHRINKFILE? That command should be able to get it below
> initial size. Also, see
> http://www.karaszi.com/SQLServer/info_dont_shrink.asp for general tips
> when shrink don't seem to work.
> --
> Tibor Karaszi, SQL Server MVP
> http://www.karaszi.com/sqlserver/default.asp
> http://sqlblog.com/blogs/tibor_karaszi|||Thanks again, Tibor.
Following your suggestions on your website:
I only had to do the Backup Log once and that freed the last page that had
the status of 2. Now everything is back to being a much more manageable
size!
Thanks again!
Terry
> "Tibor Karaszi" <tibor_please.no.email_karaszi@.hotmail.nomail.com> wrote
> in message news:%233MNF0qWHHA.1180@.TK2MSFTNGP05.phx.gbl...
>> Did you try DBCC SHRINKFILE? That command should be able to get it below
>> initial size. Also, see
>> http://www.karaszi.com/SQLServer/info_dont_shrink.asp for general tips
>> when shrink don't seem to work.
>> --
>> Tibor Karaszi, SQL Server MVP
>> http://www.karaszi.com/sqlserver/default.asp
>> http://sqlblog.com/blogs/tibor_karaszi
>