Saturday, February 25, 2012
Managing Disk Space
(database/tran log size vs. free disk space)?
Thanks.
No tricks really. You have to chose the best recovery model for your
database, and an appropriate backup plan, to keep the transaction log files
in check. Also, defragmenting your tables will help avoid disk space
wastage. Please read up on recovery model, BACKUP/RESTORE, DBCC DBREINDEX,
DBCC INDEXDEFRAG.
Vyas, MVP (SQL Server)
http://vyaskn.tripod.com/
"SQL" <nospam@.adfadfadf.com> wrote in message
news:eWiPESrYEHA.3512@.TK2MSFTNGP12.phx.gbl...
> Is there a tool or trick/tip for managing disk space on SQL Server
> (database/tran log size vs. free disk space)?
> Thanks.
>
|||Hi,
For Managing the growth you can use the Performance monitor. You create the
Alerts in performance monitor
which can raise an network message / write in application log based on the
threshold limit set for each counters.
You can also create your own alerts using SQL Agent from SQL Enterprise
manager.
Monitor disk space , see the belew link.
http://www.databasejournal.com/featu...le.php/1475741
Apart from this you can use the 3rd party tool.
http://www.bmcpatrol.com
Thanks
Hari
MCDBA
"SQL" <nospam@.adfadfadf.com> wrote in message
news:eWiPESrYEHA.3512@.TK2MSFTNGP12.phx.gbl...
> Is there a tool or trick/tip for managing disk space on SQL Server
> (database/tran log size vs. free disk space)?
> Thanks.
>
Managing Disk Space
(database/tran log size vs. free disk space)?
Thanks.No tricks really. You have to chose the best recovery model for your
database, and an appropriate backup plan, to keep the transaction log files
in check. Also, defragmenting your tables will help avoid disk space
wastage. Please read up on recovery model, BACKUP/RESTORE, DBCC DBREINDEX,
DBCC INDEXDEFRAG.
--
Vyas, MVP (SQL Server)
http://vyaskn.tripod.com/
"SQL" <nospam@.adfadfadf.com> wrote in message
news:eWiPESrYEHA.3512@.TK2MSFTNGP12.phx.gbl...
> Is there a tool or trick/tip for managing disk space on SQL Server
> (database/tran log size vs. free disk space)?
> Thanks.
>|||Hi,
For Managing the growth you can use the Performance monitor. You create the
Alerts in performance monitor
which can raise an network message / write in application log based on the
threshold limit set for each counters.
You can also create your own alerts using SQL Agent from SQL Enterprise
manager.
Monitor disk space , see the belew link.
http://www.databasejournal.com/feat...cle.php/1475741
Apart from this you can use the 3rd party tool.
http://www.bmcpatrol.com
Thanks
Hari
MCDBA
"SQL" <nospam@.adfadfadf.com> wrote in message
news:eWiPESrYEHA.3512@.TK2MSFTNGP12.phx.gbl...
> Is there a tool or trick/tip for managing disk space on SQL Server
> (database/tran log size vs. free disk space)?
> Thanks.
>
Managing Disk Space
(database/tran log size vs. free disk space)?
Thanks.No tricks really. You have to chose the best recovery model for your
database, and an appropriate backup plan, to keep the transaction log files
in check. Also, defragmenting your tables will help avoid disk space
wastage. Please read up on recovery model, BACKUP/RESTORE, DBCC DBREINDEX,
DBCC INDEXDEFRAG.
--
Vyas, MVP (SQL Server)
http://vyaskn.tripod.com/
"SQL" <nospam@.adfadfadf.com> wrote in message
news:eWiPESrYEHA.3512@.TK2MSFTNGP12.phx.gbl...
> Is there a tool or trick/tip for managing disk space on SQL Server
> (database/tran log size vs. free disk space)?
> Thanks.
>|||Hi,
For Managing the growth you can use the Performance monitor. You create the
Alerts in performance monitor
which can raise an network message / write in application log based on the
threshold limit set for each counters.
You can also create your own alerts using SQL Agent from SQL Enterprise
manager.
Monitor disk space , see the belew link.
http://www.databasejournal.com/features/mssql/article.php/1475741
Apart from this you can use the 3rd party tool.
http://www.bmcpatrol.com
Thanks
Hari
MCDBA
"SQL" <nospam@.adfadfadf.com> wrote in message
news:eWiPESrYEHA.3512@.TK2MSFTNGP12.phx.gbl...
> Is there a tool or trick/tip for managing disk space on SQL Server
> (database/tran log size vs. free disk space)?
> Thanks.
>
Managing dev, test, and pro environment
development, test, and production configuration. For example, I have an SRS
development server that runs on a development SQL Server. Developers write
the SRS reports. Then it comes time to deploy the reports to user testing.
I have a SRS test server that may run on a test SQL server. When it comes
time to deploy the reports into production, I have an SRS production server.
I want to find out how to best accomplish this.
ThanksThat sounds like a good plan. The next step would be to identify the actual
way you push the RDL files around. Depending on whether you are using forms
auth or windows auth -- as well as how you handle configuration management
in your company -- the best solution may vary.
--
Cheers,
'(' Jeff A. Stucker
\
Business Intelligence
www.criadvantage.com
---
"Lee Forst" <Lee Forst@.discussions.microsoft.com> wrote in message
news:7830DAD0-25C8-4133-B5DE-DD5F348F8886@.microsoft.com...
>I would like to find some best practice documentation on how to manage a
> development, test, and production configuration. For example, I have an
> SRS
> development server that runs on a development SQL Server. Developers
> write
> the SRS reports. Then it comes time to deploy the reports to user
> testing.
> I have a SRS test server that may run on a test SQL server. When it comes
> time to deploy the reports into production, I have an SRS production
> server.
> I want to find out how to best accomplish this.
> Thanks
>|||You hit the nail on the head. How do you push the RDLs around? The only
thought I have on this now is the developer or the configuration management
person would have to open the report in VS.NET and publish it to a different
SRS server. Sounds messy. Maybe a better approach would be for a function
in Report Manager to push RDLs around. But then, I have to wonder about
versions of the RDL.
Surely someone has thought about this. Does everyone on the planet code and
test on a production server?
"Jeff A. Stucker" wrote:
> That sounds like a good plan. The next step would be to identify the actual
> way you push the RDL files around. Depending on whether you are using forms
> auth or windows auth -- as well as how you handle configuration management
> in your company -- the best solution may vary.
> --
> Cheers,
> '(' Jeff A. Stucker
> \
> Business Intelligence
> www.criadvantage.com
> ---
> "Lee Forst" <Lee Forst@.discussions.microsoft.com> wrote in message
> news:7830DAD0-25C8-4133-B5DE-DD5F348F8886@.microsoft.com...
> >I would like to find some best practice documentation on how to manage a
> > development, test, and production configuration. For example, I have an
> > SRS
> > development server that runs on a development SQL Server. Developers
> > write
> > the SRS reports. Then it comes time to deploy the reports to user
> > testing.
> > I have a SRS test server that may run on a test SQL server. When it comes
> > time to deploy the reports into production, I have an SRS production
> > server.
> > I want to find out how to best accomplish this.
> >
> > Thanks
> >
>
>|||You can create a solution using rs.exe, that is, scripting the deployment.
Take a look here (default installation location):
C:\Program Files\Microsoft SQL Server\MSSQL\Reporting
Services\Samples\Scripts
--
Cheers,
'(' Jeff A. Stucker
\
Business Intelligence
www.criadvantage.com
---
"Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
news:A944148D-6829-471D-ADA7-61C69AEB5C00@.microsoft.com...
> You hit the nail on the head. How do you push the RDLs around? The only
> thought I have on this now is the developer or the configuration
> management
> person would have to open the report in VS.NET and publish it to a
> different
> SRS server. Sounds messy. Maybe a better approach would be for a
> function
> in Report Manager to push RDLs around. But then, I have to wonder about
> versions of the RDL.
> Surely someone has thought about this. Does everyone on the planet code
> and
> test on a production server?
> "Jeff A. Stucker" wrote:
>> That sounds like a good plan. The next step would be to identify the
>> actual
>> way you push the RDL files around. Depending on whether you are using
>> forms
>> auth or windows auth -- as well as how you handle configuration
>> management
>> in your company -- the best solution may vary.
>> --
>> Cheers,
>> '(' Jeff A. Stucker
>> \
>> Business Intelligence
>> www.criadvantage.com
>> ---
>> "Lee Forst" <Lee Forst@.discussions.microsoft.com> wrote in message
>> news:7830DAD0-25C8-4133-B5DE-DD5F348F8886@.microsoft.com...
>> >I would like to find some best practice documentation on how to manage a
>> > development, test, and production configuration. For example, I have
>> > an
>> > SRS
>> > development server that runs on a development SQL Server. Developers
>> > write
>> > the SRS reports. Then it comes time to deploy the reports to user
>> > testing.
>> > I have a SRS test server that may run on a test SQL server. When it
>> > comes
>> > time to deploy the reports into production, I have an SRS production
>> > server.
>> > I want to find out how to best accomplish this.
>> >
>> > Thanks
>> >
>>|||This was just posted today. You might want to take a look at it.
>>>>>>>>>>>>>
"Jasper Smith" <jasper_smith9@.hotmail.com> wrote in message
news:<#zNZyc0HFHA.2976@.TK2MSFTNGP15.phx.gbl>...
Download Reporting Services Scripter
http://www.sqldbatips.com/showarticle.asp?ID=62
Reporting Services Scripter is a .NET Windows Forms application that enables
scripting of all Microsoft SQL Server Reporting Services catalog items to
aid in transferring them from one server to another. It can also be used to
easily move items on mass from one Reporting Services folder to another on
the same server. Depending on the scripting options chosen, Reporting
Services Scripter can also transfer all catalog item properties such as
Descriptions, History options, Execution options (including report specific
and shared schedules) and server side report parameters.
HTH
Jasper Smith (SQL Server MVP)
http://www.sqldbatips.com
I support PASS - the definitive, global
community for SQL Server professionals -
http://www.sqlpass.org
>>>>>>>>>>>>>>>>
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
news:A944148D-6829-471D-ADA7-61C69AEB5C00@.microsoft.com...
> You hit the nail on the head. How do you push the RDLs around? The only
> thought I have on this now is the developer or the configuration
management
> person would have to open the report in VS.NET and publish it to a
different
> SRS server. Sounds messy. Maybe a better approach would be for a
function
> in Report Manager to push RDLs around. But then, I have to wonder about
> versions of the RDL.
> Surely someone has thought about this. Does everyone on the planet code
and
> test on a production server?
> "Jeff A. Stucker" wrote:
> > That sounds like a good plan. The next step would be to identify the
actual
> > way you push the RDL files around. Depending on whether you are using
forms
> > auth or windows auth -- as well as how you handle configuration
management
> > in your company -- the best solution may vary.
> >
> > --
> > Cheers,
> >
> > '(' Jeff A. Stucker
> > \
> >
> > Business Intelligence
> > www.criadvantage.com
> > ---
> > "Lee Forst" <Lee Forst@.discussions.microsoft.com> wrote in message
> > news:7830DAD0-25C8-4133-B5DE-DD5F348F8886@.microsoft.com...
> > >I would like to find some best practice documentation on how to manage
a
> > > development, test, and production configuration. For example, I have
an
> > > SRS
> > > development server that runs on a development SQL Server. Developers
> > > write
> > > the SRS reports. Then it comes time to deploy the reports to user
> > > testing.
> > > I have a SRS test server that may run on a test SQL server. When it
comes
> > > time to deploy the reports into production, I have an SRS production
> > > server.
> > > I want to find out how to best accomplish this.
> > >
> > > Thanks
> > >
> >
> >
> >|||Very good Bruce and Jeff. This is what I needed to know. We have yet to
move to SRS, but before we do, I want some questions like this answered. I
assume you have used this tool and it works as advertised?
"Bruce L-C [MVP]" wrote:
> This was just posted today. You might want to take a look at it.
> >>>>>>>>>>>>>
> "Jasper Smith" <jasper_smith9@.hotmail.com> wrote in message
> news:<#zNZyc0HFHA.2976@.TK2MSFTNGP15.phx.gbl>...
> Download Reporting Services Scripter
> http://www.sqldbatips.com/showarticle.asp?ID=62
>
> Reporting Services Scripter is a .NET Windows Forms application that enables
> scripting of all Microsoft SQL Server Reporting Services catalog items to
> aid in transferring them from one server to another. It can also be used to
> easily move items on mass from one Reporting Services folder to another on
> the same server. Depending on the scripting options chosen, Reporting
> Services Scripter can also transfer all catalog item properties such as
> Descriptions, History options, Execution options (including report specific
> and shared schedules) and server side report parameters.
>
> --
> HTH
>
> Jasper Smith (SQL Server MVP)
> http://www.sqldbatips.com
> I support PASS - the definitive, global
> community for SQL Server professionals -
> http://www.sqlpass.org
> >>>>>>>>>>>>>>>>
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
>
> "Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
> news:A944148D-6829-471D-ADA7-61C69AEB5C00@.microsoft.com...
> > You hit the nail on the head. How do you push the RDLs around? The only
> > thought I have on this now is the developer or the configuration
> management
> > person would have to open the report in VS.NET and publish it to a
> different
> > SRS server. Sounds messy. Maybe a better approach would be for a
> function
> > in Report Manager to push RDLs around. But then, I have to wonder about
> > versions of the RDL.
> >
> > Surely someone has thought about this. Does everyone on the planet code
> and
> > test on a production server?
> >
> > "Jeff A. Stucker" wrote:
> >
> > > That sounds like a good plan. The next step would be to identify the
> actual
> > > way you push the RDL files around. Depending on whether you are using
> forms
> > > auth or windows auth -- as well as how you handle configuration
> management
> > > in your company -- the best solution may vary.
> > >
> > > --
> > > Cheers,
> > >
> > > '(' Jeff A. Stucker
> > > \
> > >
> > > Business Intelligence
> > > www.criadvantage.com
> > > ---
> > > "Lee Forst" <Lee Forst@.discussions.microsoft.com> wrote in message
> > > news:7830DAD0-25C8-4133-B5DE-DD5F348F8886@.microsoft.com...
> > > >I would like to find some best practice documentation on how to manage
> a
> > > > development, test, and production configuration. For example, I have
> an
> > > > SRS
> > > > development server that runs on a development SQL Server. Developers
> > > > write
> > > > the SRS reports. Then it comes time to deploy the reports to user
> > > > testing.
> > > > I have a SRS test server that may run on a test SQL server. When it
> comes
> > > > time to deploy the reports into production, I have an SRS production
> > > > server.
> > > > I want to find out how to best accomplish this.
> > > >
> > > > Thanks
> > > >
> > >
> > >
> > >
>
>|||I have not used it. It is a tool development by a SQL Server MVP. He just
finished it so I don't know what it is like. Remember you can create your
own scripts to so you have lots of options. I would kick the tires on it
some before I depend on it.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
news:E2BB9DD3-011C-4648-B9E1-857950FC7227@.microsoft.com...
> Very good Bruce and Jeff. This is what I needed to know. We have yet to
> move to SRS, but before we do, I want some questions like this answered.
I
> assume you have used this tool and it works as advertised?
> "Bruce L-C [MVP]" wrote:
> > This was just posted today. You might want to take a look at it.
> > >>>>>>>>>>>>>
> > "Jasper Smith" <jasper_smith9@.hotmail.com> wrote in message
> > news:<#zNZyc0HFHA.2976@.TK2MSFTNGP15.phx.gbl>...
> >
> > Download Reporting Services Scripter
> >
> > http://www.sqldbatips.com/showarticle.asp?ID=62
> >
> >
> > Reporting Services Scripter is a .NET Windows Forms application that
enables
> > scripting of all Microsoft SQL Server Reporting Services catalog items
to
> > aid in transferring them from one server to another. It can also be used
to
> > easily move items on mass from one Reporting Services folder to another
on
> > the same server. Depending on the scripting options chosen, Reporting
> > Services Scripter can also transfer all catalog item properties such as
> > Descriptions, History options, Execution options (including report
specific
> > and shared schedules) and server side report parameters.
> >
> >
> > --
> >
> > HTH
> >
> >
> > Jasper Smith (SQL Server MVP)
> >
> > http://www.sqldbatips.com
> >
> > I support PASS - the definitive, global
> >
> > community for SQL Server professionals -
> >
> > http://www.sqlpass.org
> >
> > >>>>>>>>>>>>>>>>
> >
> >
> > --
> > Bruce Loehle-Conger
> > MVP SQL Server Reporting Services
> >
> >
> > "Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
> > news:A944148D-6829-471D-ADA7-61C69AEB5C00@.microsoft.com...
> > > You hit the nail on the head. How do you push the RDLs around? The
only
> > > thought I have on this now is the developer or the configuration
> > management
> > > person would have to open the report in VS.NET and publish it to a
> > different
> > > SRS server. Sounds messy. Maybe a better approach would be for a
> > function
> > > in Report Manager to push RDLs around. But then, I have to wonder
about
> > > versions of the RDL.
> > >
> > > Surely someone has thought about this. Does everyone on the planet
code
> > and
> > > test on a production server?
> > >
> > > "Jeff A. Stucker" wrote:
> > >
> > > > That sounds like a good plan. The next step would be to identify
the
> > actual
> > > > way you push the RDL files around. Depending on whether you are
using
> > forms
> > > > auth or windows auth -- as well as how you handle configuration
> > management
> > > > in your company -- the best solution may vary.
> > > >
> > > > --
> > > > Cheers,
> > > >
> > > > '(' Jeff A. Stucker
> > > > \
> > > >
> > > > Business Intelligence
> > > > www.criadvantage.com
> > > > ---
> > > > "Lee Forst" <Lee Forst@.discussions.microsoft.com> wrote in message
> > > > news:7830DAD0-25C8-4133-B5DE-DD5F348F8886@.microsoft.com...
> > > > >I would like to find some best practice documentation on how to
manage
> > a
> > > > > development, test, and production configuration. For example, I
have
> > an
> > > > > SRS
> > > > > development server that runs on a development SQL Server.
Developers
> > > > > write
> > > > > the SRS reports. Then it comes time to deploy the reports to user
> > > > > testing.
> > > > > I have a SRS test server that may run on a test SQL server. When
it
> > comes
> > > > > time to deploy the reports into production, I have an SRS
production
> > > > > server.
> > > > > I want to find out how to best accomplish this.
> > > > >
> > > > > Thanks
> > > > >
> > > >
> > > >
> > > >
> >
> >
> >|||Yep. This brings up more questions though. Once you have the script(s), how
do you run them on the new server? And which server do you run them on, IIS
or SQL? I would think it would be SQL and it would be putting data into the
ReportingDB?
"Bruce L-C [MVP]" wrote:
> I have not used it. It is a tool development by a SQL Server MVP. He just
> finished it so I don't know what it is like. Remember you can create your
> own scripts to so you have lots of options. I would kick the tires on it
> some before I depend on it.
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> "Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
> news:E2BB9DD3-011C-4648-B9E1-857950FC7227@.microsoft.com...
> > Very good Bruce and Jeff. This is what I needed to know. We have yet to
> > move to SRS, but before we do, I want some questions like this answered.
> I
> > assume you have used this tool and it works as advertised?
> >
> > "Bruce L-C [MVP]" wrote:
> >
> > > This was just posted today. You might want to take a look at it.
> > > >>>>>>>>>>>>>
> > > "Jasper Smith" <jasper_smith9@.hotmail.com> wrote in message
> > > news:<#zNZyc0HFHA.2976@.TK2MSFTNGP15.phx.gbl>...
> > >
> > > Download Reporting Services Scripter
> > >
> > > http://www.sqldbatips.com/showarticle.asp?ID=62
> > >
> > >
> > > Reporting Services Scripter is a .NET Windows Forms application that
> enables
> > > scripting of all Microsoft SQL Server Reporting Services catalog items
> to
> > > aid in transferring them from one server to another. It can also be used
> to
> > > easily move items on mass from one Reporting Services folder to another
> on
> > > the same server. Depending on the scripting options chosen, Reporting
> > > Services Scripter can also transfer all catalog item properties such as
> > > Descriptions, History options, Execution options (including report
> specific
> > > and shared schedules) and server side report parameters.
> > >
> > >
> > > --
> > >
> > > HTH
> > >
> > >
> > > Jasper Smith (SQL Server MVP)
> > >
> > > http://www.sqldbatips.com
> > >
> > > I support PASS - the definitive, global
> > >
> > > community for SQL Server professionals -
> > >
> > > http://www.sqlpass.org
> > >
> > > >>>>>>>>>>>>>>>>
> > >
> > >
> > > --
> > > Bruce Loehle-Conger
> > > MVP SQL Server Reporting Services
> > >
> > >
> > > "Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
> > > news:A944148D-6829-471D-ADA7-61C69AEB5C00@.microsoft.com...
> > > > You hit the nail on the head. How do you push the RDLs around? The
> only
> > > > thought I have on this now is the developer or the configuration
> > > management
> > > > person would have to open the report in VS.NET and publish it to a
> > > different
> > > > SRS server. Sounds messy. Maybe a better approach would be for a
> > > function
> > > > in Report Manager to push RDLs around. But then, I have to wonder
> about
> > > > versions of the RDL.
> > > >
> > > > Surely someone has thought about this. Does everyone on the planet
> code
> > > and
> > > > test on a production server?
> > > >
> > > > "Jeff A. Stucker" wrote:
> > > >
> > > > > That sounds like a good plan. The next step would be to identify
> the
> > > actual
> > > > > way you push the RDL files around. Depending on whether you are
> using
> > > forms
> > > > > auth or windows auth -- as well as how you handle configuration
> > > management
> > > > > in your company -- the best solution may vary.
> > > > >
> > > > > --
> > > > > Cheers,
> > > > >
> > > > > '(' Jeff A. Stucker
> > > > > \
> > > > >
> > > > > Business Intelligence
> > > > > www.criadvantage.com
> > > > > ---
> > > > > "Lee Forst" <Lee Forst@.discussions.microsoft.com> wrote in message
> > > > > news:7830DAD0-25C8-4133-B5DE-DD5F348F8886@.microsoft.com...
> > > > > >I would like to find some best practice documentation on how to
> manage
> > > a
> > > > > > development, test, and production configuration. For example, I
> have
> > > an
> > > > > > SRS
> > > > > > development server that runs on a development SQL Server.
> Developers
> > > > > > write
> > > > > > the SRS reports. Then it comes time to deploy the reports to user
> > > > > > testing.
> > > > > > I have a SRS test server that may run on a test SQL server. When
> it
> > > comes
> > > > > > time to deploy the reports into production, I have an SRS
> production
> > > > > > server.
> > > > > > I want to find out how to best accomplish this.
> > > > > >
> > > > > > Thanks
> > > > > >
> > > > >
> > > > >
> > > > >
> > >
> > >
> > >
>
>|||No, the scripting is using the published web services api. You should never
go directly against RS database. This is unsupported and dangerous. The web
services api should be used. Here is info from BOL about scripting:
____________________________________________
Script for Publishing Sample Reports
This sample script uses a script file (.rss) and the script environment to
run Web service operations on a specified report server. The script creates
a folder that you specify as a command-prompt variable using the -v switch,
and then publishes the sample reports that ship with Reporting Services to a
report server.
You can use script files with the Reporting Services Simple Object Access
Protocol (SOAP) API to access most of the management and administration
operations available on your report server.
Requirements
Before you run this sample, the following conditions must be met:
a.. The sample reports and the script file must first be installed to your
hard drive using Reporting Services Setup or the separate sample installer.
b.. You must have permission to run the rs utility from the computer on
which a report server instance is installed.
c.. You must have access to the report server that you are accessing with
your script.
d.. You must have Create Folder and Create Report permissions on the root
folder of the report server you are accessing. For more information about
permissions and user roles, see Reporting Services Security Model.
_________________________________________________________
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
news:9C9FAADA-9D26-4E6A-8061-EEF654F68E75@.microsoft.com...
> Yep. This brings up more questions though. Once you have the script(s),
how
> do you run them on the new server? And which server do you run them on,
IIS
> or SQL? I would think it would be SQL and it would be putting data into
the
> ReportingDB?
> "Bruce L-C [MVP]" wrote:
> > I have not used it. It is a tool development by a SQL Server MVP. He
just
> > finished it so I don't know what it is like. Remember you can create
your
> > own scripts to so you have lots of options. I would kick the tires on it
> > some before I depend on it.
> >
> >
> > --
> > Bruce Loehle-Conger
> > MVP SQL Server Reporting Services
> >
> > "Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
> > news:E2BB9DD3-011C-4648-B9E1-857950FC7227@.microsoft.com...
> > > Very good Bruce and Jeff. This is what I needed to know. We have yet
to
> > > move to SRS, but before we do, I want some questions like this
answered.
> > I
> > > assume you have used this tool and it works as advertised?
> > >
> > > "Bruce L-C [MVP]" wrote:
> > >
> > > > This was just posted today. You might want to take a look at it.
> > > > >>>>>>>>>>>>>
> > > > "Jasper Smith" <jasper_smith9@.hotmail.com> wrote in message
> > > > news:<#zNZyc0HFHA.2976@.TK2MSFTNGP15.phx.gbl>...
> > > >
> > > > Download Reporting Services Scripter
> > > >
> > > > http://www.sqldbatips.com/showarticle.asp?ID=62
> > > >
> > > >
> > > > Reporting Services Scripter is a .NET Windows Forms application that
> > enables
> > > > scripting of all Microsoft SQL Server Reporting Services catalog
items
> > to
> > > > aid in transferring them from one server to another. It can also be
used
> > to
> > > > easily move items on mass from one Reporting Services folder to
another
> > on
> > > > the same server. Depending on the scripting options chosen,
Reporting
> > > > Services Scripter can also transfer all catalog item properties such
as
> > > > Descriptions, History options, Execution options (including report
> > specific
> > > > and shared schedules) and server side report parameters.
> > > >
> > > >
> > > > --
> > > >
> > > > HTH
> > > >
> > > >
> > > > Jasper Smith (SQL Server MVP)
> > > >
> > > > http://www.sqldbatips.com
> > > >
> > > > I support PASS - the definitive, global
> > > >
> > > > community for SQL Server professionals -
> > > >
> > > > http://www.sqlpass.org
> > > >
> > > > >>>>>>>>>>>>>>>>
> > > >
> > > >
> > > > --
> > > > Bruce Loehle-Conger
> > > > MVP SQL Server Reporting Services
> > > >
> > > >
> > > > "Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
> > > > news:A944148D-6829-471D-ADA7-61C69AEB5C00@.microsoft.com...
> > > > > You hit the nail on the head. How do you push the RDLs around?
The
> > only
> > > > > thought I have on this now is the developer or the configuration
> > > > management
> > > > > person would have to open the report in VS.NET and publish it to a
> > > > different
> > > > > SRS server. Sounds messy. Maybe a better approach would be for a
> > > > function
> > > > > in Report Manager to push RDLs around. But then, I have to wonder
> > about
> > > > > versions of the RDL.
> > > > >
> > > > > Surely someone has thought about this. Does everyone on the
planet
> > code
> > > > and
> > > > > test on a production server?
> > > > >
> > > > > "Jeff A. Stucker" wrote:
> > > > >
> > > > > > That sounds like a good plan. The next step would be to
identify
> > the
> > > > actual
> > > > > > way you push the RDL files around. Depending on whether you are
> > using
> > > > forms
> > > > > > auth or windows auth -- as well as how you handle configuration
> > > > management
> > > > > > in your company -- the best solution may vary.
> > > > > >
> > > > > > --
> > > > > > Cheers,
> > > > > >
> > > > > > '(' Jeff A. Stucker
> > > > > > \
> > > > > >
> > > > > > Business Intelligence
> > > > > > www.criadvantage.com
> > > > > > ---
> > > > > > "Lee Forst" <Lee Forst@.discussions.microsoft.com> wrote in
message
> > > > > > news:7830DAD0-25C8-4133-B5DE-DD5F348F8886@.microsoft.com...
> > > > > > >I would like to find some best practice documentation on how to
> > manage
> > > > a
> > > > > > > development, test, and production configuration. For example,
I
> > have
> > > > an
> > > > > > > SRS
> > > > > > > development server that runs on a development SQL Server.
> > Developers
> > > > > > > write
> > > > > > > the SRS reports. Then it comes time to deploy the reports to
user
> > > > > > > testing.
> > > > > > > I have a SRS test server that may run on a test SQL server.
When
> > it
> > > > comes
> > > > > > > time to deploy the reports into production, I have an SRS
> > production
> > > > > > > server.
> > > > > > > I want to find out how to best accomplish this.
> > > > > > >
> > > > > > > Thanks
> > > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > >
> > > >
> > > >
> >
> >
> >|||I see. Thank you again for your help Bruce
"Bruce L-C [MVP]" wrote:
> No, the scripting is using the published web services api. You should never
> go directly against RS database. This is unsupported and dangerous. The web
> services api should be used. Here is info from BOL about scripting:
> ____________________________________________
> Script for Publishing Sample Reports
> This sample script uses a script file (.rss) and the script environment to
> run Web service operations on a specified report server. The script creates
> a folder that you specify as a command-prompt variable using the -v switch,
> and then publishes the sample reports that ship with Reporting Services to a
> report server.
> You can use script files with the Reporting Services Simple Object Access
> Protocol (SOAP) API to access most of the management and administration
> operations available on your report server.
> Requirements
> Before you run this sample, the following conditions must be met:
> a.. The sample reports and the script file must first be installed to your
> hard drive using Reporting Services Setup or the separate sample installer.
> b.. You must have permission to run the rs utility from the computer on
> which a report server instance is installed.
> c.. You must have access to the report server that you are accessing with
> your script.
> d.. You must have Create Folder and Create Report permissions on the root
> folder of the report server you are accessing. For more information about
> permissions and user roles, see Reporting Services Security Model.
> _________________________________________________________
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> "Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
> news:9C9FAADA-9D26-4E6A-8061-EEF654F68E75@.microsoft.com...
> > Yep. This brings up more questions though. Once you have the script(s),
> how
> > do you run them on the new server? And which server do you run them on,
> IIS
> > or SQL? I would think it would be SQL and it would be putting data into
> the
> > ReportingDB?
> >
> > "Bruce L-C [MVP]" wrote:
> >
> > > I have not used it. It is a tool development by a SQL Server MVP. He
> just
> > > finished it so I don't know what it is like. Remember you can create
> your
> > > own scripts to so you have lots of options. I would kick the tires on it
> > > some before I depend on it.
> > >
> > >
> > > --
> > > Bruce Loehle-Conger
> > > MVP SQL Server Reporting Services
> > >
> > > "Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
> > > news:E2BB9DD3-011C-4648-B9E1-857950FC7227@.microsoft.com...
> > > > Very good Bruce and Jeff. This is what I needed to know. We have yet
> to
> > > > move to SRS, but before we do, I want some questions like this
> answered.
> > > I
> > > > assume you have used this tool and it works as advertised?
> > > >
> > > > "Bruce L-C [MVP]" wrote:
> > > >
> > > > > This was just posted today. You might want to take a look at it.
> > > > > >>>>>>>>>>>>>
> > > > > "Jasper Smith" <jasper_smith9@.hotmail.com> wrote in message
> > > > > news:<#zNZyc0HFHA.2976@.TK2MSFTNGP15.phx.gbl>...
> > > > >
> > > > > Download Reporting Services Scripter
> > > > >
> > > > > http://www.sqldbatips.com/showarticle.asp?ID=62
> > > > >
> > > > >
> > > > > Reporting Services Scripter is a .NET Windows Forms application that
> > > enables
> > > > > scripting of all Microsoft SQL Server Reporting Services catalog
> items
> > > to
> > > > > aid in transferring them from one server to another. It can also be
> used
> > > to
> > > > > easily move items on mass from one Reporting Services folder to
> another
> > > on
> > > > > the same server. Depending on the scripting options chosen,
> Reporting
> > > > > Services Scripter can also transfer all catalog item properties such
> as
> > > > > Descriptions, History options, Execution options (including report
> > > specific
> > > > > and shared schedules) and server side report parameters.
> > > > >
> > > > >
> > > > > --
> > > > >
> > > > > HTH
> > > > >
> > > > >
> > > > > Jasper Smith (SQL Server MVP)
> > > > >
> > > > > http://www.sqldbatips.com
> > > > >
> > > > > I support PASS - the definitive, global
> > > > >
> > > > > community for SQL Server professionals -
> > > > >
> > > > > http://www.sqlpass.org
> > > > >
> > > > > >>>>>>>>>>>>>>>>
> > > > >
> > > > >
> > > > > --
> > > > > Bruce Loehle-Conger
> > > > > MVP SQL Server Reporting Services
> > > > >
> > > > >
> > > > > "Lee Forst" <LeeForst@.discussions.microsoft.com> wrote in message
> > > > > news:A944148D-6829-471D-ADA7-61C69AEB5C00@.microsoft.com...
> > > > > > You hit the nail on the head. How do you push the RDLs around?
> The
> > > only
> > > > > > thought I have on this now is the developer or the configuration
> > > > > management
> > > > > > person would have to open the report in VS.NET and publish it to a
> > > > > different
> > > > > > SRS server. Sounds messy. Maybe a better approach would be for a
> > > > > function
> > > > > > in Report Manager to push RDLs around. But then, I have to wonder
> > > about
> > > > > > versions of the RDL.
> > > > > >
> > > > > > Surely someone has thought about this. Does everyone on the
> planet
> > > code
> > > > > and
> > > > > > test on a production server?
> > > > > >
> > > > > > "Jeff A. Stucker" wrote:
> > > > > >
> > > > > > > That sounds like a good plan. The next step would be to
> identify
> > > the
> > > > > actual
> > > > > > > way you push the RDL files around. Depending on whether you are
> > > using
> > > > > forms
> > > > > > > auth or windows auth -- as well as how you handle configuration
> > > > > management
> > > > > > > in your company -- the best solution may vary.
> > > > > > >
> > > > > > > --
> > > > > > > Cheers,
> > > > > > >
> > > > > > > '(' Jeff A. Stucker
> > > > > > > \
> > > > > > >
> > > > > > > Business Intelligence
> > > > > > > www.criadvantage.com
> > > > > > > ---
> > > > > > > "Lee Forst" <Lee Forst@.discussions.microsoft.com> wrote in
> message
> > > > > > > news:7830DAD0-25C8-4133-B5DE-DD5F348F8886@.microsoft.com...
> > > > > > > >I would like to find some best practice documentation on how to
> > > manage
> > > > > a
> > > > > > > > development, test, and production configuration. For example,
> I
> > > have
> > > > > an
> > > > > > > > SRS
> > > > > > > > development server that runs on a development SQL Server.
> > > Developers
> > > > > > > > write
> > > > > > > > the SRS reports. Then it comes time to deploy the reports to
> user
> > > > > > > > testing.
> > > > > > > > I have a SRS test server that may run on a test SQL server.
> When
> > > it
> > > > > comes
> > > > > > > > time to deploy the reports into production, I have an SRS
> > > production
> > > > > > > > server.
> > > > > > > > I want to find out how to best accomplish this.
> > > > > > > >
> > > > > > > > Thanks
> > > > > > > >
> > > > > > >
> > > > > > >
> > > > > > >
> > > > >
> > > > >
> > > > >
> > >
> > >
> > >
>
>
Managing DB updates to client DB servers
pushed out to clients for product updates. Obviously, We could just
keep track of the changes on a pad or write a database that requires
us to input those changes and eventually hand write the update
scripts. I was wondering if anybody has any solutions that may help
automate this process.
Is there a way to write an application that will compare a current
(updated) database structure against the last realease that will give
us the fields that need to be changed?
As far as creating the scripts for the initial install, thats easy. We
can do that right from the SQL Enterprise Manager.
Call me lazy! Any ideas?
ThanksDuncan (duncan.loxton@.gmail.com) writes:
> I'm looking for ideas on how to write SQL scripts for updates that are
> pushed out to clients for product updates. Obviously, We could just
> keep track of the changes on a pad or write a database that requires
> us to input those changes and eventually hand write the update
> scripts. I was wondering if anybody has any solutions that may help
> automate this process.
> Is there a way to write an application that will compare a current
> (updated) database structure against the last realease that will give
> us the fields that need to be changed?
> As far as creating the scripts for the initial install, thats easy. We
> can do that right from the SQL Enterprise Manager.
There a couple of products on the market. SQLCompare from Red Gate does
indeed compare two databases. DBGhost likes to tout itself as being
good for this. I have not use any of them.
Whatever method, you should keep all your code under version control,
and all your update scripts should have their foundation in the
version-control system. Basically a shipment is all changes between
the label for the previous shipment and this one. With some files added,
like triggers or indexes for changed tables.
In fact, once you have a good version control system up and running,
composing your scripts manually is not daunting task - but admittedly
it becomes boring after a while. The flip side is that you learn to
understand your process.
In fact, I started in our shop with something like this many years
ago. This has now evolved to a versatile toolset that we use. It is
available as freeware for anyone who want to try, see
http://www.abaris.se/abaperls/index.html.
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp
Managing DATE ?
I have a page that shows some problem indexes (cards ? I don't know the word :/ ) and I want to show only those that are from the current day.
I'm using SQL server 2000. I have a date field inside my table (datetime type). So I tried to put another condition in my WHERE clause. This is:
WHERE something = something else AND mydate = DATEPART('dd', getdate())
or
WHERE something = something else AND mydate = DAY(getdate())
Both don't work..
I wonder if I can really use this in a WHERE clause...of if I'm using them correctly.try:
WHERE mydate BETWEEN CONVERT(datetime,CONVERT(nvarchar(20),GetDate(),101)) AND
CONVERT(datetime,CONVERT(nvarchar(20),DATEADD(day,1,GetDate()),101))
This will get any date/time between midnight last night and midnight tonight.|||That's it, thanks a lot.
But if it is not too much, can I ask why do I have to convert the whole thing (I thought DATE functions could handle datetime type ?) and what does 101 stand for ?|||i'll let douglas answer your question, but I'd be more comfortable with..
" ... WHERE something = something else AND mydate BETWEEN @.StartDate AND @.EndDate"
commandobject.Parameters.Add("@.StartDate", SqlDbType.DateTime);
commandobject.Parameters.Add("@.EndDate", SqlDbType.DateTime);
commandobject.Parameters["@.StartDate"].Value = thestartdatevar;
commandobject.Parameters["@.EndDate"].Value = theenddatevar;|||I presumed that this particular bit of coding was all on the SQL Server box itself. The only reason for using the two dates is so that any entry <b>on a particular day</a> is selected. In the example, he wanted records for today. Thus, no parameters would need to be sent (unless this needs to be generalized).
101 means that the date is formatted mm/dd/yyyy. In thinking about it, 112 (yyyymmdd) would have been better. A DateTime type has both a date AND a time. You are looking to get just a particular date (today in your example).|||Ok.
But is there a way to write a DateTime (like 2004-01-01) and use it with the code you gave me. Because if I want to do an archive of all entries, I have to use specific date (something like clickable month).
I think I should use KraGie code, but how do I initialise DateTime variable ?
For instance, if I want thestartdatevar to be June 1st 2004 (for the moment, because I'll eventually pass the value through a DropDownList or something).
thestartdatevar = ?
PS: Is this C# (the ; ?) ? Because I don't know C#.
Managing Database Schema Changes
please advice me as to a better choice. We are currently releasing V1 of
our product which will include a SQL Server DB. After this point when we
want to provide Service Packs/Rollups/Upgrades/whatever to our clients we
are going to need to manage database changes in some fashion to ensure we
keep the database in sync with the App. I have seen people use source safe
for this, just keeping a script for each change and labeling them so they
get pulled out during the build process. This solution seemed to work well
enough, but I was wondering if anyone in this group had any other
suggestions. As always any advice that can be provided would be greatly
appreciated.
Thank you,
Prescott
"J Prescott Sanders" <nomorespam@.spam.com> schrieb im Newsbeitrag
news:uIemf2GVEHA.3656@.TK2MSFTNGP11.phx.gbl...
> I hope this is the correct news group to post this to, if not could
someone
> please advice me as to a better choice. We are currently releasing V1
of
> our product which will include a SQL Server DB. After this point when
we
> want to provide Service Packs/Rollups/Upgrades/whatever to our clients
we
> are going to need to manage database changes in some fashion to ensure
we
> keep the database in sync with the App. I have seen people use source
safe
> for this, just keeping a script for each change and labeling them so
they
> get pulled out during the build process. This solution seemed to work
well
> enough, but I was wondering if anyone in this group had any other
> suggestions. As always any advice that can be provided would be greatly
> appreciated.
This is a difficult topic: especially if you have to create or change
indexes on an existing database with significant amount of data. This can
easily take very long effectively stopping your application.
Since schema changes (DDL) do not happen within transaction it can be a
problem to deal with failure of schema changing statements. This is
especially true if you not only want to change the schema but also modify
data as part of a schema migration from one version to the next.
IMHO source code control is not sufficient since you'll have to take into
consideration that customers might be upgrading from different schema
versions to the current one. They might have skipped a product version
you released and the next release has to take care of both schema
migration steps.
The ideal solution would be a target schema description and some kind of
tool that compares the actual schema with this target schema and performs
all changes needed for them to be in synch again. Unfortunately I don't
know such a tool but if anybody does know such a tool I'd be glad to get
it to know.
The single golden rule I can give you: plan schema migration carefully and
try to imagine which kind of schema changes you might be doing in the
future. Since you're at the beginning of your migration history, you have
a good change to get it right. Otherwise the whole topic might easily get
out of hand. That of course depends on the nature of your application and
the expected schema changes.
Good luck!
Kind regards
robert
|||Hi Robert,
Try dbMaestro. It's a product that allows comparison, migration and archiving of database schema and data.
this tool can compare the actual schema with this target schema and performs
all changes needed for them to be in synch again.
You can find it here:
http://www.extreme.co.il
"Robert Klemme" wrote:
> "J Prescott Sanders" <nomorespam@.spam.com> schrieb im Newsbeitrag
> news:uIemf2GVEHA.3656@.TK2MSFTNGP11.phx.gbl...
> someone
> of
> we
> we
> we
> safe
> they
> well
> This is a difficult topic: especially if you have to create or change
> indexes on an existing database with significant amount of data. This can
> easily take very long effectively stopping your application.
> Since schema changes (DDL) do not happen within transaction it can be a
> problem to deal with failure of schema changing statements. This is
> especially true if you not only want to change the schema but also modify
> data as part of a schema migration from one version to the next.
> IMHO source code control is not sufficient since you'll have to take into
> consideration that customers might be upgrading from different schema
> versions to the current one. They might have skipped a product version
> you released and the next release has to take care of both schema
> migration steps.
> The ideal solution would be a target schema description and some kind of
> tool that compares the actual schema with this target schema and performs
> all changes needed for them to be in synch again. Unfortunately I don't
> know such a tool but if anybody does know such a tool I'd be glad to get
> it to know.
> The single golden rule I can give you: plan schema migration carefully and
> try to imagine which kind of schema changes you might be doing in the
> future. Since you're at the beginning of your migration history, you have
> a good change to get it right. Otherwise the whole topic might easily get
> out of hand. That of course depends on the nature of your application and
> the expected schema changes.
> Good luck!
> Kind regards
> robert
>
|||Robert,
Thanks for the quick response. It seems you have quite a bit of experience
on this subject. Can you comment at all on the processes you have in place
to manage database changes within your own system. Any information will
certainly be appreciated as I work on a solution for our system.
Thanks in advance.
Prescott
"Robert Klemme" <bob.news@.gmx.net> wrote in message
news:2jdkn2F10ja1kU1@.uni-berlin.de...
> "J Prescott Sanders" <nomorespam@.spam.com> schrieb im Newsbeitrag
> news:uIemf2GVEHA.3656@.TK2MSFTNGP11.phx.gbl...
> someone
> of
> we
> we
> we
> safe
> they
> well
> This is a difficult topic: especially if you have to create or change
> indexes on an existing database with significant amount of data. This can
> easily take very long effectively stopping your application.
> Since schema changes (DDL) do not happen within transaction it can be a
> problem to deal with failure of schema changing statements. This is
> especially true if you not only want to change the schema but also modify
> data as part of a schema migration from one version to the next.
> IMHO source code control is not sufficient since you'll have to take into
> consideration that customers might be upgrading from different schema
> versions to the current one. They might have skipped a product version
> you released and the next release has to take care of both schema
> migration steps.
> The ideal solution would be a target schema description and some kind of
> tool that compares the actual schema with this target schema and performs
> all changes needed for them to be in synch again. Unfortunately I don't
> know such a tool but if anybody does know such a tool I'd be glad to get
> it to know.
> The single golden rule I can give you: plan schema migration carefully and
> try to imagine which kind of schema changes you might be doing in the
> future. Since you're at the beginning of your migration history, you have
> a good change to get it right. Otherwise the whole topic might easily get
> out of hand. That of course depends on the nature of your application and
> the expected schema changes.
> Good luck!
> Kind regards
> robert
>
|||"J Prescott Sanders" <nomorespam@.spam.com> schrieb im Newsbeitrag
news:OmmcrXHVEHA.2692@.TK2MSFTNGP09.phx.gbl...
> Robert,
> Thanks for the quick response. It seems you have quite a bit of
experience
> on this subject. Can you comment at all on the processes you have in
place
> to manage database changes within your own system. Any information will
> certainly be appreciated as I work on a solution for our system.
Well, what we do is basically store a numeric schema version number in
some table and check on startup, whether the schema version of the
software is more recent than the schema version found. If it is, we do
all single upgrade steps that we defined for each version hop.
Since a lot of legacy code is involved, things are a bit difficult
sometimes. I'd say a crucial thing to do is have several phases for
schema upgrade: a read only phase that determines whether preconditions
are satisfied (i.e. whether the schema at hand is in fact the one that it
claims to be). Only if the verification passes successfully, the next
phase is started, which modifies the dataabase. Alternatively you can do
a full backup and restore that if modifications fail.
The other factor that makes things hard for us is that we don't support a
single database.
And there's another gotcha: if you provide error fixing patches for older
versions of the software that included schema changes, then tracking of
schema versions and changes becomes really hard, because you need to
incorporate the change in the newest head version also but you have to
make sure it's not applied twice. Example:
1.0 version of software is released
2.0 version of software is released
2.0 version of software is released
1.1 version of software is released that contains a schema bug fix
Now you'll have to make sure that 3.0 checks on installation whether the
1.1 schema fix was incorporated into the db or not. You get the picture.
Kind regards
robert
[vbcol=seagreen]
> Thanks in advance.
> Prescott
> "Robert Klemme" <bob.news@.gmx.net> wrote in message
> news:2jdkn2F10ja1kU1@.uni-berlin.de...
V1[vbcol=seagreen]
when[vbcol=seagreen]
clients[vbcol=seagreen]
ensure[vbcol=seagreen]
source[vbcol=seagreen]
work[vbcol=seagreen]
greatly[vbcol=seagreen]
can[vbcol=seagreen]
a[vbcol=seagreen]
modify[vbcol=seagreen]
into[vbcol=seagreen]
version[vbcol=seagreen]
of[vbcol=seagreen]
performs[vbcol=seagreen]
don't[vbcol=seagreen]
get[vbcol=seagreen]
and[vbcol=seagreen]
have[vbcol=seagreen]
get[vbcol=seagreen]
and
>
Managing Database Mail using Microsoft.SqlServer.Management.Smo.Mail managed classes
I'm trying (or, more precisely, failing) to configure Database Mail from managed code using the classes in the Microsoft.SqlServer.Management.Smo.Mail namespace. I can easily retrieve the SqlMail object and create a new MailAccount instance but I can't figure out how to set the name of the mail server. Once the MailAccount is created, the MailAccount.MailServers property returns a single MailServer instance and this is always configured as the local server. In my case the SMTP server used to send email is not on the local server but is located elsewhere in the network. I figure I need to change the name of the MailServer instance, but whenever I try this I always get a FailedOperationException.
I've noticed that the dbo.sysmail_add_account_sp stored procedure allows you to specify the mailserver name when creating an account but the managed MailAccount class doesn't seem to provide this option. Can anyone tell me how to do the set the mail server name with the MailAccount class? If this isn't possible then it seems to me that the SMO features for managing database mail are basically useless.
David
Ok, I'm officially stupid
After spending the best part of an afternoon reading the documentation I finally figured out the answer. When a new MailAccount instance is created a default MailServer instance is created and added to the MailServers collection of the MailAccount instance. The MailServer class has a Rename method which allows you to change the name of the MailServer to the name of the SMTP server you want to use, but it's not actually possible to set this directly when you create the MailAccount. Quite why you can't construct an instance of the MailServer class and add it to the MailAccount.MailServers collection is beyond me?
I have to say that the SMO documentation in Sql Server 2005 Books Online / MSDN Library really sucks. As far as examples of configuring Database Mail goes, there's a single VB sample that illustrates creating a basic MailAccount. Good old Google isn't much help either as there appears to be very little documentation out there other than what's in MSDN Library (or if there is, my afternoon of Googling didn't find it!)
Anyhow, I finally got it working so all I need to do know is figure out how to programmatically enable Database Mail on the Sql Server instance that our product is being installed on. By default Database Mail is not enabled and has to turned on using the Sql Server Surface Area Configuration Tool or the wizard in Sql Server Mangement Studio. That's not a painless installation experience for my users so I want something a bit more automatic. It must be possible because Team Foundation Server seems to automatically enable and configure Database Mail when you install it, I just haven't figured out how yet. I'll keep looking.
|||It appears that you might want to execute the following code:
sp_configure 'Database Mail XPs', 1
go
reconfigure
go
You can do this within SMO using
objDB.ExecuteNonQuery(strSQL)where objDB is a defined database object on your server. (See my blog entry at http://sqljunkies.com/WebLog/marathonsqlguy/archive/2006/05/17/21039.aspx for details on mixing SQL and SMO in an application.)
Managing database growth
I've a DB in SQL Server 2000 which allows the
database to grow by a percentage (defualt=10%).
Now,anfter any action(for example insert ing a row)
does it calculate the DB size and allocate the
accurrate size of space by defined percentage?
does it take long time and has overhead?
is it a good way to define database grow by
percentage?
Thanks for any help.
Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/> Now,anfter any action(for example insert ing a row)
quote:
> does it calculate the DB size and allocate the
> accurrate size of space by defined percentage?
Yes
quote:
> does it take long time and has overhead?
No
quote:
> is it a good way to define database grow by
> percentage?
Yes
Otherwise, when the database grow and somebody forget to give more space, it
will happen some ugly error.
I think if the database it some, you should let the database grow by fix
size, otherwise grow by percentage|||SQL Server doesn't do the grow after an insert. It happens whenever SQL Serv
er need space for an
insert, update etc and the file is full. The user is blocked by the grow ope
ration during the grow.
Tibor Karaszi, SQL Server MVP
Archive at: http://groups.google.com/groups?oi=...ls
erver
"RM" <m_r1824@.yahoo.co.uk> wrote in message news:opr2g3ovl8hqligo@.msnews.microsoft.com...
quote:|||RM,
[c
olor=darkred]
> Hi
> I've a DB in SQL Server 2000 which allows the
> database to grow by a percentage (defualt=10%).
> Now,anfter any action(for example insert ing a row)
> does it calculate the DB size and allocate the
> accurrate size of space by defined percentage?
> does it take long time and has overhead?
> is it a good way to define database grow by
> percentage?
> Thanks for any help.
> --
> Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/[/color]
My personal preference is for a fixed growth unit.
Say, blocks of 1024MB for data and 512MB for log.
For some databases, excessive numbers of small growth will give you
fragmentation on the physical files.
I have seen some databases with the primary file in 300+ fragments. This
cannot be good.
Much better is a scheme something like this:
Initial size: 30GB Data / 3GB Log
Data growth: 3072MB
Log growth: 1024MB
This will minimize the number of 'expands' that need to run.
Planning is key here.
What you want to avoid is the 'ever-larger' growth syndrome.
I.e. Initial Size 30GB Data / 3GB Log.
Data growth: 10%.
Log growth 10%.
This percentage grows ever larger each time the DB expands.
Again, this is just personal preference.
James Hokes
"RM" <m_r1824@.yahoo.co.uk> wrote in message
news:opr2g3ovl8hqligo@.msnews.microsoft.com...
quote:
> Hi
> I've a DB in SQL Server 2000 which allows the
> database to grow by a percentage (defualt=10%).
> Now,anfter any action(for example insert ing a row)
> does it calculate the DB size and allocate the
> accurrate size of space by defined percentage?
> does it take long time and has overhead?
> is it a good way to define database grow by
> percentage?
> Thanks for any help.
> --
> Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Managing database growth
I've a DB in SQL Server 2000 which allows the
database to grow by a percentage (defualt=10%).
Now,anfter any action(for example insert ing a row)
does it calculate the DB size and allocate the
accurrate size of space by defined percentage?
does it take long time and has overhead?
is it a good way to define database grow by
percentage?
Thanks for any help.
--
Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/> Now,anfter any action(for example insert ing a row)
> does it calculate the DB size and allocate the
> accurrate size of space by defined percentage?
Yes
> does it take long time and has overhead?
No
> is it a good way to define database grow by
> percentage?
Yes
Otherwise, when the database grow and somebody forget to give more space, it
will happen some ugly error.
I think if the database it some, you should let the database grow by fix
size, otherwise grow by percentage|||SQL Server doesn't do the grow after an insert. It happens whenever SQL Server need space for an
insert, update etc and the file is full. The user is blocked by the grow operation during the grow.
--
Tibor Karaszi, SQL Server MVP
Archive at: http://groups.google.com/groups?oi=djq&as_ugroup=microsoft.public.sqlserver
"RM" <m_r1824@.yahoo.co.uk> wrote in message news:opr2g3ovl8hqligo@.msnews.microsoft.com...
> Hi
> I've a DB in SQL Server 2000 which allows the
> database to grow by a percentage (defualt=10%).
> Now,anfter any action(for example insert ing a row)
> does it calculate the DB size and allocate the
> accurrate size of space by defined percentage?
> does it take long time and has overhead?
> is it a good way to define database grow by
> percentage?
> Thanks for any help.
> --
> Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/|||RM,
My personal preference is for a fixed growth unit.
Say, blocks of 1024MB for data and 512MB for log.
For some databases, excessive numbers of small growth will give you
fragmentation on the physical files.
I have seen some databases with the primary file in 300+ fragments. This
cannot be good.
Much better is a scheme something like this:
Initial size: 30GB Data / 3GB Log
Data growth: 3072MB
Log growth: 1024MB
This will minimize the number of 'expands' that need to run.
Planning is key here.
What you want to avoid is the 'ever-larger' growth syndrome.
I.e. Initial Size 30GB Data / 3GB Log.
Data growth: 10%.
Log growth 10%.
This percentage grows ever larger each time the DB expands.
Again, this is just personal preference.
James Hokes
"RM" <m_r1824@.yahoo.co.uk> wrote in message
news:opr2g3ovl8hqligo@.msnews.microsoft.com...
> Hi
> I've a DB in SQL Server 2000 which allows the
> database to grow by a percentage (defualt=10%).
> Now,anfter any action(for example insert ing a row)
> does it calculate the DB size and allocate the
> accurrate size of space by defined percentage?
> does it take long time and has overhead?
> is it a good way to define database grow by
> percentage?
> Thanks for any help.
> --
> Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Managing Customer Email Communications via SQL Server
correspondence with all it customers, given that all customer data
including their email addresses is stored in a SQL Server 2005 table.
Is there a good way for Outlook (for example) to keep an address book
synchronized with a field in the database?
You can connect to sqlserver from outlook via VBA. So, it's just a matter of query the data and update outlook.
Here is outlook vba reference.
http://www.microsoft.com/downloads/details.aspx?familyid=A1CEAD80-23E2-456F-8618-4DA50CC2C38C&displaylang=en
Managing Connections for optimal performance question, switch from Oracle to SQL
To combat that problem I was advised to use a Singleton Factory pattern and to have that Factory instaniate a connection and open it, then pass references to that connection for all of the objects that it created. All of those objects passed the connection reference to the objects they created and so on. Basically that meant that I only ever had one connection open at any one time for my entire application. And I was able to implement this solution at my previous job where I was developing in Oracle. I primarially used OracleCommands and OracleDataReaders to get the informaton into and out of the database. I thought this was a very nice solution. Having this many DataReaders accessing a single connection was not a problem because OracleConnections don't get locked from having more than one DataReader open at once.
At my current job, however, I use SQL Server. I am concerned that the single connection will not work in my new enviroment as the SQLDataReaders lock up the connection while they are using it. If the information that I recieved about opening connections being the real bottleneck, then I am hesitant to have a connection instanciated and opened for each method, but I am concerned that a whole lot of errors will be generated if I use the single connection method. Also, how do DataAdapters effect my decision of which approach to use.
Any advice would be most helpful. If you have any questions that would help answer just ask. Thanks.No! Definitely not the way to do it. Just use ADO.NET's connection pooling to manage it. Out of the box it's pretty efficient, but you can tweak it if necessary. There's no way you'll write anything that will be as efficient as what's already there.
You're right that a data reader hogs a connection.
And the decision is the same whether you use data adapters or readers.
Don|||How do I use ADO.Net's connection pooling? Is there an article or something that you know about that can teach me to manage this problem better.
Thanks again for your help.|||That connection pooling stuff is pretty cool. My systems class was based on Java so apparantly that type of behind the scenes work didn't take place to efficiently manage the connections. I didn't realize that when I was instanciating a connection it was already managing a pool of connections for me. slick stuff.|||So you figured out how to use it? Cool. Yeah, it definitely is an area where all of Microsoft's hard work is paying off.
Don
Managing concurrency in Stored Proc with Timestamp
Hello all,
I am trying to find the right way to get the correct error reported in a Stored Proc where the table contains a timestamp column. I gather the TSEqual is not the way to go (I'm using SQL Express, which uses Server 2005 syntax and TSEqual is deprecated).
Within an update procedure, given an ID, a TimeStamp, and a new value, I would like to specifically know if the update fails because the record was not found or because the timestamp indicated that the record was previously updated.
I can't figure out how to specifically raise the 532 error if the timestamps don't match, since I can't use RAISERROR.
Any thoughts would be much appreciated
Regards,
Flavelle
Will this do?
Since you are passing the Time stamp to the procecure follow these steps:
Exeucte an Update Statement and check the valuf of @.@.rowcount after the update statement. If the value is 0 execute a SELECT statement to check if the value exists for the given ID, once again check the value of rowcount if this is 1 then the daa was not updated because somebody else updated the record If the value is 0 then there exists no records.
Sample Code:
DECLARE @.TS TIMESTAMP
SELECT @.TS=@.@.DBTS
DECLARE @.CustomerID INT
SET @.CustomerID=9
UPDATE Customers
SET CustomerName='Nith'
WHERE CustomerID=@.CustomerID AND LastUpdate=@.@.DBTS
-- Instead of @.@.DBTS use your datetime value
IF @.@.ROWCOUNT=0
BEGIN
SELECT CustomerName FROM Customers WHERE CustomerID=@.CustomerID
IF @.@.ROWCOUNT=0
BEGIN
PRINT 'No Record Present for CustomerID'
END
ELSE
PRINT 'Some body else updated the data'
END
ELSE
PRINT 'Data Updated'
|||An elegant and simple solution - many thanks. Now if only I could raise the 532 error directly using RAISERRROR - but it looks like it is going the way of the Dodo bird. Pity.
Regards,
Flavelle
|||I think this is a good example for handling concurrency in stored procedure.
Does the sql server 2000 that contains some variables in order to indicate which records had been updated?
or any other method to do the same thing but did not use any programming or extra column in the tables?
Please kindly advice.
Thanks & regards,
Clara
Managing concurrency in Stored Proc with Timestamp
Hello all,
I am trying to find the right way to get the correct error reported in a Stored Proc where the table contains a timestamp column. I gather the TSEqual is not the way to go (I'm using SQL Express, which uses Server 2005 syntax and TSEqual is deprecated).
Within an update procedure, given an ID, a TimeStamp, and a new value, I would like to specifically know if the update fails because the record was not found or because the timestamp indicated that the record was previously updated.
I can't figure out how to specifically raise the 532 error if the timestamps don't match, since I can't use RAISERROR.
Any thoughts would be much appreciated
Regards,
Flavelle
Will this do?
Since you are passing the Time stamp to the procecure follow these steps:
Exeucte an Update Statement and check the valuf of @.@.rowcount after the update statement. If the value is 0 execute a SELECT statement to check if the value exists for the given ID, once again check the value of rowcount if this is 1 then the daa was not updated because somebody else updated the record If the value is 0 then there exists no records.
Sample Code:
DECLARE @.TS TIMESTAMP
SELECT @.TS=@.@.DBTS
DECLARE @.CustomerID INT
SET @.CustomerID=9
UPDATE Customers
SET CustomerName='Nith'
WHERE CustomerID=@.CustomerID AND LastUpdate=@.@.DBTS
-- Instead of @.@.DBTS use your datetime value
IF @.@.ROWCOUNT=0
BEGIN
SELECT CustomerName FROM Customers WHERE CustomerID=@.CustomerID
IF @.@.ROWCOUNT=0
BEGIN
PRINT 'No Record Present for CustomerID'
END
ELSE
PRINT 'Some body else updated the data'
END
ELSE
PRINT 'Data Updated'
|||An elegant and simple solution - many thanks. Now if only I could raise the 532 error directly using RAISERRROR - but it looks like it is going the way of the Dodo bird. Pity.
Regards,
Flavelle
|||I think this is a good example for handling concurrency in stored procedure.
Does the sql server 2000 that contains some variables in order to indicate which records had been updated?
or any other method to do the same thing but did not use any programming or extra column in the tables?
Please kindly advice.
Thanks & regards,
Clara
Managing Concurrency
In my application, when a ticket is being issued to a tourist, an SQL query finds the max(ticketno) for the current month from the main table and reserves this number for the current ticket. When the operator has finished entering information and clicks SAVE button, the record is saved to the main table with this ticket no. Previously there was no issue since the database was standalone.
I want to know how to block the new ticket no so that other concurrent users are not using the same number for saving a record. How to better tune the database for thousands of concurrent users at the same time? I also want that the other user must not get an error message when he attempts to save a record. It should be automatically handled by the database.
I would NOT try to lock a process that first gets a ticket number, then waits until someone clicks the 'SAVE' button, and then updates the master table and then unlocks the process.
This will not scale and will be a MAJOR headache.
I recommend that you use an IDENTITY field for the 'ticketno', and let the 'system' take care of the updating/incrementation.
|||Can you please illustrate?|||Arnie's suggestion is a good one...there doesn't seem to be a strong justification to pull in the complexity of some kind of key management system and there doesn't appear to be a need for the client application to have any knowledge of the ticketno value prior to saving the information. If you need to maintain referential integrity between tables, you can use the @.@.IDENTITY system function to get the last identity value that was entered in the master table. For example:
CREATE TABLE Tickets (TicketID INT IDENTITY, PurchaseDate datetime, CustomerID int) <NOTE: I'm leaving out the key relationship to the identity column on a customer table>
DECLARE @.TicketID AS int
INSERT INTO Tickets (PurchaseDate, CustomerID) VALUES (GETDATE(), @.SomeValue)
SET @.TicketID = @.@.IDENTITY
INSERT INTO Table2 ( @.SomeInfo, @.SomeInfo2, @.TicketID)
|||Which part, how to define a 'TicketNo' column in a table as INDENTITY?
Look in Books Online, Topics: IDENTITY, CREATE TABLE
OR, the problems with scaleability?
It if is the later, consider that you have a choke point that every user must wait in line to access. And if some user takes a bit more time before clicking the [SAVE] button, well everyone else just has to wait because the number incrementing process is locked in the scenario you described.
|||I recommend using SCOPE_IDENTITY instead of @.@.IDENTITY.
Under some situations, @.@.IDENTITY may provide inaccurate data.
|||Ticket No in our case is not a sequence number but in this format:000123-0507-0101
The first part is the max(TicketNo) of current month. The second part contains month and year and the third part contains the code of the place. After each month end, the ticket no starts with 1 again.
Here I also want to know whether taking a string (like above) a primary key affects performance as opposed to a numeric primary key.
Managing Big Database
I have a Database over 50G. It increases 1G per day
Today I detach the current database and move the data&log file to E drive. Then I use DTS transfering those data to a new database on D drive. The database becomes 18G!!!! I check the tables, everything is there.
How did this happen? I thought there must be something about fragments. But I am not sure. Can anyone tell me
And is there any other way to defrag the database without taking off-line
thank you very muchDBCC ShrinkDatabase?
Most of the suggestions in http://www.aspfaq.com/2446 apply to any database.
Also see http://www.aspfaq.com/2471
--
Aaron Bertrand
SQL Server MVP
http://www.aspfaq.com/
"I.O" <anonymous@.discussions.microsoft.com> wrote in message
news:795D47D7-4A98-4907-8B2D-452A172EC1CD@.microsoft.com...
> Hi
> I have a Database over 50G. It increases 1G per day!
> Today I detach the current database and move the data&log file to E drive.
Then I use DTS transfering those data to a new database on D drive. The
database becomes 18G!!!! I check the tables, everything is there.
> How did this happen? I thought there must be something about fragments.
But I am not sure. Can anyone tell me?
> And is there any other way to defrag the database without taking off-line?
>
> thank you very much|||I have found the cause! It's because that I didn't build the index.|||I shrink database every day. The log file is quite small.|||and make sure you take backups of the database in order to be able to shrink
the transaction log in case of a full recovery model.
Check books online on dbcc shrinkdatabase, dbcc shrinkfile, and shrinking
the transaction log
--
Regards,
Dandy Weyn
MCSE, MCSA, MCDBA, MCT
www.dandyman.net
"I.O" <anonymous@.discussions.microsoft.com> wrote in message
news:795D47D7-4A98-4907-8B2D-452A172EC1CD@.microsoft.com...
> Hi
> I have a Database over 50G. It increases 1G per day!
> Today I detach the current database and move the data&log file to E drive.
Then I use DTS transfering those data to a new database on D drive. The
database becomes 18G!!!! I check the tables, everything is there.
> How did this happen? I thought there must be something about fragments.
But I am not sure. Can anyone tell me?
> And is there any other way to defrag the database without taking off-line?
>
> thank you very much
Managing Big Database
I have a Database over 50G. It increases 1G per day!
Today I detach the current database and move the data&log file to E drive. T
hen I use DTS transfering those data to a new database on D drive. The data
base becomes 18G!!!! I check the tables, everything is there.
How did this happen? I thought there must be something about fragments. But
I am not sure. Can anyone tell me?
And is there any other way to defrag the database without taking off-line?
thank you very muchDBCC ShrinkDatabase?
Most of the suggestions in http://www.aspfaq.com/2446 apply to any database.
Also see http://www.aspfaq.com/2471
Aaron Bertrand
SQL Server MVP
http://www.aspfaq.com/
"I.O" <anonymous@.discussions.microsoft.com> wrote in message
news:795D47D7-4A98-4907-8B2D-452A172EC1CD@.microsoft.com...
quote:
> Hi
> I have a Database over 50G. It increases 1G per day!
> Today I detach the current database and move the data&log file to E drive.
Then I use DTS transfering those data to a new database on D drive. The
database becomes 18G!!!! I check the tables, everything is there.
quote:
> How did this happen? I thought there must be something about fragments.
But I am not sure. Can anyone tell me?
quote:|||I have found the cause! It's because that I didn't build the index.|||I shrink database every day. The log file is quite small.|||and make sure you take backups of the database in order to be able to shrink
> And is there any other way to defrag the database without taking off-line?
>
> thank you very much
the transaction log in case of a full recovery model.
Check books online on dbcc shrinkdatabase, dbcc shrinkfile, and shrinking
the transaction log
Regards,
Dandy Weyn
MCSE, MCSA, MCDBA, MCT
www.dandyman.net
"I.O" <anonymous@.discussions.microsoft.com> wrote in message
news:795D47D7-4A98-4907-8B2D-452A172EC1CD@.microsoft.com...
quote:
> Hi
> I have a Database over 50G. It increases 1G per day!
> Today I detach the current database and move the data&log file to E drive.
Then I use DTS transfering those data to a new database on D drive. The
database becomes 18G!!!! I check the tables, everything is there.
quote:
> How did this happen? I thought there must be something about fragments.
But I am not sure. Can anyone tell me?
quote:
> And is there any other way to defrag the database without taking off-line?
>
> thank you very much
Managing backup files using SQL Agent
are backed up to disk using maintenance plans. This client wants to copy SQL
backup files on the disks of these servers to an additional server with a
tape drive to do consolidated backup to tape.
Is the best way to copy backup files already on disk to a network share to
use a DOS copy command through SQL Agent?
If so, is it best to run SQL Agent using a domain account with substantial
rights to ensure that SQL agent has appropriate file permissions both locally
and on network paths? When SQL agent runs under the local system account,
file copy operations fail with "Access denied" messages.
Does anyone have suggestions on how to do this task?
Larry Menzin
American Techsystems Corp.
I run my SQL Agent using a Domain User account that is local administrator
on the server.
In your case, just add a step to the jobs that the maintenance plans created
that calls XP_Cmdshell to do a DOS copy of the files after the backup is
complete.
Kevin Hill
President
3NF Consulting
www.3nf-inc.com/NewsGroups.htm
www.DallasDBAs.com/forum - new DB forum for Dallas/Ft. Worth area DBAs.
www.experts-exchange.com - experts compete for points to answer your
questions
"Larry Menzin" <LarryMenzin@.discussions.microsoft.com> wrote in message
news:B52A63D9-A817-430C-9B0F-17139F121FBC@.microsoft.com...
> My client has about 3 servers with several databases on each. These
> databases
> are backed up to disk using maintenance plans. This client wants to copy
> SQL
> backup files on the disks of these servers to an additional server with a
> tape drive to do consolidated backup to tape.
> Is the best way to copy backup files already on disk to a network share to
> use a DOS copy command through SQL Agent?
> If so, is it best to run SQL Agent using a domain account with substantial
> rights to ensure that SQL agent has appropriate file permissions both
> locally
> and on network paths? When SQL agent runs under the local system account,
> file copy operations fail with "Access denied" messages.
> Does anyone have suggestions on how to do this task?
> --
> Larry Menzin
> American Techsystems Corp.
|||Hi
Do it as a OperatingSystemCommand (cmdExec) in SQL Agent.
Something like:
xcopy *.bak \\servername\sharename\*.* /E /Y
SQL Server has to use a domain account to run under, otherwise it does not
have a way to authenticate itself with the other server.
--
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"Larry Menzin" <LarryMenzin@.discussions.microsoft.com> wrote in message
news:B52A63D9-A817-430C-9B0F-17139F121FBC@.microsoft.com...
> My client has about 3 servers with several databases on each. These
> databases
> are backed up to disk using maintenance plans. This client wants to copy
> SQL
> backup files on the disks of these servers to an additional server with a
> tape drive to do consolidated backup to tape.
> Is the best way to copy backup files already on disk to a network share to
> use a DOS copy command through SQL Agent?
> If so, is it best to run SQL Agent using a domain account with substantial
> rights to ensure that SQL agent has appropriate file permissions both
> locally
> and on network paths? When SQL agent runs under the local system account,
> file copy operations fail with "Access denied" messages.
> Does anyone have suggestions on how to do this task?
> --
> Larry Menzin
> American Techsystems Corp.
Managing backup files using SQL Agent
s
are backed up to disk using maintenance plans. This client wants to copy SQL
backup files on the disks of these servers to an additional server with a
tape drive to do consolidated backup to tape.
Is the best way to copy backup files already on disk to a network share to
use a DOS copy command through SQL Agent?
If so, is it best to run SQL Agent using a domain account with substantial
rights to ensure that SQL agent has appropriate file permissions both locall
y
and on network paths? When SQL agent runs under the local system account,
file copy operations fail with "Access denied" messages.
Does anyone have suggestions on how to do this task?
Larry Menzin
American Techsystems Corp.I run my SQL Agent using a Domain User account that is local administrator
on the server.
In your case, just add a step to the jobs that the maintenance plans created
that calls XP_Cmdshell to do a DOS copy of the files after the backup is
complete.
Kevin Hill
President
3NF Consulting
www.3nf-inc.com/NewsGroups.htm
www.DallasDBAs.com/forum - new DB forum for Dallas/Ft. Worth area DBAs.
www.experts-exchange.com - experts compete for points to answer your
questions
"Larry Menzin" <LarryMenzin@.discussions.microsoft.com> wrote in message
news:B52A63D9-A817-430C-9B0F-17139F121FBC@.microsoft.com...
> My client has about 3 servers with several databases on each. These
> databases
> are backed up to disk using maintenance plans. This client wants to copy
> SQL
> backup files on the disks of these servers to an additional server with a
> tape drive to do consolidated backup to tape.
> Is the best way to copy backup files already on disk to a network share to
> use a DOS copy command through SQL Agent?
> If so, is it best to run SQL Agent using a domain account with substantial
> rights to ensure that SQL agent has appropriate file permissions both
> locally
> and on network paths? When SQL agent runs under the local system account,
> file copy operations fail with "Access denied" messages.
> Does anyone have suggestions on how to do this task?
> --
> Larry Menzin
> American Techsystems Corp.|||Hi
Do it as a OperatingSystemCommand (cmdExec) in SQL Agent.
Something like:
xcopy *.bak \\servername\sharename\*.* /E /Y
SQL Server has to use a domain account to run under, otherwise it does not
have a way to authenticate itself with the other server.
--
--
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"Larry Menzin" <LarryMenzin@.discussions.microsoft.com> wrote in message
news:B52A63D9-A817-430C-9B0F-17139F121FBC@.microsoft.com...
> My client has about 3 servers with several databases on each. These
> databases
> are backed up to disk using maintenance plans. This client wants to copy
> SQL
> backup files on the disks of these servers to an additional server with a
> tape drive to do consolidated backup to tape.
> Is the best way to copy backup files already on disk to a network share to
> use a DOS copy command through SQL Agent?
> If so, is it best to run SQL Agent using a domain account with substantial
> rights to ensure that SQL agent has appropriate file permissions both
> locally
> and on network paths? When SQL agent runs under the local system account,
> file copy operations fail with "Access denied" messages.
> Does anyone have suggestions on how to do this task?
> --
> Larry Menzin
> American Techsystems Corp.