Showing posts with label primary. Show all posts
Showing posts with label primary. Show all posts

Monday, March 19, 2012

Manually Insert a Primary Key Value

I have a colleague who mysteriously lost his record in our Employee table.
The "employee ID" field serves as the primary key on the table.
How do I manually insert his record, including the old primary key value,
back into the table? That is, how do I bypass the primary-key constraint?
Thanks in advance,
Mark HolahanWhat is the definition of the table?
AMB
"Mark Holahan" wrote:

> I have a colleague who mysteriously lost his record in our Employee table.
> The "employee ID" field serves as the primary key on the table.
> How do I manually insert his record, including the old primary key value,
> back into the table? That is, how do I bypass the primary-key constraint?
> Thanks in advance,
> Mark Holahan
>
>|||Is this an identity field? if so use:
SET IDENTITY_INSERT ON
--execute insert statement here
SET IDENTITY_INSERT OFF|||You can't "bypass" a primary key constraint unless you drop it. I
assume you are actually referring to the IDENTITY property on this
column. The IDENTITY property is quite distinct from a PRIMARY KEY
constraint. If you want to insert an explicit IDENTITY value then use
the SET IDENTITY_INSERT table_name ON option.
Why does it matter to you if the row gets inserted with a different
IDENTITY value to the one it originally had? It shouldn't have been
possible for the accidental delete to cause "orphan" rows in a
referencing table - That's assuming you have correctly declared foreign
key constraints against the employee ID column. If you don't have
foreign keys then that's something you really ought to fix.
David Portas
SQL Server MVP
--|||AMB,
The table definition follows:
CREATE TABLE [dbo].[Employee] (
[id] [int] IDENTITY (1, 1) NOT NULL ,
[FName] [varchar] (25) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[MI] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LName] [varchar] (25) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[BranchId] [int] NULL ,
[SalesRepId] [int] NULL ,
[Email] [varchar] (100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[Title] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[NetworkId] [char] (20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[UserName] [varchar] (20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[Password] [char] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[Deactivated] [datetime] NULL ,
[ResetPW] [bit] NOT NULL ,
[Tries] [tinyint] NULL ,
[LastLoginDtm] [datetime] NULL ,
[PendingInfoUpdate] [bit] NOT NULL ,
[IsSalesRep] [bit] NOT NULL
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[Employee] WITH NOCHECK ADD
CONSTRAINT [PK_Employee] PRIMARY KEY CLUSTERED
(
[id]
) WITH FILLFACTOR = 90 ON [PRIMARY]
GO
ALTER TABLE [dbo].[Employee] ADD
CONSTRAINT [DF_Employee_ResetPW] DEFAULT (0) FOR [ResetPW],
CONSTRAINT [DF_Employee_PendingInfoUpdate] DEFAULT (0) FOR
[PendingInfoUpdate],
CONSTRAINT [DF_Employee_IsSalesRep] DEFAULT (0) FOR [IsSalesRep]
GO
"Alejandro Mesa" <AlejandroMesa@.discussions.microsoft.com> wrote in message
news:3E956B3B-CF85-4FBA-B885-41BE4C9A96FD@.microsoft.com...
> What is the definition of the table?
>
> AMB
> "Mark Holahan" wrote:
>|||Read David's post.
AMB
"Mark Holahan" wrote:

> AMB,
> The table definition follows:
> CREATE TABLE [dbo].[Employee] (
> [id] [int] IDENTITY (1, 1) NOT NULL ,
> [FName] [varchar] (25) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [MI] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [LName] [varchar] (25) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [BranchId] [int] NULL ,
> [SalesRepId] [int] NULL ,
> [Email] [varchar] (100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [Title] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [NetworkId] [char] (20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [UserName] [varchar] (20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [Password] [char] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [Deactivated] [datetime] NULL ,
> [ResetPW] [bit] NOT NULL ,
> [Tries] [tinyint] NULL ,
> [LastLoginDtm] [datetime] NULL ,
> [PendingInfoUpdate] [bit] NOT NULL ,
> [IsSalesRep] [bit] NOT NULL
> ) ON [PRIMARY]
> GO
> ALTER TABLE [dbo].[Employee] WITH NOCHECK ADD
> CONSTRAINT [PK_Employee] PRIMARY KEY CLUSTERED
> (
> [id]
> ) WITH FILLFACTOR = 90 ON [PRIMARY]
> GO
> ALTER TABLE [dbo].[Employee] ADD
> CONSTRAINT [DF_Employee_ResetPW] DEFAULT (0) FOR [ResetPW],
> CONSTRAINT [DF_Employee_PendingInfoUpdate] DEFAULT (0) FOR
> [PendingInfoUpdate],
> CONSTRAINT [DF_Employee_IsSalesRep] DEFAULT (0) FOR [IsSalesRep]
> GO
>
> "Alejandro Mesa" <AlejandroMesa@.discussions.microsoft.com> wrote in messag
e
> news:3E956B3B-CF85-4FBA-B885-41BE4C9A96FD@.microsoft.com...
>
>|||Distinction noted.
CIO of company claims RI puts unneeded burden on SQL Server. Therefore we
handle RI on the front end. I don't necessarily agree, especially when I
read in BOL that, "The query optimizer also uses constraint definitions to
build high-performance query execution plans." But I've never done the
homework to disprove his theory. So I abide.
"David Portas" <REMOVE_BEFORE_REPLYING_dportas@.acm.org> wrote in message
news:1107450710.189817.206210@.g14g2000cwa.googlegroups.com...
> You can't "bypass" a primary key constraint unless you drop it. I
> assume you are actually referring to the IDENTITY property on this
> column. The IDENTITY property is quite distinct from a PRIMARY KEY
> constraint. If you want to insert an explicit IDENTITY value then use
> the SET IDENTITY_INSERT table_name ON option.
> Why does it matter to you if the row gets inserted with a different
> IDENTITY value to the one it originally had? It shouldn't have been
> possible for the accidental delete to cause "orphan" rows in a
> referencing table - That's assuming you have correctly declared foreign
> key constraints against the employee ID column. If you don't have
> foreign keys then that's something you really ought to fix.
> --
> David Portas
> SQL Server MVP
> --
>|||The CIO is wrong. If he wants to design databases he should take a course
first ;-)
Obviously handling RI on the front end isn't working otherwise you wouldn't
have this problem. No surprises there.
David Portas
SQL Server MVP
--

Manual log shipping

Hello:
I've implemented a stand-by server solution, where the
tran log backup from the primary server gets restored to
the secondary server at every 15-min interval.
I understand that there are some limitations with this
approach (could not implement MS SLS as our business unit
could not afford to purchase the Ent. Ed.), and was
wondering if anyone has encountered any other issues or
observations when implementing a similar manual log
shipping process, other than my own observations listed
below:
Log Shipping will fail if...
- ...there are any open connections to the database where
the transaction log files are restored to; though querying
tables using the fully qualified name is possible from
another database connection or via a linked server
connection.
I've also had one incident where my log shipping process
failed due to a LSN out of sync issue. This happened when
I ran a BCP IN operation. Other times, both BCP and BULK
INSERT operations ran successfully, funnelling changes to
the secondary server's database as expected.
Thank you for all your responses.
Regards,
- Rob.Your observations are correct. Log shipping will fail if there are =users connected to the database. I am wondering if perhaps someone =changed the dboptions when you had the log shipping fail after a BCP =import.
And no, I have not experienced any other issues. The custom log =shipping approach works very well.
You can find a script that will kill any connections to the specified =database here:
http://sqlguy.home.comcast.net/logship.htm
-- Keith
"Rob" <anonymous@.discussions.microsoft.com> wrote in message =news:1332101c3f7bf$64a958e0$a301280a@.phx.gbl...
> Hello:
> > I've implemented a stand-by server solution, where the > tran log backup from the primary server gets restored to > the secondary server at every 15-min interval.
> > I understand that there are some limitations with this > approach (could not implement MS SLS as our business unit > could not afford to purchase the Ent. Ed.), and was > wondering if anyone has encountered any other issues or > observations when implementing a similar manual log > shipping process, other than my own observations listed > below:
> > Log Shipping will fail if...
> > - ...there are any open connections to the database where > the transaction log files are restored to; though querying > tables using the fully qualified name is possible from > another database connection or via a linked server > connection.
> > I've also had one incident where my log shipping process > failed due to a LSN out of sync issue. This happened when > I ran a BCP IN operation. Other times, both BCP and BULK > INSERT operations ran successfully, funnelling changes to > the secondary server's database as expected.
> > Thank you for all your responses.
> > Regards,
> > - Rob.|||Rather than doing a KILL command on each SPID, a cleaner
way to do it is to put the database in single user mode
with rollback immediate for the duration of the log
restore and then put it back in multi user mode. This
works very well. Here's an example:
alter database database_name set SINGLE_USER with rollback
immediate
restore log database_name from disk
= 'c:\database_name_log.bak' with standby
= 'c:\standby\database_name.bak'
alter database database_name set MULTI_USER
>--Original Message--
>Your observations are correct. Log shipping will fail if
there are users connected to the database. I am wondering
if perhaps someone changed the dboptions when you had the
log shipping fail after a BCP import.
>And no, I have not experienced any other issues. The
custom log shipping approach works very well.
>You can find a script that will kill any connections to
the specified database here:
>http://sqlguy.home.comcast.net/logship.htm
>--
>Keith
>
>"Rob" <anonymous@.discussions.microsoft.com> wrote in
message news:1332101c3f7bf$64a958e0$a301280a@.phx.gbl...
>> Hello:
>> I've implemented a stand-by server solution, where the
>> tran log backup from the primary server gets restored
to
>> the secondary server at every 15-min interval.
>> I understand that there are some limitations with this
>> approach (could not implement MS SLS as our business
unit
>> could not afford to purchase the Ent. Ed.), and was
>> wondering if anyone has encountered any other issues or
>> observations when implementing a similar manual log
>> shipping process, other than my own observations listed
>> below:
>> Log Shipping will fail if...
>> - ...there are any open connections to the database
where
>> the transaction log files are restored to; though
querying
>> tables using the fully qualified name is possible from
>> another database connection or via a linked server
>> connection.
>> I've also had one incident where my log shipping
process
>> failed due to a LSN out of sync issue. This happened
when
>> I ran a BCP IN operation. Other times, both BCP and
BULK
>> INSERT operations ran successfully, funnelling changes
to
>> the secondary server's database as expected.
>> Thank you for all your responses.
>> Regards,
>> - Rob.
>.
>|||Agreed. I need to update the web page.
-- Keith
"Van Jones" <anonymous@.discussions.microsoft.com> wrote in message =news:13bbf01c3f7d4$804fb2a0$a001280a@.phx.gbl...
> Rather than doing a KILL command on each SPID, a cleaner > way to do it is to put the database in single user mode > with rollback immediate for the duration of the log > restore and then put it back in multi user mode. This > works very well. Here's an example:
> > alter database database_name set SINGLE_USER with rollback > immediate
> > restore log database_name from disk > =3D 'c:\database_name_log.bak' with standby > =3D 'c:\standby\database_name.bak'
> > alter database database_name set MULTI_USER
> > >--Original Message--
> >Your observations are correct. Log shipping will fail if > there are users connected to the database. I am wondering > if perhaps someone changed the dboptions when you had the > log shipping fail after a BCP import. > >
> >And no, I have not experienced any other issues. The > custom log shipping approach works very well.
> >
> >You can find a script that will kill any connections to > the specified database here:
> >http://sqlguy.home.comcast.net/logship.htm
> >
> >-- > >Keith
> >
> >
> >"Rob" <anonymous@.discussions.microsoft.com> wrote in > message news:1332101c3f7bf$64a958e0$a301280a@.phx.gbl...
> >> Hello:
> >> > >> I've implemented a stand-by server solution, where the > >> tran log backup from the primary server gets restored > to > >> the secondary server at every 15-min interval.
> >> > >> I understand that there are some limitations with this > >> approach (could not implement MS SLS as our business > unit > >> could not afford to purchase the Ent. Ed.), and was > >> wondering if anyone has encountered any other issues or > >> observations when implementing a similar manual log > >> shipping process, other than my own observations listed > >> below:
> >> > >> Log Shipping will fail if...
> >> > >> - ...there are any open connections to the database > where > >> the transaction log files are restored to; though > querying > >> tables using the fully qualified name is possible from > >> another database connection or via a linked server > >> connection.
> >> > >> I've also had one incident where my log shipping > process > >> failed due to a LSN out of sync issue. This happened > when > >> I ran a BCP IN operation. Other times, both BCP and > BULK > >> INSERT operations ran successfully, funnelling changes > to > >> the secondary server's database as expected.
> >> > >> Thank you for all your responses.
> >> > >> Regards,
> >> > >> - Rob.
> >.
> >|||But even in single user mode, there could be multiple
connections to the database, which can cause manual log
shipping failures. In this case, I find killing all user
connections more effective, to ensure no connections
exists prior to restoring either the full backup and/or
the tran log.
Thanks.
>--Original Message--
>Rather than doing a KILL command on each SPID, a cleaner
>way to do it is to put the database in single user mode
>with rollback immediate for the duration of the log
>restore and then put it back in multi user mode. This
>works very well. Here's an example:
>alter database database_name set SINGLE_USER with
rollback
>immediate
>restore log database_name from disk
>= 'c:\database_name_log.bak' with standby
>= 'c:\standby\database_name.bak'
>alter database database_name set MULTI_USER
>>--Original Message--
>>Your observations are correct. Log shipping will fail
if
>there are users connected to the database. I am
wondering
>if perhaps someone changed the dboptions when you had the
>log shipping fail after a BCP import.
>>And no, I have not experienced any other issues. The
>custom log shipping approach works very well.
>>You can find a script that will kill any connections to
>the specified database here:
>>http://sqlguy.home.comcast.net/logship.htm
>>--
>>Keith
>>
>>"Rob" <anonymous@.discussions.microsoft.com> wrote in
>message news:1332101c3f7bf$64a958e0$a301280a@.phx.gbl...
>> Hello:
>> I've implemented a stand-by server solution, where the
>> tran log backup from the primary server gets restored
>to
>> the secondary server at every 15-min interval.
>> I understand that there are some limitations with this
>> approach (could not implement MS SLS as our business
>unit
>> could not afford to purchase the Ent. Ed.), and was
>> wondering if anyone has encountered any other issues
or
>> observations when implementing a similar manual log
>> shipping process, other than my own observations
listed
>> below:
>> Log Shipping will fail if...
>> - ...there are any open connections to the database
>where
>> the transaction log files are restored to; though
>querying
>> tables using the fully qualified name is possible from
>> another database connection or via a linked server
>> connection.
>> I've also had one incident where my log shipping
>process
>> failed due to a LSN out of sync issue. This happened
>when
>> I ran a BCP IN operation. Other times, both BCP and
>BULK
>> INSERT operations ran successfully, funnelling changes
>to
>> the secondary server's database as expected.
>> Thank you for all your responses.
>> Regards,
>> - Rob.
>>.
>.
>|||Putting it in 'single user mode with rollback immediate'
will disconnect any currnet connections to the db and then
put it in single user mode for the process to restore the
log. Since it's in single user mode, only the process
that is restoring the log can connect. Once the restore
is done, just put it back into multi user mode.
>--Original Message--
>But even in single user mode, there could be multiple
>connections to the database, which can cause manual log
>shipping failures. In this case, I find killing all user
>connections more effective, to ensure no connections
>exists prior to restoring either the full backup and/or
>the tran log.
>Thanks.
>>--Original Message--
>>Rather than doing a KILL command on each SPID, a cleaner
>>way to do it is to put the database in single user mode
>>with rollback immediate for the duration of the log
>>restore and then put it back in multi user mode. This
>>works very well. Here's an example:
>>alter database database_name set SINGLE_USER with
>rollback
>>immediate
>>restore log database_name from disk
>>= 'c:\database_name_log.bak' with standby
>>= 'c:\standby\database_name.bak'
>>alter database database_name set MULTI_USER
>>--Original Message--
>>Your observations are correct. Log shipping will fail
>if
>>there are users connected to the database. I am
>wondering
>>if perhaps someone changed the dboptions when you had
the
>>log shipping fail after a BCP import.
>>And no, I have not experienced any other issues. The
>>custom log shipping approach works very well.
>>You can find a script that will kill any connections to
>>the specified database here:
>>http://sqlguy.home.comcast.net/logship.htm
>>--
>>Keith
>>
>>"Rob" <anonymous@.discussions.microsoft.com> wrote in
>>message news:1332101c3f7bf$64a958e0$a301280a@.phx.gbl...
>> Hello:
>> I've implemented a stand-by server solution, where
the
>> tran log backup from the primary server gets restored
>>to
>> the secondary server at every 15-min interval.
>> I understand that there are some limitations with
this
>> approach (could not implement MS SLS as our business
>>unit
>> could not afford to purchase the Ent. Ed.), and was
>> wondering if anyone has encountered any other issues
>or
>> observations when implementing a similar manual log
>> shipping process, other than my own observations
>listed
>> below:
>> Log Shipping will fail if...
>> - ...there are any open connections to the database
>>where
>> the transaction log files are restored to; though
>>querying
>> tables using the fully qualified name is possible
from
>> another database connection or via a linked server
>> connection.
>> I've also had one incident where my log shipping
>>process
>> failed due to a LSN out of sync issue. This happened
>>when
>> I ran a BCP IN operation. Other times, both BCP and
>>BULK
>> INSERT operations ran successfully, funnelling
changes
>>to
>> the secondary server's database as expected.
>> Thank you for all your responses.
>> Regards,
>> - Rob.
>>.
>>.
>.
>

Manual log shipping

Hello:
I've implemented a stand-by server solution, where the
tran log backup from the primary server gets restored to
the secondary server at every 15-min interval.
I understand that there are some limitations with this
approach (could not implement MS SLS as our business unit
could not afford to purchase the Ent. Ed.), and was
wondering if anyone has encountered any other issues or
observations when implementing a similar manual log
shipping process, other than my own observations listed
below:
Log Shipping will fail if...
- ...there are any open connections to the database where
the transaction log files are restored to; though querying
tables using the fully qualified name is possible from
another database connection or via a linked server
connection.
I've also had one incident where my log shipping process
failed due to a LSN out of sync issue. This happened when
I ran a BCP IN operation. Other times, both BCP and BULK
INSERT operations ran successfully, funnelling changes to
the secondary server's database as expected.
Thank you for all your responses.
Regards,
- Rob.Your observations are correct. Log shipping will fail if there are =
users connected to the database. I am wondering if perhaps someone =
changed the dboptions when you had the log shipping fail after a BCP =
import. =20
And no, I have not experienced any other issues. The custom log =
shipping approach works very well.
You can find a script that will kill any connections to the specified =
database here:
http://sqlguy.home.comcast.net/logship.htm
--=20
Keith
"Rob" <anonymous@.discussions.microsoft.com> wrote in message =
news:1332101c3f7bf$64a958e0$a301280a@.phx
.gbl...
> Hello:
>=20
> I've implemented a stand-by server solution, where the=20
> tran log backup from the primary server gets restored to=20
> the secondary server at every 15-min interval.
>=20
> I understand that there are some limitations with this=20
> approach (could not implement MS SLS as our business unit=20
> could not afford to purchase the Ent. Ed.), and was=20
> wondering if anyone has encountered any other issues or=20
> observations when implementing a similar manual log=20
> shipping process, other than my own observations listed=20
> below:
>=20
> Log Shipping will fail if...
>=20
> - ...there are any open connections to the database where=20
> the transaction log files are restored to; though querying=20
> tables using the fully qualified name is possible from=20
> another database connection or via a linked server=20
> connection.
>=20
> I've also had one incident where my log shipping process=20
> failed due to a LSN out of sync issue. This happened when=20
> I ran a BCP IN operation. Other times, both BCP and BULK=20
> INSERT operations ran successfully, funnelling changes to=20
> the secondary server's database as expected.
>=20
> Thank you for all your responses.
>=20
> Regards,
>=20
> - Rob.|||Rather than doing a KILL command on each SPID, a cleaner
way to do it is to put the database in single user mode
with rollback immediate for the duration of the log
restore and then put it back in multi user mode. This
works very well. Here's an example:
alter database database_name set SINGLE_USER with rollback
immediate
restore log database_name from disk
= 'c:\database_name_log.bak' with standby
= 'c:\standby\database_name.bak'
alter database database_name set MULTI_USER

>--Original Message--
>Your observations are correct. Log shipping will fail if
there are users connected to the database. I am wondering
if perhaps someone changed the dboptions when you had the
log shipping fail after a BCP import.
>And no, I have not experienced any other issues. The
custom log shipping approach works very well.
>You can find a script that will kill any connections to
the specified database here:
>http://sqlguy.home.comcast.net/logship.htm
>--
>Keith
>
>"Rob" <anonymous@.discussions.microsoft.com> wrote in
message news:1332101c3f7bf$64a958e0$a301280a@.phx
.gbl...
to
unit
where
querying
process
when
BULK
to
>.
>|||Agreed. I need to update the web page.
--=20
Keith
"Van Jones" <anonymous@.discussions.microsoft.com> wrote in message =
news:13bbf01c3f7d4$804fb2a0$a001280a@.phx
.gbl...
> Rather than doing a KILL command on each SPID, a cleaner=20
> way to do it is to put the database in single user mode=20
> with rollback immediate for the duration of the log=20
> restore and then put it back in multi user mode. This=20
> works very well. Here's an example:
>=20
> alter database database_name set SINGLE_USER with rollback=20
> immediate
>=20
> restore log database_name from disk=20
> =3D 'c:\database_name_log.bak' with standby=20
> =3D 'c:\standby\database_name.bak'
>=20
> alter database database_name set MULTI_USER
>=20
> there are users connected to the database. I am wondering=20
> if perhaps someone changed the dboptions when you had the=20
> log shipping fail after a BCP import. =20
> custom log shipping approach works very well.
> the specified database here:
> message news:1332101c3f7bf$64a958e0$a301280a@.phx
.gbl...
> to=20
> unit=20
> where=20
> querying=20
> process=20
> when=20
> BULK=20
> to=20|||But even in single user mode, there could be multiple
connections to the database, which can cause manual log
shipping failures. In this case, I find killing all user
connections more effective, to ensure no connections
exists prior to restoring either the full backup and/or
the tran log.
Thanks.

>--Original Message--
>Rather than doing a KILL command on each SPID, a cleaner
>way to do it is to put the database in single user mode
>with rollback immediate for the duration of the log
>restore and then put it back in multi user mode. This
>works very well. Here's an example:
>alter database database_name set SINGLE_USER with
rollback
>immediate
>restore log database_name from disk
>= 'c:\database_name_log.bak' with standby
>= 'c:\standby\database_name.bak'
>alter database database_name set MULTI_USER
>
if
>there are users connected to the database. I am
wondering
>if perhaps someone changed the dboptions when you had the
>log shipping fail after a BCP import.
>custom log shipping approach works very well.
>the specified database here:
>message news:1332101c3f7bf$64a958e0$a301280a@.phx
.gbl...
>to
>unit
or
listed
>where
>querying
>process
>when
>BULK
>to
>.
>|||Putting it in 'single user mode with rollback immediate'
will disconnect any currnet connections to the db and then
put it in single user mode for the process to restore the
log. Since it's in single user mode, only the process
that is restoring the log can connect. Once the restore
is done, just put it back into multi user mode.

>--Original Message--
>But even in single user mode, there could be multiple
>connections to the database, which can cause manual log
>shipping failures. In this case, I find killing all user
>connections more effective, to ensure no connections
>exists prior to restoring either the full backup and/or
>the tran log.
>Thanks.
>
>rollback
>if
>wondering
the
the
this
>or
>listed
from
changes
>.
>

Wednesday, March 7, 2012

Managing Multiple Excel incoming files?

Managing Multiple Excel incoming files?
I have this situation. I have a client which have several locations they
work primary with excel forms and they are thinking in doing reports with
them. What they want is to import those Excel files into SQL Server for
later using crystal reports or maybe asp.net for reporting. Those files
will come every week from diferent locations then go to SQL Server and then
they should be stored in a folder in the server for backup. So this means
that i have to deal with the importing but also with the file naming. This
is because they want the reports to be date based. So i have to ask them to
include the date as part of the file name or changing the file name after
procesing the file if I want to store them all in the same folder after
procesing.
So the real question is how do I deal with multiple file in regular basis
bye changing the name of the file or enforcing they send me the file with
the date already included. If the recomendation is the second one how can i
tell the system to change the files after the procesing?
Is there any easy way or tool to achive this? is there any way i can
configure DTS to do that?
EliezerThe easiest way would probably be to use a script task in DTS and us the
File System Object to manipulate the files. It can be done from TSQL but its
pretty kludgy. Using FSO from DTS will allow you a lot more flexibility and
control.
HTH
Jasper Smith (SQL Server MVP)
I support PASS - the definitive, global
community for SQL Server professionals -
http://www.sqlpass.org
"Eliezer Figueroa" <efigueroa@.propoint.com.do> wrote in message
news:Xns9479AA48BF55Aefigueroapropointco
m@.216.196.97.132...
quote:

> Managing Multiple Excel incoming files?
> I have this situation. I have a client which have several locations they
> work primary with excel forms and they are thinking in doing reports with
> them. What they want is to import those Excel files into SQL Server for
> later using crystal reports or maybe asp.net for reporting. Those files
> will come every week from diferent locations then go to SQL Server and

then
quote:

> they should be stored in a folder in the server for backup. So this means
> that i have to deal with the importing but also with the file naming. This
> is because they want the reports to be date based. So i have to ask them

to
quote:

> include the date as part of the file name or changing the file name after
> procesing the file if I want to store them all in the same folder after
> procesing.
> So the real question is how do I deal with multiple file in regular basis
> bye changing the name of the file or enforcing they send me the file with
> the date already included. If the recomendation is the second one how can

i
quote:

> tell the system to change the files after the procesing?
> Is there any easy way or tool to achive this? is there any way i can
> configure DTS to do that?
>
> Eliezer

Managing Multiple Excel incoming files?

Managing Multiple Excel incoming files?
I have this situation. I have a client which have several locations they
work primary with excel forms and they are thinking in doing reports with
them. What they want is to import those Excel files into SQL Server for
later using crystal reports or maybe asp.net for reporting. Those files
will come every week from diferent locations then go to SQL Server and then
they should be stored in a folder in the server for backup. So this means
that i have to deal with the importing but also with the file naming. This
is because they want the reports to be date based. So i have to ask them to
include the date as part of the file name or changing the file name after
procesing the file if I want to store them all in the same folder after
procesing.
So the real question is how do I deal with multiple file in regular basis
bye changing the name of the file or enforcing they send me the file with
the date already included. If the recomendation is the second one how can i
tell the system to change the files after the procesing?
Is there any easy way or tool to achive this? is there any way i can
configure DTS to do that?
EliezerThe easiest way would probably be to use a script task in DTS and us the
File System Object to manipulate the files. It can be done from TSQL but its
pretty kludgy. Using FSO from DTS will allow you a lot more flexibility and
control.
--
HTH
Jasper Smith (SQL Server MVP)
I support PASS - the definitive, global
community for SQL Server professionals -
http://www.sqlpass.org
"Eliezer Figueroa" <efigueroa@.propoint.com.do> wrote in message
news:Xns9479AA48BF55Aefigueroapropointcom@.216.196.97.132...
> Managing Multiple Excel incoming files?
> I have this situation. I have a client which have several locations they
> work primary with excel forms and they are thinking in doing reports with
> them. What they want is to import those Excel files into SQL Server for
> later using crystal reports or maybe asp.net for reporting. Those files
> will come every week from diferent locations then go to SQL Server and
then
> they should be stored in a folder in the server for backup. So this means
> that i have to deal with the importing but also with the file naming. This
> is because they want the reports to be date based. So i have to ask them
to
> include the date as part of the file name or changing the file name after
> procesing the file if I want to store them all in the same folder after
> procesing.
> So the real question is how do I deal with multiple file in regular basis
> bye changing the name of the file or enforcing they send me the file with
> the date already included. If the recomendation is the second one how can
i
> tell the system to change the files after the procesing?
> Is there any easy way or tool to achive this? is there any way i can
> configure DTS to do that?
>
> Eliezer