Showing posts with label package. Show all posts
Showing posts with label package. Show all posts

Friday, March 30, 2012

Mapping Package Variables to a SQL Query in an OLEDB Source Component

Learning how to use SSIS...

I have a data flow that uses an OLEDB Source Component to read data from a table. The data access mode is SQL Command. The SQL Command is:

select lpartid, iCallNum, sql_uid_stamp
from call where sql_uid_stamp not in (select sql_uid_stamp from import_callcompare)

I wanted to add additional clauses to the where clause.

The problem is that I want to add to this SQL Command the ability to have it use a package variable that at the time of the package execution uses the variable value.

The package variable is called [User::Date_BeginningYesterday]

select lpartid, iCallNum, sql_uid_stamp
from call where sql_uid_stamp not in (select sql_uid_stamp from import_callcompare) and record_modified < [User::Date_BeginningYesterday]

I have looked at various forum message and been through the BOL but seem to missing something to make this work properly.

http://msdn2.microsoft.com/en-us/library/ms139904.aspx

The article, is the closest I have (what I belive) come to finding a solution. I am sure the solution is so easy that it is staring me in the face and I just don't see it. Thank you for your assistance.

...cordell...

Not sure what your problem is; but the solution is to create a variable to hold your query, let's say [User::SQLStatement] and use as value your query. Then set EvaluateAsExpression property of the variable to true. In the expression property, create an expression that will be evaluate at run time where you concatenate your query with the [User::Date_BeginningYesterday] variable. Back in your OLE DB Component you need to choose 'SQL Statement from variable' and then choose [User::SQLStatement] from the list.

Notice that you need to cast the value of [User::Date_BeginningYesterday] to string in the expression builder before concatenating its value to the sql statement.

Rafael Salas

select lpartid, iCallNum, sql_uid_stamp
from call where sql_uid_stamp not in (select sql_uid_stamp from import_callcompare) and record_modified < [User::Date_BeginningYesterday]

|||

The question that I have is: Can one embed a package variable into a sql statement while selecting "SQL Statement" from the data access mode. If so how would would go about that?

...cordell...

|||

The short answer is no. you cannot reference a SSIS variable directly in your sql statement. You need to use '?' and then use the parameter mapping in your OLE DB source OR, to concatenate it within a second varibale as I described in the previous post.

Rafael Salas

sql

Wednesday, March 28, 2012

Mapping Output Parameter to a variable!

Hi there,

I am working on SSIS package that gets data from SQL 2005 Database and writes that to a flat file. But I need to write the count of records as part of the header.

Here is what i am trying:

    The OLE DB Source is calling a stored procedure and returning two things i.e. a resultset and an output parameter. The data access mode is SQL Command.

    Code Snippet

    EXEC [Get_logins] ?, ?, ? OUTPUT

    In the Set Query Parameters dialogbox, all the three patameters are mapped to three different user variables.

What is happening is that the user variable that is mapped to output parameter is never updated. The header property expression is written as follows

Code Snippet

RIGHT("0000000000" + (DT_STR, 10, 1252)@.LoginCount, 10)

I tried to watch the variable in watch window but to no avail. Any guidance if it is bug or I am missing some thing? Any thoughts, how can I accomplish this? I have also tried adding Row Count Transformation but its variable has the same behaviour. If I set the value of @.LoginCount variable to some value, this initially set value is successfully written to the file header.

Thanks

Paraclete

No bug, the OLE DB Source just doesn't support output parameters from stored procedures. The Execute SQL Task does, though. You could execute that one in your control flow and put your resultset in a variable. A script source component can shred the resultset into rows in your Data Flow.

Or you could issue two queries. One to count the rows and put that value in a variable in the Control Flow, and then another one in the Data Flow to produce the rows.
|||

Hi,

Thanks for your response. Yes I can calculate the number of rows in a separate query, but some of the rows may have bad data. So in this case these rows will be ignored or sent to error output i.e. will not be written to the Flat File Destination. So the count taken in a separate query will be incorrect i.e. CountATStart-Errors not the CountAtstart. This may create problem becuase the header has count of records in the file. Any guidance/thoughts are wellcomed.

Thanks,

Paraclete

|||You were probably on the right track with the Row Count transformation, but it won't write to the variable until all the rows have been recieved, at which point you've already written your header. Try putting a Sort component after the Row Count. This will queue up the rows between the Row Count and the Destination and should allow Row Count to set the variable before the header gets created. By the way, how are you writing the header?

Mapping Columns Automatically?

New to SSIS...

I created a new package with a source and destination and manually created the output column with data type, etc. Works. The issue is say the table has 200 columns to export.. I dont want to create these by hand. How can I just say export them all to csv format and not have to specify and map each and every column?

Use the Export Data Wizard in SSMS.

-Jamie

|||Thats fine and dandy when starting from scratch. But if you have spent a lot of time building scripts and other actions in an existing package... it seems that it should be simple to add all columns to an existing text export. This seems like it would be such a common issue there has to be a solution.|||

You could replace your existing source adapter with a new one. The default behaviour is to select all columns which by the sound of it is what you want.

The new columns will automatically appear in the metadata of downstream components.

-Jamie

|||Thanks.. I will try that and see how it goes.

Monday, March 19, 2012

Manual validation of external metadata

Hi,

is there a way to start validation of external metadata manual?

My problem is this:

The package uses a variable as connectionstring for flatfile source, and another variable for the destination table. Running the package gives a warning about external metadata that needs to be updated. Normally I update this data by just opening the data flow, and answering the question for updating with yes. This time that deosnt work, I think because the variable is not set, so there can not be any conflicts with external metadata.

I dont want do disable validation, but just validate one time and then save the package.

Any ideas?

Regards,

Jan

Not sure I got the problem...but have you looked at the DelayValidation property of the flat file connection manager and destination component. Also make sure you give some values to those variables at design time.|||

Hi Jan,

I dont want do disable validation, but just validate one time and then save the package.

I'm not sure this is an option without setting DelayValidation to True for the package. Validation occurs at different times for different objects. For instance, when the package is opened in the development environment, some validation occurs - things like Connection Manager connections are tested, etc. When the package is executed, some other validation is performed.

There's a great description of this here.

Hope this helps,
Andy

|||

Hi,

I tried to set option delayValidation to true, but this didn't help.

My package still logs warning messages because the metadata are not updated, but I cant update them without running the package, but while running the package I don't see a way to update metadata. That's my dilemma.

Regards,

Jan

|||

Hi Jan,

Is it possible to set the variables to a Development instance of SQL Server? The values will still be updated at runtime and if they're not, the worst that will happen is they'll execute against Development.

Hope this helps,
Andy

|||

Hi Andy,

unfortunately it is not an option for me to set the variables to an development instance, because that would not help.

My variables are filled up dynamically with the right values on runtime, and because this is done in a for-each container a lot of values are set. After each iteration the data flow task is started, and this causes in warnings for the metadata. I would need to set each possible value in every variable, disable my scripttask und run the package "manually" step by step. This is my problem with SSIS: sometimes it seems that SSIS wasn't developed for dynamic use, cause everytime you use it this ways you get stopped by strange problems like these.

What I dont get is why doesnt SSIS give me an option to update metadata on runtime, instead of warning me that they needs to be updated, without giving me the possibility to do that....

Thanks for all your help and ideas, I think I will just ignore the warnings.

Regards,

Jan

|||

By any chance are you changing column names or data types in the source/destination components; if so I am afraid that would break the data flow...changing the path or file name of a flat file connection manager; or table names would be fine for SSIS though...

Monday, March 12, 2012

manipulating binary data

To all,
I have a binary data type in my database. It is an array of doubles. I
would like to create a DTS package that can change this binary data type int
o
an array so I can retrieve one of the double values at a specific index.
Is there any way to do this?
How can I cast the binary data type in the database as an array of doubles?
Thanks in advance,
GloriaThere is no such thing in SQL as an array. So, to understand this clearly,
the data yoou have in the database, even though it was an array of doubles i
n
your client code before you sent it there, in the database it's just a byte
stream, or a long string of bytes...
To convert it to individual values, will most easily be done using client
side code in some programming language.
If the "array" was a delimited list of doubles represesnted as text, then it
miht be possible to do this parsing and separating using some SQL Code, but
even tis is medium to hard. If the data in your binary data column is the
actual binary byte stream generated by some client side code language to
represent an array of IEEE Double precision floats, then there's probably no
hope of parsing those individual values out in SQL Code...
"Gloria" wrote:

> To all,
> I have a binary data type in my database. It is an array of doubles. I
> would like to create a DTS package that can change this binary data type i
nto
> an array so I can retrieve one of the double values at a specific index.
> Is there any way to do this?
> How can I cast the binary data type in the database as an array of doubles
?
> Thanks in advance,
> Gloria|||There are no arrays in SQL, nor do we use it for bit level
manipulations. Indexes in SQL are not exposed toi the programmer; in
fact they are a implementattion method that not all products use
(Teradata is based on hashing; Nucleus uses bit vectors, etc.).
You have missed the point of SQL and have returned to a C style file
system in your mental model. And there is nothing wrong with a file or
programming at the machine level for certain problems. Databases are
not one of those problems.

Wednesday, March 7, 2012

Managing Excel execution through DTS

We have a convoluted DTS package. The package is stored and scheduled to run on database server1. Within the package we:
Make a connection to database server2
Execute a Process Task to execute EXCEL.EXE that is installed on server3
Pass parameters through the Process Task direct EXCEL.EXE to open a .xls file on server4
When the Excel file opens, an auto_exec macro in the Excel file being opened imports a text file local to server4 AND directs Excel to save it with another name on server4.

The questions are:
1. Excel is not installed on server1 so how do we direct Excel to execute on server3 rather than server1 where the DTS package is being executed?

2. And how do we control the security context that executes Excel through this use of automation?

3. Other than potential CPU competition, are there any significant issues with having Excel installed on a dedicated database server?Yikes! That is convoluted. Is there any particular reason you are doing things this way?|||Dude -

Are you into masochism? Enjoy sleepless nights?

Seriously - just pull the excel spreadsheet from the server you are reading from, open it on the machine with the dts package, do the manipulation & then copy the final result to wherever its going....

Dont make it any more complicated than it has to be - K.I.S.S.

Simple = Robust.

Cheers,|||Why do you have 4 servers in this scenario when you just want to modify some data on another server than where the DTS-package is situated?

What kind of data does this file contain, and why does it have to be updated via Excel?

Managing DTS Programmatically

Does anyone out there have any tip on managing DTS packages
programmatically?
Specifically, I want to install and run a dts package using tsql through
query anyalyzer.
I know how to run it using the xp_cmdshell, but I don't know how to install
it programatically from a Structured Storage File.You can 'install' the structured storage file like you would any other file
needed by your application. DTS doesn't really care where the file resides.
You might consider creating a folder on the SQL Server specifically for DTS
package files. You could then launch the packages using DTSRUN via
xp_cmdshell from that well-known location. Folder/file creation can be done
manually or as part of a Windows Installer package, depending on your
deployment requirements.
Hope this helps.
Dan Guzman
SQL Server MVP
"rmg66" <rgwathney__xXx__primepro.com> wrote in message
news:%23WYn30uEGHA.524@.TK2MSFTNGP09.phx.gbl...
> Does anyone out there have any tip on managing DTS packages
> programmatically?
> Specifically, I want to install and run a dts package using tsql through
> query anyalyzer.
> I know how to run it using the xp_cmdshell, but I don't know how to
> install
> it programatically from a Structured Storage File.
>