October 20, 2018

Srikaanth

eClerx Most Frequently Asked Latest SSIS Interview Questions Answers

What Are Variables And What Is Variable Scope?

A variable is used to store values. There are basically two types of variables, System Variable (like ErrorCode, ErrorDescription, PackageName etc) whose values you can use but cannot change and User Variable which you create, assign values and read as needed. A variable can hold a value of the data type you have chosen when you defined the variable.

Variables can have a different scope depending on where it was defined. For example you can have package level variables which are accessible to all the tasks in the package and there could also be container level variables which are accessible only to those tasks that are within the container.

What Is An Ssis Proxy Account And Why Would You Create It?

When we try to execute an SSIS package from a SQL Server Agent Job it fails with the message "Non-SysAdmins have been denied permission to run DTS Execution job steps without a proxy account". This error message is generated if the account under which SQL Server Agent Service is running and the job owner is not a sysadmin on the instance or the job step is not set to run under a proxy account associated with the SSIS subsystem. Refer to this tip to learn more about it.
Conditionally executing tasks

Double-click the precedence constraint and set the Evaluation property to Expression and Constraint.
Type the condition that you want to evaluate in the Expression box.

Pass in variables when scheduling or running a package

Use the /SET command in the DTExec command line or change the Property tab in the Package Execution Utility to have the property path like:
\Package.Variables[User::VariableName].Properties[Value]

Perform a conditional update & insert

Components Required: Data Flow Task, Conditional Split, Lookup Transform or Merge Join, OLE DB Command Transform

Use the lookup Transform or Merge Join to detemine if the row exists on the destination and ignore a failed match. If the row yields blank
on the key, then you know the row should be inserted into the target (by Conditional Split). Otherwise, the row is a duplicate or an update. Determine if the row
is an update by comparing the source value to the target value in the Conditional Split. The update can be done by an OLE DB Command Transform or by loading the data
into a staging table.
eClerx Most Frequently Asked Latest SSIS Interview Questions Answers
eClerx Most Frequently Asked Latest SSIS Interview Questions Answers

Perform an incremental load of data

Tasks Required: 2 Execute SQL Tasks, Data Flow Task

Have the 1st Execute SQL Task retrieve a date from a control table of when the target table was last loaded and place that into a variable.
In the Data Flow Task, create a date range on your query using the variable. Then, update the control table using a 2nd Execute SQL Task to specify when the table was
last updated.

What is data flow task in SSIS?

Data flow task is the pumping mechanism that moves data from source to destination. However, in the case of SSIS, you have much more control of what happens from start to finish.  In fact, you have a set of out of the box transformation components that you snap together to clean and manipulate the data while it is in the data pipeline.

Just as the control flow handles the main workflow of the package, the Data Flow handles the transformation of data. Almost anything that manipulates data falls into the Data Flow Category.

What is data Profiling task?

Data profiling is the process of analyzing the source data to better understand what condition the data is in, in terms of cleanliness, patterns, numbers or nulls, and so on. data profiling task usually be used at the beginning of the development cycle to support the design of the destination database schema. Note that this task is not used when you develop the normal recurring ETL packages.

What is the multicast Transformation in SSIS?

The Multicast transform, as the name implies, can send single data input to multiple output paths easily. You may want to use this transformation to send a path to multiple destinations sliced in different ways.  The multicast transformation is similar to the Split Transformation because both send data to multiple outputs. However, you can not specify the conditions for which part of the data will be in which output in the Multicast transformation.

Move and rename the file at the same time

Tasks Required:
File System Task

Set the File System task to rename the file and point to the directory you’d like to move the file to. This enables you to rename and move the file in the same step.

Loop over an array of data in a table & perform a set of tasks for each row

Tasks Required: Execute SQL Task, Foreach Loop

Use an Execute SQL Task to load the array and send the data into an object variable. Loop over the variable in a Foreach Loop by use an ADO Enumerator.

Difference between Merge and Union All?

The Merge transformation can merge data from two paths into a single output.  The Transform is usefull when you wish to break out your Data Flow into a path that handles certain errors and then merge it back into the main Data Flow downstream after the errors have been handled.  it’s also useful if you wish to merge data from two Data Sources.

Note that the data must be sorted before using the Merge Transformation. you can do this by using the sort transformation prior to the merge or by specifying an ORDER By clause in the source connection.  Also, the metadata must be the same for both paths. For example, CustomerID column cannot be a numeric column in one path and a character column in the other path.

The Union All Transformation works much the same way as the Merge Transformation, but it does not require the data to be sorted. It takes the outputs from multiple sources or transforms and combines them into a single result set.

OLE DB Command Transform?

The OLE DB Command Transform is a component designed to execute a SQL Statement for each row in an input stream. This task is analogous to an ADO Command Object being created, prepared, and executed for each row of a result set. The input stream provides that data for parameters that can be set into the SQL Statement that is either an In – line statement or a stored procedure call.


Subscribe to get more Posts :