Viewing or Saving the Compiled Biml

Some time ago a BIML enthusiast asked me for the location of the temporary files that BIDSHelper create when you generate the SSIS packages in BIDSHelper. I couldn’t help him and explained that the Biml Engine first compiles the Biml documents,  applying any BimlScript in memory and then creates the SSIS packages. The intermediate step isn’t persisted on disk.

Debugging Biml Files Made Easier

Obviously he needed this intermediate result to better debug his Biml files and improve his development efforts. Recently I learned this simple trick to create the intermediate results he wanted. And I like to share it with you in this blog post:

  • Add a new Biml File to your project and name it SaveCompiledBiml.biml
  • Replace the contents of this file with the next code block
<Biml xmlns=”https://web.archive.org/web/20160825182845/http://schemas.varigence.com/biml.xsd”>
<#
string xBiml = RootNode.GetBiml();
XmlDocument xmlDoc = new XmlDocument();
xmlDoc.LoadXml(xBiml);
string sFile = @”C:\temp\myBimlFile.xml”;
xmlDoc.Save(sFile);
#>
</Biml>
<!–Directives:–>
<#@ template tier=”5″ #>
<#@ import namespace=”Varigence.Hadron.CoreLowerer.SchemaManagement” #>
<#@ import namespace=”System.Xml” #>
<!–BIML Code to save the compiled BIML to file: SaveCompiledBiml.biml–>

Some remarks:

Change the directory and filename of the sFile variable to match your environment / wishes.
In this example I have used tier 5. If you have files with a higher tier change the template tier directive in the SaveCompiledBiml file.  
(BIML files are compiled in the order of their “tier”, the files with the lowest tier first and those with the highest last. The default tier is 0 for files without BimlScript and 1 for files with BimlScript.)

Do you want to improve the quality and speed of your ETL-development?

Biml is quickly gaining popularity as the way to automate the creation of SSIS packages. And thus saving time and money while improving quality. Are you interested in automation of SSIS packages with Biml? Consider my Biml Workshop to get a headstart.

Building a data warehouse while the source systems are still in development

Some years ago the client of a contractor I worked for made some major investments in their IT-landscape. They replaced their ERP – and CRM system and of course BI/Reporting was part of the acceptance criteria for the new system. It was a big project and a disaster! ERP, CRM and BI consultants from the contractor where running in and out. Discussing requirement details with the client and adapting the systems to these wishes. For the BI team it was especially hard. When we build something on Monday chances were slim that it still worked on Thursday. We depended upon the ERP and CRM team to communicate their changes and provide us with correct test data. And there was no love wasted between the teams. I was very glad when the opportunity arose to leave this war hole and move on. And I did.

Trending

Nevertheless it seems to become a trend to build/adapt the BI system before a new source system has been launched. And make it part of the acceptance criteria for the new system. This of course offers the client an easy way to verify the new system by comparing the reports they work with. In my previous three projects (part) of the source system was still in development and I would probably have gone crazy keeping up with all the changes if I hadn’t been able to automate the work.

Agility is the name of the game

agility_drill
agility_drill

In these situations I have found that you (as in an agility drill) need three things:

  • Light feet:
    An alert/reporting system that gives you the insights in the changes that are made in the source systems. I’ll typically import metadata of the source systems tables, views and columns in the database on a daily basis.
  • Run on balls of feet:
    data warehouse automation framework that automates the creation of the data warehouse objects (Tables, Views, Primary keys, Foreign key relations and Indexes) and the ETL procedures (Stored procedures and SSIS packages) to load the data into the data warehouse.
    I have written extensively about this earlier.
  • Use Arms (to keep your balance):
    testing framework that daily checks and alerts you about the results, compared to the results the business users expect.

Testing Framework

A testing framework has a complicated ring to it. But if you keep it simple and focus on the results the users expect to see it will be easy and the advantages are great:

  • You’ll notice any errors early in the process.
  • You can correct the errors before the users’ starts testing.
  • You can pinpoint any changes made in the source system that ruin your reports.
  • You’ll gain confident in your solution and sleep better.

I’ll describe a method that has worked for me. Key to success herein is starting early!

First get hard copies and digital versions of reports the users are using now. Ask them to highlight the important measures. Start entering the tests in an Excel spreadsheet and use the digital versions of the reports to copy any relevant data) I use the following format where I keep values in the column testName unique

testNametestTypecriterion1criterion2expectedValueexpression
InternetSales201401DWH201404Internet125.035 

Now the difficult/laborious part: start entering SQL statement in the expression column that will return the expected value. Use [criterion1] and/or [criterion2] as placeholders in these statements. The values in the criterion1 and criterion2 column will replace these placeholders at execution time. You will now be able to copy the expression to similar test with different values for the criteria. Example expression:
SELECT CAST(SUM(f.sales) AS int)
FROM factSales f
INNER JOIN dimDate d
ON f.DateKey = d,DateKey
INNER JOIN dimChannel ch
ON f.ChannelKey = ch.ChannelKey
WHERE d.YearMonth = [criterion1]
AND ch.ChannelName = ‘[criterion2]’
/* SQL code of example test expression */

Testing Framework: Tables

Import your Excel spreadsheet into a SQL Server table with the following definition:
CREATE TABLE DataTests (
testName nvarchar(64)
, testType nvarchar(20)
, criterion1 nvarchar(128)
, criterion2 nvarchar(128)
, expectedValue int
, expression nvarchar(1024)
)
/* SQL code to create the DataTests table */

Create an additional table to store the result of the tests that were performed. Use the following definition:
CREATE TABLE DataTestResults (
testName nvarchar(64)
, testDate datetime default getdate()
, actualValue int
, expectedValue int
)
/* SQL code to create the DataTestResults table */

Testing Framework: SSIS Package

Now create an SSIS Package that will perform the tests. It uses:

  • An Execute SQL Task to get a list of tests
  • A For Each Loop Container that is used to loop through the list with in it:
  • An Execute SQL Task to execute every test
  • An Execute SQL Task to log the result of every test
image

This package can be described with the following BIML (Business Intelligence Markup Language) snippet:

<Packages>
<Package Name="PKG TestData" ConstraintMode="Linear">
<Variables>
<Variable Name="TestList" DataType="Object" />
<Variable Name="testQuery" DataType="String" />
<Variable Name="testName" DataType="String" />
<Variable Name="expectedValue" DataType="Int32"> 0 </Variable>
<Variable Name="actualValue" DataType="Int32"> 0 </Variable>
</Variables>
<Tasks>
<!--Get list with tests to be performed—>
<ExecuteSQL Name="SQL GetTestList" ConnectionName="META" ResultSet="Full">
<DirectInput>
SELECT testName
, testQuery = REPLACE(REPLACE(expression,'[criterion1]',criterion1),'[criterion2]',criterion2) , expectedValue
FROM DataTests
WHERE testType='DWH'
</DirectInput>
<Results>
<Result Name="0" VariableName="User.TestList" />
</Results>
</ExecuteSQL>
<!--Loop through tests and perform them—>
<ForEachAdoLoop Name="FELC Test" SourceVariableName="User.TestList" ConstraintMode="Linear" >
<VariableMappings>
<VariableMapping Name="0" VariableName="User.testName" />
<VariableMapping Name="1" VariableName="User.testQuery" />
<VariableMapping Name="2" VariableName="User.expectedValue" />
</VariableMappings>
<Tasks>
<!--Perform test—>
<ExecuteSQL Name="SQL Execute Test" ConnectionName="DWH" ResultSet="SingleRow" >
<VariableInput VariableName="User.testQuery" />
<Results>
<Result Name="0" VariableName="User.actualValue" />
</Results>
</ExecuteSQL>
<!--Log test result—>
<ExecuteSQL Name="SQL Log Test Result" ConnectionName="META" ResultSet="None">
<DirectInput>
INSERT INTO DataTestResults (testName,actualValue,expectedValue) VALUES (?,?,?)
</DirectInput>
<Parameters>
<Parameter Name="0" Direction="Input" DataType="String" VariableName="User.testName" />
<Parameter Name="1" Direction="Input" DataType="Int32" VariableName="User.actualValue" />
<Parameter Name="2" Direction="Input" DataType="Int32" VariableName="User.expectedValue" />
</Parameters>
</ExecuteSQL>
</Tasks>
</ForEachAdoLoop>
</Tasks>
</Package>
</Packages>
*/ BIML code to create the testing ssis package */

Conclusion

In this blog post I discussed the components of an agile system that are necessary when you build a data warehouse where the source systems are still in development:

  • An Alert System to inform you of changes in the source systems.
  • A Data Warehouse Automation Framework.
  • A Testing Framework.

I ended with a simple implementation of a testing framework that worked for me. Does this help you or did you implement something similar? … I love to hear from you!

Making your Biml files less complex

The combination of XML and C# code in a Biml file can make Biml files very complex. In web development Microsoft overcame this problem by introducing the code-behind model with one file for the static (html) text and a separate file for the dynamic code. Unfortunately this is not possible with Biml. There are however some ways to make your files less complex:

  1. Separate files
  2. Move code logic to T-SQL

In this post I’ll briefly describe the first option and use some examples to explain the second option.


Separate files

You can use the include directive to insert a piece of static xml of another file in your Biml file. John Minkjan has a nice example on his website.
If you need more dynamic content you can also opt for the CallBimlScript function which will allow you to use properties. Of course you will have to handle these properties in the callee file.


Move some code logic to T-SQL

A typical Biml solution exists not only of Biml files but also of metadata that describe the packages that you want to build. I typically use a table mapping table and a column mapping table in a SQL Server database. This allows me to create a stored procedure that combines information from both tables in one dataset with all relevant information in one row.

For the next examples I will use the following information in this source to target mapping table:

image


Using FOR XML PATH(‘’) in T-SQL

With the “FOR XML PATH(‘’)” syntax you can transform columns into a string. The T-SQL statement:

SELECT srcColumnList = SELECT  ', ' + srcColumn 
FROM meta.ColumnMapping WHERE srcColumn is not null
FOR XML PATH ('')

returns: , Id, f_name, l_name, sex a string you can almost use as a column list in a source component in the SELECT <column list> FROM <source table> statement. Almost … because you’ll have to remove the first comma and handle strange characters in column names. So you’ll have to do some stuff to remove this comma:

SELECT srcColumnList = STUFF(( SELECT ', ' + QUOTENAME(srcColumn) FROM meta.ColumnMapping WHERE srcColumn is not null FOR XML PATH('')),1,2,'')

which returns the string we need: [Id], [f_name], [l_name], [sex]


Creating the data conversion task

Expanding on the previous statement you can create additional dynamic content for your Biml solution. In this case the column definition in  a Data Conversion Task. In Biml you would write:

<DataConversion Name=”DC”>
  <Columns>
    <Column SourceColumn=”f_name” TargetColumn=”FirstName”
            DataType=”AnsiString” Length=”40″ CodePage=”1252″ />
     <…more columns …>
</Columns>
</DataConversion>

To create the column list for this data conversion use the statement:

SELECT DCColumns = REPLACE(REPLACE(STUFF(
    (SELECT  char(10) + '<Column SourceColumn="' + srcColumn 
        + '" TargetColumn="' + tgtColumn
        + '" DataType="' + tgtDataType
        + CASE WHEN tgtDataType='AnsiString'
        THEN '" Length="' + CAST(tgtLength AS varchar(10))
             + '" CodePage="1252" />'
        ELSE '" />' END
     FROM meta.ColumnMapping
     FOR XML PATH('')),1,1,''),'&lt;','<'),'&gt;','>')

Which returns the Biml string we need:

<Column SourceColumn=”Id” TargetColumn=”CustomerID”
     DataType=”Int32″ />
<Column SourceColumn=”f_name” TargetColumn=”FirstName”
     DataType=”AnsiString” Length=”40″ CodePage=”1252″ />
<Column SourceColumn=”l_name” TargetColumn=”LastName”
     DataType=”AnsiString” Length=”40″ CodePage=”1252″ />
<Column SourceColumn=”sex” TargetColumn=”Gender”
    DataType=”AnsiString” Length=”6″ CodePage=”1252″ />

Some remarks to the SQL statement:

  • We don’t really need the char(10), but the line break is convenient when we look at the results.
  • Since this query uses the for xml syntax and xml can’t handle the opening en closing tag signs. SQL server replaces them with their escape codes. We have to use the REPLACE function to change these escape codes back into the opening and closing tag signs.

Creating the stored procedure

For this example I would create the following stored procedures that combines the two metadata tables:

CREATE PROCEDURE meta.getPackageDetails (@MappingType varchar(50)) AS
SELECT
      PackageName = t.MappingName
    , TargetTable = t.tgtSchema + '.' + t.tgtTable
    , TargetConnection = t.tgtConnection
    , SourceTable = t.srcSchema + '.' + t.srcTable
    , SourceConnection = t.srcConnection
    , srcColumnList = STUFF((SELECT  ', ' + QUOTENAME(srcColumn)
        FROM meta.ColumnMapping
        WHERE srcColumn is not null
        AND TableMappingID=t.TableMappingID
        FOR XML PATH('')),1,2,'')
    , DCColumns = REPLACE(REPLACE(STUFF(
        (SELECT  char(10) + '<Column SourceColumn="' + srcColumn
        + '" TargetColumn="' + tgtColumn
        + '" DataType="' + tgtDataType
        + CASE WHEN tgtDataType='AnsiString'
        THEN '" Length="' + CAST(tgtLength AS varchar(10))
            + '" CodePage="1252" />'
        ELSE '" />' END
        FROM meta.ColumnMapping
        WHERE TableMappingID=t.TableMappingID
        FOR XML PATH('')),1,1,''),'&lt;','<'),'&gt;','>')
FROM meta.TableMapping t
WHERE t.MappingType = @MappingType

Less complex Biml file

The Biml file that you need to create the packages:

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
  <#@ include file="Connection.biml" #>

    <Packages>
    <# string sConn =  “Provider=SQLNCLI10;Server=.\\SQL2012;
       Initial Catalog=BimlSamples;Integrated Security=SSPI;"; #>
    <# string sSQL = string.Format("Exec meta.getPackageDetails
      {0}", "dim"); #>
    <# DataTable tblPackages = ExternalDataAccess.GetDataTable
      (sConn,sSQL); #>
    <# foreach (DataRow pkg in tblPackages.Rows){ #>
       
        <Package Name="<#=pkg["PackageName"]#>"
                ConstraintMode="Linear">
            <Tasks>
        <Dataflow Name="DFT <#=pkg["PackageName"]#>">
          <Transformations>
            <OleDbSource Name="ODS Source"
              ConnectionName="<#=pkg["SourceConnection"]#>" >
              <DirectInput>
                SELECT <#=pkg["srcColumnList"]#>
                FROM <#=pkg["SourceTable"]#>
              </DirectInput>
            </OleDbSource>
           
            <DataConversion Name="DC">
              <Columns>
                <#=pkg["DCColumns"]#>
              </Columns>
            </DataConversion>

            <OleDbDestination Name="ODD Target"
                ConnectionName="<#=pkg["TargetConnection"]#>">
              <ExternalTableOutput
                 Table="<#=pkg["TargetTable"]#>"/>
            </OleDbDestination>

          </Transformations>
        </Dataflow>
      </Tasks>
        </Package>
        <#}#>
    </Packages>
</Biml>
<#@ import namespace="System.Data" #>

Conclusion

In this blog post I discussed some ways to make you Biml files less complex. You can move (semi)-static content to other files and import them with the include directive or use the CallBimlScript function. And you can move parts of complicated Biml code to T-SQL. The choice of how much you will move to T-SQL will largely depend on your proficiency in C# or T-SQL.

8 Practical BIML Tips

You can leverage Business Intelligence Markup Language – or BIML – to automate the creation of Microsoft SQL Server Integration Services (SSIS) Packages. The popular open source BIDSHelper project includes Biml functionality, enabling anyone to write and execute BIML code for free. Among professional SSIS Developers BIML is quickly gaining popularity.
In this post I’ll share some practical tips ….

1. Better copy and paste experience

The irritating behavior of the XML editor in Visual Studio when copying BIML script and how to overcome it  has been documented on several blogs. In this overview of tips I couldn’t discard it.
In Visual Studio / SSDT … Select Options in the Tools menu.
In the Treeview of the Options form expand Text Editor, expand  XML and choose Formatting.
Next uncheck both options under Auto Reformat.

clip_image002

2. Put directive tags #@..# at the bottom of your BIML file

In many examples (including mine) these directive are placed at the top of the BIML file. Which makes sense because this is the default location in software development. However when Visual Studio opens a file with a directive before the <Biml> xml tag it doesn’t use the xml editor and we lose the formatting and intellisense features in the gui.
So instead place the directives at the bottom of the file. After the closing </Biml> tag. This will not have any effect on the creation of packages.

3. Vanilla Biml file

Before using script in a BIML file create a working BIML file that can create a package with more than 80% of the expected functionality.
Copy this file and use it as a base and then start scripting.
Why? The Combination of xml and code in one document makes it more complicated to select the correct BIML elements and attributes. Next use a small dataset so when you test your work only a small amount of packages are created.

4. Test and save often

During development regularly often check your work. Use the options: “Check Biml for Errors” or “Generate SSIS Packages” from the context menu. This way you not only test your work but save it as well.
Why? Debugging BIML files is mostly a pain. Error messages are limited and often refer to the wrong row and small typos can have a huge impact. So you better find your errors early in development.

5. Special XML characters

Xml has some special characters that you need to enclose in  a CDATA tag or replace the special character with its escape code:

  • double quote ” ( &quot; )
  • single quote ‘  ( &apos; )
  • less than sign  < ( &lt; )
  • greater than sign  >  ( &gt; )
  • ampersand & ( &amp; )

As an example suppose you have the following BIML:

<Direct Input>
SELECT Name FROM dbo.People WHERE Age > 25
</Direct Input>

then the xml processor will fail at Age > 25 As a remedy change your BIML into:

<Direct Input>
<![CDATA[SELECT Name FROM dbo.People WHERE Age > 25]]>
</Direct Input>, or

<Direct Input>
SELECT Name FROM dbo.People WHERE Age > 25
</Direct Input>

6. Special C# characters

C# also as some special characters that you will need to escape with a backslash . Most notably:

  • the backslash itself \ ( \\ )
  • single quote ‘ ( \’ )
  • double quote “ ( \” )

As an example escaping the backslash in a file location  
string FileName = “C:\\Dir\\File.txt”;
or use the verbatim string construction:
string FileName = @”C:\Dir\File.txt”;

7. Learn some basic C#

C# is the principal language of the .NET framework and is widely used for all sorts of programs: Web Services, Web Applications, Windows Form applications, SSIS Scripts, SQL CLR Stored Procedures etc. An investment in learning some C# will pay off. There is an abundant supply of websites and books with relevant information.

To get you started: read the chapters Basics  and  Flow control  off This tutorial: http://zetcode.com/lang/csharp/

8. Learn from the samples

Steal / use the samples on:

Creating a Meta Data Driven SSIS Solution with Biml – 5. Creating the Master Package

In this last post in a series of 5 on creating a meta data driven ssis solution I”m going to create the master package. The master package will start the other table packages in the right order.

10. Deploy the Table Packages

Before you’ll create the master package publish the packages to sql server:

  • In the Solution Explorer (BIDS) right click the project and select Properties from the context menu.
  • Select Deploy under Configuration Properties in the treeview.
  • Change the DeploymentType to SqlServerDestination and the DestinationServer to localhost.
Project property Pages
  • In the Solution Explorer (BIDS) right click the project and select Deploy from the context menu.
The Master Package

I will group the Execute Package Tasks in Sequence Containers that correspond with the PackageGroup in the SsisPacakges Table.

Final result will look like this:

Masterpackage
The Biml script

The Biml Script uses two connection: one to the meta database and one to the MSDB database where the packages are stored. And before we loop through the rows in the SsisPackages table and create the tasks I will use Package Configurations to handle future deployment issues. To create this package I’ll use the following script:

<#@ template language=”C#” hostspecific=”true”#>
<#@ import namespace=”System.Data” #>
<Biml xmlns=”http://schemas.varigence.com/biml.xsd”>
      <Connections>
            <OleDbConnection Name=”metaDB”
                                   ConnectionString=”Data Source=.;Initial Catalog=MyDwh_meta;Provider=SQLNCLI10.1;Integrated Security=SSPI;”
                                   CreatePackageConfiguration=”true”></OleDbConnection>
            <OleDbConnection Name=”msdb”
                                   ConnectionString=”Data Source=.;Initial Catalog=msdb;Provider=SQLNCLI10.1;Integrated Security=SSPI;”
                                   CreatePackageConfiguration=”true”></OleDbConnection>
      </Connections>
      <Packages>
            <Package Name=”MasterPackage MyDwh” ConstraintMode=”Linear” AutoCreateConfigurationsType=”None”>
                  <!–Package Configurations–>
                  <PackageConfigurations>
                        <PackageConfiguration Name=”MetaDB” >
                             <EnvironmentVariableInput EnvironmentVariable=”Northwind_Config”></EnvironmentVariableInput>
                             <ConfigurationValues>
                                   <ConfigurationValue DataType=”String”
                                                                 PropertyPath=”\Package.Connections[metaDB].Properties[ConnectionString]”
                                                                 Name=”metaDB”
                                                                 Value=”Data Source=.;Initial Catalog=ROCWB_MIS_config;Provider=SQLNCLI10.1;Integrated Security=SSPI;”>
                                   </ConfigurationValue>
                             </ConfigurationValues>
                        </PackageConfiguration>
                        <PackageConfiguration Name=”msdb” ConnectionName=”metaDB”>
                             <ExternalTableInput Table=”[dbo].[SsisConfiguration]” />
                        </PackageConfiguration>
                  </PackageConfigurations>
                  <!–Tasks–>
                  <Tasks>
                        <# string sPackageGroup=””; #>
                        <# string IsFirstRow=”True”; #>
                        <# string sConn = “Provider=SQLNCLI10;Server=.;Initial Catalog=MyDwh_meta;Integrated Security=SSPI;”; #>
                        <# string sSQL = “SELECT PackageGroup, PackageName FROM SsisPackages”; #>
                        <# DataTable tblPackages = ExternalDataAccess.GetDataTable(sConn,sSQL); #>
                        <# foreach (DataRow row in tblPackages.Rows){ #>
                        <# if(sPackageGroup!=row[“PackageGroup”].ToString()) { #>
                             <# sPackageGroup=row[“PackageGroup”].ToString();#>
                             <# if(IsFirstRow==”True”) { #>
                                   <# IsFirstRow=”False”; #>
                                   <!–Start new container–>
                                   <Container Name=”SEQC – <#=row[“PackageGroup”]#>” ConstraintMode=”Parallel”>
                                         <Tasks>
                             <# }else { #>
                                   <!–Close container and start new container–>
                                         </Tasks>
                                   </Container>
                                   <Container Name=”SEQC – <#=row[“PackageGroup”]#>” ConstraintMode=”Parallel”>
                                         <Tasks>
                        <# }}#>
                                   <ExecutePackage Name=”EPT – <#=row[“PackageName”]#>”>
                                         <SqlServer ConnectionName=”msdb” PackagePath=”\<#=row[“PackageName”]#>” />
                                   </ExecutePackage>
                        <# }#>     
                             <!–Close container–>
                             </Tasks>
                        </Container>
                  </Tasks>
            </Package>
      </Packages>
</Biml>

11. Final steps: Create the masterpackage
  • In Solution Explorer (BIDS) right click the project and select Add New Biml File
  • Rename the Biml File to Create master package.biml
  • Copy the biml script above to the new biml file
  • In Solution Explorer (BIDS) right click the Create master package.biml file and select Generate SSIS Packages.
Final thoughts

In this series we created a complete SSIS Solution with

  • a logging and lineage method
  • package configurations to easily port the package to another environment (development, test, production)
  • an enforced standard way to import data: (Try Bulk-load, Try any error row RowByRow, and write any remaining errors to an error file)
  • a master package that can start the packages in the correct order

You could easily change this to fit your needs…

Creating a Meta Data Driven SSIS Solution with Biml – 4. Creating the Table Packages

In part 3 of this series I created a Biml file and used it to create the desired SSIS package. In this part I am going to take it one step further: Í will use Biml script to loop through a number of rows in the SsisPackages table and create a package for every row.

Before we start creating the Biml script let’s examine the SsisPackage table we created in part 2 and add some rows to it. The following columns are used in this table:

  • PackageID, a unique number identifying the package
  • PackageType, we will use 1 for this example. If you want to use another template you can use this column. If you have made manual changes to the package change this field into 0. This package won’t be recreated.
  • PackageName, the name of the package
  • PackageGroup, this used to group related packages. e.g.: 1. Staging, 2 Dimensions etc. This will play an important role when we create the master package in part 5.
  • SourceInput, a sql statement that selects the rows in the source database.
  • SourceConnection, the name used as source connection
  • DestinationName, the name of the destination table.
  • DestinationConnection, the name used as destination connection.
  • ErrorFile, the full path and name for the raw error file.
  • TruncateDestination, a flag to indicate if you want to truncate the destination table.

Now, let’s add some rows to this table…

8. Add rows to SsisPackages Table

Execute the following SQL query in Management Studio:

USE [MyDwh_meta]
GO
TRUNCATE TABLE [dbo].[SsisPackages];
INSERT INTO [dbo].[SsisPackages] ([PackageType], [PackageName]
      , [PackageGroup], [SourceInput], [SourceConnection], [DestinationName]
      , [DestinationConnection], [ErrorFile], [TruncateDestination])
SELECT 1, N'Import_Northwind_Categories', N'1 Staging'
      , N'SELECT CategoryID, CategoryName FROM [dbo].[Categories]'
      , N'Northwind', N'[dbo].[Northwind_Categories]', N'stagingDB'
      , N'c:\SSISErrors\Categories.raw', 1
UNION ALL
SELECT 1, N'Import_Northwind_Customers', N'1 Staging'
      , N'SELECT CustomerID, CompanyName, City, Country FROM [dbo].[Customers]'
      , N'Northwind', N'[dbo].[Northwind_Customers]', N'stagingDB'
      , N'c:\SSISErrors\Customers.raw', 1
UNION ALL
SELECT 1, N'Import_Northwind_Employees', N'1 Staging'
      , N'SELECT EmployeeID, LastName, FirstName, Title, ReportsTo FROM [dbo].[Employees]'
      , N'Northwind', N'[dbo].[Northwind_Employees]', N'stagingDB'
      , N'c:\SSISErrors\Employees.raw', 1
UNION ALL
SELECT 1, N'Import_Northwind_OrderDetails', N'1 Staging'
      , N'SELECT OrderID, ProductID, UnitPrice, Quantity, Discount FROM [dbo].[Order Details]'
      , N'Northwind', N'[dbo].[Northwind_OrderDetails]', N'stagingDB'
      , N'c:\SSISErrors\OrderDetails.raw', 1
UNION ALL
SELECT 1, N'Import_Northwind_Orders', N'1 Staging'
      , N'SELECT OrderID, CustomerID, EmployeeID, OrderDate FROM [dbo].[Orders]'
      , N'Northwind', N'[dbo].[Northwind_Orders]', N'stagingDB'
      , N'c:\SSISErrors\Orders.raw', 1
UNION ALL
SELECT 1, N'Import_Northwind_Products', N'1 Staging'
      , N'SELECT ProductID, ProductName, SupplierID, CategoryID FROM [dbo].[Products]'
      , N'Northwind', N'[dbo].[Northwind_Products]', N'stagingDB'
      , N'c:\SSISErrors\Products.raw', 1
UNION ALL
SELECT 1, N'Import_Northwind_Suppliers', N'1 Staging'
      , N'SELECT SupplierID, CompanyName FROM [dbo].[Suppliers]'
      , N'Northwind', N'[dbo].[Northwind_Suppliers]', N'stagingDB'
      , N'c:\SSISErrors\Suppliers.raw', 1
UNION ALL
SELECT 1, N'Import_DimCustomer', N'2 Dimensions'
      , N'SELECT * FROM [dbo].[vwDimCustomer]'
      , N'DWH', N'[dbo].[dimCustomer]', N'DWH'
      , N'c:\SSISErrors\dimCustomer.raw', 1
UNION ALL
SELECT 1, N'Import_DimEmployee', N'2 Dimensions'
      , N'SELECT * FROM [dbo].[vwDimEmployee]'
      , N'DWH', N'[dbo].[dimEmployee]', N'DWH'
      , N'c:\SSISErrors\dimEmployee.raw', 1
UNION ALL
SELECT 1, N'Import_DimProduct', N'2 Dimensions'
      , N'SELECT * FROM [dbo].[vwDimProduct]'
      , N'DWH', N'[dbo].[dimProduct]', N'DWH'
      , N'c:\SSISErrors\dimProduct.raw', 1
UNION ALL
SELECT 1, N'Import_FactOrder', N'3 Fact tables'
      , N'SELECT * FROM [dbo].[vwFactOrder]'
      , N'DWH', N'[dbo].[factOrder]', N'DWH'
      , N'c:\SSISErrors\factOrder.raw', 1;

SELECT * FROM [dbo].[SsisPackages];

You will see the following results:

clip_image001
The Biml code to Create the packages

Next I will implement Biml.script to loop through these rows. Biml script is enclosed in the following tag: <# script #>

I used the xml code created in part 3 and changed it into:

<#@ template language=”C#” hostspecific=”true”#>
<#@ import namespace=”System.Data” #>
<Biml xmlns=”http://schemas.varigence.com/biml.xsd”>
       <Connections>
             <OleDbConnection Name=”metaDB”
                              ConnectionString=”Data Source=.;Initial Catalog=MyDwh_meta;Provider=SQLNCLI10.1;Integrated Security=SSPI;”
                              CreatePackageConfiguration=”true”></OleDbConnection>
             <OleDbConnection Name=”Northwind”
                              ConnectionString=”Data Source=.;Initial Catalog=Northwind;Provider=SQLNCLI10.1;Integrated Security=SSPI;”
                              CreatePackageConfiguration=”true”></OleDbConnection>
             <OleDbConnection Name=”stagingDB”
                              ConnectionString=”Data Source=.;Initial Catalog=MyDwh_st;Provider=SQLNCLI10.1;Integrated Security=SSPI;”
                              CreatePackageConfiguration=”true”></OleDbConnection>
             <OleDbConnection Name=”DWH”
                              ConnectionString=”Data Source=.;Initial Catalog=MyDwh;Provider=SQLNCLI10.1;Integrated Security=SSPI;”
                              CreatePackageConfiguration=”true”></OleDbConnection>
       </Connections>
     <Packages>
<# string sConn = “Provider=SQLNCLI10;Server=.;Initial Catalog=MyDwh_meta;Integrated Security=SSPI;”; #>
         <# string sSQL = “SELECT * FROM SsisPackages WHERE PackageType=1”; #>
         <# DataTable tblPackages = ExternalDataAccess.GetDataTable(sConn,sSQL); #>
         <# foreach (DataRow row in tblPackages.Rows){ #>
            <Package Name=”<#=row[“PackageName”]#>” ConstraintMode=”Linear” AutoCreateConfigurationsType=”None”>
                   <!–Variables–>
                   <Variables>
                         <Variable Name=”AuditID” DataType=”Int32″ Namespace=”User” >0</Variable>
                         <Variable Name=”PackageID” DataType=”Int32″ Namespace=”User” ><#=row[“PackageID”]#></Variable>
                         <Variable Name=”RcStart” DataType=”Int32″ Namespace=”User” >0</Variable>
                         <Variable Name=”RcExtract” DataType=”Int32″ Namespace=”User” >0</Variable>
                         <Variable Name=”RcInsert” DataType=”Int32″ Namespace=”User” >0</Variable>
                         <Variable Name=”RcUpdate” DataType=”Int32″ Namespace=”User” >0</Variable>
                         <Variable Name=”RcError” DataType=”Int32″ Namespace=”User”>0</Variable>
                         <Variable Name=”RcEnd” DataType=”Int32″ Namespace=”User”>0</Variable>
                         <Variable Name=”ErrorFile” DataType=”String” Namespace=”User”><#=row[“ErrorFile”]#></Variable>
                   </Variables>
                   <!–Package Configurations–>
                   <PackageConfigurations>
                         <PackageConfiguration Name=”MetaDB” >
                              <EnvironmentVariableInput EnvironmentVariable=”Northwind_Config”></EnvironmentVariableInput>
                              <ConfigurationValues>
                                    <ConfigurationValue DataType=”String”
                                                                  PropertyPath=”\Package.Connections[metaDB].Properties[ConnectionString]”
                                                                  Name=”metaDB”
                                                                  Value=”Data Source=.;Initial Catalog=MyDwh_meta;Provider=SQLNCLI10.1;Integrated Security=SSPI;”>
                                    </ConfigurationValue>
                              </ConfigurationValues>
                         </PackageConfiguration>
                         <PackageConfiguration Name=”<#=row[“SourceConnection”]#>” ConnectionName=”metaDB”>
                              <ExternalTableInput Table=”[dbo].[SsisConfiguration]” />
                         </PackageConfiguration>
<#if (row[“SourceConnection”].ToString()!=row[“DestinationConnection”].ToString()) {#>
                        <PackageConfiguration Name=”<#=row[“DestinationConnection”]#>” ConnectionName=”metaDB”>
                              <ExternalTableInput Table=”[dbo].[SsisConfiguration]” />
                         </PackageConfiguration>
<#}#>
                  </PackageConfigurations>
                   <!–Tasks–>
                   <Tasks>
                         <!–Get Inital RowCount–>
                         <ExecuteSQL Name=”SQL – Get Initial RowCount” ConnectionName=”<#=row[“DestinationConnection”]#>” ResultSet=”SingleRow”>
                              <DirectInput>SELECT COUNT(*) AS RcStart FROM <#=row[“DestinationName”]#></DirectInput>
                              <Results>
                                    <Result Name=”RcStart” VariableName=”User.RcStart” />
                              </Results>
                         </ExecuteSQL>
<# if((bool)row[“TruncateDestination”]){ #>
                        <!–Truncate Table–>
                         <ExecuteSQL Name=”SQL – Truncate Destination Table” ConnectionName=”<#=row[“DestinationConnection”]#>” ResultSet=”None”>
                              <DirectInput>TRUNCATE TABLE <#=row[“DestinationName”]#></DirectInput>
                         </ExecuteSQL>
<# } #>
                        <!–Get auditID–>
                         <ExecuteSQL Name=”SQL – Create AuditRow and Get AuditID” ConnectionName=”metaDB” ResultSet=”SingleRow”>
                              <DirectInput>EXEC [dbo].[uspNewAuditRow] ?,?,?,?,?,?</DirectInput>
                              <Parameters>
                                    <Parameter Name=”0″ Direction=”Input” DataType=”Int32″ VariableName=”User.PackageID” />
                                    <Parameter Name=”1″ Direction=”Input” DataType=”Guid” VariableName=”System.PackageID” />
                                    <Parameter Name=”2″ Direction=”Input” DataType=”Guid” VariableName=”System.VersionGUID” />
                                    <Parameter Name=”3″ Direction=”Input” DataType=”Guid” VariableName=”System.ExecutionInstanceGUID” />
                                    <Parameter Name=”4″ Direction=”Input” DataType=”DateTime” VariableName=”System.StartTime” />
                                    <Parameter Name=”5″ Direction=”Input” DataType=”Int32″ VariableName=”User.RcStart” />
                              </Parameters>
                              <Results>
                                    <Result Name=”0″ VariableName=”User.AuditID” />
                              </Results>
                         </ExecuteSQL>
                         <!–Import Table–>
                         <Dataflow Name=”DFT – <#=row[“PackageName”]#>“>
                              <Transformations>
                                    <OleDbSource Name=”ODS – <#=row[“SourceConnection”]#>” ConnectionName=”<#=row[“SourceConnection”]#>“>
                                          <DirectInput><#=row[“SourceInput”]#></DirectInput>
                                    </OleDbSource>
                                    <RowCount Name=”RC – Extracted Rows”  VariableName=”User.RcExtract” />
                                    <DerivedColumns Name=”DC – Add Metadata”>
                                          <Columns>
                                                <Column Name=”InsertAuditID” DataType=”Int32″ ReplaceExisting=”false”>@AuditID</Column>
                                                <Column Name=”UpdateAuditID” DataType=”Int32″ ReplaceExisting=”false”>@AuditID</Column>
                                                <Column Name=”RowStartDate” DataType=”DateTime” ReplaceExisting=”false”>@StartTime</Column>
                                                <Column Name=”RowEndDate” DataType=”DateTime” ReplaceExisting=”false”>(DT_dbDATE)”12/31/9999″</Column>
                                          </Columns>
                                    </DerivedColumns>
                                    <OleDbDestination Name=”ODD-BulkLoad <#=row[“DestinationConnection”]#>” ConnectionName=”<#=row[“DestinationConnection”]#>“>
                                          <ExternalTableOutput Table=”<#=row[“DestinationName”]#>“/>
                                          <ErrorHandling ErrorRowDisposition=”RedirectRow” TruncationRowDisposition=”RedirectRow” />
                                    </OleDbDestination>
                                    <OleDbDestination Name=”ODD-RowbyRow <#=row[“DestinationConnection”]#>” ConnectionName=”<#=row[“DestinationConnection”]#>” UseFastLoadIfAvailable=”false” >
                                          <InputPath OutputPathName=”ODD-BulkLoad <#=row[“DestinationConnection”]#>.Error” />
                                          <ExternalTableOutput Table=”<#=row[“DestinationName”]#>“/>
                                          <ErrorHandling ErrorRowDisposition=”RedirectRow” TruncationRowDisposition=”RedirectRow” />
                                    </OleDbDestination>
                                    <RowCount Name=”RC – Error Rows” VariableName=”User.RcError”>
                                          <InputPath OutputPathName=”ODD-RowbyRow <#=row[“DestinationConnection”]#>.Error” />
                                    </RowCount>
                                    <RawFileDestination Name=”RTF – Error File”>
                                          <FileFromVariableOutput VariableName=”User.ErrorFile”/>
                                    </RawFileDestination>
                              </Transformations>
                         </Dataflow>
                         <!–Get Final RowCount–>
                         <ExecuteSQL Name=”SQL – Get Final RowCount” ConnectionName=”<#=row[“DestinationConnection”]#>” ResultSet=”SingleRow”>
                              <DirectInput>SELECT COUNT(*) AS RcEnd FROM <#=row[“DestinationName”]#></DirectInput>
                              <Results>
                                    <Result Name=”RcEnd” VariableName=”User.RcEnd” />
                              </Results>
                         </ExecuteSQL>
                         <!–Update Audit Row–>
                         <ExecuteSQL Name=”SQL – Update Audit Row” ConnectionName=”metaDB” ResultSet=”None”>
                              <DirectInput>EXEC [dbo].[uspUpdateAuditRow] ?,?,?,?,?,?</DirectInput>
                              <Parameters>
                                    <Parameter Name=”0″ Direction=”Input” DataType=”Int32″ VariableName=”User.AuditID” />
                                    <Parameter Name=”1″ Direction=”Input” DataType=”Int32″ VariableName=”User.RcExtract” />
                                    <Parameter Name=”2″ Direction=”Input” DataType=”Int32″ VariableName=”User.RcInsert” />
                                    <Parameter Name=”3″ Direction=”Input” DataType=”Int32″ VariableName=”User.RcUpdate” />
                                    <Parameter Name=”4″ Direction=”Input” DataType=”Int32″ VariableName=”User.RcError” />
                                    <Parameter Name=”5″ Direction=”Input” DataType=”Int32″ VariableName=”User.RcEnd” />
                              </Parameters>
                         </ExecuteSQL>
                   </Tasks>
             </Package>
<# } #>
       </Packages>
</Biml>

9. Create the packages – and test them.
  • In BIDS right click the project in solution explorer and select Add New Biml File.
  • Rename the file to: Create the packages.biml
  • Copy the Biml code on this page to the newly created biml.file.
  • Right click the Create the packages.biml in Solution Explorer and select Generate SSIS Packages.
  • Open one of the newly created package Import_Northwind_Categories.dtsx and examine the contents.
  • Test the package by Executing the package

Creating a Meta Data Driven SSIS Solution with Biml – 3. Creating a Table Package

In this third blogpost on creating a meta data driven ssis solution I’ll create a biml document with which you can create a package that will import the contents of a query in the source database to a table in the destination database. In blogpost 4 I will enhance this document to include biml script that loops through the entries in the meta data table SsisPackages and create a package for every entry  in this table.

The design of the TablePackage is loosely based on the design used by Joy Mundy and Warren Thornthwaite in their book:
The Microsoft Data warehouse Toolkit With SQL Server 2008 R2. So it will include a system for logging results, adding lineage to the imported rows in the data warehouse and creates error files if appropriate.

The Control Flow
image

The control flow consist of:

  1. An Execute SQL Task, retrieving the number of rows in the destination table at the start of the package.
  2. (Optional, depends on a setting in metadata table) An Execute SQL Task, Truncating the destination table.
  3. An Execute SQL statement to the meta database, creating a new row in the SsisAudit table and returning the AuditID of the created row. Which is used for Lineage in the Data Flow.
  4. The Data Flow, see next paragraph
  5. An Execute SQL Task, retrieving the number of rows in the destination table at the end of the package.
  6. An Execute SQL Task to update the audit row with the end time and several row counters.
The Data Flow

The data flow will first try to import the data with a bulk-load setting (step 4). It will try to load any error rows using a row by row setting. (step5) Any remaining errors rows will be written to a raw text file.(step 7) In the data flow we will also count the extracted rows(step 2) and any error rows (step 6). The package will also add the meta data (AuditID and time stamps) to the data flow (step 3)

The Biml code

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
    <Connections>
        <OleDbConnection Name="metaDB"
                         ConnectionString="Data Source=.;Initial Catalog=MyDwh_meta;Provider=SQLNCLI10.1;Integrated Security=SSPI;"
                         CreatePackageConfiguration="true"></OleDbConnection>
        <OleDbConnection Name="Northwind"
                         ConnectionString="Data Source=.;Initial Catalog=Northwind;Provider=SQLNCLI10.1;Integrated Security=SSPI;"
                         CreatePackageConfiguration="true"></OleDbConnection>
        <OleDbConnection Name="stagingDB"
                         ConnectionString="Data Source=.;Initial Catalog=MyDwh_st;Provider=SQLNCLI10.1;Integrated Security=SSPI;"
                         CreatePackageConfiguration="true"></OleDbConnection>
        <OleDbConnection Name="DWH"
                         ConnectionString="Data Source=.;Initial Catalog=MyDwh;Provider=SQLNCLI10.1;Integrated Security=SSPI;"
                         CreatePackageConfiguration="true"></OleDbConnection>
    </Connections>
    <Packages>
        <Package Name="Import_Northwind_Categories" ConstraintMode="Linear" AutoCreateConfigurationsType="None">
            <!--Variables-->
            <Variables>
                <Variable Name="AuditID" DataType="Int32" Namespace="User" >0</Variable>
                <Variable Name="PackageID" DataType="Int32" Namespace="User" >99</Variable>
                <Variable Name="RcStart" DataType="Int32" Namespace="User" >0</Variable>
                <Variable Name="RcExtract" DataType="Int32" Namespace="User" >0</Variable>
                <Variable Name="RcInsert" DataType="Int32" Namespace="User" >0</Variable>
                <Variable Name="RcUpdate" DataType="Int32" Namespace="User" >0</Variable>
                <Variable Name="RcError" DataType="Int32" Namespace="User">0</Variable>
                <Variable Name="RcEnd" DataType="Int32" Namespace="User">0</Variable>
                <Variable Name="ErrorFile" DataType="String" Namespace="User">c:\myErrorFile.raw</Variable>
            </Variables>
            <!--Package Configurations-->
            <PackageConfigurations>
                <PackageConfiguration Name="MetaDB" >
                    <EnvironmentVariableInput EnvironmentVariable="Northwind_Config"></EnvironmentVariableInput>
                    <ConfigurationValues>
                        <ConfigurationValue DataType="String"
                                            PropertyPath="\Package.Connections[metaDB].Properties[ConnectionString]"
                                            Name="metaDB"
                                            Value="Data Source=.;Initial Catalog=ROCWB_MIS_config;Provider=SQLNCLI10.1;Integrated Security=SSPI;">
                        </ConfigurationValue>
                    </ConfigurationValues>
                </PackageConfiguration>
                <PackageConfiguration Name="Northwind" ConnectionName="metaDB">
                    <ExternalTableInput Table="[dbo].[SsisConfiguration]" />
                </PackageConfiguration>
                <PackageConfiguration Name="stagingDB" ConnectionName="metaDB">
                    <ExternalTableInput Table="[dbo].[SsisConfiguration]" />
                </PackageConfiguration>
                <PackageConfiguration Name="DWH" ConnectionName="metaDB">
                    <ExternalTableInput Table="[dbo].[SsisConfiguration]" />
                </PackageConfiguration>
            </PackageConfigurations>
            <!--Tasks-->
            <Tasks>
                <!--Get Inital RowCount-->
                <ExecuteSQL Name="SQL - Get Initial RowCount" ConnectionName="stagingDB" ResultSet="SingleRow">
                    <DirectInput>SELECT COUNT(*) AS RcStart FROM [dbo].[Northwind_Categories]</DirectInput>
                    <Results>
                        <Result Name="RcStart" VariableName="User.RcStart" />
                    </Results>
                </ExecuteSQL>
                <!--Truncate Table-->
                <ExecuteSQL Name="SQL - Truncate Destination Table" ConnectionName="stagingDB" ResultSet="None">
                    <DirectInput>TRUNCATE TABLE [dbo].[Northwind_Categories]</DirectInput>
                </ExecuteSQL>
                <!--Get auditID-->
                <ExecuteSQL Name="SQL - Create AuditRow and Get AuditID" ConnectionName="metaDB" ResultSet="SingleRow">
                    <DirectInput>EXEC [dbo].[uspNewAuditRow] ?,?,?,?,?,?</DirectInput>
                    <Parameters>
                        <Parameter Name="0" Direction="Input" DataType="Int32" VariableName="User.PackageID" />
                        <Parameter Name="1" Direction="Input" DataType="Guid" VariableName="System.PackageID" />
                        <Parameter Name="2" Direction="Input" DataType="Guid" VariableName="System.VersionGUID" />
                        <Parameter Name="3" Direction="Input" DataType="Guid" VariableName="System.ExecutionInstanceGUID" />
                        <Parameter Name="4" Direction="Input" DataType="DateTime" VariableName="System.StartTime" />
                        <Parameter Name="5" Direction="Input" DataType="Int32" VariableName="User.RcStart" />
                    </Parameters>
                    <Results>
                        <Result Name="0" VariableName="User.AuditID" />
                    </Results>
                </ExecuteSQL>
                <!--Import DIM_KOSTENPLAATS-->
                <Dataflow Name="DFT - Import_Northwind_Categories">
                    <Transformations>
                        <OleDbSource Name="ODS - Northwind" ConnectionName="Northwind">
                            <DirectInput>SELECT CategoryID, CategoryName FROM [dbo].[Categories]</DirectInput>
                        </OleDbSource>
                        <RowCount Name="RC - Extracted Rows"  VariableName="User.RcExtract" />
                        <DerivedColumns Name="DC - Add Metadata">
                            <Columns>
                                <Column Name="InsertAuditID" DataType="Int32" ReplaceExisting="false">@AuditID</Column>
                                <Column Name="UpdateAuditID" DataType="Int32" ReplaceExisting="false">@AuditID</Column>
                                <Column Name="RowStartDate" DataType="DateTime" ReplaceExisting="false">@StartTime</Column>
                                <Column Name="RowEndDate" DataType="DateTime" ReplaceExisting="false">(DT_dbDATE)"12/31/9999"</Column>
                            </Columns>
                        </DerivedColumns>
                        <OleDbDestination Name="ODD-BulkLoad stagingDB" ConnectionName="stagingDB">
                            <ExternalTableOutput Table="[dbo].[Northwind_Categories]"/>
                            <ErrorHandling ErrorRowDisposition="RedirectRow" TruncationRowDisposition="RedirectRow" />
                        </OleDbDestination>
                        <OleDbDestination Name="ODD-RowbyRow stagingDB" ConnectionName="stagingDB" UseFastLoadIfAvailable="false" >
                            <InputPath OutputPathName="ODD-BulkLoad stagingDB.Error" />
                            <ExternalTableOutput Table="[dbo].[Northwind_Categories]"/>
                            <ErrorHandling ErrorRowDisposition="RedirectRow" TruncationRowDisposition="RedirectRow" />
                        </OleDbDestination>
                        <RowCount Name="RC - Error Rows" VariableName="User.RcError">
                            <InputPath OutputPathName="ODD-RowbyRow stagingDB.Error" />
                        </RowCount>
                        <RawFileDestination Name="RTF - Error File">
                            <FileFromVariableOutput VariableName="User.ErrorFile"/>
                        </RawFileDestination>
                    </Transformations>
                </Dataflow>
                <!--Get Final RowCount-->
                <ExecuteSQL Name="SQL - Get Final RowCount" ConnectionName="stagingDB" ResultSet="SingleRow">
                    <DirectInput>SELECT COUNT(*) AS RcEnd FROM [dbo].[Northwind_Categories]</DirectInput>
                    <Results>
                        <Result Name="RcEnd" VariableName="User.RcEnd" />
                    </Results>
                </ExecuteSQL>
                <!--Update Audit Row-->
                <ExecuteSQL Name="SQL - Update Audit Row" ConnectionName="metaDB" ResultSet="None">
                    <DirectInput>EXEC [dbo].[uspUpdateAuditRow] ?,?,?,?,?,?</DirectInput>
                    <Parameters>
                        <Parameter Name="0" Direction="Input" DataType="Int32" VariableName="User.AuditID" />
                        <Parameter Name="1" Direction="Input" DataType="Int32" VariableName="User.RcExtract" />
                        <Parameter Name="2" Direction="Input" DataType="Int32" VariableName="User.RcInsert" />
                        <Parameter Name="3" Direction="Input" DataType="Int32" VariableName="User.RcUpdate" />
                        <Parameter Name="4" Direction="Input" DataType="Int32" VariableName="User.RcError" />
                        <Parameter Name="5" Direction="Input" DataType="Int32" VariableName="User.RcEnd" />
                    </Parameters>
                </ExecuteSQL>
            </Tasks>
        </Package>
    </Packages>
</Biml>

7. Create a package – and test it.
image
  • Open Business Intelligence Development Studio and create a new Integration Services Project.
  • Right click  the project in solution explorer and select Add New Biml File.
  • Rename the file to: Create a package.biml
  • Copy the Biml code on this page to the newly created biml.file.
  • Right click the Create a package.biml in Solution Explorer and select Generate SSIS Packages.
  • Open the newly created package Import_Northwind_Categories.dtsx and examine the contents.
  • Test the package by Executing the package

Creating a Meta Data Driven SSIS Solution with Biml – 2. Setting the stage

This is the second blogpost on the creation of a Meta Data Driven SSIS Solution using Biml. In this post we are settings the stage. We are going to create a simple Kimball based data warehouse with 3 dimensions:

  • dimCustomer
  • dimEmployee
  • dimProduct

and one fact table: factOrder.

Furthermore we’ll implement a staging area and a meta database.

1. Create the databases:

USE [master]
GO
CREATE DATABASE [MyDwh]
GO
CREATE DATABASE [MyDwh_st]
GO
CREATE DATABASE [MyDwh_meta]
GO

2. Create the staging tables:

USE [MyDwh_st]
GO
CREATE TABLE [dbo].[Northwind_Suppliers](
   [SupplierID] [int] NOT NULL,
   [CompanyName] [nvarchar](40) NOT NULL,
   [InsertAuditID] [int] NULL,
   [UpdateAuditID] [int] NULL,
   [RowStartDate] [datetime] NULL,
   [RowEndDate] [datetime] NULL);
CREATE TABLE [dbo].[Northwind_Products](
   [ProductID] [int] NOT NULL,
   [ProductName] [nvarchar](40) NOT NULL,
   [SupplierID] [int] NULL,
   [CategoryID] [int] NULL,
   [InsertAuditID] [int] NULL,
   [UpdateAuditID] [int] NULL,
   [RowStartDate] [datetime] NULL,
   [RowEndDate] [datetime] NULL);
CREATE TABLE [dbo].[Northwind_Orders](
   [OrderID] [int] NOT NULL,
   [CustomerID] [nchar](5) NULL,
   [EmployeeID] [int] NULL,
   [OrderDate] [datetime] NULL,
   [InsertAuditID] [int] NULL,
   [UpdateAuditID] [int] NULL,
   [RowStartDate] [datetime] NULL,
   [RowEndDate] [datetime] NULL);
CREATE TABLE [dbo].[Northwind_OrderDetails](
   [OrderID] [int] NOT NULL,
   [ProductID] [int] NOT NULL,
   [UnitPrice] [money] NOT NULL,
   [Quantity] [smallint] NOT NULL,
   [Discount] [real] NOT NULL,
   [InsertAuditID] [int] NULL,
   [UpdateAuditID] [int] NULL,
   [RowStartDate] [datetime] NULL,
   [RowEndDate] [datetime] NULL);
CREATE TABLE [dbo].[Northwind_Employees](
   [EmployeeID] [int] NOT NULL,
   [LastName] [nvarchar](20) NOT NULL,
   [FirstName] [nvarchar](10) NOT NULL,
   [Title] [nvarchar](30) NULL,
   [ReportsTo] [int] NULL,
   [InsertAuditID] [int] NULL,
   [UpdateAuditID] [int] NULL,
   [RowStartDate] [datetime] NULL,
   [RowEndDate] [datetime] NULL);
CREATE TABLE [dbo].[Northwind_Customers](
   [CustomerID] [nchar](5) NOT NULL,
   [CompanyName] [nvarchar](40) NOT NULL,
   [City] [nvarchar](15) NULL,
   [Country] [nvarchar](15) NULL,
   [InsertAuditID] [int] NULL,
   [UpdateAuditID] [int] NULL,
   [RowStartDate] [datetime] NULL,
   [RowEndDate] [datetime] NULL);
CREATE TABLE [dbo].[Northwind_Categories](
   [CategoryID] [int] NOT NULL,
   [CategoryName] [nvarchar](15) NOT NULL,
   [InsertAuditID] [int] NULL,
   [UpdateAuditID] [int] NULL,
   [RowStartDate] [datetime] NULL,
   [RowEndDate] [datetime] NULL);

3. Create the dimension and fact tables

USE [MyDwh]
GO
CREATE TABLE dimCustomer (
   CustomerKey INT IDENTITY(1,1) NOT NULL
   ,CustomerID NCHAR(5) NOT NULL
   ,CustomerName NVARCHAR(40)
   ,City NVARCHAR(15)
   ,Country NVARCHAR(15)
   ,InsertAuditID INT
   ,UpdateAuditID INT
   ,RowStartDate DATETIME
   ,RowEndDate DATETIME);
GO
CREATE TABLE dimEmployee (
   EmployeeKey INT IDENTITY(1,1) NOT NULL
   ,EmployeeID INT NOT NULL
   ,EmployeeName NVARCHAR(50)
   ,Title NVARCHAR(30)
   ,Manager NVARCHAR(50)
   ,InsertAuditID INT
   ,UpdateAuditID INT
   ,RowStartDate DATETIME
   ,RowEndDate DATETIME);
GO
CREATE TABLE dimProduct (
   ProductKey INT IDENTITY(1,1) NOT NULL
   ,ProductID INT NOT NULL
   ,ProductName NVARCHAR(40)
   ,Category NVARCHAR(15)
   ,Supplier NVARCHAR(40)
   ,InsertAuditID INT
   ,UpdateAuditID INT
   ,RowStartDate DATETIME
   ,RowEndDate DATETIME);
GO
CREATE TABLE factOrder (
   OrderDateKey INT NOT NULL
   ,CustomerKey INT NOT NULL
   ,EmployeeKey INT NOT NULL
   ,ProductKey INT NOT NULL
   ,OrderQuantity INT
   ,OrderAmount money
   ,DiscountAmount money
   ,InsertAuditID INT
   ,UpdateAuditID INT
   ,RowStartDate DATETIME
   ,RowEndDate DATETIME);
GO

4. Create the views for the dimensions and fact tables

USE [MyDwh]
GO
CREATE VIEW [dbo].[vwDimCustomer] AS
SELECT [CustomerID]
      ,[CompanyName] AS CustomerName
      ,[City]
      ,[Country]
FROM  MyDwh_st.dbo.Northwind_Customers;
GO

CREATE VIEW vwDimProduct AS
SELECT pr.ProductID
   , pr.ProductName
   , ca.CategoryName AS Category
   , su.CompanyName AS Supplier
FROM MyDwh_st.dbo.Northwind_Categories AS ca
INNER JOIN MyDwh_st.dbo.Northwind_Products AS pr ON ca.CategoryID = pr.CategoryID
INNER JOIN MyDwh_st.dbo.Northwind_Suppliers AS su ON pr.SupplierID = su.SupplierID;
GO

CREATE VIEW vwDimEmployee AS
SELECT em.[EmployeeID]
      ,em.[FirstName] + ' ' + em.[LastName] AS EmployeeName
      ,em.Title
      ,ma.[FirstName] + ' ' + ma.[LastName] AS Manager
FROM MyDwh_st.dbo.Northwind_Employees em
LEFT OUTER JOIN  MyDwh_st.dbo.Northwind_Employees ma ON ma.[EmployeeID]=em.[ReportsTo];
GO

CREATE VIEW vwFactOrders AS
SELECT OrderDateKey=YEAR(o.OrderDate)*10000+MONTH(o.OrderDate)*100+DAY(o.OrderDate)
   , CustomerKey=ISNULL(cu.CustomerKey,-1)
   , EmployeeKey=ISNULL(em.EmployeeKey,-1)
   , ProductKey=ISNULL(pr.ProductKey,-1)
   , OrderQuantity=od.Quantity
   , OrderAmount=od.Quantity * od.UnitPrice
   , DiscountAmount=CONVERT(money, od.Quantity * od.UnitPrice * od.Discount)
FROM  MyDwh_st.dbo.[Northwind_OrderDetails] od
INNER JOIN MyDwh_st.dbo.Northwind_Orders o ON o.OrderID = od.OrderID
LEFT OUTER JOIN dbo.dimCustomer cu ON cu.CustomerID=o.CustomerID AND cu.RowEndDate>o.OrderDate
LEFT OUTER JOIN dbo.dimEmployee em ON em.EmployeeID=o.EmployeeID AND em.RowEndDate>o.OrderDate
LEFT OUTER JOIN dbo.dimProduct pr ON pr.ProductID=od.ProductID AND pr.RowEndDate>o.OrderDate;
GO

5. Create the objects for the meta database

USE [MyDwh_meta]
GO

CREATE TABLE [dbo].[SsisPackages](
   [PackageID] [int] IDENTITY(1,1) NOT NULL,
   [PackageType] [int] NULL,
   [PackageName] [nvarchar](50) NULL,
   [PackageGroup] [nvarchar](50) NULL,
   [SourceInput] [nvarchar](MAX) NULL,
   [SourceConnection] [nvarchar](50) NULL,
   [DestinationName] [nvarchar](50) NULL,
   [DestinationConnection] [nvarchar](50) NULL,
   [ErrorFile] [nvarchar](MAX) NULL,
   [TruncateDestination] [bit] NULL,
CONSTRAINT [PK_ssisPackages] PRIMARY KEY CLUSTERED
(  [PackageID] ASC));
GO
CREATE TABLE [dbo].[SsisConfiguration](
   [ConfigurationFilter] [nvarchar](255) NOT NULL,
   [ConfiguredValue] [nvarchar](1000) NULL,
   [PackagePath] [nvarchar](255) NOT NULL,
   [ConfiguredValueType] [nvarchar](20) NOT NULL);
GO
CREATE TABLE [dbo].[SsisAudit](
   [AuditID] [int] IDENTITY(1,1) NOT NULL,
   [PackageID] [int] NOT NULL,
   [PackageGUID] [uniqueidentifier] NULL,
   [PackageVersionGUID] [uniqueidentifier] NULL,
   [ExecutionID] [uniqueidentifier] NULL,
   [PackageStartDate] [datetime] NULL,
   [PackageEnddate] [datetime] NULL,
   [ExtractRowCount] [int] NULL,
   [InsertRowCount] [int] NULL,
   [UpdateRowCount] [int] NULL,
   [ErrorRowCount] [int] NULL,
   [TableInitialRowCount] [int] NULL,
   [TableFinalRowCount] [int] NULL,
   [IsProcessed] [bit] NULL,
CONSTRAINT [PK_SsisAudit] PRIMARY KEY CLUSTERED
(  [AuditID] ASC));
GO

CREATE PROCEDURE [dbo].[uspUpdateAuditRow]
    @AuditID INT
    ,@ExtractRowCount INT
    ,@InsertRowCount INT
    ,@UpdateRowCount INT
    ,@ErrorRowCount INT
    ,@TableFinalRowCount INT
AS
UPDATE [dbo].[SsisAudit]
SET [PackageEnddate]=SYSDATETIME()
    ,[ExtractRowCount]=@ExtractRowCount
    ,[InsertRowCount]=@InsertRowCount
    ,[UpdateRowCount]=@UpdateRowCount
    ,[ErrorRowCount]=@ErrorRowCount
    ,[TableFinalRowCount]=@TableFinalRowCount
    ,[IsProcessed]=1
WHERE [AuditID]=@AuditID;
GO

CREATE PROCEDURE [dbo].[uspNewAuditRow]
    @PackageID INT
    ,@PackageGUID UNIQUEIDENTIFIER
    ,@PackageVersionGUID UNIQUEIDENTIFIER
    ,@ExecutionID UNIQUEIDENTIFIER
    ,@PackageStartDate DATETIME
    ,@TableInitialRowCount INT
AS
INSERT INTO [dbo].[SsisAudit]
           ([PackageID]
           ,[PackageGUID]
           ,[PackageVersionGUID]
           ,[ExecutionID]
           ,[PackageStartDate]
           ,[TableInitialRowCount]
           ,[IsProcessed])
VALUES
           (@PackageID
           ,@PackageGUID
           ,@PackageVersionGUID
           ,@ExecutionID
           ,@PackageStartDate
           ,@TableInitialRowCount
           ,0);
     
SELECT CAST(SCOPE_IDENTITY() AS INT) AS AuditID
GO

6. Create an environment variable to store the connection

I’ll use package configurations to make it easier to deploy the solution to different servers (production, test, development) These package configuration will hold the information for the several connections. The connection to the meta database is stored in an environment variable. The other package configurations will be stored in the SsisConfigurations table in the meta database.

So create an environment variable with the name Northwind_Config and value Data Source=.;Initial Catalog=MyDwh_meta;Provider=SQLNCLI10.1;Integrated Security=SSPI;

Windows Explorer – right click the (my)computer node – select Properties – under tab Advanced click the buttton Environment Variables – under System variables click New.

Creating a Meta Data Driven SSIS Solution with Biml – 1. Overview

In this series of blog posts I’ll describe and walk you through the creation of a simple meta data driven SSIS solution using Biml.

Biml (Business Intelligence Markup Language) is an XML-based language that allows you to describe your BI Solution in a declarative fashion.

De Biml Package Generator is a new feature of BIDShelper a free add-in for the Microsoft Business Intelligence Development Studio (BIDS). It allows you to create packages based on the Biml documents. Furthermore you can use BimlScript to automate the creation of these packages.

I will use the Northwind sample database as the source for a simple Kimball based data warehouse. (MyDwh) I will also use a staging database (MyDwh_st) and a meta database (MyDwh_meta) that will include a table with a source to target mapping that will drive the creation of SSIS packages.

The series will consists of the following posts:

  1. Overview (This page)
  2. Setting the stage: Primarily a description of the SQL server databases and SQL statements to create the environment.
  3. Creating a TablePackage: Creating a Biml document to create a package to import one table.
  4. Creating the TablePackages: Using Bimlscript to automate the creation of packages
  5. Creating the MasterPackage