Thursday, March 23, 2006

SS2K DTS: Write File Task, Dynamic Properties

In this post I will explain why using a 'Dynamic Properties Task' in a SQL Server 2000 DTS package helps you centralize the definition and management of parameters needed in several tasks and connections (data stores) in the package. The alternative to using dynamic properties is to "hard code" property values all over the package, which makes maintaining the package harder and more error prone. Most articles about dynamic properties shows how to use an external INI-file as the source of the parameters, but I rather prefer DTS global variables as this makes the package self-contained.

I will use a 'Transform Data Task' of type 'Write File' as an example, as this task contains a (output) FilePath property that is likely to change when the package is deployed to another server. In addition, any package that writes files, will most likely require some existing file cleanup mechanism. Deleting existing files is important to ensure that the package can be rerun in the event of a failure, and still produce the same results over and over (reliability).

The 'write file' task type is a special transformation that requires two source columns in the input, one for the target filename and one for the actual data to write to the file. In addition, the properties of the 'write file' task must be edited to specify the output path of the transformation, the file type, options such as overwrite existing files, append to existing files, etc.

If you use the property dialog of the transformation, you are effectively hard coding the property values deep inside the each task. As the package grows bigger, it becomes harder and harder to remeber all the places that contains values that need to be re-configured when deploying to another server or changing the server setup. The documentation of how to configure the package will also become a real mess, as you need to describe how to configure every single element of the package.

Whenever you are about to change a default property value of a DTS element (object), you should consider if the value is subject to requiring re-configuration dependent on the deployment environment. Such values are good candidates for becoming configuration parameters of the DTS package. There are several mechanisms in DTS for storing common configuration values, I prefer using global variables as they are an integral part of the package and easy to administer using the DTS designer.

Use the 'Global Variables' folder in the 'DTS Package Properties' dialog to define the set of configuration parameters and their initial value. You must define the variables before using them as dynamic property values.

Add a 'Dynamic Properties Task' as the first task in your package. Then add an 'On Success' workflow link to the rest of package steps to ensure that the package gets correctly configured before execution. Open the properties dialog of he dynamic properties object, then click 'Add' to open the window showing all package elements and their settable properties (see screenshot below).


Browse to find the applicable 'write file' task and select the transformation properties node in the treeview. The right panel of the window will show the same properties as can be set directly on the task object, except that when setting the values from this window will make them configuration parameters instead of hard coded values. Select the property that you want to convert into a dynamic property, click 'Set' and then connect the property to the applicable global variable. Repeat for all properties that should be parameter based.

The configured output folder of the 'write file' task is also used in an 'ActiveX Script Task' that deletes all existing output files to make the DTS package process repeatable. The script uses the same global variable that was used in the 'write file' task:

Function Main()

path = DTSGlobalVariables("ExportFilePathCCS").Value

filemask = path & "export_file_mask_here"

Set fso = CreateObject("Scripting.FileSystemObject")
'NOTE: fso throws an error if no files match the mask on .DeleteFile
if fso.FileExists(filemask & ".000") = true then
fso.DeleteFile filemask & ".*"
end if

Main = DTSTaskExecResult_Success

End Function

Read more about DTS best practices at Vyas' site.

You will find a good introduction to SQL Server 2000 DTS and to SQL Server 2005 SSIS at Database Journal.

Thursday, March 09, 2006

Archiving Office documents and e-mails to MSCRM team-sites

In a previous post, I described how we integrate SharePoint team-sites into MSCRM to provide collaboration features and to provide document archive functionality to our MSCRM customers. We typically give the users the ability to create WSS team-sites for each account, each opportunity or for each custom entity such as 'project'. If the customer also uses SharePoint Portal Server (SPS), we register the team-sites in the site directory and include them in SPS search scopes to ensure that e.g. documents are indexed by the SPS search engine.

In addition, we provide an Office add-in that allows users to directly archive Word documents and Outlook e-mail messages into the MSCRM team-sites (much like the "snap"). MSCRM has poor support for storing documents (only as attachments to notes), and the support for creating new documents from templates and merging data from the database is not very much better. The rather lackluster support for document production in MSCRM is something potential customers always react negative to in sales meetings.

This screenshot collage shows how the add-in is used in Word and Outlook (click to enlarge):
E-mails are archived using the .MSG format. The SPS search engine can be configured to index .MSG files, thus making it easier to find archived messages. Documents are readily indexed by SPS out-of-the-box.

The templates and the datasets that they use, are configured using a 'document production admin' team-site, and the add-in uses the standard WSS web-services to get the list of templates:
The templates support merging in data from the datasets into placeholders in the document (bookmarks, document variables, [xml element name] literal text). The merge engine is simple and requires no scripting of each template: it just loops through the dataset and looks for all placeholders with the same name as the dataset column names:
The produced document is of course automatically archived to the correct team-site when the merging has finished (screenshot). Optionally, a post merge Word macro can be run should a template require some scripting, e.g. to create an table listing the details of a quote.

The merge data used in the templates are provided through custom web-services. Stored procedures using FOR XML AUTO, ELEMENTS are used to return the dataset to the add-in as XML. A stored procedure takes some MSCRM entity IDs as input, and then retrieve data from the MSCRM database, and possibly other datasources, to provide one or more resulting datasets. This allows for returning data about e.g. both an account, its primary contact, a quote and quote details in one go.

The Office add-in (.NET1.1, non-VSTO) currently supports Office 2000, XP, and 2003. It might be made available to other Microsoft partners this spring, keep an eye on this blog for further details.

BaseCompareValidator.Compare bug

In my use of the Noogen.Validator component for doing WinForms validation in a VS2005 project, I could not get validation of required 'currency' (decimal) fields to work properly. The Noogen datatype based validation rules utilizes the BaseCompareValidator of the System.Web.UI.WebControls namespace to do the .CanConvert() and .Compare() tests. This allows for Noogen to do culture aware validation of dates and numbers by reusing .NET framework components.

BaseCompareValidator.CanConvert(String, ValidationDataType)

BaseCompareValidator.Compare(String, String, ValidationCompareOperator, ValidationDataType)

Validation of required fields uses ValidationCompareOperator set to Equal to compare the entered value against the specified initial value of the field (typically ""), and the the required check returns true when these values differs. But this mechanism did not work as expected.

After a lot of frustration and a bit of debugging, I got these strange results using the 'Immediate' windows in VS2005:

? ValidationUtil.Compare("123", "", vco, vdt)
true
? ValidationUtil.Compare("123", "0", vco, vdt)
false

Thus, the BaseCompareValidator considers the two 'currency' fields "123" and "" to be equal, which is clearly not true. The 'culture-neutral format' compare has the same bug.

The Noogen.Validator discussion thread suggests that this error also applies to 'double' also. In fact, it might apply to all datatypes other than 'string'.

Sunday, March 05, 2006

A snap while waiting for VSTO MSCRM

Microsoft has released an Office Word 2003 add-in that lets you insert data from MSCRM into documents and archive documents as MSCRM note attachments. Read more at Johnz blog.

The source code is available at GotDotNet in this workspace.