Saturday, September 29, 2007

An Overview of Workflow

Are you looking for a quick overview of what is. Here is an article I wrote some time ago that will give you some basic background on it - An Overview of Workflow Technology

Friday, September 28, 2007

InfoPath Common Trap and SQL Event

Common Trap
I had a past client reach out to me today for a question about He was using the SQL Event do some operations within an InfoPath process. He fell into one of the traps I discussed in my blog (Item 13 - Process and Activity Level XML and Multi-Destination Considerations). Basically remember if you were to do something like the following where you have an event (in this case a SQL event) after an InfoPath client event in the same activity you must use the activity XML field. Do not use the process XML field as it will not be updated until the succeeding rule for the activity is called.

I personally do not like how this code is hidden down inside of the succeeding rule. The code to move data from an activity to process level XML field conceptually has nothing to do with the success of an activity. There should be an OnActivityComplete event handler which should handle this. I digress…

A quick and dirty solution would be to do something like below if you want to use the process level XML fields. I would think this would be better solution as there could be multiple destination users who each have their own version of the XML. If there is only one slot, doing this you are guaranteed to use the XML of the destination user would finished the activity.

Now this may not be the best solution because maybe the code within the SQL Event handler could be very similar between SQL Events. You can refactor the code into something reusable if you wish. Point is I am not such a fan of using the SQL Event unless you are practicing a very rapid delivery environment. Reason is that all of the SQL statements are injected directly into .Net code as strings. If you want to be a minimalist and not create a large data layer just create store procedures and write a little code yourself to call them. You will have more maintainable code over the long run.

SQL 2005 Not Working with the Event
As well, he was running into some issues with SQL 2005 not working with the SQL Event Wizard although it would work with SQL 2000. I checked around and did not have an explanation on why that was occurring. Unfortunately I do not have an environment to test this out at the moment but I was not surprised there was a problem. The client said that if he took code from an event that he had generated with SQL 2000 the code would work with SQL 2005. I explained that is because the code is using underneath the hood of the SQL Event which is agnostic to the version of SQL. This leads me to believe that the SQL Event Wizard must be using something specific to SQL 2000.

Friday, September 14, 2007 InfoPath Document Attachment Solution

1. Introduction
This article will provide a solution for the document attachment issue with as attaching documents into data fields can cause the database to grow at an uncontrolled rate. This issue commonly arises with InfoPath but also occurs with an or Win Form workflow. Many workflows have a requirement for attaching documents. An example would be if a user makes a purchase request maybe some legal document must be signed and then attached to the submission form.

2. Problem Background
If InfoPath is being used developers will want to use the document attachment functionality to attach the file to the InfoPath form. The problem is that InfoPath will take the attached document and serialize it into the XML of the InfoPath form. This is a problem as the size of the InfoPath XML file will grow as each file is attached. This becomes even a bigger problem when the XML is attached into a process as the database will grow significantly with each process instance.

The database will continue to grow because the XML of the InfoPath form will be stored numerous times; this is how it works. When an InfoPath form is attached to a K2 process the XML for the InfoPath form is stored at the process level as well as at the activity level. The reason why is each destination user (which could have their own slot) may require their own copy of the XML. In the preceding rule of the activity where the InfoPath client event resides, some code is generated that will copy the XML from process level into each activity instance for each and every destination user. Then on the succeeding rule, the XML from the activity instance of the user who finishes the activity will have their XML copied back into the process level.

Understanding this let’s gain an understanding of what will happen if a document is attached to the InfoPath form. If there is a process that has a three step approval, for the first approval there are ten possible destination users, for the second there are five and for the third there are two. In this example the originator creates an InfoPath form and adds three attachments to the InfoPath form each 1MB. Because the documents are serialized into the XML for the InfoPath, 54 MB is the total minimum amount of space that would be consumed in the database per process instance (3MB for the process and 3 MB for each activity instance). In reality most business documents are not 1MB. Note this does not include consumption of space for the other data fields in the process. This number will go up if there is any rejection paths as new activity instances are created. The amount of space can even grow more if an audit trail is turned on for the XML.

This is not a fault or a short coming of the database. The database is a database that has been specifically normalized for workflow statement management, not for managing unique data points. This issue would applicable not just InfoPath forms, this issue would be present if any file attachment is serialized into either process or activities data fields.

3. Solution
A solution for this problem is to store the attached documents externally. This can be simply resolved by storing the documents externally in WSS. To accomplish the solution the WSS web services provided by will be reused. The solution below was initially developed for 2003, InfoPath 2003 and WSS 2.0 but was a reused successfully with 2003, InfoPath 2007 and WSS 3.0. This solution would also apply to a BlackPearl implementation.

4. Web Services
The following web service is written in can be translated to C# if needed. There are three web methods that will be created: UploadFileToFolder, UploadInfoPathAttachmentToFolder and GetFilesFromFolder.

The web service will require a reference to http://[servername]/_vti_bin/K2SPSList.asmx. This is the web service that is used by

4.1 UploadFileToFolder
This web method purpose is to upload a file to a folder in SharePoint. This web service requires a folder name and a document name. In this case, the folder name should be a unique name like a requisition number. The web method will create the folder if the folder does not already exist. If the document with the same name already exists, the web method will overwrite it (that could be changed).

Note that whatever account is used to connect to the web service will upload the file. This will not be the user is actually uploading the file (like the destination user).

Public Sub UploadFileToFolder(ByVal strWssServerUrl As String, ByVal strWssSite As String, _
ByVal strWssSiteDocLib As String, ByVal strFolderName As String, ByVal strFileName As String, _
ByVal bytes As [Byte]())

Dim objK2Wss As K2WssWebService.K2SPSList

If strFolderName Is Nothing Or strFolderName = "" Or strFolderName.Length = 0 Then
Throw New Exception("A Unique ID is required.")
End If

objK2Wss = New K2WssWebService.K2SPSList
objK2Wss.Url = GetK2WssWebServiceURL()
'connect with default IE account
'objK2Wss.Credentials = System.Net.CredentialCache.DefaultCredentials
'connect with a service account
objK2Wss.Credentials = New System.Net.NetworkCredential(GetWSSUserString(), _
GetWSSUserPasswordString(), GetWSSUserDomainString())

Dim strFullFolderName As String = strWssSiteDocLib & "/" & strFolderName

' Check if folder exists
If Not objK2Wss.FolderExist(strWssSite, strFullFolderName) Then
' Create the folder
Dim strCreateFolderErrorMsg As String
objK2Wss.CreateFolder(strWssSite, _
strFullFolderName, strCreateFolderErrorMsg)

' Check if error creating the folder
If strCreateFolderErrorMsg <> "" Then
Throw New Exception(strCreateFolderErrorMsg)
End If
End If

' upload the document
Dim strUploadFileErrorMsg As String
objK2Wss.UploadDocument(strWssServerUrl, strWssSite, _
strFullFolderName, strFileName, _
bytes, True, strUploadFileErrorMsg)

' Check if there was an error uploading the file
If strUploadFileErrorMsg <> "" Then
Throw New Exception(strUploadFileErrorMsg)
End If
Catch ex As Exception
Throw ex
objK2Wss = Nothing
End Try
End Sub

4.2 UploadInfoPathAttachmentToFolder
This method will call UploadFileToFolder but its specific purpose is to accept a document that has been attached into an InfoPath form. Documents that have been attached into an InfoPath form have some header information added to the bits of the document. For instance, the file name needs to be stripped of the bits.

Public Sub UploadInfoPathAttachmentToFolder(ByVal strWssServerUrl As String, ByVal strWssSite As String, _
ByVal strWssSiteDocLib As String, ByVal strFolderName As String, ByVal byteIPFileAttachment As [Byte]())

Dim i As Integer

' Get the length of the file name from the IP file attachment header
Dim iNameBufferLen As Integer = byteIPFileAttachment(20) * 2

' Create binary array for the file name
Dim byteFileName(iNameBufferLen) As Byte

' Get the file name
For i = 0 To iNameBufferLen
byteFileName(i) = byteIPFileAttachment(24 + i)

' Translate file name to a string variable
Dim asciiChars() As Char = System.Text.UnicodeEncoding.Unicode.GetChars(byteFileName)
Dim strFileName As New String(asciiChars)
strFileName = strFileName.Substring(0, strFileName.Length - 1)

' Create binary arrary for the file. This is
' the total file lenght minue the header and the file name length
Dim byteFileContent(byteIPFileAttachment.Length - (24 + byteFileName.Length)) As Byte

' Get the file bytes
i = 0
For i = 0 To byteFileContent.Length - 1
byteFileContent(i) = byteIPFileAttachment(24 + (byteFileName.Length - 1) + i)

' Upload the file to WSS
UploadFileToFolder(strWssServerUrl, strWssSite, strWssSiteDocLib, strFolderName, strFileName, byteFileContent)
Catch ex As Exception
Throw ex
End Try

End Sub

4.3 GetFilesFromFolder
This web method will retrieve all of the file names from a specific folder in SharePoint and will return an XML document.

Public Function GetFilesFromFolder(ByVal strWssServerUrl As String, ByVal strWssSite As String, _
ByVal strWssSiteDocLib As String, ByVal strFolderName As String) As Xml.XmlDocument

Dim objK2Wss As K2WssWebService.K2SPSList

'XML writer
Dim sw As New System.IO.StringWriter
Dim xtw As New System.Xml.XmlTextWriter(sw)

If strFolderName Is Nothing Or strFolderName = "" Or strFolderName.Length = 0 Then
Throw New Exception("A Unique ID is required.")
End If

objK2Wss = New K2WssWebService.K2SPSList
objK2Wss.Url = GetK2WssWebServiceURL()
objK2Wss.Credentials = New System.Net.NetworkCredential(GetWSSUserString(), _
GetWSSUserPasswordString(), GetWSSUserDomainString())

Dim strFullFolderName As String = strWssSiteDocLib & "/" & strFolderName

' Check if folder exists
If Not objK2Wss.FolderExist(strWssSite, strFullFolderName) Then
' It is valid for this folder to not exist for a request,
' thus this web service should not return an error but empty XML...
' Folders for a request are only created when a document is uploaded.
xtw.WriteEndElement() 'Files
' get all of the files from web service
Dim strErrorMsg As String
Dim strFiles As String() = objK2Wss.GetFolderFiles(strWssServerUrl, _
strWssSite, strFullFolderName, strErrorMsg)

If strErrorMsg <> "" Then
Throw New Exception(strErrorMsg)
End If


' Loop over files and build a list of files
' for the unique id
Dim i As Integer
For i = 0 To strFiles.Length - 1
If Not (strFiles(i) Is Nothing) Then
xtw.WriteElementString("FileName", strFiles(i))
xtw.WriteElementString("FileUrl", strWssServerUrl & "/" & _
strWssSite & "/" & strFullFolderName & "/" & _
xtw.WriteEndElement() 'File
End If

xtw.WriteEndElement() 'Files
End If

Dim xmlDoc As New System.Xml.XmlDocument

Return xmlDoc
Catch ex As Exception
Throw ex
End Try
End Function

5. How to Connect to InfoPath
To get this hooked up to InfoPath is easy and will require no .net enabled code.

5.1 Add a Binary Data Field
Add a base64Binary data field to the InfoPath form. This value will only be set temporarily and it will be submitted to the web webserive.

Even though this violates one my best practices of not polluting the XSD schema of your InfoPath form with UI specific notes. However with InfoPath it is not possible to have a secondary data source with a base64Binary.

5.1 Add Submit Data Connection
Make a data connection to the UploadInfoPathAttachmentToFolder web method using the Data Connection Wizard. Do the following:

  • Use the “Submit Data” option
  • Select a web service
  • Enter the url to the web service
  • Select the UploadInfoPathAttachmentToFolder web method
  • End a value for all of the parameters from the web method. For the strFolderName make sure you a unique value and for byteIPFileAttachment set the data field from the first step.
  • Then finish the wizard.

5.3 Add Get Files Data Connection
Now create a data connection to the web method to return all of the files using the Data Connection Wizard. Do the following:
  • Prior to this you will need to modify the web service to return a hard coded set of dummy values otherwise you will receive an error while making the data connection. A quick solution is to temporarily change the GetFilesFromFolder to return a hard coded sample XML in the correct format. Make sure that there is more than one node returned in the XML. If not, InfoPath will not infer that it is not possible that the web method can return more than one file.
  • Use the “Receive Data” option
  • Select a web service
  • Enter the url to the web service
  • Select the GetFilesFromFolder web method
  • In the final step select the checkbox to run this when the form is opened to retrieve any files that may already be uploaded. This will only work if the unique number for the folder name is generated before the InfoPath form is opened. If that is not possible, modify the code in the web method to not throw an exception when strFolderName is null. Instead return back an empty string of XML.
  • Finish the wizard
  • Remove the hard coded XML.

5.4 Add Controls to Form
First move the base64Binary field that was added in the first step to the form as a document attachment control. Then drag and drop a button onto the form. Finally go Data Source task pane and drag and drop the File collection for the GetFiles data connection as a repeating table.

Something similar to following can be created.

For the repeating table change the control within it to be a hyperlink control and use the following configuration.

5.5 Create Rules for the Button
Now we need to add some rules to the Upload File button. Double click on the button and the properties window should appear. Press the Rules button and then press the Add button to create a new rule. First add a Set Condition to make sure that the File (base64Binary field) is not blank. Then add an action to upload the attachment. Then clear out the File (base64Binary) field. Finally add another action to return all the files that are currently available.

5.6 User Experience
The user experience will be that they will add a file to the form using the InfoPath file attachment control. They will then press the button and then the file name will appear in the repeating list immediately below. When they click on the link the file will be opened in a new window.

6. Using this Solution
Now you have the ability to start attaching documents to your processes without adversely affecting the size of the InfoPath form. As well, the services can be used outside of InfoPath and are re-useable for all processes you create in the future like SmartForms, Custom pages, WinForms, etc.

Saturday, September 8, 2007, SharePoint, and InfoPath Best Practices

This is the fourth of a series of best practices I considered when starting a new workflow. Note that manye of these Best Practices are 2003 specific. They will be re-evaluated with the new BlackPearl release.

1) Store InfoPath Data Externally
Commonly much of the data within an InfoPath form needs to be reported on. It is recommended that this data be stored externally for robust data reporting requirements.

When using InfoPath processes, a good practice is to keep the data in the InfoPath form while it moves through the process. Then at the end of the process, shred the XML apart and insert it into a normalized database. As well, at various points in the process, push the XML document out to a different InfoPath form library where users can get access to the latest and greatest data.

2) Create Reusable XML Web Services
In many cases both the InfoPath form and the process will require access to the same data. Use XML Web Services to allow both to gain access to the same data.

3) Required Fields with InfoPath
In InfoPath setting a field as required will make the field required for all users who will submit the form. If there is a simple process with two InfoPath activities the first being for a submitter and the second being for an approver. If there is a field called “IsApproved” and is defined as required in InfoPath the submitter would have to fill this in even though this is something that only the approver should do. There are two ways to get around this:
  • Introduce a field into the InfoPath form that will be set by to indicate the state of the workflow (very similar to switching the view). This can be evaluated in validation rule of the field and defer the validation of a field to specific points in time.
  • Break the InfoPath form apart into two separate workflows.

4) Remove Views and Create a Single View
If you have many views (let’s say more than three) and you commonly have to make changes in each and every view every time there is a change, consider collapsing down to one view. Again introduce a state field into the InfoPath form that will be set by Then InfoPath can use conditional formatting to hide those fields based upon the state of the workflow.

5) Multiform Processes
BlackPearl now provides the ability to create multiple InfoPath forms and use them in the same process. In 2003 it was possible to create workflows with multiple InfoPath forms by creating a master process and then creating child processes for each form. Another approach is to just chain several InfoPath form processes together. The advantage of having a master process is that you can map out how all of the InfoPath child processes interact with each other as well as have global events.

6) Understand Limitations of InfoPath
Do not let InfoPath become a substitute for creating Win Forms applications. In the end InfoPath should be used to capture data from a user. Robust functionality to view databases, provide heavy user interactions, etc. should be done in other mediums. One indication this is occurring if there is heavy use of script Managed Code in the InfoPath form.

Note that an InfoPath process does NOT have to be initiated by an InfoPath form. External system,, Win Forms can start a process that later uses an InfoPath form to capture some data.

7) Solidify XSD First
It is very important to try to have your XSD schema fairly solid prior to enabling your InfoPath form. Once the Form is enabled, it must remain in sync with the cached XSD schema within

8) Do not Pollute XSD
Over time the InfoPath form will become polluted with various nodes that are used to support the presentation of the InfoPath form (two of the recommendations above violate this rule). It is recommended to keep this to a minimum. If it cannot be avoided place the nodes related to the InfoPath UI in a separate group (title as such) so that is it immediately known which nodes contain valid business data and which ones do not.

9 ) InfoPath Version Control
For 2003 when a change is made to an InfoPath form this must be carefully considered when publishing it to SharePoint as running process instances will be affected. InfoPath will try to gracefully handle situation when opening up an old XML file within a newer version of the InfoPath form. This will not always result well. It is suggested that a different form library be created for each version of the form. The running processes are configured to use the old form in the old form library while the new process instances will use the new form in the new form library.

10) Do not Use Attachments in InfoPath
Do not put attachments into the InfoPath form as this can cause significant performance problems. Large attachments will be stored in the InfoPath XML which is then stored in the database. The XML for an InfoPath form is used at both the process and activity levels. This creates the possibility the same attachment being stored hundreds of times in the database wasting space. It is recommended that the InfoPath form store all of the documents directly into something like SharePoint and then provide links to those documents inside the InfoPath form.

11) Securing InfoPath Data
Switching InfoPath views is NOT a valid way to secure data as the user can simply open the XML directly and modify the data directly. One way to enforce security is to use multiple document libraries with permissions. There is no way secure an InfoPath form at the document instance level with WSS 2.0. This can possibly resolved with using WSS 3.0 with BlackPearl as permissions can be applied at the document item level. If data must be totally secured, store all of the XML data in an external database.

12) Viewing Forms in Process
A common request by users is to be able to review the InfoPath form after they have submitted it or when the process is complete. The InfoPath form that was in SharePoint is dynamically added and removed by when an InfoPath client is completed. It is suggested that the form be published at specific points in the process to various archive document libraries for all users to access.

13) Process and Activity Level XML and Multi-Destination Considerations
Understand that when an InfoPath form is attached to a 2003 process the wizard will go into the InfoPath form and retrieve the XSD. The XSD will be copied into the process definition as a process level XML data field. For InfoPath activities within that process each will have an activity level XML data field created for the InfoPath client event. This is done because there is a possibility that multiple destination users can be added to the activity. This is important because each destination user should have their own copy of the InfoPath XML to modify as succeeding rules will need to evaluate the data set by each approving user (remember that rules can be created that more than one user must approve before an activity can be completed). It would not be possible to create succeeding rules requiring multiple user approval if all the users shared the same XML document.

If there are events after the InfoPath client event in the same activity make sure to use the activity level XML and not the process level XML. The process level XML will be updated in the succeeding rule of the activity which is at the end of the activity life-cycle.

If there are many destination users that are able to approve an activity, place custom data aggregation code into the succeeding rule as this is where data is re-synchronized with the process level XML data field. An example is that if three users and each add three items to the InfoPath form, you will need to add to the succeeding rule to ensure there will be nine items in the process level XML data field. Otherwise the last three will only be moved up.

It is highly suggested that if there are multiple destination users for an InfoPath client event that a unique name is used for the InfoPath form for each destination user. If multiple destination users modified their own specific XML and this needs to be aggregated at the end of the activity, modify the succeeding rule code to merge this data together back into the process level XML data field.

14) Reuse WSS Services provides several useful ready to go services to do various things within SharePoint like uploading/deleting files, creating/deleting folders, creating/updating document metadata, etc. These services can be easily reused within InfoPath, and Win Forms. Process Sizing and Performance Best Practices

This is the third of a series of best practices I considered when starting a new workflow. Note that manye of these Best Practices are 2003 specific. They will be re-evaluated with the new BlackPearl release.

1) Process Sizing Considerations
1.1) Use the Estimator When Designing a Process has some spreadsheets that will help you understand how much database storage will be required for process instances. Specifically things that must be considered are:
  • What are the number of process and activity level fields, the data types and size?
  • If audit trails are used for data fields and how many times can the data be updated?
  • How many destination users will be assigned for each activity?

1.2) Understand Process and Activity Level Fields
From a general programming position process level fields should be though of as global fields defined at a top of a class while activity level fields are local fields to a method. Process and Activity level fields are not limited to just using primitive data types. Custom objects can be used if they are serialized into a data field. Important Note: Activity level fields can have multiple instances if many slots have been created. A unique activity instance will always be created for an activity slot. This will result in the data value being created and stored for each individual activity instance. Use activity data fields when you need to pass data between activities and events or when each destination user requires their own unique value. This will have to be balanced with having too many process level fields that could be similar to one another.

1.3) Number of Destination Users
“A high number of destination users for activities with client events can cause severe performance issues on the Server. This comes into play when a large number of users access the Worklist simultaneously (or calls from the K2.netROM API to user worklists) and a high volume of data has to be returned to the Server from the database. In conjunction with the "Data on Demand" it can be alleviated. With the arrival of 2003 Service Pack 3 (SP3), there is now an alternative way to handle this. SP3 introduced the option to create a single activity instance for an activity when a Destination Queue is used as the activity destination. The new feature creates a single Activity Instance for the Destination Queue. Server will create only one activity instance which is visible to all users within the Destination Queue. When one of the users opens the item it will no longer be visible in the other user's tasks lists but will have been assigned to the user who opened it. The implications however are that only one slot is available. The advantage is that only one activity instance is created in the database significantly reducing database overhead as well.” [1]

1.4) Keep Audit Trail
For data fields checking the keep audit trail checkbox will save every value every time the data field changes. If the data field is big (like xml) this will fill up the K2 database every quickly.

1.5) Data on Demand is Lazy Loading
“Data on Demand is a feature which minimizes the load placed on server resources when a large volume of data from the database is requested by the server and worklist. By default, when a process instance is loaded by the server all the data fields are loaded, regardless of whether they are needed at that time or not. This creates a resource drain as not all the data fields are required at the same time to perform a step within the process. However, since all the data has been loaded, the system must manage the data contained within memory even though only a small portion of the data may be affected at that time. To make use of the "Data on Demand" feature, it must be enabled on a field-by-field basis from within Studio. When "Data on Demand" is active, only the fields required at that time will be returned by the server when the request for data is sent through to the server. In other words, the data must be "demanded" as an explicit call for it to be loaded into server memory and passed to the client application. “ [1]

2) Externalize Large Data Requests for Open Worklist Items
When using the ROM be careful to not make unfiltered requests for worklist items to the server as this can be resource intensive. For the OpenWorklist method create a WorklistCritera object with a filter.

“For large volume scenarios where process/activity data fields need to be utilized to filter or sort a worklist, it is recommended that the specific data fields required in the search be stored in a database table external to, in addition to the native data field storage. The external table should contain a field for the "Process Instance ID" if querying process datafields or a "Process Instance ID" and an "Event Instance ID" if tracking activity level data. A query can then be constructed that joins the external table data with the internal _Worklist table, or more appropriately a read- only view on the _Worklist table (see below for a sample view). Please note the read-only view of the worklist table is the recommended interface to querying the Worklist. This query should then become the basis for the custom worklist application, ensuring that the fields necessary to process a worklist item in an application (generally the platform, serial number and URL for the event which is stored in the "Data" field), are accessible.” [1]

3) Server Events in Multi-Destination Activities
Be careful to not place a server event into an activity that has multiple slots for destination users unless the code in the event must be executed for each destination user. This can become a bottleneck by flooding external applications with hundreds of simultaneous calls or it could even introduce errors like insert duplicate records into a database.

4) Use Destination Queues for AD Groups
When Active Directory (AD) Groups are used as destination users for activities, destination queues should be used. will poll active directory to check for changes in the group membership. This will allow to remove a destination user whom has been removed from the group creating more security around who can access the process instance. As well if a user is added a new activity instance for this destination user will be created. The rate in which will poll active directory is configurable. [1]

5) Understand Exception Handling
Exceptions will be raised in the following order from within [1]

  • Code Level Try-Catch Block
  • Event Exceptions Code Block
  • Activity Exceptions Code Block
  • Process Exceptions Code Block

Exceptions that are handled in a try/catch block will not be logged to the database. It is recommended that the error may need to be handled locally but a custom error can be thrown which will continue up the stack and be caught at the Process Level. At that location it can be determined if the error should be result in stopping the process instance or continue letting it move along. This is a good location to do global logging of exceptions.

Reference - 2003 Process Design best practices - Architecture Best Practices

This is the second of a series of best practices I considered when starting a new workflow. Note that manye of these Best Practices are 2003 specific. They will be re-evaluated with the new BlackPearl release.

1) Understand When to Use provides the layer of human-to-human and human-to-system process definitions. typically compliments the middle layer business logic in a traditional three-tiered architecture. BizTalk workflow is used for system-to-system business workflow definition and specializes as a data broker. The native pattern of BizTalk is to serve as an Enterprise Bus for data between line of business applications. Integration Services (SSIS) for SQL Server 2005 can be used as the “poor man’s” BizTalk but should be limited in its usage as an ETL facility. can be the hub for human related workflows to start BizTalk or SSIS data workflows.

2) Decouple External Data Request with XML
Using XML to pass data between and external actors will ensure for a more configurable and manageable process over its lifetime.

3) Understand Integration with Line of Business Systems
Versioning should be accounted for whenever interacting with anything external to When integrating with external applications, the introduction of a version number should be considered especially when process instances may have a long life-cycle.

4) Design for Long Running Processes
When designing long running and large processes it is highly recommended that the process is broken up into several smaller processes. This allows for flexibility and manageability of the process definitions if things change while a process instance is in flight. Use the IPC server event and create a master child or a daisy chain of process that as a whole make up on large process. Note this is not recommending that processes are used as a sub function.

5) Design with Maintainability in Mind
A reality is that processes change with time due to evolving business rules. Locate all places where there is a possibility of change and consider what types of changes will be made and how they can be resolved. Understand the service level agreement of the process; for instance can a process instance be stopped and started over if there is a change request? If not, understand where a configuration can be introduced. A configuration is not just the usage of the string table or configuration file. A web service, database, sub-process, etc. could be introduced to allow for more flexibility in handling changes.

6) Handling Volatile Business Rules
Line Rules are no more than standard if statements in structured programming; if this do that. If Line Rules are going to be volatile and it is required they are up to date at all times it is recommended that those rules be externalized. Use web services, BizTalk Business Rule Engine, BlackPearl Business Rule Engine (TBD), Database, etc. General Best Practices

This is the first of a series of best practices I considered when starting a new workflow. Note that manye of these Best Practices are 2003 specific. They will be re-evaluated with the new BlackPearl release.

1) Start with Use Cases & Process Flow Diagram
Prior to doing your workflow, start with the creation of use cases as they can be used to design a workflow. Use tools like Visio or K2 Studio itself to create process flows to show to the business users. With BlackPearl, the Visio diagrams can be imported into K2.

2) Plan on Reporting
With 2003 the Workspace website provides several out of the box reports on the health of all processes. They can be used to support development and administrative activities. They sometimes do not do well with business users as low level implementation details are exposed which can become confusing. Consider storing key business data externally in reporting database using your reporting engine of choice. The K2 database is optimized for managing a state machine not for doing business reporting. BlackPearl provides a wizard to design reports I would still recommend doing externally reporting.

3) Maximize Use of Code Accelerators (Wizards)
Use the K2 Event Wizards to their fullest extent. If you do not know the code, but need to write some custom code, use an Event wizard to generate the code and then re-use it. It is not a short coming of that a developer still needs to go into the generated code; it is advantage that much of this code is already built and is completely extendable. For instance, there are events for email, SQL, Data Manipulation, InfoPath, SharePoint, Web Services, BizTalk, Exchange, etc. Good .net developers will want to take the code generated and generalize it in a code library. That is fine approach but should be balanced with maximizing to create things quickly.

4) Code Modules
Code modules provide a quick way to centralize re-usable code that can be used in all processes defined in a solution. They are best used when created utility methods and functions. Singleton classes tend to work well with code modules. If full object orientated libraries are needed it is better to create an external class library in Visual Studio.

5) Using External Libraries Versioning
External code library should be considered if there is a need for complex classes. If the classes can be used in other context outside of, it is recommended the class definitions be absolutely externalized. Note that referenced libraries will be exported with the process definition to the server (the dll is serialized into the database).

There are considerations that must be thought of before placing the external DLL that will use in the GAC. Even though the DLL is versioned in the GAC there is no version mapping to the process definition that has been exported to the server. If the code has volatile business rules being computed with long running processes, creating an external service to access that code library is the suggested best practice.

Typically it is just best to create an external library in Visual Studio and import that directly in the process but there are still some considerations. Note that configuration files will have to be manually pushed to the production server and there can only be one config file for the K2 Server requiring all of the external DLLs so share the same config.

Finally errors in external libraries can be particularly hard to debug without logging. They are even harder or impossible to repair using the Service Manager without going through a lot of effort.

6) Use String Table for Configuration Values
Use the string table to place all configurations for your processes. Complex configurations (repeating data) cannot be managed inside of string table. A configuration file can be created in the bin directory it is recommend that an external database be used to retrieve configuration or process metadata if it is particularly complex.

7) Provide Users with Multiple Ways to Access Assigned Tasks
Provide users with multiple ways to complete the tasks that they have been assigned. Relying completely on email as a way to distribute links to an InfoPath or form is not good as the email can be lost or deleted.

8) Using ROM in Server Event
The ROM should never be used in a server event when it is accessing the current process instance. Attempting to use the ROM to operate on the current process instance from within the execution of the current process instance can result in inconsistent behavior and/or crashing of the Server service. This because is the server locks the processes that is currently executing and cannot connect back to itself with the external ROM as described. [1] It is alright to use the ROM to create new process instances (similar to using an IPC sever event) or to finish the event of a different process. The golden rule is to not use the ROM on the current process instance itself, only on other process instances.

9) Design with Testing in Mind
Having a development and QA server should be highly considered. It is possible to create unit tests using Visual Studio 2005 test projects the can unit test an entire workflow and all of its permutations. This is done by using the ROM to create process instances and mimic user events that push the workflow through. [2]

10) Use Source Control and Define File Hierarchy
It is highly recommended the version control be used to manage solutions. With BlackPearl integrated into Visual Studio working with source control becomes much easier!

11) Activity Logic Considerations
11.1) Line Rule Must be True to Continue
A common mistake when designing process is to have activities that will stop a process instance. If the activity has no lines extending from it or none that will result with a true value the process instance will finish and cannot be restarted again. It is suggested to pseudo code your line rules on paper to avoid this issue.

11.2) Line Rule Custom Code
Custom code in the Line Rules should only have code that is used determined if something is true or false; do not embed something like a write operation. All line rules are executed NOT matter what and other line rules will be affected. In general, whenever you put custom code into an event handler for line rules, events, preceding, succeeding, etc. make sure the code placed in there is for that operation only.

11.3) Graceful Cancel out of a Workflow
Most workflows will require paths to gracefully cancel or stop a workflow. Make sure this is incorporated into the design of the process definition from the beginning.

11.4) Design for Delegation
Make sure that there is functionality to delegate or escalate so process instances can be re-assigned by the users themselves (even though this will be done through the K2 Service Manager). When delegating, events that are places before the client event will be re-executed. Ensure that there are no problems with iterating over the same code more than once; otherwise move the event.

12) Logging
Ensure Logging is turned on in the server and logging has been implemented within code.

13) Unique Name in Folio
Processes allow for a Folio Name which is a readable name for a process instance; use them.

14) Do not Let K2 Spam
Escalations with email can quickly become a spam machine. Ensure that the setting for the duration for escalations is configurable through the string table or a database. As well, create a configuration to turn off escalations in the string table and use it in the event handler for the escalation.

Monday, September 3, 2007

Automation testing or simulation with 2003

I initially published this at, but I figured this would be a good starting point for my Blog.

This article will show how to create automated unit tests which can be used to exercise workflow process in both and InfoPath. Creating automated testing or simulation with 2003 may seem difficult to do but is easy when using Visual Studio 2005 Test Projects. Creating unit tests with Visual Studio is no different than creating a custom web page that uses the K2ROM to finish a worklist item that has been assigned to a user. In this case instead of embedding code into an .aspx code behind we are going to put the code into a test method.

2. Create a Process
The following process is a standard approve/deny process. This particular screenshot shows an approval process using This process would be initiated by a custom page or a 2003 SmartForm PlanPage. There would then be a second web page which the Manager would use to approve the process instance. Finally an email would be sent out based on the Manager’s decision.

An assumption is made that the reader knows how to create a basic workflow. The specific details of setting destination users, configuring line rules, configuring email, etc. will not be covered.

2.1 Adding a Test ID Data Field
The Visual Studio Test Project will require an identifier to correlate the process instance with the test instance. generates an identifier called a Serial Number that uniquely identifies every process, activity and event instance. To achieve this we will add a Data Field to the Process by going to the Properties of the Process, clicking on Data Fields and adding a string type called AutomatedTestID. The value will be generated by the Unit Test and set through the K2ROM which will be discussed later.

2.2 Add Approval Data Field
For the purposes of this article we will add an Approve Data Field to the Manager Approval activity. This will be used by the manager to approve or deny the workflow.

3. Create Visual Studio Test Project
Start Visual Studio 2005 and select File, then new Project. In the New Project window select the programming language of choice (in this case C#) and the select the Test option. Within this select Test Project and place the project in a location.

3.1 Configuring the Test Project
The following will be created for you by default.

First rename the default class UnitTest1 to SimpleWorkflowTest by right clicking the filename in the Solution Explorer and selecting rename. After renaming the file you will be prompted to rename all references; select yes.

Next add a reference to the K2ROM by right clicking the References node in the Solution Explorer and selecting Add Reference. In the Add Reference window select the Browse tab and go to \Program Files\ 2003\Bin and select the K2ROM.dll.

The resulting project should look like the following.

3.2 Create Test Methods
To simulate this workflow we need to create two test methods. Note the methods must be decorated with the [TestMethod] attribute and the class with [TestClass]. If these are not present neither the class nor its methods will be used when the unit test is executed. First rename TestMethod1() to SimpleApprovalTest(). Next add two method stubs one called StartSimpleWorkflow() and the other called ManagerApproval(). Notice that both of these methods have not been decorated with [TestMethod] attribute. They will be executed by Visual Studio because SimpleApprovalTest() is the entry point and has the [TestMethod] attribute.
3.3 SimpleApprovalTest Method
Once we have our stub set up, a unique identifier needs to be generated for the unit test instance which will be used to correlate to the process instance. Modify SimpleApprovalTest() to generate a GUID and pass that value down into both StartSimpleWorkflow() and ManagerApproval().

public void SimpleApprovalTest() {
//Create Test Instance GUID
Guid testInstance = new Guid();

//Start the process

//Give K2 server time to create process instance

//Approve the process

Both of these methods will need to have their signatures modified to accept the GUID.

private void StartSimpleWorkflow(Guid testInstance) {


private void ManagerApproval(Guid testInstance) {


Note the System.Threading.Thread.Sleep(2000) was added to give server a little time between activities. Depending on your testing server performance, this value may need to be modified or this line could be completely removed.

3.4 StartSimpleWorkflow Test Method
Add the following statement using SourceCode.K2ROM; in the class.

Add the following code to the StartSimpleWorkflow(Guid testInstance) method which will create a new process instance using the K2ROM. This method opens a connection to the server, sets the GUID to the process instance and folio name, and finally starts the process.

private void StartSimpleWorkflow(Guid testInstance) {
string connectionString;
string k2Server;
string processName;
Connection conn = new Connection();

try {
//Recommend moving thses values to app.config file
connectionString = "CONNECTION STRING";
k2Server = "SERVER NAME";
processName = "SimpleApproval\\SimpleApproval";

//Open K2 Connection
conn.Open(k2Server, connectionString);

//Create Process Instance
ProcessInstance process = conn.CreateProcessInstance(processName);

//Set the test instance id
process.DataFields["AutomatedTestID"].Value = testInstance.ToString();

//Set K2 Folio Name
process.Folio = "Test Simple Approval " + testInstance.ToString();

//Start the process
catch (Exception ex) {
// failed validation
Assert.Fail("The process failed: " + ex.Message);
finally {

3.5 ManagerApproval Test Method
In this method a connection is made to the server and the Worklist for the Manager is opened. This is done by using the WorklistCriteria object to query for the WorklistItem that has test instance GUID set in its process data field. This is done by using a WorklistCriteria with a filter. When the WorklistItem is returned the activity instance data field is set to “approve” and the WorklistItem is subsequently finished.

private void ManagerApproval(Guid testInstance) {
string connectionString;
string k2Server;
Connection conn = new Connection();
Worklist workList;
WorklistCriteria workListCriteria = new WorklistCriteria();

try {
//Recommend moving thses values to app.config file
connectionString = "CONNECTION STRING";
k2Server = "SERVER NAME";

//Open K2 Connection
conn.Open(k2Server, connectionString);

//Retrieve the test manager's worklist items
workListCriteria.AddFilterField(WCLogical.And, WCField.ProcessData, "AutomatedTestID", WCCompare.Equal, testInstance.ToString());
workList = conn.OpenWorklist(workListCriteria);

//Find the worklist item with the test identifier in it
foreach (WorklistItem workListItem in workList) {

//Set the process for approval
workListItem.ActivityInstanceDestination.DataFields["Approve"].Value = "Approve";

//Finish the worklist item
catch (Exception ex) {
// failed validation
Assert.Fail("The process failed: " + ex.Message);
finally {

3.6 More Test Methods
More test methods could be created in a similar fashion. For instance a similar test method could be created for the denial path. As well if the workflow had multiple steps every step in the process could have a method created and chained to one another to fully test the entire workflow and every possible path it could take.
4. Deployment
It is highly recommended that this not be done on a production server. A testing server should be used to run the unit tests on. The reason is that these test process instances will be intermingled with production instances. Having these instances in there will subsequently throw off or dilute reports provided in the Workspace.

Deployment of the SimpleWorkflow would be the same as any other process using the Export functionality in Studio.

5. InfoPath Testing
It is possible to test InfoPath process in visual studio as well. In the Visual Studio Test Project, add a reference to the InfoPathService web service.

Change the StartSimpleWorkflow(Guid testInstance) method and remove all of the code associated to the K2ROM. Instead call the SubmitInfoPathData method of the InfoPathService web service. Generate the XML for the InfoPath form and pass that into SubmitInfoPathData webmethod. To get a sample of this XML, open an InfoPath that has been enabled and save it locally. Then open the .Xml file in a text editor and copy the Xml into the Visual Studio. That Xml string can be passed into SubmitInfoPathData.

Next in ManagerApproval(Guid testInstance) remove the following line of code.

workListItem.ActivityInstanceDestination.DataFields["Approve"].Value = "Approve";

Instead use this line of code to retrieve the XML for the InfoPath Form.

string infoPathXml = workListItem.ActivityInstanceDestination.XmlFields["K2InfoPathSchema"].Value;

Once the Xml has been retrieved, modify it (setting it to Approved) and call SubmitInfoPathData webmethod again passing in the Xml. Do not set the Xml back into workListItem.ActivityInstanceDestination.XmlFields["K2InfoPathSchema"] because this would not accurately test the process as InfoPath will always call the web service.
6. Conclusions
Very sophisticated testing processes can be done in this manner testing multiple steps and every permutations of the workflow. Load testing can be done as well by creating multiple unit test instances to identify bottlenecks in the workflow process. Bottlenecks could be sometimes be hardware specific, a connection to an external system may be too slow, tables in a database require indexes, etc. Using this methodology it is possible to completely test your processes without writing any or InfoPath front-ends. This will create a highly decoupled solution and is considered a best practice.