Monday, December 31, 2007

Visual Studio 2008 Free MS eBooks

A colleague of mine sent out this link to get free MS Press books on LINQ, with Ajax Extensions and Silverlight 1.0.

Introducing Microsoft LINQ by Paolo Pialorsi and Marco Russo (ISBN: 9780735623910)
This practical guide covers Language Integrated Query (LINQ) syntax fundamentals, LINQ to ADO.NET, and LINQ to XML. The e-book includes the entire contents of this printed book!

Introducing Microsoft ASP.NET AJAX by Dino Esposito (ISBN: 9780735624139)
Learn about the February 2007 release of ASP.NET AJAX Extensions 1.0, including an overview and the control toolkit.

Introducing Microsoft Silverlight 1.0 by Laurence Moroney (ISBN: 9780735625396)
Learn how to use Silverlight to simplify the way you implement compelling user experiences for the Web. Discover how to support an object-oriented program model with JavaScript.

Sunday, December 16, 2007

K2 BlackPearl SP1 Released

Service Pack 1 Release

Service Pack 1 for K2 BlackPearl has been released and I am very excited about getting it. Over the past two months I have been working closely with a client do some pretty basic workflows in WSS 3.0 and client-based InfoPath forms. We knew would be early adopters with using BlackPearl. We knew that building the workflows in K2 2003 and then migrating them to BlackPearl would add additional work plus that migration path still is under development with K2. Knowing this, we jumped in with BlackPearl. We ran into several small problems which required me to open several tickets with K2. The problem was the amount of small problems with their workarounds equated to one large problem. K2's support worked hand-and-hand with us to identify the issues and as of right now almost all of the problems have been resolved!

Congratulations to the K2 team for getting this out the door. I expect great things for SP2 J

Service Pack 1 Issues Resolved for this Project

  • K2 Visual Studio memory leak issues were resolved - Basically as the processes got larger and larger with more stuff and the longer you left studio open the slower and slower things got. It is not speedy for SP1, but it is a significant improvement and bearable. In the past it would take minutes to deploy a K2 process, open a process or even a wizard within a disconnected VPC.
  • SmartObject Invalid Archive Type Error – Well I hope this issue will go away; they mention it in the SP1 release notes. Basically when using SmartObjects with the SmartBox I would get this error out of the blue. No code changes and no reasons; just because. We would find weird ways to resolve the problem like changing the parameter values on insert or changing the order in which properties were set on insert. K2 Labs had seen the problem and believe it has gone away (I suspect the refactored a bunch of code under the hood which just took care of the problem). I cannot verify this has been fixed as I do not have any of the process instances with the error sitting around anymore.
  • Escalations – There were several issues associated with escalations that were resolved for me. InfoPath forms are now properly cleaned up when Redirecting, using GoTo and Expiring. Problem is that the file name for the InfoPath for is dynamically generated using the event serial number in the name. So, it is impossible to guess what the InfoPath file name is without going in and modifying K2 wizard code directly to use a custom file naming convention. Going down this path was un-realistic with the amount of InfoPath client events that I had. The filename was stored in the activity destination instance and it looks like they now loop over this in the activity succeeding event and delete all of the InfoPath file instances.
  • Working Hours – Along with escalations, Working Hours were not working and now work in SP1. They have changed since the K2 2003 product. The great thing you can now configure them on the K2 workspace outside of the process definition. Now you do not have to redeploy your K2 process when the process hours change since it is configurable in the K2 Workspace. Plus they are now time zone based. You can share working hour configurations across processes. I still need to do some research this week on the proper configuration of them. Specifically what is the right mix of associating a working hour zone with a process and user group/role.
  • Dynamic Roles - Roles were not working correctly with Redirect Escalations. The problem was that even if you change the users who are registered in a role, the redirect escalations would still go to the old users who were removed from the role. Then I found out the same behavior was occurring with setting a Role for a destination user. Roles are the replacement for Destination Queues in K2 2003. It is a best practices to ALWAYS use a Role and inside of process. Then you associate users or groups to the Role making make your processes configurable through the K2 workspace so again you do not have to redeploy your K2 process when there is a change. So the issue was that Roles were not being refreshed with the K2 processes, this has been fixed and all is well now.
  • Environments – Simply stated, they were not working correctly. Basically Environment templates are a class like definition for the StringTable (a key-value pair config). You can create re-useable configurations that can be used across processes. Now I have had to go down a specific path with my current processes but I know they have done a lot of work with this. Hopefully it will be better. Plus I need to do some investigation if you can create project dedicated environments that are not shared. This was a big problem with the release prior to SP1.
  • Reporting Services – Many of the out of the box reports were not available. All of the reports we had in K2 2003 are not there an all of the issues associated with navigating through the Reporting Services reports have been resolved.
  • Process, Activity and Event Error Handlers are Back – In K2 2003 there was the ability to put in global exception blocks to log, send an email, etc on error. They were not present. They have been added to BlackPearl allowing us to send emails to an administrator when an error occurs anywhere in the K2 process (I would still like to see this as part of the notifications framework)…
  • Special Characters in K2 Workspace – It was frustrating but you could not use underscores, dashes, etc. in when setting values in the workspace. So if you have a url you needed to put into a configuration you could not. You would have to go into the configuration database and manually fix the problem. Resolved…

Service Pack 1 Issues Still Open for this Project

  • Notification Limitations – Notifications is a new piece that allows administrators and users to hook into all events that are being raised throughout the workflows. Problem is that you cannot get access to XML Fields and that they are not re-deployable. The second issue is more problematic because you may set up sophisticated emails in your development environment and you will have to manually recreate all of them in your production environment. Plus the notification registration screen is not business user friendly. It is exactly what a K2 Administrator, Developer or very strong IT Business Analyst can use. Knowing this, we do not plan to take advantage of this functionality until it gets farther along.
  • Redeployment to Production Environment Limitations – This is still an open issue with redeploying from one environment to another. Specifically notifications, roles, and environments must be manually redeployed into the production environment. I am not sure yet if custom reports will fall into this issue yet as they are Reporting Service definitions, but there is not push button to move them either.
  • InfoPath Client File Locking – This is not in their release notes but it is an open bug that is still in SP1. Basically if you are using client based InfoPath forms if the user does not press the Ok button quick enough (like 3 seconds) after the submit successful is shown within InfoPath the process will fail with a file locking issue between K2, SharePoint and the client desktop. A temporary solution is to change the Submit Options to not show the prompt in the advanced options. Problem is that if there is an error on submit the user will not be notified there is an error. Second, every time you deploy the K2 process K2 will publish the InfoPath form with the option to show the confirmation prompt. After deploying the process you will have to open the InfoPath form, make the changes, and publish it again.
  • Sending Roles an Email (WORKAROUND) – In the release notes, in is mentioned that a known bug is that you cannot send a Role an email and you will get a compile error. My simple workaround is make the Role the destination user for the activity, in the destination rule resolve the users in the Role and then send the email to the destination user. Done.

Saturday, December 15, 2007

SharePoint Migration Level Of Effort

I am posting this blog is a little late. A common question is what is the level of effort to migration a out-of-the-box SharePoint 2003 implementation and are there any tools out there that can help me determine this? I have been asked this several times by current and prospective clients. There is a lot of information already out there on the migration from SharePoint 2003 to SharePoint 2007 but this is what I have seen.

Note I am going to refer to SharePoint 2003 generically instead of specifying WSS 2.0 or SharePoint Portal. Same with SharePoint 2007 instead of calling it WSS 3.0/MOSS.

If I had to give a level of effort the size of the content database for SharePoint can be a factor but it really does not influence the complexity of a particular SharePoint migration. Running the prescan tool on your content databases will provide you the best information to understand what you are up against when migrating to SharePoint 2007.

To run the Prescan Tool you must install SharePoint 2003 SP2. The prescan tool can be downloaded here. The Prescan Tool will find

  • Sites that are based on languages or that use controls that are not installed
  • Custom Web Parts
  • Orphaned data objects
  • Customized site templates

The first one I have not run into yet and the second one is nothing to be concerned about either because most custom web parts "should" be compatible with SharePoint 2007.

The third one could pose some problems but many of the issues can be resolved with applying SharePoint 2003 post-SP2 hot fix. A good resource to get an understanding of errors that are through during the migration process is Bill Baer's blog on the subject. Many of these issues can and will be resolved pretty quickly using this hot fix. You will run the hot fix's new database repair command and then re-run the Prescan tool to determine if the errors have been resolved. If orphaned data objects still remain, you will have to evaluate each one and determine if there are critical pieces of data that must be migrated.

The fourth one is probably the most difficult issue you will come across when doing your migration. I have seen several clients say "we have not done any customizations and we are using SharePoint 2003 out-of-the-box". In many of those cases we found these statements to be un-true where 30% to 80% of their SharePoint implementation had been un-ghosted (customized). This issue is extremely problematic because if a site/page has been customized the presentation of that page will not take on the new SharePoint 2007 presentation; as many want this. A common way a site becomes un-ghosted (or customized) is when FrontPage is used or when a developer has gone in a directly modified the templates in the hive. Note that changing the templates in the hive is an un-recommended practice for either SharePoint 2003 and 2007.

In the end for un-ghosted (customized) site templates you will have to evaluate each one individually and determine what the migration path is for it. If the specific page has been edited with FrontPage, then you could simply revert the page back to the original template. In many cases this is a sufficient solution as many of the changes with FrontPage are presentational in nature. If a SharePoint 2003 site template was purchased or created you will need to migrate the site definition using a XML mapping file that will migrate the template definition to the SharePoint 2007 definition. Creating this XML mapping file requires more effort depending on the complexity. I would ensure that you allow enough time to do several test tries for this. This can be done by simply backing up the content database and bring it back up on different test server.

Using the Prescan tool will help you determine all of this and assist you with understanding the effort for migrating to SharePoint 2007.

Thursday, November 15, 2007

Create Custom BlackPearl SmartObject Service

1) Introduction

The purpose of this article is to present the basic steps that are needed to create a custom service for BlackPearl SmartObjects. This article will give the details of how to quickly create and update a SmartObject Service. Then I will show you how to create a SmartObject that uses the custom service you have authored.

This article will not dive into the details for best practices on to build SOA compliant services. I highly recommend reading going to the Microsoft Patterns & Practices Website and review their patterns for web services. SmartObject Services are not web services however the architecture used to build one is comparable.

I would also like to thank Codi at K2 for providing me an early version of the 201 training materials (currently under construction) that allowed me to spin up on this. Much of the content of this article is information that was derived from that training module.

2) When Do You Need to Write a Custom SmartObject Service

You will want to create a custom SmartObject Service when you want to read in data from any existing custom or vendor database. K2 provides you some SmartObject Services:

  • SmartBox (cannot be used for custom SQL Databases)
  • SharePoint
  • Active Directory
  • K2 [blackpearl]
  • K2 201 Training Materials and SDK show how to build a DynamicSQL service that can hook into any SQL Database. It is built to dynamically define its interface by reading the table schema and building an interface around it.

K2 plans to build more SmartObject Services to hook into other enterprise products

3) Summary Steps to Create

These are the summary steps to create a SmartObject Service:

  • Create a Service Broker class.
  • Override all of the required methods and define the SmartObject Service interface in this class.
  • Build the service and deploy the dll.
  • Register the SmartObject service.
  • Create a Service Instance of the SmartObject service in the K2 Workspace.
  • Use a method of the SmartObject Service in a SmartObject.

4) Summary Steps to Update

These are the summary steps to update a SmartObject Service:

  • Update the service.
  • Stop the K2 BlackPearl Service on the server.
  • Replace the dll.
  • Restart the K2 BlackPearl Service on the server.
  • Refresh all service instances created within the K2 Workspace.

5) Detailed Steps to Create

5.1) Create Class Library

Create a standard class library. You will need to add a reference to SourceCode.SmartObjects.Services.ServiceSDK which is located at \\Program Files\K2 blackpearl\Host Server\Bin\. It is suggested that you change the Copy Local property of the reference to False.

Add the following using statements:

using SourceCode.SmartObjects.Services.ServiceSDK;

using SourceCode.SmartObjects.Services.ServiceSDK.Objects;

using SourceCode.SmartObjects.Services.ServiceSDK.Types;

5.2) Create SmartObject Service

Create class that inherits from ServiceAssemblyBase. Note that I tried creating more than one class that inherits from ServiceAssemblyBase in the same class library and the second one would never be recognized. If you want to create another SmartObject Service class, you will have to create a new class library project. I create class called MyCustServiceBroker.

public class MyCustServiceBroker : ServiceAssemblyBase
public MyCustServiceBroker()


You need to override the following methods GetConfigSection(), DescribeSchema(), Execute() and Extend().

5.2.1) () is a method used to define configuration values for the service instance. If you service is going to make a database connection or needs a URL to make a connection to another web service, you will add a configuration here. An administrator will enter the value within the K2 Workspace.

All you need to do is add values in a Key/Value pair fashion; similar to creating a config file.

public override string GetConfigSection()
this.Service.ServiceConfiguration.Add("Connection", true, "Default Value");

return base.GetConfigSection();

5.2.2) DescribeSchema()

DescribeSchema() is the method that is used to define the interface for the SmartObject service. The values that you set within this interface will dictate how developers will hook their SmartObjects into your SmartObject Service.

First you will define the service. This information will be presented to the developer when they are selecting a service to use with a SmartObject. Second, you will create a ServiceObject. This broker class can have one to many ServiceObjects. I tend to think of ServiceObjects as class definition. Third, each ServiceObject will have properties which are used to pass values in and out of methods. Fourth, you will need to add methods to your service and in this example I created a Load method. There is a finite set of methods you can create: Create, Delete, Execute, List, Read and Update. Take special note in how I add the properties to the Validation, Input and Output collections. A developer will map the fields of their SmartObject to the method properties of a ServiceObject.

Side note you can possibly use the Execute for custom method(s) you want to write for the service. You can create an enumeration property where each enumeration value will map to each custom method that you have. Then you could have an XML property to pass custom parameter(s) for the specific enumeration. I would need to spend a little time testing this out but it should work.

public override string DescribeSchema()
//set base info
this.Service.Name = "MyCustomService";
this.Service.MetaData.DisplayName = "My Custom Service";
this.Service.MetaData.Description = "The simple little service.";

//Create the service object, one to many
ServiceObject so = new ServiceObject();
so.Name = " MyCustomServiceObject ";
so.MetaData.DisplayName = "My Test Service";
so.MetaData.DisplayName = "Use for my test service.";
so.Active = true;

//Create field definition
Property property1 = new Property();
property1.Name = "MyField1";
property1.MetaData.DisplayName = "My Field 1";
property1.MetaData.Description = "My Field 1";
property1.Type = "System.String";
property1.SoType = SoType.Text;

//Create field definition
Property property2 = new Property();
property2.Name = "MyField2";
property2.MetaData.DisplayName = "My Field 2";
property2.MetaData.Description = "My Field 2";
property2.Type = "Integer";
property2.SoType = SoType.Number;

//Create method
Method method = new Method();
method.Name = "Load";
method.MetaData.DisplayName = "Load";
method.MetaData.Description = "Load custom service data";
method.Type = MethodType.Read;


return base.DescribeSchema();

5.2.3) Execute()

Execute() is the method that is used persist data.

Note that you should add errors to the ServicePackage object. This will surface up error messages in a stand way to all callers who are executing the custom ServiceObject method through a SmartObject.

public override void Execute()
EventLog log = new EventLog("Application", "localhost", "K2 BlackPearl Server");

foreach (ServiceObject so in Service.ServiceObjects)
switch (so.Name)
case "MyCustomServiceObject ":

throw new Exception("Service Object Not Found");
catch (Exception ex)
string errorMsg = Service.MetaData.DisplayName + " Error >> " + ex.Message;
ServicePackage.ServiceMessages.Add(errorMsg, MessageSeverity.Error);
ServicePackage.IsSuccessful = false;

private void ExecuteCustomService(ServiceObject so)
foreach (Method method in so.Methods)
switch (method.Type)
case MethodType.Read:
ReadCustomService(so, method);

throw new Exception("Service method undefined");

//This method shows how return a single row of data, this would
//be used for Read and Create methods (when calling Create you will
//want to return primary key of new record)
private void ReadCustomService(ServiceObject so, Method method)
//Add in code here to retrieve data from any external data source and
//load it into the result set for this method.
so.Properties["MyField1 "].Value = "Value 1";
so.Properties["MyField2"].Value = "Value 2";

//This method shows how return a collection of data using a DataTable.
//This would be used for a List method
private void ReadCollectionCustomService(ServiceObject so, Method method)
//Add in code here to retrieve data from any external data source and
//load it into the result set for this method.
DataTable resultTable = this.ServicePackage.ResultTable;

5.2.4) Extend()

Unfortunately I do not have much information Extend() but it is not needed for this service.

public override void Extend()
//throw new Exception("The method or operation is not implemented.");

Note that are several other methods that you may need to override later. For instance there is a Rollback method that should be used when the SmartObject method transaction has been configured to rollback changes if the transaction should fail.

5.3) Build and Deploy It

Build the library. Then get the dll and place it in \\Program Files\K2 blackpearl\ServiceBroker.

5.4) Register the SmartObject Service

There is an executable in \\Program Files\K2 blackpearl\ServiceBroker called BrokerManagement.exe. Double click on it and then click on Configure Services, then right click the services node and select Register New Service Type. Fill in all of the required information and find the class library dll that was placed in \\Program Files\K2 blackpearl\ServiceBroker. Press ok and your SmartObject Service is now available for use.

5.5) Create a Service Instance

Now you need to create an instance of the SmartObject Service you have defined. Instances of your SmartObject are instantiated through the K2 BlackPearl Workspace. If you wrote a very generic service you can create multiple service instances and use your configurations to make connections to different data sources.

Open the K2 BlackPearl Workspace >> go the Management Console >> select the BlackPearl Server >> SmartObjects >> Services >> My Custom Service

On this screen, press the add button and fill in all configuration information that is required as part of the GetConfigSection() method.

5.6) Use the Service Method in a SmartObject

The next step is to start using the SmartObject Service in the methods of your service just like you would with a service that comes with BlackPearl (ex. SmartBox, Active Directory).

6) Detailed Steps to Update

6.1) Update the Service

Make any modifications you need to your service.
6.2) Stop the K2 BlackPearl Service

Go to Start >> Administrative Tools >> Services >>K2 [BlackPeal] Server. Stop the service.
6.3) Replace the SmartObject Servuce dll

Drop in the new dll overriding the old dll in \\Program Files\K2 blackpearl\ServiceBroker.

6.4) Restart the K2 BlackPearl Service

If you have not closed the Service Console, simply restart the service.

6.5) Refresh Service Instances

Start the BrokerManagement.exe executable in \\Program Files\K2 blackpearl\ServiceBroker. Right click the service node and select Refresh All Service Instances.

7) Create a SmartObject using the Custom SmartObject Service

Create a SmartObject Project or add a new SmartObject to an existing project. Click on the Advanced Mode button at the top of the screen. This is important because this SmartObject will have fields that are not in the SmartBox database. When adding new fields; make sure to uncheck the SmartBox column. Simply add two new fields called MyField1 (string) and MyField2 (integer) and uncheck the SmartBox. Now remove all methods from the bottom except for the Load. Select it and press the Edit button. Run this wizard in advanced mode and select Next.

On the Method Details page make sure the type is Read and the Transaction is Continue.

Click Next Twice and on the Service Objects Methods screen remove the SmartBox Service, we will not use it. Press the Add button. Press the ellipse button and in the tree select ServiceObject Server(s) >> Service Object Server >> My Custom Service >> My Test Service >> Load. That will load in all of the Input and Return properties that were defined earlier. Now assign those properties to Fields on the SmartObject and you are done!

Now you can start using your SmartObject that is Loading from a custom data source.

Wednesday, October 31, 2007

November K2 User Group Presentation

Update 4/16/2009 - Here is the recorded presentation

I will be presenting virtually to the K2.Net User Group Meeting Tuesday, November 13, 11am-1pm central time. The actually presentation will between 11:45-12:45 Central Time.

I plan to demonstrate how to build a simple custom SmartObject Service.

For information on how to attend, please go to the following -

Sunday, October 28, 2007

K2 BlackPearl Initial Impression and 101 Training


A few weeks ago K2 sent out an early version of their training VPC for BlackPeal. The training VPCs for 2003 were invaluable for building early iterations of processes, building up demos and proof of concepts, etc.

These notes are not comprehensive and will require some more research on my part as I get more acclimated with all the new features and functionality.

Notes and Improvements

The following are notes that I gathered while go through the initial set of the materials.

  • SmartObjects is the new big thing they have released and it pretty impressive. SmartObjects provide a way to access business data from your line of business databases. With 2003 you had to build up your own service layers and then integrate them into the processes. Now BlackPearl provides the services layer to integrate business data across the enterprise. There are two types of SmartObject. From my initial research the first are SmartObjects which the developer defines where the data is stored in the SmartBox sever. You can now very rapidly create data structures to persist your workflow data to. The second are SmartObjects which you can code up using their SDK which can load up from any disparate datasource available given you have a service of some type to work with. The data is not cached within SmartBox server, you are just building a proxy based on a start interface to retrieve data and surface it up through K2. This deserves a deeper dive and some best practices will develop with time.
  • BlackPearl has tried to make the creation of line rules easier with the addition of Actions. When building events you define actions that an end-user can select from. When the wizard is done BlackPearl will generate all of the lines which can be used to connect to other activities. This assumes is that conditional logic is somewhat simple; which is not always the case. In many cases K2 will create a dropdown for users to select these actions making your development of processes very quick. So if I had a process where a manager should be able to select approve, deny, or resubmit I would create those three actions and within InfoPath a dropdown will be added by K2 where those three values would appear (do not worry, it is not hard coded into the InfoPath form). These actions are also available everywhere throughout workspace, SharePoint web parts, etc. as context menu items allowing the user to take quick action on an event instance without having to open a form.
  • There was a neat new event called Forms Generation Event which is a client event handler to create very basic forms within the new workspace. Although, given how simple it was, I still need to test the boundaries of it to see how feasible it is to use with real heavy duty business requirements are thrown at it. Nothing replaces writing a start custom page.
  • BlackPearl now provides a reporting solution which was not available in 2003. This was a major problem with K2 2003 and the solution was to persist all of your data externally to support reporting. Still is not the best place to store your business data and I still say it is best to keep that data externally (the new SmartObjects help with this best practice). Regardless all of the reporting within BlackPearl is now based completely on Reporting Services. They now provide several wizards to generate your own reports with the K2 Workspace. What was really exciting is that you can write your own custom reports in Reporting Services and then host them within the workspace. Reports associated to workflow may not always be centric and there may be multiple datasrouces where you need to retrieve data. Plus they have written their own data provider to access their data so you have immediate access to all data within K2 to create sophisticated reporting using Reporting Services. This is a huge improvement.
  • Users can now sign up for Notifications for processes, events, activities, destination rules, line rules, start rules, succeeding rules and escalations. This is great because with 2003 I continually had to design around not make K2 a spam machine. Now end-users are empowered with the ability to subscribe to events.
  • BlackPearl provides a web enabled process design tool that is embedded within SharePoint. The tool is better than the SharePoint Designer for creating standard business workflows as you can build workflows with InfoPath forms, they are re-usable, and easier to create custom activities. Still this tool is scoped toward created workflows within SharePoint only as it only is meant for designing workflows for a list or a library. I still need to test the boundaries for this tool. Given how easy it is to create workflows in Visual Studio using K2, I am more inclined to work with it at the moment.
  • BlackPearl now provides a way to create processes with Visio. To be honest when I heard about this at first was excited but skeptical it would be provide a way to create good workflows. During my initial testing of it turned out to work well and I was pleased with it. I will still need to again test the boundaries of this. The great thing about using Visio is that when defining the business process with business users during the elaboration phase I always use Visio to define the process. Although there is not always a one-to-one with the activities and events that I want to expose the business user to. So now if I create the workflows using Visio I can mark them up and them move them over to Visual Studio.
  • The integration with InfoPath 2007 has improved. Things to be excited about is the ability to support web enabled InfoPath forms, have multiple InfoPath forms in a single process, the InfoPath forms are integrated with Visual Studio along with your K2 process and the deployment of the InfoPath form with the K2 process (you no longer have to publish your InfoPath form, K2 will publish it for you). Although when adding .Net managed code to your InfoPath form has some problems due to the new MOSS requirement that any form with .Net manage code must be deployed through central administration. I will be doing a deeper dive into this soon.
  • provided a ton of integration with WSS 3.0 and MOSS. There are administrative event handlers for provisioning new sites, managing user permissions and groups and lists. These are very important as too often I go to companies where their SharePoint implementations have grown out of control because of no governance. Sometimes if the governance is too tight this can become a barrier for usage as business users cannot get what they need in a timely basis. These event handlers can be used to define light-weight business processes to automate the management of an enterprise MOSS implementation.
  • For lists and document libraries in SharePoint MOSS provides event handlers to do almost all of the operations you could need. Create, update, delete, check in/out, manage permission, move, copy, update metadata, download/upload documents, etc.
  • provides search event handlers that allow the developer to build up criteria to find items in SharePoint and then take action on them. I did not use reflector to go underneath the hood to see what it is being done but looks like these events are basically creating CAML queries that can go out and find items.
  • provides some new Records Management event handlers (create and delete records as well as put a hold on a record). My initial research into MOSS Records Management a few months ago I found that there is no workflow around the actual movement of documents into the records center. We have to rely on business users to go out, find a document and then move it to the records center. In many situations it is not that simple as there will be some sort of approval process before the document should be moved over. BlackPearl can be leveraged to build that business process.
  • BlackPearl provides several event handlers around managing web content. As you may know CMS has been rolled into MOSS. In MOSS this means new content management templates, publishing pages, etc. BlackPearl provides event handlers that can create, copy, move, delete, check in/out for publishing pages. As well you can get and update publishing page content, create pages using Conveters and create and Update reusable Content. I can see how this can be used extensively to create workflow around the management content.
  • BlackPearl provides integration with the BDC to expose SmartObject data. As you may know it can become difficult to create the XML configuration for the BDC to read data. If you are already loading in line of business data into your SmartObjects it is really almost no effort to expose any SmartObject through the BDC into SharePoint. It was really simple to hook up a SmartObject into the BDC as K2 has created some custom screens within SharePoint Central Administration to facilitate this. So if you have some SmartObjects with some Oracle, SAP and SQL data all you need to use the K2's BDC integration and you are done.

Miscellaneous Notes

  • Within Visual Studio they have introduced what they call environment configurations. They are similar to K2 2003 StringTable but are reusable between processes.
  • Improved presentation and integration with Visual Studio. Their implementation of WPF for their wizards is really clean and is exciting to see a real implementation of. They also implemented this neat little thing where the developer can draw letters on the Visual Studio canvas which will create a new activity. Is it necessary; not really; but it is neat.
  • The BizTalk events were not present on the VPC. I found out they have not be released yet. This is not terrible as it is still possible to invoke BizTalk through other means but I have used the K2 BizTalk adapters on two projects.
  • Getting into the properties of an event or activity is much easier now.
  • Workflows can be initiated directly from the workspace. In the past, we would have to write little stubs to kick off workflow.
  • The K2 Service Manager is now part of the K2 workspace. The K2 workspace is still geared towards the power user as it can still expose too much information to the common everyday business user.
  • They have created a new concept called "Roles" which seem to be similar to the Destination Queues which were available in 2003.
  • K2 deprecated a couple event handlers, the SQL and Web Service event handlers were gone. Good riddance J

Tuesday, October 16, 2007

MOSS Power User References and Training

Many clients have asked for references for not just a technical training but power user training and references for MOSS. This should compliment an earlier recommendation for technical references and training. I recommend picking up a copy of SharePoint 2007 User's Guide: Learning Microsoft's Collaboration and Productivity Platform by Seth Bates and Tony Smith.

This series so far has been really good for SharePoint 2007. The book lays out step by step instructions on how to do things. If you would like to read an introductory chapter please go here.

I would also highly recommend using the Office SharePoint 2007 Help and How to Site as there is a top of information here with steps on how to do every day sorts of things.

As for training I am going to again refer to MindSharp as they offer a SharePoint 2007 Power Users Course.

Monday, October 15, 2007

BlackPearl and MOSS Workflow Options

1. Background
A common question asked is should you use, Windows Workflow Foundation (WF) or SharePoint designer. The following is some quick things that should be considered.

1.1 General WF
WF architecturally has the look and feel of K2 and we know the K2 BlackPearl release is built completely on WF. Here is a high level context diagram of it.

A few things to keep in mind:

  • Have to build low level interfaces to move data between applications and the workflow definitions.
  • Have to build a hosting application to manage transactions (especially long-running transactions).
  • Have to create a transaction management and logging database.
  • Have to build ACID transaction management using WF activity library.
  • Must create graceful and generic exception handling.
  • No "out of the box" integration with any applications in the Microsoft stack.

This is not a comprehensive list and is based on observations and other projects I have seen try to do complete WF solutions from scratch. WF is great foundation to build solutions upon but it could be equated to "plumbing". If you have done work with BizTalk you may know it is a powerful Enterprise Service Bus pattern solution. BizTalk has orchestrations which would consistently confuse stakeholders into thinking that a simple diagram is drawn and configured and the project is done. We know this is not the case and the same analogy can be drawn for WF.

WF is similar in nature for the workflow world. It is very interface driven, requires lots of custom code and a high-level of effort. Some return on investment is gained with its activity library abstractions but not enough. As well, the level of adoption of pure WF solutions has not gotten off the ground when it comes to building custom applications.

1.2 SharePoint 2007 Workflow

There are two options when doing SharePoint workflow the first is building workflow in SharePoint Designer 2007 and building workflows in Visual Studio.

SharePoint Designer 2007 is a rebuild of MS FrontPage made specifically to work with SharePoint. Some disadvantages of building workflow in SharePoint Designer 2007 are:

  • Meant for business users as a code free workflow solution for managing items within SharePoint only.
  • When using SharePoint Designer 2007 your site can become customized (unghosted). Allowing business users to have elevated privileges to use SharePoint Designer 2007 is not good either.
  • Workflows are not reusable and are bound to the list (there are some known ways of doing it but it is not natural; the official answer is it is "NOT supported" - Porting SharePoint Designer Workflows to Visual Studio). Note this makes it difficult to move them between development, QA and production environments.
  • Limited conditional logic.
  • Cannot add custom code to workflow from designer (read whitepaper in called "Adding Activities to SharePoint Designer 2007" in the EMC Starter Kit).
  • Form integration not robust ( or InfoPath).
  • Centers around workflows to manage documents, sending emails and creating tasks only.
  • Supports sequential flows only.
  • No ability to debug.

The second option is to create custom WFs using the architecture discussed in the first section using Visual Studio. From information I have gathered from other colleagues whom have done pure MOSS WF solutions they we difficult. Some disadvantages of doing MOSS workflow in Visual Studio is:

  • Provide basic activities and events for MOSS only.
  • Difficult to deploy.
  • Robust audit and metric data must be built up.
  • Creating large multi-step processes difficult.
  • Integration with other platforms must be built from scratch.

1.3 References to Learn About Both
To try your hand at both SharePoint Designer and Visual Studio Workflow try this virtual lab.

As well check out the book I recommended here.

Also recommend reviewing these two links Workflow Development in Office SharePoint Designer and Workflow Development Tools Comparison.

2. BlackPearl and MOSS BlackPearl simply becomes an obvious choice once you start pealing back the layers. It is built completely on top of WF and removes every one of the issues stated above. Out of the box:

  • Comes with event handlers to do every operation you need with a SharePoint List or Document Item. Updating metadata, moving, deleting, creating, batch operations, modifying the document permission, checking in/out, etc. They basically provide wizards to generate and customize Features requiring the developer to write less code.
  • They provide several events for publishing content supporting the Content Management Server (CMS) features that were rolled into MOSS.
  • They have some out of the box events to build workflow around Records Management.
  • Creating complex multi-stage InfoPath 2007 processes is supported (this includes web enabled forms).
  • Provide several events for SharePoint site administration. Events to managing users and permissions, managing lists/libraries and provisioning sites and workspaces, etc.
  • They provide an Ajax tool to build workflows on the SharePoint site itself.
  • Workflows can be authored inside of Visio.

There is really so much more as this is the tip of the iceberg for BlackPearl.

Saturday, October 13, 2007

Copy SPListItem.Version History with BlackPearl Part 2

1. Background

Copy Version History with SPListItemVersion to a new SPListItem Part 1 -

In a prior entry we were researching how to move the history from a SPListItem to another SPListItem that will be archived to another location. While doing research on this, we decided to check BlackPearl to see if it supported copying the version information. Using Reflector we were able to see that BlackPearl was not supporting the coping of version history when an SPListItem is copied from one list to another.

2. Resolution
The resolution is simple and still the same was as it would be done in 2003. Basically we need to go modify the code; the only difference is that we are now writing code in Windows Workflow Foundation (WF) instead of in an standard event handler.

Now go to the Event, right click on it and select to view the Event Item.

This will drive you to a screen where you will see the under laying WF code which BlackPearl is built on. Just seeing this is makes me giddy; even the underlying code is presented in a graphical format allowing developer to zero on the exact thing they need to modify. Writing code in this manner forces developers to write more modular code which is easier to understand and maintain.
There are two ways you can make this modification.

2.1 Option 1
One option would be to double click on CopyListItem code event handler and you will be taken to a code screen that you can start modifying.

2.2 Option 2
The second option would be to add an if statement and then add a new code event to the workflow. This is the best solution as it first segregates the custom code from the code. Second all of the K2 SharePoint events use their custom web services which we will not use. Instead we would insert code which was discussed in the part of this article series.

Copy SPListItem.Version (SPListItemVersion) Part 1

7/13/2008 - Going through and cleaning up code samples in popular posts. The old tool I used stunk. At least this is better. I apologize for the inconvenience.

10/14/2009 - I have created a new blog posting with updates as there were some issues with this original posting. Please go here.

1. Introduction

In MOSS you may have a requirement to take list items and move them to another location. A simple scenario would be that you have an item in a list and when a specific value is set you want to move that item to another location. One problem that you will run into is when you copy the item from one list to another is that you will want to ensure that the version history of the item is moved with the item. This article will discuss how you should go about fixing this problem; as well as a quick blurb on how this would be implemented in BlackPearl.

2. Available Methods and Approach

There are several available methods apart of SPListItem that you may be tempted to use. Specifically there are the CopyTo() and CopyFrom() methods. After some research online and using with Reflector to dive into the System.Microsoft.SharePoint neither of these methods can be used because the file name must be passed in has part of the URL parameters for the methods. With a standard list item object the file will be null making these methods useless unless you are working with a document object. As well, these methods do not support the copying of the version history.

The only viable solution is to loop over all of the values in SPListItemVersionCollection and set them into a new SPListItem. The Restore() method of SPListItemVersion cannot be used as it will take a specific version and restore it as a new version within the current SPListItem object. While looping over each SPListItemVersion the call Update() of the new SPListItem object will be called to rebuild the version history into the new object.

3. Solution Outline

An outline of the solution is as follows:

  • Create an Event Handler for the ItemUpdated event.
  • Create a new SPListItem object that will be the item that will be archived.
  • Loop over the Versions property of the source SPListItem. When looping over the item you must loop going backwards as the last item in the SPListItemVersionCollection is the first version saved.
  • Set each field from SPListItemVersion and call update on the new SPListItem (if there are 50 versions, then update will be called 50 times).
    • Note that index zero is always the latest version that a user would edit through SharePoint.
  • Then move the file attachments to the new SPListItem.
    • Note that attached documents to a SPListItem are not versioned. If a document needs to be versioned, it is best to put that document in a document library that is versioned controlled and then link the document to the SPListItem.
  • Deploy the Event Handler as a Feature.

4. Code

4.1 Feature

This article will not discuss the particulars of creating Features but shows them here for reference. NOTE – Change the GUID

Title="Item Archive Event"
Description="This event will archive an item when it is closed"
<ElementManifestLocation="Elements.xml" />

4.2 Elements of Feature

<Elements xmlns="">
Events, Version=,culture=neutral,

4.3 EventHandler Class

public classArchiveItemEventHandler : SPItemEventReceiver {

public overridevoid ItemUpdated(SPItemEventProperties properties) {

4.4 Create Folder

This solution is based on moving items to an archive folder which is dynamically generated every month. The code below creates a new folder in SharePoint based on the current month.

private SPListItem GetArchiveFolder(SPList list) {
//Get the folder for the current month
string folderName = System.DateTime.Now.Month.ToString() +

SPListItem destinationFolder = null;

foreach (SPListItem folder in list.Folders) {
if (folder.Name == folderName) {
destinationFolder = folder;

if (destinationFolder == null) {

//Create new folder
destinationFolder = list.Items.Add(
SPFileSystemObjectType.Folder, null);

if (destinationFolder != null) {
destinationFolder["Name"] =
"Archive Closed Records (" + folderName + ")";

return destinationFolder;

4.5 Copy the Versions into a new Item

This code is the meat of the solution. Couple of more notes:

  • A new archive item SPListItem is created in the archive folder location.
  • The DisableEventFiring() and EnableEventFiring() methods wrap the code. This is needed because the there will be an Update() call for every version.
  • Note again that the looping of the versions starts with the last item in the collection and works backwards.
  • There is code to accommodate Created, Created By, Modified and Modified By. They are read only fields. If they are not set the values will be lost and set with the system account and the time in which the code executed.
  • After all of the versions have been updated into the new item, the attachments are moved into the new item.
  • Finally the source item is deleted.

That is essentially it. The biggest thing to walk away with is that the properties.ListItem.Version[x][y] is two dimensional and that is what gets you access to all of the field values in the version collection.

private void MoveItem(SPListItem sourceItem,
SPListItem destinationFolder,
string newItemLocation) {

//Create a new item
SPListItem archiveItem = destinationFolder.ListItems.Add(


//loop over the soureitem, restore it
for (int i = sourceItem.Versions.Count - 1; i >= 0; i--) {
//set the values into the archive
foreach (SPField sourceField in sourceItem.Fields) {
SPListItemVersion version = sourceItem.Versions[i];

if ((!sourceField.ReadOnlyField) && (sourceField.Type != SPFieldType.Attachments)) {
archiveItem[sourceField.Title] = version[sourceField.Title];
else if (sourceField.Title == "Created"
sourceField.Title == "Created By"
sourceField.Title == "Modified"
sourceField.Title == "Modified By") {

archiveItem[sourceField.Title] = version[sourceField.Title];

//update the archive item and
//loop over the the next version

//now get the attachments, they are not versioned
foreach (string attachmentName in sourceItem.Attachments) {
SPFile file = sourceItem.ParentList.ParentWeb.GetFile(
sourceItem.Attachments.UrlPrefix + attachmentName);

archiveItem.Attachments.Add(attachmentName, file.OpenBinary());



//Now delete the current item.

5. Referencs

Sunday, October 7, 2007

MOSS Book and Training Recommendation

1) Books
There are a ton of MOSS books being published right now. If you are a developer and looking for one; hands down get Microsoft SharePoint: Building Office 2007 Solutions in C# 2005 by Scot Hillier. Scot’s book discusses all of the things you will need to do day one for as a developer from a SharePoint project. He provides step by step instructions on how to create site templates, web parts, features, SSO, object model stuff, some administration, fundamentals and he actually provides an example of a how to create a custom workflow that works. I had messed around with writing WF flows throwing them into MOSS during beta timeframe and failed miserably due to the complete lack of documentation (I would still not go off the deep end and create core workflows in MOSS). Although the workflow he provides is very similar to a free virtual lab I did some time ago. Try it out.

If you want to get Microsoft Office SharePoint Server 2007 Administrator's Companion by Bill English. This is a fine book but most of the information you can get for free on MSDN.

Also if you want to get good training on SharePoint I highly recommend attending a training session with MindSharp.

Wednesday, October 3, 2007

BlackPearl Training VPC

Yes - I have the Blackpearl traing VPC and I am very excited. I have been remiss on making blog enteries on my initial why playing around with first release of BlackPearl. I had built my own VPC and done some playing around with it. I will be working with it over the next two weekends. Right now I am a weekend warrior as I have been busy with MOSS related projects.

Saturday, September 29, 2007

An Overview of Workflow

Are you looking for a quick overview of what is. Here is an article I wrote some time ago that will give you some basic background on it - An Overview of Workflow Technology

Friday, September 28, 2007

InfoPath Common Trap and SQL Event

Common Trap
I had a past client reach out to me today for a question about He was using the SQL Event do some operations within an InfoPath process. He fell into one of the traps I discussed in my blog (Item 13 - Process and Activity Level XML and Multi-Destination Considerations). Basically remember if you were to do something like the following where you have an event (in this case a SQL event) after an InfoPath client event in the same activity you must use the activity XML field. Do not use the process XML field as it will not be updated until the succeeding rule for the activity is called.

I personally do not like how this code is hidden down inside of the succeeding rule. The code to move data from an activity to process level XML field conceptually has nothing to do with the success of an activity. There should be an OnActivityComplete event handler which should handle this. I digress…

A quick and dirty solution would be to do something like below if you want to use the process level XML fields. I would think this would be better solution as there could be multiple destination users who each have their own version of the XML. If there is only one slot, doing this you are guaranteed to use the XML of the destination user would finished the activity.

Now this may not be the best solution because maybe the code within the SQL Event handler could be very similar between SQL Events. You can refactor the code into something reusable if you wish. Point is I am not such a fan of using the SQL Event unless you are practicing a very rapid delivery environment. Reason is that all of the SQL statements are injected directly into .Net code as strings. If you want to be a minimalist and not create a large data layer just create store procedures and write a little code yourself to call them. You will have more maintainable code over the long run.

SQL 2005 Not Working with the Event
As well, he was running into some issues with SQL 2005 not working with the SQL Event Wizard although it would work with SQL 2000. I checked around and did not have an explanation on why that was occurring. Unfortunately I do not have an environment to test this out at the moment but I was not surprised there was a problem. The client said that if he took code from an event that he had generated with SQL 2000 the code would work with SQL 2005. I explained that is because the code is using underneath the hood of the SQL Event which is agnostic to the version of SQL. This leads me to believe that the SQL Event Wizard must be using something specific to SQL 2000.

Friday, September 14, 2007 InfoPath Document Attachment Solution

1. Introduction
This article will provide a solution for the document attachment issue with as attaching documents into data fields can cause the database to grow at an uncontrolled rate. This issue commonly arises with InfoPath but also occurs with an or Win Form workflow. Many workflows have a requirement for attaching documents. An example would be if a user makes a purchase request maybe some legal document must be signed and then attached to the submission form.

2. Problem Background
If InfoPath is being used developers will want to use the document attachment functionality to attach the file to the InfoPath form. The problem is that InfoPath will take the attached document and serialize it into the XML of the InfoPath form. This is a problem as the size of the InfoPath XML file will grow as each file is attached. This becomes even a bigger problem when the XML is attached into a process as the database will grow significantly with each process instance.

The database will continue to grow because the XML of the InfoPath form will be stored numerous times; this is how it works. When an InfoPath form is attached to a K2 process the XML for the InfoPath form is stored at the process level as well as at the activity level. The reason why is each destination user (which could have their own slot) may require their own copy of the XML. In the preceding rule of the activity where the InfoPath client event resides, some code is generated that will copy the XML from process level into each activity instance for each and every destination user. Then on the succeeding rule, the XML from the activity instance of the user who finishes the activity will have their XML copied back into the process level.

Understanding this let’s gain an understanding of what will happen if a document is attached to the InfoPath form. If there is a process that has a three step approval, for the first approval there are ten possible destination users, for the second there are five and for the third there are two. In this example the originator creates an InfoPath form and adds three attachments to the InfoPath form each 1MB. Because the documents are serialized into the XML for the InfoPath, 54 MB is the total minimum amount of space that would be consumed in the database per process instance (3MB for the process and 3 MB for each activity instance). In reality most business documents are not 1MB. Note this does not include consumption of space for the other data fields in the process. This number will go up if there is any rejection paths as new activity instances are created. The amount of space can even grow more if an audit trail is turned on for the XML.

This is not a fault or a short coming of the database. The database is a database that has been specifically normalized for workflow statement management, not for managing unique data points. This issue would applicable not just InfoPath forms, this issue would be present if any file attachment is serialized into either process or activities data fields.

3. Solution
A solution for this problem is to store the attached documents externally. This can be simply resolved by storing the documents externally in WSS. To accomplish the solution the WSS web services provided by will be reused. The solution below was initially developed for 2003, InfoPath 2003 and WSS 2.0 but was a reused successfully with 2003, InfoPath 2007 and WSS 3.0. This solution would also apply to a BlackPearl implementation.

4. Web Services
The following web service is written in can be translated to C# if needed. There are three web methods that will be created: UploadFileToFolder, UploadInfoPathAttachmentToFolder and GetFilesFromFolder.

The web service will require a reference to http://[servername]/_vti_bin/K2SPSList.asmx. This is the web service that is used by

4.1 UploadFileToFolder
This web method purpose is to upload a file to a folder in SharePoint. This web service requires a folder name and a document name. In this case, the folder name should be a unique name like a requisition number. The web method will create the folder if the folder does not already exist. If the document with the same name already exists, the web method will overwrite it (that could be changed).

Note that whatever account is used to connect to the web service will upload the file. This will not be the user is actually uploading the file (like the destination user).

Public Sub UploadFileToFolder(ByVal strWssServerUrl As String, ByVal strWssSite As String, _
ByVal strWssSiteDocLib As String, ByVal strFolderName As String, ByVal strFileName As String, _
ByVal bytes As [Byte]())

Dim objK2Wss As K2WssWebService.K2SPSList

If strFolderName Is Nothing Or strFolderName = "" Or strFolderName.Length = 0 Then
Throw New Exception("A Unique ID is required.")
End If

objK2Wss = New K2WssWebService.K2SPSList
objK2Wss.Url = GetK2WssWebServiceURL()
'connect with default IE account
'objK2Wss.Credentials = System.Net.CredentialCache.DefaultCredentials
'connect with a service account
objK2Wss.Credentials = New System.Net.NetworkCredential(GetWSSUserString(), _
GetWSSUserPasswordString(), GetWSSUserDomainString())

Dim strFullFolderName As String = strWssSiteDocLib & "/" & strFolderName

' Check if folder exists
If Not objK2Wss.FolderExist(strWssSite, strFullFolderName) Then
' Create the folder
Dim strCreateFolderErrorMsg As String
objK2Wss.CreateFolder(strWssSite, _
strFullFolderName, strCreateFolderErrorMsg)

' Check if error creating the folder
If strCreateFolderErrorMsg <> "" Then
Throw New Exception(strCreateFolderErrorMsg)
End If
End If

' upload the document
Dim strUploadFileErrorMsg As String
objK2Wss.UploadDocument(strWssServerUrl, strWssSite, _
strFullFolderName, strFileName, _
bytes, True, strUploadFileErrorMsg)

' Check if there was an error uploading the file
If strUploadFileErrorMsg <> "" Then
Throw New Exception(strUploadFileErrorMsg)
End If
Catch ex As Exception
Throw ex
objK2Wss = Nothing
End Try
End Sub

4.2 UploadInfoPathAttachmentToFolder
This method will call UploadFileToFolder but its specific purpose is to accept a document that has been attached into an InfoPath form. Documents that have been attached into an InfoPath form have some header information added to the bits of the document. For instance, the file name needs to be stripped of the bits.

Public Sub UploadInfoPathAttachmentToFolder(ByVal strWssServerUrl As String, ByVal strWssSite As String, _
ByVal strWssSiteDocLib As String, ByVal strFolderName As String, ByVal byteIPFileAttachment As [Byte]())

Dim i As Integer

' Get the length of the file name from the IP file attachment header
Dim iNameBufferLen As Integer = byteIPFileAttachment(20) * 2

' Create binary array for the file name
Dim byteFileName(iNameBufferLen) As Byte

' Get the file name
For i = 0 To iNameBufferLen
byteFileName(i) = byteIPFileAttachment(24 + i)

' Translate file name to a string variable
Dim asciiChars() As Char = System.Text.UnicodeEncoding.Unicode.GetChars(byteFileName)
Dim strFileName As New String(asciiChars)
strFileName = strFileName.Substring(0, strFileName.Length - 1)

' Create binary arrary for the file. This is
' the total file lenght minue the header and the file name length
Dim byteFileContent(byteIPFileAttachment.Length - (24 + byteFileName.Length)) As Byte

' Get the file bytes
i = 0
For i = 0 To byteFileContent.Length - 1
byteFileContent(i) = byteIPFileAttachment(24 + (byteFileName.Length - 1) + i)

' Upload the file to WSS
UploadFileToFolder(strWssServerUrl, strWssSite, strWssSiteDocLib, strFolderName, strFileName, byteFileContent)
Catch ex As Exception
Throw ex
End Try

End Sub

4.3 GetFilesFromFolder
This web method will retrieve all of the file names from a specific folder in SharePoint and will return an XML document.

Public Function GetFilesFromFolder(ByVal strWssServerUrl As String, ByVal strWssSite As String, _
ByVal strWssSiteDocLib As String, ByVal strFolderName As String) As Xml.XmlDocument

Dim objK2Wss As K2WssWebService.K2SPSList

'XML writer
Dim sw As New System.IO.StringWriter
Dim xtw As New System.Xml.XmlTextWriter(sw)

If strFolderName Is Nothing Or strFolderName = "" Or strFolderName.Length = 0 Then
Throw New Exception("A Unique ID is required.")
End If

objK2Wss = New K2WssWebService.K2SPSList
objK2Wss.Url = GetK2WssWebServiceURL()
objK2Wss.Credentials = New System.Net.NetworkCredential(GetWSSUserString(), _
GetWSSUserPasswordString(), GetWSSUserDomainString())

Dim strFullFolderName As String = strWssSiteDocLib & "/" & strFolderName

' Check if folder exists
If Not objK2Wss.FolderExist(strWssSite, strFullFolderName) Then
' It is valid for this folder to not exist for a request,
' thus this web service should not return an error but empty XML...
' Folders for a request are only created when a document is uploaded.
xtw.WriteEndElement() 'Files
' get all of the files from web service
Dim strErrorMsg As String
Dim strFiles As String() = objK2Wss.GetFolderFiles(strWssServerUrl, _
strWssSite, strFullFolderName, strErrorMsg)

If strErrorMsg <> "" Then
Throw New Exception(strErrorMsg)
End If


' Loop over files and build a list of files
' for the unique id
Dim i As Integer
For i = 0 To strFiles.Length - 1
If Not (strFiles(i) Is Nothing) Then
xtw.WriteElementString("FileName", strFiles(i))
xtw.WriteElementString("FileUrl", strWssServerUrl & "/" & _
strWssSite & "/" & strFullFolderName & "/" & _
xtw.WriteEndElement() 'File
End If

xtw.WriteEndElement() 'Files
End If

Dim xmlDoc As New System.Xml.XmlDocument

Return xmlDoc
Catch ex As Exception
Throw ex
End Try
End Function

5. How to Connect to InfoPath
To get this hooked up to InfoPath is easy and will require no .net enabled code.

5.1 Add a Binary Data Field
Add a base64Binary data field to the InfoPath form. This value will only be set temporarily and it will be submitted to the web webserive.

Even though this violates one my best practices of not polluting the XSD schema of your InfoPath form with UI specific notes. However with InfoPath it is not possible to have a secondary data source with a base64Binary.

5.1 Add Submit Data Connection
Make a data connection to the UploadInfoPathAttachmentToFolder web method using the Data Connection Wizard. Do the following:

  • Use the “Submit Data” option
  • Select a web service
  • Enter the url to the web service
  • Select the UploadInfoPathAttachmentToFolder web method
  • End a value for all of the parameters from the web method. For the strFolderName make sure you a unique value and for byteIPFileAttachment set the data field from the first step.
  • Then finish the wizard.

5.3 Add Get Files Data Connection
Now create a data connection to the web method to return all of the files using the Data Connection Wizard. Do the following:
  • Prior to this you will need to modify the web service to return a hard coded set of dummy values otherwise you will receive an error while making the data connection. A quick solution is to temporarily change the GetFilesFromFolder to return a hard coded sample XML in the correct format. Make sure that there is more than one node returned in the XML. If not, InfoPath will not infer that it is not possible that the web method can return more than one file.
  • Use the “Receive Data” option
  • Select a web service
  • Enter the url to the web service
  • Select the GetFilesFromFolder web method
  • In the final step select the checkbox to run this when the form is opened to retrieve any files that may already be uploaded. This will only work if the unique number for the folder name is generated before the InfoPath form is opened. If that is not possible, modify the code in the web method to not throw an exception when strFolderName is null. Instead return back an empty string of XML.
  • Finish the wizard
  • Remove the hard coded XML.

5.4 Add Controls to Form
First move the base64Binary field that was added in the first step to the form as a document attachment control. Then drag and drop a button onto the form. Finally go Data Source task pane and drag and drop the File collection for the GetFiles data connection as a repeating table.

Something similar to following can be created.

For the repeating table change the control within it to be a hyperlink control and use the following configuration.

5.5 Create Rules for the Button
Now we need to add some rules to the Upload File button. Double click on the button and the properties window should appear. Press the Rules button and then press the Add button to create a new rule. First add a Set Condition to make sure that the File (base64Binary field) is not blank. Then add an action to upload the attachment. Then clear out the File (base64Binary) field. Finally add another action to return all the files that are currently available.

5.6 User Experience
The user experience will be that they will add a file to the form using the InfoPath file attachment control. They will then press the button and then the file name will appear in the repeating list immediately below. When they click on the link the file will be opened in a new window.

6. Using this Solution
Now you have the ability to start attaching documents to your processes without adversely affecting the size of the InfoPath form. As well, the services can be used outside of InfoPath and are re-useable for all processes you create in the future like SmartForms, Custom pages, WinForms, etc.

Saturday, September 8, 2007, SharePoint, and InfoPath Best Practices

This is the fourth of a series of best practices I considered when starting a new workflow. Note that manye of these Best Practices are 2003 specific. They will be re-evaluated with the new BlackPearl release.

1) Store InfoPath Data Externally
Commonly much of the data within an InfoPath form needs to be reported on. It is recommended that this data be stored externally for robust data reporting requirements.

When using InfoPath processes, a good practice is to keep the data in the InfoPath form while it moves through the process. Then at the end of the process, shred the XML apart and insert it into a normalized database. As well, at various points in the process, push the XML document out to a different InfoPath form library where users can get access to the latest and greatest data.

2) Create Reusable XML Web Services
In many cases both the InfoPath form and the process will require access to the same data. Use XML Web Services to allow both to gain access to the same data.

3) Required Fields with InfoPath
In InfoPath setting a field as required will make the field required for all users who will submit the form. If there is a simple process with two InfoPath activities the first being for a submitter and the second being for an approver. If there is a field called “IsApproved” and is defined as required in InfoPath the submitter would have to fill this in even though this is something that only the approver should do. There are two ways to get around this:
  • Introduce a field into the InfoPath form that will be set by to indicate the state of the workflow (very similar to switching the view). This can be evaluated in validation rule of the field and defer the validation of a field to specific points in time.
  • Break the InfoPath form apart into two separate workflows.

4) Remove Views and Create a Single View
If you have many views (let’s say more than three) and you commonly have to make changes in each and every view every time there is a change, consider collapsing down to one view. Again introduce a state field into the InfoPath form that will be set by Then InfoPath can use conditional formatting to hide those fields based upon the state of the workflow.

5) Multiform Processes
BlackPearl now provides the ability to create multiple InfoPath forms and use them in the same process. In 2003 it was possible to create workflows with multiple InfoPath forms by creating a master process and then creating child processes for each form. Another approach is to just chain several InfoPath form processes together. The advantage of having a master process is that you can map out how all of the InfoPath child processes interact with each other as well as have global events.

6) Understand Limitations of InfoPath
Do not let InfoPath become a substitute for creating Win Forms applications. In the end InfoPath should be used to capture data from a user. Robust functionality to view databases, provide heavy user interactions, etc. should be done in other mediums. One indication this is occurring if there is heavy use of script Managed Code in the InfoPath form.

Note that an InfoPath process does NOT have to be initiated by an InfoPath form. External system,, Win Forms can start a process that later uses an InfoPath form to capture some data.

7) Solidify XSD First
It is very important to try to have your XSD schema fairly solid prior to enabling your InfoPath form. Once the Form is enabled, it must remain in sync with the cached XSD schema within

8) Do not Pollute XSD
Over time the InfoPath form will become polluted with various nodes that are used to support the presentation of the InfoPath form (two of the recommendations above violate this rule). It is recommended to keep this to a minimum. If it cannot be avoided place the nodes related to the InfoPath UI in a separate group (title as such) so that is it immediately known which nodes contain valid business data and which ones do not.

9 ) InfoPath Version Control
For 2003 when a change is made to an InfoPath form this must be carefully considered when publishing it to SharePoint as running process instances will be affected. InfoPath will try to gracefully handle situation when opening up an old XML file within a newer version of the InfoPath form. This will not always result well. It is suggested that a different form library be created for each version of the form. The running processes are configured to use the old form in the old form library while the new process instances will use the new form in the new form library.

10) Do not Use Attachments in InfoPath
Do not put attachments into the InfoPath form as this can cause significant performance problems. Large attachments will be stored in the InfoPath XML which is then stored in the database. The XML for an InfoPath form is used at both the process and activity levels. This creates the possibility the same attachment being stored hundreds of times in the database wasting space. It is recommended that the InfoPath form store all of the documents directly into something like SharePoint and then provide links to those documents inside the InfoPath form.

11) Securing InfoPath Data
Switching InfoPath views is NOT a valid way to secure data as the user can simply open the XML directly and modify the data directly. One way to enforce security is to use multiple document libraries with permissions. There is no way secure an InfoPath form at the document instance level with WSS 2.0. This can possibly resolved with using WSS 3.0 with BlackPearl as permissions can be applied at the document item level. If data must be totally secured, store all of the XML data in an external database.

12) Viewing Forms in Process
A common request by users is to be able to review the InfoPath form after they have submitted it or when the process is complete. The InfoPath form that was in SharePoint is dynamically added and removed by when an InfoPath client is completed. It is suggested that the form be published at specific points in the process to various archive document libraries for all users to access.

13) Process and Activity Level XML and Multi-Destination Considerations
Understand that when an InfoPath form is attached to a 2003 process the wizard will go into the InfoPath form and retrieve the XSD. The XSD will be copied into the process definition as a process level XML data field. For InfoPath activities within that process each will have an activity level XML data field created for the InfoPath client event. This is done because there is a possibility that multiple destination users can be added to the activity. This is important because each destination user should have their own copy of the InfoPath XML to modify as succeeding rules will need to evaluate the data set by each approving user (remember that rules can be created that more than one user must approve before an activity can be completed). It would not be possible to create succeeding rules requiring multiple user approval if all the users shared the same XML document.

If there are events after the InfoPath client event in the same activity make sure to use the activity level XML and not the process level XML. The process level XML will be updated in the succeeding rule of the activity which is at the end of the activity life-cycle.

If there are many destination users that are able to approve an activity, place custom data aggregation code into the succeeding rule as this is where data is re-synchronized with the process level XML data field. An example is that if three users and each add three items to the InfoPath form, you will need to add to the succeeding rule to ensure there will be nine items in the process level XML data field. Otherwise the last three will only be moved up.

It is highly suggested that if there are multiple destination users for an InfoPath client event that a unique name is used for the InfoPath form for each destination user. If multiple destination users modified their own specific XML and this needs to be aggregated at the end of the activity, modify the succeeding rule code to merge this data together back into the process level XML data field.

14) Reuse WSS Services provides several useful ready to go services to do various things within SharePoint like uploading/deleting files, creating/deleting folders, creating/updating document metadata, etc. These services can be easily reused within InfoPath, and Win Forms.