Monday, March 24, 2008

Update SPListItem Created By and Modified By Fields

Well I thought I would have the simple task which was not so simple. I had a requirement to allow users to make anonymous submissions which I wanted to store in a SharePoint List. Simple enough right?

I needed to create a web part and in that web part I needed to create a SPListItem for designated list. In a previous blog I had created an archive method where I had set the modified by and last updated but apparently I had made a mistake. The code will absolutely work if I am logged in as an administrator however if I am logged in as user who is part of a member group with contributor rights I cannot set the created or modified by fields. This made sense for what I was doing at the time but now I needed to allow normal every day collaboration users to do this operation. As well, the users in the members group should not have rights to edit or delete items in the list items; just add. It is simple enough to break the security inheritance on the list and associate members of that group to a custom permission level. However I wanted to avoid doing that.

I did some searching around and found the following:

These show you how to update the created and modified fields but did not help me in my quest.

I found some stuff that a guy wrote on creating an impersonation utility:

However that was going really overboard. In WSS 3.0 you now have the ability to SPSecurity.RunWithElevatedPrivileges. Now I had tried this in my web part but it was not going so well. However I moved the logic to a custom ItemAdded event handler on the list and that did the trick. Plus it is better to put the logic of changing the created and modified by on the list itself because if direct submissions are made on the list then logic is centrally located.

The event handler is below:


public class InsertEventHandler : SPItemEventReceiver
{

public override void ItemAdded(SPItemEventProperties properties)
{
base.ItemAdded(properties);
ChangeSubmittedBy(properties);
}

private void ChangeSubmittedBy(SPItemEventProperties properties) {

Guid siteID = properties.ListItem.ParentList.ParentWeb.Site.ID;
Guid webID = properties.ListItem.ParentList.ParentWeb.ID;
Guid listID = properties.ListItem.ParentList.ID;
Guid newItemID = properties.ListItem.UniqueId;

SPSecurity.RunWithElevatedPrivileges(delegate()
{
using (SPSite site = new SPSite(siteID))
{
using (SPWeb web = site.OpenWeb(webID))
{
SPList questionList = web.Lists[listID];
SPListItem listItem = questionList.Items[newItemID];

SPFieldUserValue oUser = new SPFieldUserValue(
web, web.CurrentUser.ID, web.CurrentUser.LoginName);

listItem["Author"] = oUser;
listItem["Editor"] = oUser;
listItem.Update();
}
}
});
}
}

Tuesday, March 18, 2008

Some BlackPearl Install Notes 1

I just finished another install of K2 BlackPearl and ran into a couple things I had not run into in the past that you should know about.

1 – Visual Studio 2005 Extensions for .NET Framework 3.0 (WCF & WPF)

Quick note if you are trying to install Visual Studio 2005 extensions for .NET Framework 3.0 (WCF & WPF) and you have .Net 3 SP1 installed you will run into some issues. Use the following to help for some more information - http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2912680&SiteID=1

You either need to un-install .Net 3.0, install the extensions and then upgrade to .Net 3.0 SP1.

If you do not want to do this the link above has a command line which should resolve the problem: msiexec /i vsextwfx.msi WRC_INSTALLED_OVERRIDE=1

2 – Installing Microsoft Report Viewer Redistributable 2005

This was weird. We installed Microsoft Report Viewer Redistributable 2005 and then got an error in the K2 Workspace when trying to access Reporting Services saying we need to install Microsoft Report Viewer Redistributable 2005 SP1. We down loaded Microsoft Report Viewer Redistributable 2005 SP1 (Full Installation) but that would never work.

The solution was to install Microsoft Report Viewer Redistributable 2005 then install Microsoft Report Viewer Redistributable 2005 SP1 (Upgrade). The upgrade is referenced in the Getting Starting document but the original distributable is not.

3 – DTC Configuration

The Getting Started document mentions this but do not forget to configure the DTC security configuration on SQL Server, the K2 Server and possibly on the MOSS server if MOSS and K2 on different machines. You will run into deployment issues if it is not configured on each machine.

Thursday, February 28, 2008

K2 InfoPath Parallel Activity Lesson Learned

I have a small lesson learned as I did not listen to myself and use my own best practices for InfoPath integration with K2 in general. Basically I had some processes that were pretty simple using InfoPath. At certain points in the K2 Process I would kick out data to SQL using SmartObjects. A best practice is to make sure that you are kicking out that InfoPath data and not letting it sit as an XML document in a form library or in the K2 database because the data is not very accessible for enterprise reporting.

I had a change request to add some parallel processing to a process that was in production. The process had Security, VP, IT and Help Desk approvals that occurred in a serial manner. Now I needed to change the process by adding a Contracts approval that would occur in parallel with Security approval. It was simple enough to add a new Contracts approval activity and then add a new proceeding rule VP to make sure that both Contracts and Security finish before VP approval can begin. I ran into some issues with accessing the Outcome and Action Result activity level fields for both the Security and Contracts approvals. I would have liked to put into the proceeding rule that those values had to equal "approved" but they were not available in the wizard. The only option I had was to add some process level fields, add new events to the Security and Contracts activities, set the fields appropriately and then use the new fields in the VP proceeding rules.

The real issue I ran into was around working with the InfoPath data. The use case basically stated that if both Security and Contracts rejected I would need to resubmit back to each one independently. So if Contracts rejects; Security should still be allowed to approve. There is no guarantee that this would be done in a coordinated fashion and that both Security and Contracts approvers will be working with different portions of data.

The challenge was if both the Security and Contracts approvers rejected I would have some issues with the resubmission of data and ensuring data in the InfoPath forms is managed correctly. What I knew would happen is that lets say both Security and Contracts rejected, an InfoPath form would be generated for each resubmission activity for the originator. My InfoPath form was NOT loading off the database when it was opened. So if the originator opened up the Security rejection, changed some data, and then submitted. Subsequently the originator would then resubmit the Contracts rejection with different updated data. What would happen is the Contracts resubmission would overwrite on top of the Security resubmission edits because I was not loading the InfoPath form. It was clear as day I knew this was going to happen.

Solution, I need to modify the InfoPath form to load up off my SmartObjects. Luckily it is extremely simple to hook my SmartObjects into the InfoPath form and resolve this issue.

Lesson learned is just treat the InfoPath forms just like I would with an ASP.net form. When it opens use the rules to populate the form and then use submit rules to save form data using SmartObjects.

Tuesday, February 19, 2008

MOSS Cross Site Lookup

Need a free MOSS 2007 cross site look up?

This one comes up on most searches and seems to have tons of issues.

However after Googling a little harder I found this cross site look up which has worked pretty well and I have seen three different projects (including one of mine) get some real mileage out of this one. The author gives a fantastic tutorial how it was composed too. The only things we found were:

  • The cross site look-up is only a dropdown, so it will not support multi-selects. Well, you have the code.
  • Another person at RDA saw it does not return result in alphabetical order; however that was easily fixed by adding the following to CrossSiteLookupFieldControl.FillDropDownList().
public static void SortByValue(ListControl combo) {
SortCombo(combo, new ComboValueComparer());
}

public static void SortByText (ListControl combo) {
SortCombo(combo, new ComboTextComparer());
}

private static void SortCombo(ListControl combo, IComparer comparer) {
int i;

if (combo.Items.Count <= 1)
return;

ArrayList arrItems=new ArrayList();

for (i=0; i<combo.Items.Count; i++) {
ListItem item=combo.Items[i];
arrItems.Add(item);
}

arrItems.Sort(comparer);
combo.Items.Clear();

for (i=0; i<arrItems.Count; i++) {
combo.Items.Add((ListItem) arrItems[i]);
}
}


// compare list items by their value
private class ComboValueComparer : IComparer
{

public enum SortOrder
{
Ascending=1,
Descending=-1
}

private int _modifier;

public ComboValueComparer() {
_modifier = (int) SortOrder.Ascending;
}

public ComboValueComparer(SortOrder order) {
_modifier = (int) order;
} //sort by value


public int Compare(Object o1, Object o2) {
ListItem cb1=(ListItem) o1;
ListItem cb2=(ListItem) o2;
return cb1.Value.CompareTo(cb2.Value)*_modifier;
}
} //end class ComboValueComparer


// compare list items by their text.
private class ComboTextComparer : IComparer {
public enum SortOrder {
Ascending=1,
Descending=-1
}

private int _modifier;

public ComboTextComparer()
{
_modifier = (int) SortOrder.Ascending;
}

public ComboTextComparer(SortOrder order) {
_modifier = (int) order;
}

//sort by value p
public int Compare(Object o1, Object o2) {
ListItem cb1=(ListItem) o1; ListItem cb2=(ListItem) o2;
return cb1.Text.CompareTo(cb2.Text)*_modifier;
}
} //end class ComboTextComparer

Sunday, February 17, 2008

BlackPearl MOSS Governance Case Study

1) Introduction


One of the biggest things we run into when coming in and talking with a new MOSS client is how is governance enforced within MOSS? Some of their experiences may be:

  • With SharePoint 2003 and WSS 2.0 we stood up sites everywhere and no one could navigate or search for content.
  • Team sites, workspaces, etc. are not organized making it impossible to find content and have sprawled everywhere.
  • Documents and content have no classifications.
  • Document libraries are full of disorganized content.
  • Documents and content have no business life-cycle management.
  • Etc.

MOSS 2007 has given us tons new features and functionality to assist with governance of content. The following site is a good resource to become familiar with MOSS Governance. However the out of the box functionality can fall short specifically in terms of workflow. These shortcomings have been documented before (here) but these shortcomings become even more compelling if a Use Case is put against it.

2) Use Case

The following is a simple use case for managing an opportunity/sales process.

  1. Account Executive identifies a business opportunity.
  2. Account Executive goes to screen and enters in required data.
  3. A site is created for the company.
  4. A site is created for the opportunity.
  5. Account Executive fills out Opportunity Assessment and SOW Templates.
  6. Account Executive initiates workflow that notifies participants (Business Development Manager, Program Director and Project Manager).
  7. All participants review the document and mark as Approved.
    1. A single Deny requires Account Executive resubmission.
  8. When complete, all participants are notified of the end status.

3) End Solution Vision

The goal of this implantation is to create functionality in MOSS that will manage both the topology and the taxonomy of content. The first thing that comes to mind is using Content Types and Workflows to manage this solution. Content Types would be used to classify, store associated metadata and have workflows to manage the life-cycle of the document. To name a few benefits doing this through Content Type is the content data can be centrally defined, re-useable across the entire site and allow for more targeted searching with Enterprise Search.

We will see that this vision cannot be easily achieved with SharePoint Designer or Visual Studio and the case for using K2 BlackPearl will be overwhelming.

4) MOSS WF Shortcomings

The first problem you will encounter is that you cannot use SharePoint Designer. SharePoint Designer is a great tool for managing the workflow for documents but it is stuck on the list you created it for and cannot be associated to a Content Type which is a major shortcoming. In this use case as new sites are created. Creating workflows on the document library itself will not work there will be a new document library for each site.

Changing the use case to not use sub sites (or Document Workspaces) would not solve the problem as the single document library will become a dumping ground for documents. There are documents that are used to support the creation of Opportunity or SOW documents, for instance there are project plans, budgets, requirements documents, whitepaper references, etc. It is important to keep this content close to each other in an organized fashion and creating a Document Workspace fits the bill. If you did not use a Document Workspace you would not be able to take advantage of other collaboration tools like task lists, calendars, custom lists, discussion boards, etc. because these tools would be shared among all the documents in the single library creating a rat's nest of content.

Second is that SharePoint Designer does not support anything other than modifying the metadata of a document, sending emails, and a few other things. You cannot use SharePoint Designer to provision sites.

Third there will probably be a need to access CRM data which has company names, past opportunities, contact information, etc. and none of this is accessible using SharePoint designer as it can connect to external data sources.

Fourth some may try to retrieve external data or provision sites creating custom WF Activity Library in SharePoint Designer. Creating the custom code to do these operations is nothing out of the ordinary however getting the activities to hook into the SharePoint Designer and passing values around between the activities is a whole different issue you will run into.

At this point we can pretty much say SharePoint Designer 2007 will not be useful and its position as a tool for workflow will fall short over the long run. Next we would look into using Visual Studio to create a SharePoint Workflow Foundation (WF). Simply put, creating workflow in Visual Studio is a complete custom coding effort. All of the things needed for this solution can be done with Visual Studio however there level of effort increases and there is zero infrastructure around it (other than you do not have to build a host server for the workflow definition as it is hosted by SharePoint as a Feature).

WF workflows in SharePoint revolve around managing tasks in a Task List; that is it. Tasks are created in a task list which will lead the user to an asp.net or InfoPath form where a user can take an action. From what we have seen with custom WF projects the effort is increased to do the simplest of things. For instance when writing WF workflows there is no destination user and rules infrastructure. For instance it is not out of the box to define rules like:

  • All destination users must approve but the first to deny will move the workflow to another state.
  • All destination users must approve task in a serial manner one after another.
  • If a certain amount of destination users approve, then the task is completed and moves to the next state.
  • Destination user assignment is determined based on process instance state data and consequently the user profile must be retrieved from an external system.
  • Etc.

All of this logic would have to be built from scratch.

As well, there is no framework around the actions that users can take. It would be coded into the workflow to look for a custom column in the task list to check for a specific value and then move the workflow from one state to another. There is no security around the completion of the task to ensure that the right person is completing it either.

As well building multistage processes with different forms is cumbersome. There are things like initiation, association and task forms. When I have played with them in the past using InfoPath there is no way to get at the XML data easily (except on the initiation form). What you will quickly start doing is not storing any data at all in the workflow, externalizing all of it completely defeating some of the purpose around using a tool like InfoPath. Since you will start externalizing all of your data you will need to create custom libraries to persist data and abstract in such a way that your workflow and your forms (whether asp.net or InfoPath) can both use it.

There is no reporting around the WF workflows either (SharePoint Designer too). You can go to the properties of an item and see all the logging that were created for the item but that is it. There is no way to aggregate the data for the items to create business contextual reports. As well, since the data is associated to the SharePoint item, if that item were to ever be deleted all of the workflow histories are deleted. Knowing this you will be forced to externalize all of your business reporting data and make sure you persist this data during workflow execution.

As well, there is no central administration or infrastructure for the workflows at all. There is no way out of the box way to handle security, no way to search for workflow instances globally to manage them, no infrastructure for managing configuration parameters, and the list goes on and on…

5) BlackPearl SharePoint Workflow Solution

I must first stress that BlackPearl workflow is so much more than SharePoint workflow. It is a workflow server built completely on top of Windows Workflow Foundation that can be used to create workflows for SharePoint, WinForms, ASP.net, around COTS, etc.

While trying to figure out how I wanted to do this solution I had two options. I could either use BlackPearl's SharePoint Integration which will deploy a workflow as a Feature in the same fashion as creating a workflow in Visual Studio or I could have used the SharePoint Events Integration which will listen for events at a location (like a list) to initiation a workflow. Given that I wanted to create a workflow that would be associated to Content Type and I wanted a workflow a user could action either from within SharePoint or Office using the BlackPearl SharePoint Integration was a no brainer.

Knowing the Use Case stated above these are the projects in my solution:

  • SharePoint Project:
    • Feature with Content Types (with standard document templates) and custom Fields.
    • Feature with custom document library with custom Content Types Associated to them.
    • Simple Custom Web Part to display list of Sub Sites (not out of the box with WSS 3 but extremely simple).
    • Custom Site Definition for Company level sites (Sub Site Web part included in definition).
    • Custom Site Definition for Opportunity level sites (had custom document library and Forms library with XML configuration added).
  • K2 BlackPearl SmartObject Project:
    • Two SmartObjects using the Dynamic SQL service to retrieve data from CRM.
    • One SmartObject to retrieve data from Active Directory.
  • K2 BackPearl Process Project:
    • InfoPath Process to initiate site provisioning.
    • SharePoint Integration process for Opportunity Content Types.
    • SharePoint Integration process for SOW Content Types.
  • SQL Reporting Services Project:
    • Reports that used the SmartObject Data Provider to retrieve data from custom SmartObjects and out of the box SmartObjects to display status of all workflows.

That is in a nutshell is the solution, lets dive a little deeper into each.

The SharePoint Project basically encompasses all of the standard development needed for SharePoint. I made it a point to create the custom content types, document lists, and site definitions as I believe in making solution highly re-deployable. I am not a fan of creating templates or doing stuff in SharePoint Designer unless I have to. I look at CAML code the same way I look at DDL code I would write for defining a database. It must be repeatable and have the ability to be built from scratch, on demand otherwise you will encounter maintainability issues down the road.

The SmartObject Project allows me to harvest data from disparate sources and provide the data to all levels of the solution using a uniform interface, API, and deployment methodology. A whole separate discussion is required for the SmartObject framework however this provided me the ability to work with data from different locations without having to write a single line of code. As you will see I was able to use this data in my InfoPath form through their out of the box web services and I was able to use the drag and drop environment within the K2 Process to access this data. If I needed to write any custom code I would just use its API to make a connection to the server, set some data into a SmartObject instance and call a method like save.

The first process looks like the following. This process is started by a user going to an InfoPath form. The InfoPath form uses SmartObjects too display past opportunities for the company and allows for the selection of participants of future Opportunity and SOW workflows (Account Execs, Directors and Project Managers). There are events to create sites based on the custom definitions created earlier. As well, there are other events to save the XML from the InfoPath form into the provisioned site.


The following screenshots show the two workflows for the Opportunity and SOW workflows. They are both very similar in nature. Notice the first thing each one does is load in the XML from the InfoPath form used during the site provisioning process. From the XML we retrieve primary key information like the company and opportunity ID to link the workflow to external data. I am not concerned about the other data as the InfoPath form is hooked into SmartObjects which I read from dynamically in the process configuration. Otherwise the rest is simple.

  • Metadata of the document is updated from the CRM via a SmartObject.
  • Users are configured dynamically into the process to complete tasks.
  • Email notifications are sent.
  • Environment variables are used to configure all parameters such as library names, sites, etc.
  • Only about five lines of custom code are written to set some process level data.
  • The workflows are deployed to the custom Content Types.



Finally the reporting services process uses the SmartObject Data Provider to query data from the CRM, K2 database and Active Directory. I was able to write a single SQL statement that joined all of these data sources together to create reports that shows all of the active process instances and whom they are assigned to. The K2 workspace provides several out of the box reports that do the same and I could have used their report generate tool. However I wanted to create a more customized report with more tactical data that business decision makes would want to view. Then I added simply added the reporting services viewer web part into SharePoint site and the reports were now available to everyone. I could even import this custom report into the K2 Workspace if I wanted to.


6) Conclusions

The creation and refinement of the workflows only took me day once I had laid out the pattern that I wanted. That only really took me about two days so the level of effort to create this solution is around three days (creating the custom document library definition was a whole different headache associated to poor MOSS documentation). Now, we have this nailed down, future document workflows can be laid out in this fashion in significantly less time.

As you can see I was able to create processes that can be used to manage both the topology and taxonomy of content within MOSS with almost no custom code. Yes, no custom code and I have a solution where I am hooked into external data sources, I have custom UIs, I am calling WSS Services, etc. K2 BlackPearl provides tons of events and wizards that you can use to drive more sorts of variations of workflows in MOSS. At your fingers you have event wizards to:

  • Create, Delete or Modify Sites and Workspaces
  • Create, Delete or Modify lists and libraries.
  • Create and delete both SharePoint Users and Groups.
  • Manage Permissions for sites, lists, folders, items.
  • Manage documents (upload, download, check in/out/undo, move/copy/delete, and modify metadata).
  • Create, copy, delete, retrieve and update any list items.
  • SharePoint search for list or library data.
  • Records management sending documents, creating and releasing holds.
  • Publishing wizards to create, copy, more, delete, check in, update/copy page content, create/update/delete reusable content.
  • Out of the box BDC integration with several of the SmartObjects.
  • Ability to create complex InfoPath workflows.
  • Etc.

If that is not enough, you can create your own event wizards using their API. This list does not include the entire infrastructure and management tools that come for free when using BlackPearl, I just cannot cover it all here.

As you can see an overwhelming case can be made use K2 BlackPearl when doing workflow automation within SharePoint. I classify BlackPearl as a framework in which I can use to build all my SharePoint solutions.

Thursday, February 14, 2008

Measuring MOSS Performance with SCCP and MOM

1) Introduction
This is a second part of an MOSS Architecture and Performance article (Part 1) where a deeper look will be made into tools that can be used to estimate, manage and forecast capacity for MOSS 2007. There are some studies of performance on the Microsoft website but many of the variables may not apply or a more glandular set of assumptions need to be baked into a model.

There are two tools out there that you can use assist with architecting and managing a MOSS production environment. First there is the System Center Capacity Planner 2007 which can help with the planning of a production environment. The second is the MOSS 2007 Management Pack for MOM 2005 which can be used to monitor the actual performance your production environment.


2) System Center Capacity Planner 2007 (SCCP)


All SharePoint Architects should take a good look at this tool and use it to first estimate what would be required to support a new production environment and second to assist with the forecasting changes to the production environment. SCCP provides a simple wizard that you used which will ask various things like:

  • Is an Intranet or Extranet?
  • How many users will use the site? How will the users use the site (collaboration or publishing) and to what degree?
  • It will ask about branch offices and request information about Branch office access to the network.
  • It will ask about the hardware and network configuration of the environment.
  • What sort of availability needs to be supported?
  • What sort of SQL server environment is available?

With this information which are the same questions usually asked when doing a MOSS assessment; the tool will provide you a recommended topology. From there you can run simulations and it will provide detailed reports on the performance of the planned topology.


What is really interesting about the tool is that:

  • User usage profiles can be created and modified to more accurately capture how the users will use the environment.
  • Ability to add more servers to the recommended model to test various different scenarios.
  • Ability to add multiple user roles to model. For instance you may have 50 high collaborative users while there are 2000 publishing users.
  • Ability to attach new networks and understand how they affect over performance.

2.1) In/Out of Scope


This information is pulled directly from the System Documentation for the System Center Capacity Planner 2007 Tool.


In-scope capabilities. The tool is designed to assist you in planning the following elements of a WSS/MOSS installation:


  • Deployment of WSS 3.0 on servers running Windows Server 2003 SP2.
  • Deployment of MOSS 2007 on servers running Windows Server 2003 SP2.
  • Determining storage requirements for MOSS.
  • Ensuring high-availability needs are met.
  • Planning for scalability and expansion of existing installations.

Out-of-scope capabilities. Several areas that are beyond the scope of the tool include:

  • Modeling memory usage. SCCP does not model memory usage.
  • Upgrade scenarios for WSS and MOSS. The tool models only the latest version of WSS 3.0 and MOSS 2007.
  • Self discovery of existing WSS/MOSS installations. The tool models only green-field and previously saved models.
  • Deployment migration from competitor products. The tool models only WSS and MOSS.
  • Real-time customer usage profiling. SCCP does not deploy agents into your server farm to monitor usage patterns in existing SharePoint installations. This may be addressed in future versions.
  • Virtualized WSS/MOSS deployment. The tool assumes that you are doing capacity planning for a production environment and are using physical server boxes to deploy your SharePoint farm.
  • Disaster recovery scenarios. Disaster recovery scenarios introduce levels of complexity that make efficient modeling prohibitive.
  • Side by side installations. The tool models only new installations of the latest WSS and MOSS releases. Incremental upgrades involving server farms with multiple versions of WSS or MOSS are not handled.
  • Extranet installation. Authentication complexity precludes implementing extranet modeling.
  • E-mail integration: Exchange Server integrated with SharePoint. E-mail integration may be included in a future release.
  • Microsoft Excel® services. Excel services may be included in a future release.
  • High-end scenarios. The tool does not model high-end scenarios such as multi-terabyte Web applications or multiple Web applications.
  • Authentication methods other than NTLM and Anonymous.

Even with all these limitations this tool can still provide significant insight into the most important variables when design a physical topology for MOSS.


2.2) Installing
There are two downloads you will need. First download the System Center Capacity Planner itself which installs the capacity planner iteself. By default it will only have the Exchange capacity planner. Then you will need to download and install the SharePoint Capacity Planner template which will allow you to create SharePoint models.
2.3) Building an Initial Model
The following is a quick view into this tool. First you open the SCCP tool and select what type of capacity model you want to create. I have heard Microsoft intends to build more tools for server products in their stack.
Next this is where you select what type of site you intend to create and select an initial user profile.
Next you enter in the Branch Offices.
If Branch offices were created you will need to provide more detailed information around their connection to the network.
Next you need to identify the configuration of the servers that will be part of the recommended topology.
Next you will make selections around the availability of the web front end servers and the SQL server database.
Finally, a screen will be presented with a recommended topology.
After the wizard has been created, you are presented with a graphical tool that shows you all the servers. It provides a drag and drop GUI where you can add or remove elements from the model.
The following is a result of a simulation. Notice that this topology has pretty good results however there are some warnings on the amount of time it takes to create or delete a site. Knowing what the permissions are for the majority of the users will help dictate whether this is important or not. You may even go into the model, add a different user profile that has permission to do this activity and then rerun the simulation.
Versus the result of this simulation which shows there are some potential performance problems. The SCCP comes with a very detailed reference document explaining each one of the performance indicators.
3) MOSS 2007 Management Pack for MOM 2005
Now the SCCP tool will do a good job and modeling but it will not assist with the monitoring of a SharePoint environment. The SharePoint's Administrator Companion does provide a good chapter on how to monitor performance of hardware however many SharePoint Administrators require a tool that can be used to automatically notify them when something interesting has occurred in the production environment. The Microsoft Office SharePoint Server 2007 MP for MOM 2005 watches for failures or configuration problems which affect the availability and performance of Office SharePoint Server 2007. The following things are tracked:
  • Shared Services Provider (SSP) provisioning failed
  • Site Directory scan job failed
  • Enabling features failed on some sites
  • Administration site for the SSP is missing
  • Enabling features on existing sites failed
  • The Office SharePoint Server Search service is not running
  • The Microsoft Single Sign-On service is not running
  • The Office Document Conversions Launcher service is not running
  • Failed to connect to parent server farm
  • SSP synchronization failed
  • The Office Document Conversions Load Balancer service is not running
  • Failures in content deployment jobs
  • Poor cache performance
  • Error during document copy or move operations
  • Errors with the Information Rights Management (IRM) features
  • Failures in the Document Conversion feature
  • Out of Memory exceptions coming from form business logic
  • Denial of Service scenarios
  • Failures during form processing or while loading business logic assemblies

The management pack can be downloaded here.


4) Conclusions
With both of these tools in place SharePoint architects and administrators will have a better ability to plan their production environments, watch for performance issues with their environment and then model new configurations to compensate.

Wednesday, February 13, 2008

MOSS 2007 Architecture and Performance Considerations

1) Introduction
A very common problem that many clients encounter with their MOSS implementation is can their production environment handle the amount users? This can be a complex question to answer which can be driven by many things (many times understanding the usage of the site). Knowing this is a broad question this article will try to answer this at a high level and provide direction on areas where deeper investigation on a case by case basis. Hopefully decisions makers can walk away after reading this with a good understanding of what considerations there are with a MOSS implementation.

2) Knowing Your MOSS Implementation
First many need to make that decision on whether to use MOSS or vanilla WSS 3.0. Here are two articles that spell out the difference of the features clearly:

There are several things that are going to drive performance when setting up your MOSS environment. Typically when breaking this down and tuning out all the noise it really comes down to understanding how MOSS will be used. There are a couple different types of sites that may be implemented within a MOSS deployment:

  • Publishing Site (Intranet) – This could be characterized as content that would be read a lot by internal employees. It would have a medium level of transactions most of which would be reads with some insert, update and delete transactions associated with the management of the content. Typically this content is fairly static as the most dynamic content on any Intranet site is news, calendars, etc.
  • Team Collaboration – This can be a highly transactional environment where document libraries, lists, forums, discussion boards, task/issue lists, calendars, workflows, etc. are being updated on a daily, hourly or down to the minute. Knowing both how many users and their planned usage of this environment will greatly dictate the performance of this type of site.
  • My Sites – This again is similar to a Team Collaboration environment except that that each user of the SharePoint environment will have their own site to do the same things which they may or may not be used heavily. The amount of transactions per user may not be that high (unless there is strong adoption) however what can be compelling is the amount of users making transactions.
  • Extranets – Can be similar to Team Collaboration environments except that they are externally accessible to partners so that they can collaborate on the same information. SharePoint can be configured to allow multiple authentication protocols to access the Team Collaboration content that is sitting behind a firewall.
  • Public Internet Site – This is a public site that is available to the entire world using anonymous access. This site would require the ability to support a high level of read transactions and make sure that the SharePoint environment is secure.

This is not inclusive but a list of the most common ones. Please also note that characterizations of low, medium and high in the above explanation is discussing the level of transactions relative to each other.

It is important to note that question still does not change in regards to building up any application. We always need to know, how many users will there be, what the users are doing, what events may cause a high volume of usage, etc.


3) Physical Architecture
First let us get an understanding of what the physical architecture of a standard MOSS environment may look like. Keeping this simple there are three types of servers that will be part of your topology:

  • Web Front End Server(s) (WFEs): What users will directly access.
  • Application Server(s): Search Index, Excel Services, Form Services, etc.
    • InfoPath Form Services Note: A little misconception is that Form Services actually run on the WFEs because it uses ASP.net and IIS to serve up the content. If there will be heaving web enabled InfoPath usage it must be evaluated (InfoPath Forms Services Best Practices and InfoPath Forms Services Architecture).
    • Excel Services Note: It is better known that Excel Services does heavily use the WFEs and it is the only service that supports load balancing on the WFEs (Excel Services Architecture).
    • Search Notes: There are two roles with Search, the Query Role and Index Role. To simplify the discussion the Index Role's job is to build indexes of content which are given to the Query Role. The Query Role is accessed by the user when they perform a search. The Query Role is typically installed on the WFE while the Index Role will run on the Application Server. This decouples the two from one another to support redundancy if the search index needs to be rebuilt, more server power needs to be allocated to building the index, etc.
  • Database Server(s): This is the Configuration Database for the MOSS Server farm and the all of the Content Database(s). There could be performance ramifications for how the Content Database(s) are partitioned logically with respects the content stored within it but it would have to be something really really really large. However the size of the Content Database itself should not be an issue, please review (Demystify MOSS Content Database Sizing). It boils down to how agile you are with the administration of your databases (backing up and restoring them).

This article, Plan for availability (Office SharePoint Server), spells out the most common physical architectures to be considered. The most common one often recommend is the five server farm depicted below from the Plan for availability (Office SharePoint Server) article.


More WFEs can always be added and loaded balanced to help support the amount of users using the MOSS Farm. More application servers can be added to provide dedicated resource for service however something like the search index service cannot be scaled out across applications servers. Probably the most effective configuration is the six server configuration where there are two applications servers. One application server is dedicated to search while the other will support less intensive services.

3.1) Licensing
To be honest a five server farm is not always required, it really comes down to what sort of service level agreement (SLA) has been set up which will dictate the level of redundancy that needs to be supported. It is not uncommon for a four server farm configuration where there is one WFE, one Application Server and a DB cluster because the client already has this infrastructure in place. Any decision maker should challenge themselves with the following question. If you are already willing to make an investment to create a highly-available data tier, why not do the same for the web tier? Depending on the licensing agreement with Microsoft or Software Vendors the cost of the server software itself is nominal when it comes to the cost of CALs for MOSS. The following article has a short write up on the various licensing agreements out there for MOSS (Logical architecture model: Corporate Deployment).

3.2) Reality Check
It is important to give a quick reality check on some of the services and components that many want to use with MOSS.

3.2.1) Excel Services
This is quick overview of excel services however it Excel Services is not a robust tool. After reading the list of Unsupported Features in Excel Services you will find out it may not provide what is needed for Excel power users (macros are not supported). It would not be recommended to use Excel Services as a replacement for enterprise reporting platform while tools such as Performance Point Server are built for this. Understand its limitations before using it.


3.2.2) InfoPath Services
Before going off and making a huge investment in InfoPath Form Services and web enabled Forms, please review the following two articles:

Web enabled InfoPath forms are fantastic tools for building forms to quickly capture information. However the golden rule with them is if you have complex user interfaces requiring multiple views, lots custom .Net code, etc. you may have eclipsed InfoPath and better off using traditional ASP.net forms.


3.2.3) Workflow
Business users can use SharePoint Designer and developers can use Visual Studio to author workflows that can be hosted within SharePoint. However the SharePoint Designer does have lots of limitations and Visual Studio can be cumbersome and you will have to build up a bunch of things. A workflow engine like K2 BlackPearl should be used. The following article lays out workflow options and considerations for the various tools.

3.3) Extranet Topology
It is worth discussing what an extranet topology may look like as many clients have requirements to support access to the MOSS environment to the outside. With SharePoint 2003, setting up external access was not a trivial, natural thing although this is not the case with MOSS 2007. Regardless if the access is external using Active Directory (AD) credentials over SSL, Forms Based Authentication (FBA) with SQL Server, ADUM, etc. all that is being set up is a new Zone which will have a specific authentication protocol associated to it. Users can all have access to the same content and the SharePoint Security is used for authorization (security trimming) content within MOSS.

Design extranet farm topology (Office SharePoint Server) is an article written by Microsoft that lays out the details of the various different configurations for an extranet. There various different perimeter and publishing models. The following diagram from Design extranet farm topology (Office SharePoint Server) is a very common configuration.


In this model, placing the application servers on the corporate network can be beneficial as efficiencies can be gained with building the search index as server doing the index is closer to the data sources it is querying. Plan Security Hardening for Extranet Environments provides detailed insight into the configuration of all the ports between the DMZ and the Corporate Network.

3.5) Topology Performance Testing Results
In an effort to provide some easy performance numbers, Microsoft completed some tests of the various different topologies. If an environment requires the ability to support more users, it can always be scaled by adding more WFEs or further segregating the application server services to their own dedicated machine. Here are some high level statistics summarized in Plan for performance and capacity (Office SharePoint Server). Please note, that the hardware configuration may not be identical and there are many factors that could affect these statistics.

It is interesting to note that the study by Microsoft found that after four web servers had been added for a single database server there were diminishing returns. Research on a case-by-case basis would need to be completed to gain a full understanding of how many users will be using the server concurrently and in what capacity.


4) Logical Architecture
Finally the logical architecture is probably one of the more complex things that topics that needs to be discussed. The following article Logical architecture model: Corporate Deployment is the most comprehensive in regards to this. It has detailed information on how an enterprise logical deployment should be configured. The following diagram was developed from that article and is from in the following blog posting (Investing in Logical Architecture Design Samples).


Some important things to walk away with are:

  • MySites are partitioned from team sites into their own Site Collection. This is extremely important, highly recommended and should be done from the beginning as it difficult to move them later (Design My Sites Architecture).
  • Internet and Intranet\Team Sites are as well partitioned into their own Site Collections as collaborative environments tend to consume more resources than publishing sites. The logical architecture even goes as far a partitioning Intranet and the Team Sites from each other. Site Collections can have their own dedicated Content Databases and can they can easily moved to their dedicated WFEs if there are future performance problems. Even if entire implementation were to be physically installed on one box but logically separated like this the solution will have much more ability to scale.
  • Dedicated application servers (Shared Services Providers) should be set up to support the various different types of sites. For example the external web site would use a different SSP than the intranet for security reasons.
  • Various Application Pools are recommended to achieve higher redundancy between, security, performance, etc. Zones are then used to logically control the authentication protocol.

5) Tools for Understanding the Topology
The next part of the this article will go into a discussion of the various different tools that can be used to help determine what the architecture should be (System Center Capacity Planner 2007) and how it should be monitored after it has gone into production (MOSS 2007 Management Pack for MOM 2005).