Thursday, February 28, 2008

K2 InfoPath Parallel Activity Lesson Learned

I have a small lesson learned as I did not listen to myself and use my own best practices for InfoPath integration with K2 in general. Basically I had some processes that were pretty simple using InfoPath. At certain points in the K2 Process I would kick out data to SQL using SmartObjects. A best practice is to make sure that you are kicking out that InfoPath data and not letting it sit as an XML document in a form library or in the K2 database because the data is not very accessible for enterprise reporting.

I had a change request to add some parallel processing to a process that was in production. The process had Security, VP, IT and Help Desk approvals that occurred in a serial manner. Now I needed to change the process by adding a Contracts approval that would occur in parallel with Security approval. It was simple enough to add a new Contracts approval activity and then add a new proceeding rule VP to make sure that both Contracts and Security finish before VP approval can begin. I ran into some issues with accessing the Outcome and Action Result activity level fields for both the Security and Contracts approvals. I would have liked to put into the proceeding rule that those values had to equal "approved" but they were not available in the wizard. The only option I had was to add some process level fields, add new events to the Security and Contracts activities, set the fields appropriately and then use the new fields in the VP proceeding rules.

The real issue I ran into was around working with the InfoPath data. The use case basically stated that if both Security and Contracts rejected I would need to resubmit back to each one independently. So if Contracts rejects; Security should still be allowed to approve. There is no guarantee that this would be done in a coordinated fashion and that both Security and Contracts approvers will be working with different portions of data.

The challenge was if both the Security and Contracts approvers rejected I would have some issues with the resubmission of data and ensuring data in the InfoPath forms is managed correctly. What I knew would happen is that lets say both Security and Contracts rejected, an InfoPath form would be generated for each resubmission activity for the originator. My InfoPath form was NOT loading off the database when it was opened. So if the originator opened up the Security rejection, changed some data, and then submitted. Subsequently the originator would then resubmit the Contracts rejection with different updated data. What would happen is the Contracts resubmission would overwrite on top of the Security resubmission edits because I was not loading the InfoPath form. It was clear as day I knew this was going to happen.

Solution, I need to modify the InfoPath form to load up off my SmartObjects. Luckily it is extremely simple to hook my SmartObjects into the InfoPath form and resolve this issue.

Lesson learned is just treat the InfoPath forms just like I would with an ASP.net form. When it opens use the rules to populate the form and then use submit rules to save form data using SmartObjects.

Tuesday, February 19, 2008

MOSS Cross Site Lookup

Need a free MOSS 2007 cross site look up?

This one comes up on most searches and seems to have tons of issues.

However after Googling a little harder I found this cross site look up which has worked pretty well and I have seen three different projects (including one of mine) get some real mileage out of this one. The author gives a fantastic tutorial how it was composed too. The only things we found were:

  • The cross site look-up is only a dropdown, so it will not support multi-selects. Well, you have the code.
  • Another person at RDA saw it does not return result in alphabetical order; however that was easily fixed by adding the following to CrossSiteLookupFieldControl.FillDropDownList().
public static void SortByValue(ListControl combo) {
SortCombo(combo, new ComboValueComparer());
}

public static void SortByText (ListControl combo) {
SortCombo(combo, new ComboTextComparer());
}

private static void SortCombo(ListControl combo, IComparer comparer) {
int i;

if (combo.Items.Count <= 1)
return;

ArrayList arrItems=new ArrayList();

for (i=0; i<combo.Items.Count; i++) {
ListItem item=combo.Items[i];
arrItems.Add(item);
}

arrItems.Sort(comparer);
combo.Items.Clear();

for (i=0; i<arrItems.Count; i++) {
combo.Items.Add((ListItem) arrItems[i]);
}
}


// compare list items by their value
private class ComboValueComparer : IComparer
{

public enum SortOrder
{
Ascending=1,
Descending=-1
}

private int _modifier;

public ComboValueComparer() {
_modifier = (int) SortOrder.Ascending;
}

public ComboValueComparer(SortOrder order) {
_modifier = (int) order;
} //sort by value


public int Compare(Object o1, Object o2) {
ListItem cb1=(ListItem) o1;
ListItem cb2=(ListItem) o2;
return cb1.Value.CompareTo(cb2.Value)*_modifier;
}
} //end class ComboValueComparer


// compare list items by their text.
private class ComboTextComparer : IComparer {
public enum SortOrder {
Ascending=1,
Descending=-1
}

private int _modifier;

public ComboTextComparer()
{
_modifier = (int) SortOrder.Ascending;
}

public ComboTextComparer(SortOrder order) {
_modifier = (int) order;
}

//sort by value p
public int Compare(Object o1, Object o2) {
ListItem cb1=(ListItem) o1; ListItem cb2=(ListItem) o2;
return cb1.Text.CompareTo(cb2.Text)*_modifier;
}
} //end class ComboTextComparer

Sunday, February 17, 2008

BlackPearl MOSS Governance Case Study

1) Introduction


One of the biggest things we run into when coming in and talking with a new MOSS client is how is governance enforced within MOSS? Some of their experiences may be:

  • With SharePoint 2003 and WSS 2.0 we stood up sites everywhere and no one could navigate or search for content.
  • Team sites, workspaces, etc. are not organized making it impossible to find content and have sprawled everywhere.
  • Documents and content have no classifications.
  • Document libraries are full of disorganized content.
  • Documents and content have no business life-cycle management.
  • Etc.

MOSS 2007 has given us tons new features and functionality to assist with governance of content. The following site is a good resource to become familiar with MOSS Governance. However the out of the box functionality can fall short specifically in terms of workflow. These shortcomings have been documented before (here) but these shortcomings become even more compelling if a Use Case is put against it.

2) Use Case

The following is a simple use case for managing an opportunity/sales process.

  1. Account Executive identifies a business opportunity.
  2. Account Executive goes to screen and enters in required data.
  3. A site is created for the company.
  4. A site is created for the opportunity.
  5. Account Executive fills out Opportunity Assessment and SOW Templates.
  6. Account Executive initiates workflow that notifies participants (Business Development Manager, Program Director and Project Manager).
  7. All participants review the document and mark as Approved.
    1. A single Deny requires Account Executive resubmission.
  8. When complete, all participants are notified of the end status.

3) End Solution Vision

The goal of this implantation is to create functionality in MOSS that will manage both the topology and the taxonomy of content. The first thing that comes to mind is using Content Types and Workflows to manage this solution. Content Types would be used to classify, store associated metadata and have workflows to manage the life-cycle of the document. To name a few benefits doing this through Content Type is the content data can be centrally defined, re-useable across the entire site and allow for more targeted searching with Enterprise Search.

We will see that this vision cannot be easily achieved with SharePoint Designer or Visual Studio and the case for using K2 BlackPearl will be overwhelming.

4) MOSS WF Shortcomings

The first problem you will encounter is that you cannot use SharePoint Designer. SharePoint Designer is a great tool for managing the workflow for documents but it is stuck on the list you created it for and cannot be associated to a Content Type which is a major shortcoming. In this use case as new sites are created. Creating workflows on the document library itself will not work there will be a new document library for each site.

Changing the use case to not use sub sites (or Document Workspaces) would not solve the problem as the single document library will become a dumping ground for documents. There are documents that are used to support the creation of Opportunity or SOW documents, for instance there are project plans, budgets, requirements documents, whitepaper references, etc. It is important to keep this content close to each other in an organized fashion and creating a Document Workspace fits the bill. If you did not use a Document Workspace you would not be able to take advantage of other collaboration tools like task lists, calendars, custom lists, discussion boards, etc. because these tools would be shared among all the documents in the single library creating a rat's nest of content.

Second is that SharePoint Designer does not support anything other than modifying the metadata of a document, sending emails, and a few other things. You cannot use SharePoint Designer to provision sites.

Third there will probably be a need to access CRM data which has company names, past opportunities, contact information, etc. and none of this is accessible using SharePoint designer as it can connect to external data sources.

Fourth some may try to retrieve external data or provision sites creating custom WF Activity Library in SharePoint Designer. Creating the custom code to do these operations is nothing out of the ordinary however getting the activities to hook into the SharePoint Designer and passing values around between the activities is a whole different issue you will run into.

At this point we can pretty much say SharePoint Designer 2007 will not be useful and its position as a tool for workflow will fall short over the long run. Next we would look into using Visual Studio to create a SharePoint Workflow Foundation (WF). Simply put, creating workflow in Visual Studio is a complete custom coding effort. All of the things needed for this solution can be done with Visual Studio however there level of effort increases and there is zero infrastructure around it (other than you do not have to build a host server for the workflow definition as it is hosted by SharePoint as a Feature).

WF workflows in SharePoint revolve around managing tasks in a Task List; that is it. Tasks are created in a task list which will lead the user to an asp.net or InfoPath form where a user can take an action. From what we have seen with custom WF projects the effort is increased to do the simplest of things. For instance when writing WF workflows there is no destination user and rules infrastructure. For instance it is not out of the box to define rules like:

  • All destination users must approve but the first to deny will move the workflow to another state.
  • All destination users must approve task in a serial manner one after another.
  • If a certain amount of destination users approve, then the task is completed and moves to the next state.
  • Destination user assignment is determined based on process instance state data and consequently the user profile must be retrieved from an external system.
  • Etc.

All of this logic would have to be built from scratch.

As well, there is no framework around the actions that users can take. It would be coded into the workflow to look for a custom column in the task list to check for a specific value and then move the workflow from one state to another. There is no security around the completion of the task to ensure that the right person is completing it either.

As well building multistage processes with different forms is cumbersome. There are things like initiation, association and task forms. When I have played with them in the past using InfoPath there is no way to get at the XML data easily (except on the initiation form). What you will quickly start doing is not storing any data at all in the workflow, externalizing all of it completely defeating some of the purpose around using a tool like InfoPath. Since you will start externalizing all of your data you will need to create custom libraries to persist data and abstract in such a way that your workflow and your forms (whether asp.net or InfoPath) can both use it.

There is no reporting around the WF workflows either (SharePoint Designer too). You can go to the properties of an item and see all the logging that were created for the item but that is it. There is no way to aggregate the data for the items to create business contextual reports. As well, since the data is associated to the SharePoint item, if that item were to ever be deleted all of the workflow histories are deleted. Knowing this you will be forced to externalize all of your business reporting data and make sure you persist this data during workflow execution.

As well, there is no central administration or infrastructure for the workflows at all. There is no way out of the box way to handle security, no way to search for workflow instances globally to manage them, no infrastructure for managing configuration parameters, and the list goes on and on…

5) BlackPearl SharePoint Workflow Solution

I must first stress that BlackPearl workflow is so much more than SharePoint workflow. It is a workflow server built completely on top of Windows Workflow Foundation that can be used to create workflows for SharePoint, WinForms, ASP.net, around COTS, etc.

While trying to figure out how I wanted to do this solution I had two options. I could either use BlackPearl's SharePoint Integration which will deploy a workflow as a Feature in the same fashion as creating a workflow in Visual Studio or I could have used the SharePoint Events Integration which will listen for events at a location (like a list) to initiation a workflow. Given that I wanted to create a workflow that would be associated to Content Type and I wanted a workflow a user could action either from within SharePoint or Office using the BlackPearl SharePoint Integration was a no brainer.

Knowing the Use Case stated above these are the projects in my solution:

  • SharePoint Project:
    • Feature with Content Types (with standard document templates) and custom Fields.
    • Feature with custom document library with custom Content Types Associated to them.
    • Simple Custom Web Part to display list of Sub Sites (not out of the box with WSS 3 but extremely simple).
    • Custom Site Definition for Company level sites (Sub Site Web part included in definition).
    • Custom Site Definition for Opportunity level sites (had custom document library and Forms library with XML configuration added).
  • K2 BlackPearl SmartObject Project:
    • Two SmartObjects using the Dynamic SQL service to retrieve data from CRM.
    • One SmartObject to retrieve data from Active Directory.
  • K2 BackPearl Process Project:
    • InfoPath Process to initiate site provisioning.
    • SharePoint Integration process for Opportunity Content Types.
    • SharePoint Integration process for SOW Content Types.
  • SQL Reporting Services Project:
    • Reports that used the SmartObject Data Provider to retrieve data from custom SmartObjects and out of the box SmartObjects to display status of all workflows.

That is in a nutshell is the solution, lets dive a little deeper into each.

The SharePoint Project basically encompasses all of the standard development needed for SharePoint. I made it a point to create the custom content types, document lists, and site definitions as I believe in making solution highly re-deployable. I am not a fan of creating templates or doing stuff in SharePoint Designer unless I have to. I look at CAML code the same way I look at DDL code I would write for defining a database. It must be repeatable and have the ability to be built from scratch, on demand otherwise you will encounter maintainability issues down the road.

The SmartObject Project allows me to harvest data from disparate sources and provide the data to all levels of the solution using a uniform interface, API, and deployment methodology. A whole separate discussion is required for the SmartObject framework however this provided me the ability to work with data from different locations without having to write a single line of code. As you will see I was able to use this data in my InfoPath form through their out of the box web services and I was able to use the drag and drop environment within the K2 Process to access this data. If I needed to write any custom code I would just use its API to make a connection to the server, set some data into a SmartObject instance and call a method like save.

The first process looks like the following. This process is started by a user going to an InfoPath form. The InfoPath form uses SmartObjects too display past opportunities for the company and allows for the selection of participants of future Opportunity and SOW workflows (Account Execs, Directors and Project Managers). There are events to create sites based on the custom definitions created earlier. As well, there are other events to save the XML from the InfoPath form into the provisioned site.


The following screenshots show the two workflows for the Opportunity and SOW workflows. They are both very similar in nature. Notice the first thing each one does is load in the XML from the InfoPath form used during the site provisioning process. From the XML we retrieve primary key information like the company and opportunity ID to link the workflow to external data. I am not concerned about the other data as the InfoPath form is hooked into SmartObjects which I read from dynamically in the process configuration. Otherwise the rest is simple.

  • Metadata of the document is updated from the CRM via a SmartObject.
  • Users are configured dynamically into the process to complete tasks.
  • Email notifications are sent.
  • Environment variables are used to configure all parameters such as library names, sites, etc.
  • Only about five lines of custom code are written to set some process level data.
  • The workflows are deployed to the custom Content Types.



Finally the reporting services process uses the SmartObject Data Provider to query data from the CRM, K2 database and Active Directory. I was able to write a single SQL statement that joined all of these data sources together to create reports that shows all of the active process instances and whom they are assigned to. The K2 workspace provides several out of the box reports that do the same and I could have used their report generate tool. However I wanted to create a more customized report with more tactical data that business decision makes would want to view. Then I added simply added the reporting services viewer web part into SharePoint site and the reports were now available to everyone. I could even import this custom report into the K2 Workspace if I wanted to.


6) Conclusions

The creation and refinement of the workflows only took me day once I had laid out the pattern that I wanted. That only really took me about two days so the level of effort to create this solution is around three days (creating the custom document library definition was a whole different headache associated to poor MOSS documentation). Now, we have this nailed down, future document workflows can be laid out in this fashion in significantly less time.

As you can see I was able to create processes that can be used to manage both the topology and taxonomy of content within MOSS with almost no custom code. Yes, no custom code and I have a solution where I am hooked into external data sources, I have custom UIs, I am calling WSS Services, etc. K2 BlackPearl provides tons of events and wizards that you can use to drive more sorts of variations of workflows in MOSS. At your fingers you have event wizards to:

  • Create, Delete or Modify Sites and Workspaces
  • Create, Delete or Modify lists and libraries.
  • Create and delete both SharePoint Users and Groups.
  • Manage Permissions for sites, lists, folders, items.
  • Manage documents (upload, download, check in/out/undo, move/copy/delete, and modify metadata).
  • Create, copy, delete, retrieve and update any list items.
  • SharePoint search for list or library data.
  • Records management sending documents, creating and releasing holds.
  • Publishing wizards to create, copy, more, delete, check in, update/copy page content, create/update/delete reusable content.
  • Out of the box BDC integration with several of the SmartObjects.
  • Ability to create complex InfoPath workflows.
  • Etc.

If that is not enough, you can create your own event wizards using their API. This list does not include the entire infrastructure and management tools that come for free when using BlackPearl, I just cannot cover it all here.

As you can see an overwhelming case can be made use K2 BlackPearl when doing workflow automation within SharePoint. I classify BlackPearl as a framework in which I can use to build all my SharePoint solutions.

Thursday, February 14, 2008

Measuring MOSS Performance with SCCP and MOM

1) Introduction
This is a second part of an MOSS Architecture and Performance article (Part 1) where a deeper look will be made into tools that can be used to estimate, manage and forecast capacity for MOSS 2007. There are some studies of performance on the Microsoft website but many of the variables may not apply or a more glandular set of assumptions need to be baked into a model.

There are two tools out there that you can use assist with architecting and managing a MOSS production environment. First there is the System Center Capacity Planner 2007 which can help with the planning of a production environment. The second is the MOSS 2007 Management Pack for MOM 2005 which can be used to monitor the actual performance your production environment.


2) System Center Capacity Planner 2007 (SCCP)


All SharePoint Architects should take a good look at this tool and use it to first estimate what would be required to support a new production environment and second to assist with the forecasting changes to the production environment. SCCP provides a simple wizard that you used which will ask various things like:

  • Is an Intranet or Extranet?
  • How many users will use the site? How will the users use the site (collaboration or publishing) and to what degree?
  • It will ask about branch offices and request information about Branch office access to the network.
  • It will ask about the hardware and network configuration of the environment.
  • What sort of availability needs to be supported?
  • What sort of SQL server environment is available?

With this information which are the same questions usually asked when doing a MOSS assessment; the tool will provide you a recommended topology. From there you can run simulations and it will provide detailed reports on the performance of the planned topology.


What is really interesting about the tool is that:

  • User usage profiles can be created and modified to more accurately capture how the users will use the environment.
  • Ability to add more servers to the recommended model to test various different scenarios.
  • Ability to add multiple user roles to model. For instance you may have 50 high collaborative users while there are 2000 publishing users.
  • Ability to attach new networks and understand how they affect over performance.

2.1) In/Out of Scope


This information is pulled directly from the System Documentation for the System Center Capacity Planner 2007 Tool.


In-scope capabilities. The tool is designed to assist you in planning the following elements of a WSS/MOSS installation:


  • Deployment of WSS 3.0 on servers running Windows Server 2003 SP2.
  • Deployment of MOSS 2007 on servers running Windows Server 2003 SP2.
  • Determining storage requirements for MOSS.
  • Ensuring high-availability needs are met.
  • Planning for scalability and expansion of existing installations.

Out-of-scope capabilities. Several areas that are beyond the scope of the tool include:

  • Modeling memory usage. SCCP does not model memory usage.
  • Upgrade scenarios for WSS and MOSS. The tool models only the latest version of WSS 3.0 and MOSS 2007.
  • Self discovery of existing WSS/MOSS installations. The tool models only green-field and previously saved models.
  • Deployment migration from competitor products. The tool models only WSS and MOSS.
  • Real-time customer usage profiling. SCCP does not deploy agents into your server farm to monitor usage patterns in existing SharePoint installations. This may be addressed in future versions.
  • Virtualized WSS/MOSS deployment. The tool assumes that you are doing capacity planning for a production environment and are using physical server boxes to deploy your SharePoint farm.
  • Disaster recovery scenarios. Disaster recovery scenarios introduce levels of complexity that make efficient modeling prohibitive.
  • Side by side installations. The tool models only new installations of the latest WSS and MOSS releases. Incremental upgrades involving server farms with multiple versions of WSS or MOSS are not handled.
  • Extranet installation. Authentication complexity precludes implementing extranet modeling.
  • E-mail integration: Exchange Server integrated with SharePoint. E-mail integration may be included in a future release.
  • Microsoft Excel® services. Excel services may be included in a future release.
  • High-end scenarios. The tool does not model high-end scenarios such as multi-terabyte Web applications or multiple Web applications.
  • Authentication methods other than NTLM and Anonymous.

Even with all these limitations this tool can still provide significant insight into the most important variables when design a physical topology for MOSS.


2.2) Installing
There are two downloads you will need. First download the System Center Capacity Planner itself which installs the capacity planner iteself. By default it will only have the Exchange capacity planner. Then you will need to download and install the SharePoint Capacity Planner template which will allow you to create SharePoint models.
2.3) Building an Initial Model
The following is a quick view into this tool. First you open the SCCP tool and select what type of capacity model you want to create. I have heard Microsoft intends to build more tools for server products in their stack.
Next this is where you select what type of site you intend to create and select an initial user profile.
Next you enter in the Branch Offices.
If Branch offices were created you will need to provide more detailed information around their connection to the network.
Next you need to identify the configuration of the servers that will be part of the recommended topology.
Next you will make selections around the availability of the web front end servers and the SQL server database.
Finally, a screen will be presented with a recommended topology.
After the wizard has been created, you are presented with a graphical tool that shows you all the servers. It provides a drag and drop GUI where you can add or remove elements from the model.
The following is a result of a simulation. Notice that this topology has pretty good results however there are some warnings on the amount of time it takes to create or delete a site. Knowing what the permissions are for the majority of the users will help dictate whether this is important or not. You may even go into the model, add a different user profile that has permission to do this activity and then rerun the simulation.
Versus the result of this simulation which shows there are some potential performance problems. The SCCP comes with a very detailed reference document explaining each one of the performance indicators.
3) MOSS 2007 Management Pack for MOM 2005
Now the SCCP tool will do a good job and modeling but it will not assist with the monitoring of a SharePoint environment. The SharePoint's Administrator Companion does provide a good chapter on how to monitor performance of hardware however many SharePoint Administrators require a tool that can be used to automatically notify them when something interesting has occurred in the production environment. The Microsoft Office SharePoint Server 2007 MP for MOM 2005 watches for failures or configuration problems which affect the availability and performance of Office SharePoint Server 2007. The following things are tracked:
  • Shared Services Provider (SSP) provisioning failed
  • Site Directory scan job failed
  • Enabling features failed on some sites
  • Administration site for the SSP is missing
  • Enabling features on existing sites failed
  • The Office SharePoint Server Search service is not running
  • The Microsoft Single Sign-On service is not running
  • The Office Document Conversions Launcher service is not running
  • Failed to connect to parent server farm
  • SSP synchronization failed
  • The Office Document Conversions Load Balancer service is not running
  • Failures in content deployment jobs
  • Poor cache performance
  • Error during document copy or move operations
  • Errors with the Information Rights Management (IRM) features
  • Failures in the Document Conversion feature
  • Out of Memory exceptions coming from form business logic
  • Denial of Service scenarios
  • Failures during form processing or while loading business logic assemblies

The management pack can be downloaded here.


4) Conclusions
With both of these tools in place SharePoint architects and administrators will have a better ability to plan their production environments, watch for performance issues with their environment and then model new configurations to compensate.

Wednesday, February 13, 2008

MOSS 2007 Architecture and Performance Considerations

1) Introduction
A very common problem that many clients encounter with their MOSS implementation is can their production environment handle the amount users? This can be a complex question to answer which can be driven by many things (many times understanding the usage of the site). Knowing this is a broad question this article will try to answer this at a high level and provide direction on areas where deeper investigation on a case by case basis. Hopefully decisions makers can walk away after reading this with a good understanding of what considerations there are with a MOSS implementation.

2) Knowing Your MOSS Implementation
First many need to make that decision on whether to use MOSS or vanilla WSS 3.0. Here are two articles that spell out the difference of the features clearly:

There are several things that are going to drive performance when setting up your MOSS environment. Typically when breaking this down and tuning out all the noise it really comes down to understanding how MOSS will be used. There are a couple different types of sites that may be implemented within a MOSS deployment:

  • Publishing Site (Intranet) – This could be characterized as content that would be read a lot by internal employees. It would have a medium level of transactions most of which would be reads with some insert, update and delete transactions associated with the management of the content. Typically this content is fairly static as the most dynamic content on any Intranet site is news, calendars, etc.
  • Team Collaboration – This can be a highly transactional environment where document libraries, lists, forums, discussion boards, task/issue lists, calendars, workflows, etc. are being updated on a daily, hourly or down to the minute. Knowing both how many users and their planned usage of this environment will greatly dictate the performance of this type of site.
  • My Sites – This again is similar to a Team Collaboration environment except that that each user of the SharePoint environment will have their own site to do the same things which they may or may not be used heavily. The amount of transactions per user may not be that high (unless there is strong adoption) however what can be compelling is the amount of users making transactions.
  • Extranets – Can be similar to Team Collaboration environments except that they are externally accessible to partners so that they can collaborate on the same information. SharePoint can be configured to allow multiple authentication protocols to access the Team Collaboration content that is sitting behind a firewall.
  • Public Internet Site – This is a public site that is available to the entire world using anonymous access. This site would require the ability to support a high level of read transactions and make sure that the SharePoint environment is secure.

This is not inclusive but a list of the most common ones. Please also note that characterizations of low, medium and high in the above explanation is discussing the level of transactions relative to each other.

It is important to note that question still does not change in regards to building up any application. We always need to know, how many users will there be, what the users are doing, what events may cause a high volume of usage, etc.


3) Physical Architecture
First let us get an understanding of what the physical architecture of a standard MOSS environment may look like. Keeping this simple there are three types of servers that will be part of your topology:

  • Web Front End Server(s) (WFEs): What users will directly access.
  • Application Server(s): Search Index, Excel Services, Form Services, etc.
    • InfoPath Form Services Note: A little misconception is that Form Services actually run on the WFEs because it uses ASP.net and IIS to serve up the content. If there will be heaving web enabled InfoPath usage it must be evaluated (InfoPath Forms Services Best Practices and InfoPath Forms Services Architecture).
    • Excel Services Note: It is better known that Excel Services does heavily use the WFEs and it is the only service that supports load balancing on the WFEs (Excel Services Architecture).
    • Search Notes: There are two roles with Search, the Query Role and Index Role. To simplify the discussion the Index Role's job is to build indexes of content which are given to the Query Role. The Query Role is accessed by the user when they perform a search. The Query Role is typically installed on the WFE while the Index Role will run on the Application Server. This decouples the two from one another to support redundancy if the search index needs to be rebuilt, more server power needs to be allocated to building the index, etc.
  • Database Server(s): This is the Configuration Database for the MOSS Server farm and the all of the Content Database(s). There could be performance ramifications for how the Content Database(s) are partitioned logically with respects the content stored within it but it would have to be something really really really large. However the size of the Content Database itself should not be an issue, please review (Demystify MOSS Content Database Sizing). It boils down to how agile you are with the administration of your databases (backing up and restoring them).

This article, Plan for availability (Office SharePoint Server), spells out the most common physical architectures to be considered. The most common one often recommend is the five server farm depicted below from the Plan for availability (Office SharePoint Server) article.


More WFEs can always be added and loaded balanced to help support the amount of users using the MOSS Farm. More application servers can be added to provide dedicated resource for service however something like the search index service cannot be scaled out across applications servers. Probably the most effective configuration is the six server configuration where there are two applications servers. One application server is dedicated to search while the other will support less intensive services.

3.1) Licensing
To be honest a five server farm is not always required, it really comes down to what sort of service level agreement (SLA) has been set up which will dictate the level of redundancy that needs to be supported. It is not uncommon for a four server farm configuration where there is one WFE, one Application Server and a DB cluster because the client already has this infrastructure in place. Any decision maker should challenge themselves with the following question. If you are already willing to make an investment to create a highly-available data tier, why not do the same for the web tier? Depending on the licensing agreement with Microsoft or Software Vendors the cost of the server software itself is nominal when it comes to the cost of CALs for MOSS. The following article has a short write up on the various licensing agreements out there for MOSS (Logical architecture model: Corporate Deployment).

3.2) Reality Check
It is important to give a quick reality check on some of the services and components that many want to use with MOSS.

3.2.1) Excel Services
This is quick overview of excel services however it Excel Services is not a robust tool. After reading the list of Unsupported Features in Excel Services you will find out it may not provide what is needed for Excel power users (macros are not supported). It would not be recommended to use Excel Services as a replacement for enterprise reporting platform while tools such as Performance Point Server are built for this. Understand its limitations before using it.


3.2.2) InfoPath Services
Before going off and making a huge investment in InfoPath Form Services and web enabled Forms, please review the following two articles:

Web enabled InfoPath forms are fantastic tools for building forms to quickly capture information. However the golden rule with them is if you have complex user interfaces requiring multiple views, lots custom .Net code, etc. you may have eclipsed InfoPath and better off using traditional ASP.net forms.


3.2.3) Workflow
Business users can use SharePoint Designer and developers can use Visual Studio to author workflows that can be hosted within SharePoint. However the SharePoint Designer does have lots of limitations and Visual Studio can be cumbersome and you will have to build up a bunch of things. A workflow engine like K2 BlackPearl should be used. The following article lays out workflow options and considerations for the various tools.

3.3) Extranet Topology
It is worth discussing what an extranet topology may look like as many clients have requirements to support access to the MOSS environment to the outside. With SharePoint 2003, setting up external access was not a trivial, natural thing although this is not the case with MOSS 2007. Regardless if the access is external using Active Directory (AD) credentials over SSL, Forms Based Authentication (FBA) with SQL Server, ADUM, etc. all that is being set up is a new Zone which will have a specific authentication protocol associated to it. Users can all have access to the same content and the SharePoint Security is used for authorization (security trimming) content within MOSS.

Design extranet farm topology (Office SharePoint Server) is an article written by Microsoft that lays out the details of the various different configurations for an extranet. There various different perimeter and publishing models. The following diagram from Design extranet farm topology (Office SharePoint Server) is a very common configuration.


In this model, placing the application servers on the corporate network can be beneficial as efficiencies can be gained with building the search index as server doing the index is closer to the data sources it is querying. Plan Security Hardening for Extranet Environments provides detailed insight into the configuration of all the ports between the DMZ and the Corporate Network.

3.5) Topology Performance Testing Results
In an effort to provide some easy performance numbers, Microsoft completed some tests of the various different topologies. If an environment requires the ability to support more users, it can always be scaled by adding more WFEs or further segregating the application server services to their own dedicated machine. Here are some high level statistics summarized in Plan for performance and capacity (Office SharePoint Server). Please note, that the hardware configuration may not be identical and there are many factors that could affect these statistics.

It is interesting to note that the study by Microsoft found that after four web servers had been added for a single database server there were diminishing returns. Research on a case-by-case basis would need to be completed to gain a full understanding of how many users will be using the server concurrently and in what capacity.


4) Logical Architecture
Finally the logical architecture is probably one of the more complex things that topics that needs to be discussed. The following article Logical architecture model: Corporate Deployment is the most comprehensive in regards to this. It has detailed information on how an enterprise logical deployment should be configured. The following diagram was developed from that article and is from in the following blog posting (Investing in Logical Architecture Design Samples).


Some important things to walk away with are:

  • MySites are partitioned from team sites into their own Site Collection. This is extremely important, highly recommended and should be done from the beginning as it difficult to move them later (Design My Sites Architecture).
  • Internet and Intranet\Team Sites are as well partitioned into their own Site Collections as collaborative environments tend to consume more resources than publishing sites. The logical architecture even goes as far a partitioning Intranet and the Team Sites from each other. Site Collections can have their own dedicated Content Databases and can they can easily moved to their dedicated WFEs if there are future performance problems. Even if entire implementation were to be physically installed on one box but logically separated like this the solution will have much more ability to scale.
  • Dedicated application servers (Shared Services Providers) should be set up to support the various different types of sites. For example the external web site would use a different SSP than the intranet for security reasons.
  • Various Application Pools are recommended to achieve higher redundancy between, security, performance, etc. Zones are then used to logically control the authentication protocol.

5) Tools for Understanding the Topology
The next part of the this article will go into a discussion of the various different tools that can be used to help determine what the architecture should be (System Center Capacity Planner 2007) and how it should be monitored after it has gone into production (MOSS 2007 Management Pack for MOM 2005).

Sunday, February 10, 2008

Content Type Feature with Document Template

July 25, 2008 - cleaned up code snippets...

May 5, 2009 - added ContentTypes.xml which was missing

I was presented with the challenge of trying to figure out how to deploy document contents types and then deploy them to document libraries. This should be simple given I have deployed content types as a feature, created my custom list and site definitions, etc. However, I had only worked with item types and introducing a document template was more complex and not well documented.

In this scenario I had a requirement where I needed to create a new site definition in which I needed to attach the document library content types to. I knew there were several different approaches and I wanted to investigate each. The goal is to create content types that I would later associated K2 BlackPearl processes to.

I also have the belief that I should be able to do everything through CAML. I take the same approach with SharePoint as I do with a database; I treat the CAML as a database schema. A best practice in my opinion is if you cannot script it out the deployment of any customization you are doing to SharePoint and make it repeatable you will have long-term issues with moving solutions from environment to environment. Knowing that I wanted to make sure I could do everything through Features and Site Definitions.

Creating the Content Types

Creating Content Types through a feature is not something too terribly difficult and pretty well documented. If you go to C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\TEMPLATE\FEATURES\ctypes\ctypeswss.xml you will find all of the out of the box Content Types that come with SharePoint 2007. All you need to is take and an ID for something like Document "0x0101", concatenate "00", then generate a new GUID and concatenate it to the end resulting in something like this 0x010100B334687E3D2F41f98A38B2FAA0B99088005CC82C39718847d8BEAC042213B66BEF.

Simple enough but now I needed to deploy a document template with the content types. The information on doing this was pretty sparse and I did find the following blog entry which seemed to be the only really thing viable out there. I found some information about managing publishing site templates but that did not apply.

This blog entry explained how to deploy a content type with a document template. You would then have the ability to go to a document library through the SharePoint UI and add it as a content type and it will work. However it will not work if you were to associate the content type to a custom document library definition and then use that definition in a custom site definition. I spent hours messing around this, reading the MSDN documentation, creating templates and opened up the manifest in the .stp, and even using the SharePoint Site Generator tool. Nothing would help but I finally got to work. In finally found this blog which pointed me in the right direction.

First here is the feature. Note there are ElementFiles which have all of the document templates that will be set at the document templates.

<Feature xmlns="http://schemas.microsoft.com/sharepoint/"
Id="07E36FBD-08F6-4370-B1E0-42194CA76BA0"
Title="RDA Opportunity Content Types"
Description="These are the content types that will be used by K2 BlackPearl for Opportunity Workflows."
Version="1.0.0.0"
Scope="Site">
<ElementManifests>
<ElementManifest Location="Columns.xml"/>
<ElementManifest Location="ContentTypes.xml"/>
<ElementFile Location="OpportunityQual.docx"/>
<ElementFile Location="StatementOfWork.docx"/>
</ElementManifests>
</Feature>

Next are the contents of Columns.xml, again pretty standard stuff.


<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
<Field ID="{3370B9EC-BC03-42bd-9A1E-B87D3FEF5C7E}"
Name="CompanyName"
Group="RDA Workflow Opportunity Columns"
Type="Text"
DisplayName="Company Name"
StaticName="CompanyName"
ReadOnly="FALSE"
Hidden="FALSE"
Required="FALSE"
/>
<Field ID="{0CB803C5-FFA9-40b8-8BC3-3D2A7BC71C93}"
Name="CompanyID"
Group="RDA Workflow Opportunity Columns"
Type="Text"
DisplayName="Company ID"
StaticName="CompanyID"
ReadOnly="FALSE"
Hidden="FALSE"
Required="FALSE"
/>
<Field ID="{6AE2C1E0-2F0E-4adc-857E-8EF8BD720A6D}"
Name="OpportunityName"
Group="RDA Workflow Opportunity Columns"
Type="Text"
DisplayName="Opportunity Name"
StaticName="OpportunityName"
ReadOnly="FALSE"
Hidden="FALSE"
Required="FALSE"
/>
<Field ID="{6268BDA9-87DE-4997-B902-77C75F868BB9}"
Name="OpportunityID"
Group="RDA Workflow Opportunity Columns"
Type="Text"
DisplayName="Opportunity ID"
StaticName="OpportunityID"
ReadOnly="FALSE"
Hidden="FALSE"
Required="FALSE"
/>
</Elements>

May 5, 2009

The following is the ContentTypes.xml.

<?xml version="1.0" encoding="utf-8" ?>
<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
<ContentType ID="0x010100B334687E3D2F41f98A38B2FAA0B99088"
Name="RDA Opportunity Document"
Group="RDA Workflow Opportunity Types"
Description="Content Types to be used Opportunity Workflow"
Version="0">
<FieldRefs>
<FieldRef ID="{3370B9EC-BC03-42bd-9A1E-B87D3FEF5C7E}" Name="CompanyName"/>
<FieldRef ID="{0CB803C5-FFA9-40b8-8BC3-3D2A7BC71C93}" Name="CompanyID"/>
<FieldRef ID="{6AE2C1E0-2F0E-4adc-857E-8EF8BD720A6D}" Name="OpportunityName"/>
<FieldRef ID="{6268BDA9-87DE-4997-B902-77C75F868BB9}" Name="OpportunityID"/>
</FieldRefs>
</ContentType>
<ContentType ID="0x010100B334687E3D2F41f98A38B2FAA0B99088005CC82C39718847d8BEAC042213B66BEF"
Name="Opportunity Qual"
Group="RDA Workflow Opportunity Types"
Version="0">
<DocumentTemplate TargetName="OpportunityQual.docx" />
<FieldRefs />
</ContentType>
<Module Name="Opportunity Qual"
SetupPath="Features\RDA.WSS.Opportunity.ContentTypes"
Url="_cts/Opportunity Qual"
Path=""
RootWebOnly="TRUE">
<File Url="OpportunityQual.docx" />
</Module>
<ContentType ID="0x010100B334687E3D2F41f98A38B2FAA0B990880015019A49E1C8471c952F6209251DC30E"
Name="Statement of Work"
Group="RDA Workflow Opportunity Types"
Version="0">
<DocumentTemplate TargetName="StatementOfWork.docx" />
<FieldRefs />
</ContentType>
<Module Name="Statement of Work"
SetupPath="Features\RDA.WSS.Opportunity.ContentTypes"
Url="_cts/Statement of Work"
Path=""
RootWebOnly="TRUE">
<File Url="StatementOfWork.docx" />
</Module>
</Elements>

So here are some notes and little pit falls I ran into along the way:

  • You must enter a <FieldRefs /> node even if there are no fields added to the content type. What would happen is if you open the content type from the document library none of the fields would appear in MS Word.
  • For the DocumentTemplate TargetName only put in the name of the file. All of the blogs out there have you enter a full path to the template. Once I took that out, everything started to work. I lost hours on this point and would not have guessed that was the solution given the error I was getting. What would happen is if the content type was associated to a custom document library definition, after the site was provisioned, you would go to click open the document and save it and the content type of the document would get lost and revert to saving the document as Document type.

Create a New Document Library

Now the next step is trying to figure out how to deploy the content type with a site definition. The simplest and well known best practice is to create your own document library. To do this go to C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\TEMPLATE\FEATURES\DocumetLibrary folder and copy out all of the files into your own folder. I modified the ID to be a new GUID and modified some other little things.




<Feature Id="FDE23094-968A-4184-9EBC-1E9830C1E7CD"
Title="RDA Opportunity Document Library"
Description="Document library with the opportunity content types attached."
Version="1.0.0.0"
Scope="Site"
Hidden="FALSE"
DefaultResourceFile="core"
xmlns="http://schemas.microsoft.com/sharepoint/">
<ElementManifests>
<ElementManifest Location="ListTemplates\DocumentLibrary.xml" />
</ElementManifests>
</Feature>

Then I modified DocumentLibrary.xml change the Name and Type attributes. Note that when I changed the Name attribute I had to change the folder name "DocLib" to "OpportunityDocLib".




<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
<ListTemplate
Name="OpportunityDocLib"
Type="4001"
BaseType="1"
OnQuickLaunch="TRUE"
SecurityBits="11"
Sequence="110"
DisplayName="RDA Opportunity Document Library"
Description="RDA Opportunity Document Library"
Image="/_layouts/images/itdl.gif"
DocumentTemplate="101"/>
</Elements>

Then in the Schema.xml file made the following modifications. First, I added EnableContentTypes and AllowMultipleContentTypes.




<List xmlns="http://schemas.microsoft.com/sharepoint/"
Title="Opportunity Docs"
Direction="$Resources:Direction;"
Url="OpportunityDocs"
BaseType="1"
EnableContentTypes="TRUE"
AllowMultipleContentTypes="TRUE">
<MetaData>
<ContentTypes>
<ContentTypeRef ID="0x010100B334687E3D2F41f98A38B2FAA0B99088005CC82C39718847d8BEAC042213B66BEF">
<Folder TargetName="Forms/Opportunity Qual" />
</ContentTypeRef>
<ContentTypeRef ID="0x010100B334687E3D2F41f98A38B2FAA0B990880015019A49E1C8471c952F6209251DC30E">
<Folder TargetName="Forms/Statement of Work" />
</ContentTypeRef>
</ContentTypes>
...

Some pitfalls I ran into:

  • The AllowMultipleContentTypes does not seem to be part of the WSS XSD schema and will not be shown in intillisense.
  • Note there are no spaces in the URL attribute of List, for some reason when I took it out it would work even though work if the site was provisioned through the UI. If there was a space and this was provisioned in the Site Definition, I would get failures.

Second, to make the custom columns visible you need to do the following. First you need to add the columns to the Fields node. This is a well known mistake and rather perplexing thing you must do.




<Field ID="{3370B9EC-BC03-42bd-9A1E-B87D3FEF5C7E}"
Name="CompanyName"
DisplayName="Company Name"
Type="Text"
SourceID="http://schemas.microsoft.com/sharepoint/v3"
StaticName="CompanyName"/>
<Field ID="{0CB803C5-FFA9-40b8-8BC3-3D2A7BC71C93}"
Name="CompanyID"
DisplayName="Company ID"
Type="Text"
SourceID="http://schemas.microsoft.com/sharepoint/v3"
StaticName="CompanyID"/>
<Field ID="{6AE2C1E0-2F0E-4adc-857E-8EF8BD720A6D}"
Name="OpportunityName"
DisplayName="Opportunity Name"
Type="Text"
SourceID="http://schemas.microsoft.com/sharepoint/v3"
StaticName="OpportunityName"/>
<Field ID="{6268BDA9-87DE-4997-B902-77C75F868BB9}"
Name="OpportunityID"
DisplayName="Opportunity ID"
Type="Text"
SourceID="http://schemas.microsoft.com/sharepoint/v3"
StaticName="OpportunityID"/>



Next you need to dig down in the Views Node and find the appropriate ViewFields node and add in the custom columns. This will make them visible in the document library view within the SharePoint UI.




<ViewFields>
<FieldRef Name="DocIcon">
</FieldRef>
<FieldRef Name="LinkFilename">
</FieldRef>
<FieldRef Name="Modified">
</FieldRef>
<FieldRef Name="Editor">
</FieldRef>
<FieldRef Name="CompanyName">
</FieldRef>
<FieldRef Name="CompanyID">
</FieldRef>
<FieldRef Name="OpportunityName">
</FieldRef>
<FieldRef Name="OpportunityID">
</FieldRef>
</ViewFields>

Create Custom Site Definition

Note the next thing to do is create a custom site definition. Again this is something that is pretty well documented. You should never modify the Microsoft site definition but instead create your own. I will outline the steps that you must do but this is well documented:

  • Go to C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\TEMPLATE\1033\XML and create an new WEBTEMP_XXX.XML file for you site definition.
  • Go to C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\TEMPLATE\SiteTemplates, select an existing template to start from. Copy all of the contents out of that into you new folder. Among other things, make sure the name of the new folder corresponds with the template name defined in the previous step.

I will stop here and suggest you read up on it. Many books out there show how to do this was well.

I did make the following modifications to my custom template to show the new document library that I created.

First, I added this under the appropriate Configuration node.




<List FeatureId="FDE23094-968A-4184-9EBC-1E9830C1E7CD"
Type="4001"
Title="Opportunity Documents"
Url="OpportunityDocuments"
QuickLaunchUrl="OpportunityDocuments/Forms/AllItems.aspx"
EnableContentTypes="TRUE" />

Pitfall - Note if the EnableContentTypes needs to be set to true here even though the list definition has it set to true. I ran into issues that when the document library is created with the site definition, the content types would again be wrong when I saved. It would only select the first content type by default. Setting this to true too resolved the problem.

Second, I added the following in the corresponding Module node for the Configuration node.




<View List="OpportunityDocuments"
BaseViewID="6"
WebPartZoneID="Left"
WebPartOrder="2" />

Then I was done. I got everything working so now when a new site is provisioned it will have an opportunity document library with the custom content types (with standard document templates) ready to go.

Other things I Tried Worth Noting

The following are things that I tried which did not completely work. The solution above was what I wanted to do but it was just getting weird errors with non-obvious solutions. I figured since everyone learns from the things they did wrong I would post up things that I tried.

List Instance Feature

I decided to instead of trying to create a custom document library definition with custom content types I would create a feature using a ListInstance. I would then activate the feature in my custom site definition. The following is the feature.xml.




<Feature xmlns="http://schemas.microsoft.com/sharepoint/"
Id="3D4BDB9A-EB6A-40ec-B0C9-367C9EB5D44C"
Title="RDA Opportunity Library Instance"
Description="Instance of a doc lib with opportunity content types."
Version="1.0.0.0"
Scope="Web">
<ElementManifests>
<ElementManifest Location="LibraryInstance.xml"/>
</ElementManifests>
</Feature>

The following is the LibraryInstance.xml which inherits from the out of the box document library, note the FeatureID that is used.




<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
<ListInstance Id="400005"
FeatureId="00BFEA71-E717-4E80-AA17-D0C71B360101"
Description="Opportunity Documents"
TemplateType="101"
Title="Opportunity Documents"
OnQuickLaunch="TRUE"
QuickLaunchUrl="OpporunityDocuments/Forms/AllItems.aspx"
Url="OpportunityDocuments">
<Data/>
</ListInstance>
<ContentTypeBinding ListUrl="OpportunityDocuments"
ContentTypeId="0x010100B334687E3D2F41f98A38B2FAA0B99088005CC82C39718847d8BEAC042213B66BEF"/>
<ContentTypeBinding ListUrl="OpportunityDocuments"
ContentTypeId="0x010100B334687E3D2F41f98A38B2FAA0B990880015019A49E1C8471c952F6209251DC30E"/>
</Elements>

I would then add the following into the WebFeatures of the Onet.xml of the custom site definition.




<!--List instance feature-->
<Feature ID="3D4BDB9A-EB6A-40ec-B0C9-367C9EB5D44C" />

Notes:

  • First I actually tried putting the ContentTypeBinding into its own feature but I continued to get a File Not Found error every time.
  • I also would get File Not Found errors if there was a space in the URL.
  • This feature worked fine until I finally corrected the way in which I associated a content type to a document template through a feature. If I deployed a content type feature without a document template, and then added the document template manually through site settings, content type gallery this would work.

Long term, this was not the solution I wanted.

  • First adding the document library as a feature in this manner would not allow me to add it to the Module View in the onet.xml of the custom site definition. I wanted the users to see the document library on the top page of the newly provisioned site as there would not be a lot documents in the library.
  • Second I really wanted to keep custom document library definition as this would be consistent with the way other content types are made available. For instance, tasks, linked lists, document, issues, etc. content types each have their own list definition.

Content Binding and Site Stapling Features

Another solution I tried was feature stapling. This is a very important new capability in WSS 3.0 as it allows you to create features which will be provisioned to existing site definitions without requiring you to create a new site definition.

In my situation this did not make a whole lot of sense as I needed to associate these content types to a specific site definition; however I figured I give this a try.

I first create the following feature that would bind the documents to a document library.




<Feature xmlns="http://schemas.microsoft.com/sharepoint/"
Id="7BC32E74-6CA9-4775-9C19-400DF718E3E2"
Title="RDA Opportunity Bind Document Library"
Description="This feature will staple opportunity documents to a site."
Version="1.0.0.0"
Scope="Web">
<ElementManifests>
<ElementManifest Location="BindDocs.xml"/>
</ElementManifests>
</Feature>

This is the BindDocs.xml for the feature. The ListUrl needs to correspond to a name of the list in which this content type is being associated to. Note below I have OpporunityDocuments as the ListUrl. There is no space and there is an assumption that there is already a document library on the site with a URL of OpportunityDocuments. This is a poor assumption and should not be done but in my case I had already added a document library with this name to my custom site definition. If this was a more general solution you would put something like Shared Documents as the ListUrl.




<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
<ContentTypeBinding ListUrl="OpportunityDocuments"
ContentTypeId="0x010100B334687E3D2F41f98A38B2FAA0B99088005CC82C39718847d8BEAC042213B66BEF"/>
<ContentTypeBinding ListUrl="OpportunityDocuments"
ContentTypeId="0x010100B334687E3D2F41f98A38B2FAA0B990880015019A49E1C8471c952F6209251DC30E"/>
</Elements>

Then I created the following feature to activate the ContentTypeBinding feature when my custom site definition were to be provisioned.




<Feature Id="998ACA04-668C-4491-94B1-DFFB83F68665"
Title="RDA Opportunity Site Stapling"
Description="Associates features to the RDA Opportunity."
Version="1.0.0.0"
Scope="Site"
xmlns="http://schemas.microsoft.com/sharepoint/">
<ElementManifests>
<ElementManifest Location="Elements.xml" />
</ElementManifests>
</Feature>

Here is a good blog on FeatureSiteTemplateAssociation. As you see I associate a the ContentTypeBinding feature to the custom site definition I have called Opportunity.




<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
<FeatureSiteTemplateAssociation
Id="7BC32E74-6CA9-4775-9C19-400DF718E3E2"
TemplateName="OPPORTUNITY#0" />
</Elements>

I have used FeatureSiteTemplateAssociation successfully in the past to associate a CustomAction feature to a site definition but in this case I would get never ending File Not Found errors. I would go into the SharePoint logs and would not find any information that could solve the problem. I Googled forever and could not find anything that would solve the problem. I found this particularly frustrating as I could make a good case that someday I may have a use case where I would need to associate content types to existing site definitions but for right now this would not work for me. Regardless, the real solution I wanted was my own document library definition with my custom content types which I was finally able to achieve.