Saturday, December 26, 2009

Call WCF from K2 blackpearl event

Introduction

I recently had the task to integrate some enterprise WCF services with a K2 blackpearl process for a company. If you need an introduction on WCF, please read this blog I wrote on how to write a WCF service.

There was not much stuff on the K2 Underground so I figured I write a short blog on how to go it. Thank you to Bob Maggio at K2 for giving some pointers.

There are really two ways you can do this.

  1. Call WCF endpoint through server side event handler in K2.
  2. Use the Dynamic WCF SmartObject Service.

Server Side Code Event

To all a WCF endpoint from a server side code event is pretty simple but there are a few things you should know. First, you may be used to right clicking on references in class library project and selecting the "Add Service Reference". What this does use the Svcutil tool to generate a client service class for you to call. That equivalent is not available in a K2 blackpearl project. You can try to add a reference by going to the K2 blackpearl process properties and then select references. In there you have the ability to add a "web reference" but that is for a Web Service. Even if the WCF service is deployed in IIS, you cannot use that methodology as it will give you errors. Just trying doing in a command line application outside of K2 and you will see what I mean.

The only way to do call the WCF service in clean manner is to create a class library project and call to it from your server side code event. The steps are:

  1. Add a class library project.
  2. In the K2 process properties, add a reference to that new project.
  3. Add the client service interface to that project and create a helper class that calls the WCF endpoints. Follow my instructions that I wrote about in this blog.
  4. In the K2 process server side event, add code to call your helper class.
  5. You will need to take the WCF client endpoint configuration (discussed in this blog) and add it to the C:\Program Files\K2 blackpearl\Host Server\Bin\K2HostServer.config.
  6. You will need to restart the K2 service to take effect.

Notes

  • In step five I discuss adding the WCF client endpoint configuration to the K2HostServer.config. This configuration is global to all K2 processes running on that server.
  • To test you helper class outside of K2, just create a command line project that reference the helper project. Place the client side endpoint configuration in the app.config of that project.

Dynamic WCF SmartObject Service

In my solution I elected to not use this even though it is a pretty good approach. Why? We really were not using SmartObjects as part of the overall architecture. First, I really did not want to have to pass through SmartObject layer just to do WCF call. Second, SmartObjects are great for doing visual development inside your K2 blackpearl processes. However in this solution we were already doing lots of server side code events so writing code into the process was not big deal and really quick to do anyways.

If we had decided to use the Dynamic WCF SmartObject Service I really liked to deployment better than the first option we went over. The configuration of the WCF service endpoints would have been in the SmartObject service on each K2 Server rather than having to modify the K2 Service config file and restarting the K2 server when there is a change.

Considerations

Only consideration I always mention is that K2 processes version controlled. That means your WCF services MUST be version controlled. What can happen is:

  • You deploy process version 1.
  • You create 10 process instances on version 1.
  • You make a change to the process and the WCF interface and deploy version 2 of the K2 process.
  • BOOM, process instances on version 1 start bombing because changed the WCF interface.

So if you were to:

  • Remove a WCF service method.
  • Change the parameters, data or contract of an existing WCF service method.
  • Drastically change the internals of an existing WCF service method.

There will be a high chance that you will run into this issue. You need to account for this because making changes to WCF services. This is really nothing new; you will run into this issue when calling external databases or web services. If you are practicing good SOA methodologies you will be able to handle service versioning.

My WCF Introduction

1.0 Simple WCF Introduction

Windows Communication Foundation (WCF) was released as part of .NET 3.0 and has been out for some time now. It has been heavily adopted up to this point and will be around for a long time. What WCF did was unify SOAP based web service messages, .NET Remoting, asynchronous communications and distributed transaction into a single service orientated framework for Microsoft. It also provides a nice pluggable environment such that messages can be sent over multiple communication protocols (HTTP, TCP, MSMQ, etc.).

This description probably does not do WCF justice but it this is what I think about with WCF (I also have some opinions below).

2.0 Your First WCF Service

I needed to quickly learn how to build a WCF service and I wanted to host it through IIS. I found two articles that showed me how to do this and I decided to consolidate this into one blog posting. Here are the quick steps to set up that first simple WCF service.

2.1 Create the Project

You can use the .NET project templates but is really simple to create this using a class library. Sometimes it is just good to know how it works.

  • Create a Class Library in my example K2Distillery.WCF.
  • Add folders called App_Code and Bin to the K2Distillery.WCF project.
  • Add a Web.config file to K2Distillery.WCF project.
  • Add references to System.Runtime.Serialization and System.ServiceModel.
  • Here where I deviate from most articles when they introduce WCF.
  • Add another Class Library project that will contain the actual code for the service, in my case I called K2Distillery.Service.
  • Add a reference to System.Runtime.Serialization to the K2Distillery.Service project.
  • In the K2Distillery.WCF project add a reference to the K2Distillery.Service project.

I do this because I personally believe that WCF should be a pass through to facilitate distributed computing, interoperability, etc. You should build your layered service architecture as a separate dlls. If you do not, your service methods will only be available through WCF and you will want to expose those services out through other means in the future.

2.2 Building the WCF Service Classes

In this example, a good simple service that may be needed is to get users by a business role. This data could be stored in many places (AD, HR Databases, etc.) and a nice generic method that returns a complete employee profile is very useful for all applications across the enterprise.

In the K2Distillery.Service project, I will add a method called GetEmployeesByRole(string roleName) to the RoleService.cs class. As will create an Employee class and decorate it with attribute that make properties serializable.

Here is the employee class:

    [DataContract]
public class Employee
{
string _employeeNumber;
string _firstName;
string _lastName;
string _email;
string _activeDirectoryID;

public Employee(string employeeNumber, string firstName, string lastName, string email, string activeDirectoryID)
{
EmployeeNumber = employeeNumber;
FirstName = firstName;
LastName = lastName;
Email = email;
ActiveDirectorID = activeDirectoryID;
}

[DataMember]
public string EmployeeNumber
{
get { return _employeeNumber; }
set { _employeeNumber = value; }
}

[DataMember]
public string FirstName
{
get { return _firstName; }
set { _firstName = value; }
}

[DataMember]
public string LastName
{
get { return _lastName; }
set { _lastName = value; }
}

[DataMember]
public string Email
{
get { return _email; }
set { _email = value; }
}

[DataMember]
public string ActiveDirectorID
{
get { return _activeDirectoryID; }
set { _activeDirectoryID = value; }
}
}

Here is the service method:

    public class RoleService
{
public static List<Employee> GetEmployeesByRole(string roleName)
{
List<Employee> users = new List<Employee>();
users.Add(new Employee("111", "Jason", "Apergis", "japergis@foo.com", "foo\\japergis"));
users.Add(new Employee("222", "Ethan", "Apergis", "eapergis@foo.com", "foo\\eapergis"));

return users;
}
}

Then in K2Distillery.WCF project I will add two files to the App_Code folder called IRoleService.cs and RoleService.cs. I am not going to go into the details of WCF (read the references I have provided) but you need to create an interface that that is decorated with attributes and then implement that interface.

In IRoleService.cs I define the interface for the service being exposed through WCF.

    [ServiceContract()]
public interface IRoleService
{
[OperationContract]
Employee[] GetEmployeesByRole(string roleName);
}

Then in RoleService.cs in the K2Distillery.WCF project I will add the following code to implement IRoleService. This will basically just calls my service method.

    public class RoleService : IRoleService
{
public Employee[] GetEmployeesByRole(string roleName)
{
List<Employee> employees = K2Distillery.Service.RoleService.GetEmployeesByRole(roleName);
return employees.ToArray();
}
}

Notice the only thing the WCF method does is transforms the results from List<Employee> to a Employee[]. This is required because .NET Generic lists cannot go across platform boundaries and must be transformed into a simple array. It could be argued by the SOA zealots that my service layer should have returned the values out in a generic fashion but in these days, I almost everything I work in is .NET and is never really a requirement.

2.3 Preparing Service Configuration

The next thing you will need to do is create the configuration files for the service.

First thing you need to create is a .svc file which can be placed in the root of the K2Distillery.WCF project. The service attribute just points to the fully qualified name of the service class.

<%@ServiceHost language=c# Debug="true" Service="K2Distillery.WCF.RoleService"%>

In my example we are going to deploy this WCF service through IIS, so we will need to add a web.config file to K2Distillery.WCF project. Below is web.config file that is needed for this service..

<?xml version="1.0"?>
<configuration>

<appSettings />

<system.serviceModel>
<behaviors>
<serviceBehaviors>
<behavior name="mexBehavior" >
<serviceMetadata httpGetEnabled="true" />
</behavior>
</serviceBehaviors>
</behaviors>

<services>
<service name="K2Distillery.WCF.RoleService"
behaviorConfiguration="mexBehavior">
<endpoint address=""
binding="wsHttpBinding"
contract="K2Distillery.WCF.IRoleService" />
<endpoint address="mex"
binding="mexHttpBinding"
contract="IMetadataExchange" />
</service>
</services>
</system.serviceModel>

</configuration>

2.4 Final Solution

The final solution should look like the following:

clip_image002

2.5 Deployment

There are multiple different ways to deploy the service to IIS. The two I will call your attention.

  1. Deploy the service as DLL into the bin directory or GAC.
  2. Deploy the .cs file in the App_Code directory into IIS and have the file dynamically compiled.

In my opinion the best practice would be to deploy it as a DLL in the bin directory of a web site. The reason being is similar to the security issues we run into with SharePoint.

To deploy this service:

  1. Create a service account to run the service under.
  2. Create an application pool in IIS to use that service account.
  3. Create a new virtual directory in IIS to host these files.
  4. Place the web.config and RoleService.svc files in the root folder of the new virtual directory.
  5. Add a bin folder and place the K2Distillery.WCF.dll and K2Distillery.Service.dll in it.

I would recommend creating a bat file that will automate this deployment for you. But that is all you need to do.

To quickly test if everything worked, just open IE and navigate to the svc file (http://XXX/RoleService.svc?wsdl) and see if there are any errors.

2.6 Generating Client Classes

To allow a client to access the service there are two simple ways of doing this:

  1. Use Visual Studio, right click the project references, select Add Service reference and provide the url to the RoleService.svc.
  2. Use the Svcutil tool that have more granular control over the client interface creation. For detailed information read this - ServiceModel Metadata Utility Tool (Svcutil.exe)

I personally like using the Scvutil because if gives more control and it will also generate the app configuration you will need for the client application. You will need to add the generated config setting to the client application web.config or app.config file.

In this case the command would be:

cd C:\Program Files\Microsoft SDKs\Windows\v6.0\Bin

svcitil.exe http://XXX/RoleService.svc?wsdl

2.7 Calling Service from Client

In this example I am going to create a simple console application to call out to the service.

  • You will need to add the generated class from the previous step.
  • You will need to create an app.config file and add the generated configurations from the previous step.

Here is example of the code for the generated client class.

try
{
using (RoleServiceClient client = new RoleServiceClient())
{
Employee[] employees = client.GetEmployeesByRole("");

foreach (Employee employee in employees) {
Console.WriteLine(employee.EmployeeNumber);
}
}
}
catch (Exception ex)
{
throw new Exception("Error Calling WCF GetEmployeesByRole: " + ex.Message);
}

You must always sure that you properly dispose your service.

That is how easy it is to get started on creating some simple WCF services.

4.0 WCF Final Thoughts

On MSDN, Microsoft states that “Windows Communication Foundation (WCF) is designed to offer a manageable approach to distributed computing, broad interoperability, and direct support for service orientation”. I had several healthy discussions with some smart colleagues of mine that I trust. Some say I take it too literal but I base my opinion on the experiences I have. I have a couple issues with using WCF because I have seen many well intended developers and companies incorrectly architect SOA and Web Services in general over the past couple years. In my opinion there is no need for WCF unless you have:

  • Interoperability requirements.
  • Need to support distributed transactions for the enterprise.
  • Need to provide common functionality to many systems.

Far too often:

  • People put in web services to create a physical boundary in the layer architecture to force a decoupled architecture. A good OO developer who knows good Gang of Four development will not need to do that.
  • Creating a service to host something utility classes which should be directly referenced.
  • Creating DAL services that are primarily for a single application.

I as a developer really like WCF because it is strongly typed and you have the ability to scale out your services using different communication protocols. However when I put on my architect and manager hat I see a different story. I will challenge developers on the need for WCF. Very often in the environments that I work in SOA is not needed. Couple years ago developers thought web services were the hottest thing out there and they baked them into their applications. The result was performance issues and code coupling maintenance nightmares.

Why do SOA projects fail? Governance. Typically when you build an application there are business owners, technical management, assigned development teams and maintenance personnel. We do not see that level of support with services because services are too far behind the scenes. What typically happens is a service is implemented and it may be used in one or two applications. Then new business requirements are received which require application specific modifications. The services are then modified for the specific application and things just start becoming harder to manage. My point is we are usually in application management mode and not service management mode. Successful SOA implementations are applications to themselves. They are developed from the beginning without much knowledge from the consumers and the consumers must call them based on the public interface contract.

Services work great with exposing functionality from ERP systems like SAP and PeopleSoft. Or even exposing enterprise functionality for SharePoint (search, content management, etc.). But nine times out of ten it is not needed for an internal company custom database for a custom application.

My conclusion is good design will push you to create decoupled application layers (UI, business, persistence, data, etc). WCF should just wrap these layers and expose functionality; nothing more. You should write good methods that expose enterprise data and functionality as an “endpoint” as needed. Please know your requirements and make sure you ask tough questions. Just because you can create a WCF service does not mean you should.

5.0 References

Monday, November 2, 2009

What is a FAST Enterprise Search Project Part 2

Series

Introduction

In my previous blog (Why is FAST Enterprise Search Important) I discussed why is an Enterprise Search project in import? In this blog posting I will discuss what is needed for a successfully Enterprise Search project. This should hopefully give you enough information to anticipate what will be needed in an Enterprise Search project.

What is an Enterprise Search Project?

A few years ago I had to make the transition as a custom application developer to an application server consultant with Microsoft products. Project plans for implementing SharePoint, K2 or BizTalk were really not much different other than you have several new tasks associated to the configuration, integration, sustainability and maintenance of the new application server. Still with application server projects you still have lots of custom artifacts and components that have to be developed. This too is the case with FAST.

When posed the question of what is an Enterprise Search project, I first did not know where to start. I wanted to draw from my past experience. I also knew that Enterprise Search projects can be complex but I did not understand what a search project would entail.

Content Processing and Transformation

Enterprise Search within an organization many complexities. First we have to be able to index content where ever it may be (in a custom database, 3rd party enterprise application server, file share, mainframe, etc.). Custom code may have to be written to facilitate bringing this content over to FAST so that it can be indexed. Knowing this a comprehensive analysis project must be completed to understand all the content/data that is spread across the organization. A common mistake is a company may index bad data and they get the old "garbage in; garbage out" issues. There must be plans for indexing both good and bad data, formatting unstructured data, making data relevant, normalizing data (removing duplicates), etc. We will need to understand the entire life-cycle of that data and how it can be effectively pulled or pushed into the FAST Search index. This is very similar to a data warehouse project however the context is a little different.

An Enterprise Search project is also very similar to a complex ETL project because you will have to create several transformation processes/workflows. The processes must transform the content into a document that can be recognized by the FAST Index. FAST refers to anything in the index as a document; even if the index item comes from a database. A document for FAST is a unique piece of data with metadata which gives it relevancy. FAST provides several out of the box connectors that do this transformation and they provide an API to write custom ones. In many cases you may have to build or extend connectors. Just as important as the ETL pre-processing, there is post-processing routines that must be executed before the search results are passed back to the user interface layer. Again more relevancy rules or aggregation of search results may be incorporated here. I was happy to hear that the FAST team also draws comparisons to an ETL project when discussing what an Enterprise Search project is.

User Interface

Most Enterprise Search platforms like FAST do not have a traditional GUI; it is an Enterprise Search engine that can be plugged into new or existing platforms. FAST does provide several controls that can be integrated into any UI platform but in many cases you will be extending upon or building complete new controls. FAST provides a rich API that is accessible in such languages and .NET, Java and C++.

User Profile

An important element of the FAST Enterprise Search project is to understand the user profile that is performing the search. Things such as their current location, where they are within the organization, what sort of specialties do they have, what types of past searches have they done, who have they worked for work for, and past or future projects, tasks or initiatives they have supported can all be used to give a more relevant search result. This requires integration to go to systems that can infer these relationships and pass this information along with the query to FAST Query and Results server which will return a relevant result.

Security

The profile is also important for incorporating security. FAST has numerous ways in which documents can be securely exposed to the end user. For instance there is an Access Control List (ACL) which is part of the document instance in the search index. The ACL is populated during the indexing of content and this may require customizations to set the ACL appropriately. As well, more customizations may be added to do real-time authorization to ensure that documents being returned from the index have not been removed from the user's visibility. Another consideration is to partition indexes based on boundaries such as internet, extranet and intranet. There are several more considerations that must be accounted for so time must be accounted for in the plan to ensure that content is managed properly.

Installation and Configuration

A major portion of the project plan needs to be devoted to the installation and configuration of the FAST server. There are several important things that need to be accounted for when doing this. For instance how many queries will be executed concurrently, what are peak usage scenarios, how much content will be indexed, what sort of complexities/exceptions are there in the indexing process, what is the anticipated growth, etc. All of this must be known for us to properly scale the FAST server and the design of custom components.

Testing

With all of the custom transformation and GUI components to support the Enterprise Search implementation, there will need to be a focus on system integration testing, system application testing, and user acceptance testing. There will be specific test for search to ensure that indexing, query performance and result relevancy are accurate and within acceptable ranges. This is nothing new but we need to be sure that a proportionate amount of time is incorporated into the plan to ensure that a quality solution is put in place.

Sustainment and Governance

Sustainment next needs to be part of the plan which is commonly neglected. Too often the plan is focused on the short-term end result while the long-term management is not incorporated into the solution. What sort of organizational management changes are required to support and maintenance of the search implementation? What sort of configuration management business processes will need to be introduced to continually tune the index and relevancy model based on usage? What sort of new roles and responsibilities need to be incorporated into the employee performance (from both a systems and business user perspective)? How is the enterprise taxonomy going to be maintained? What sort key performance metrics and reporting are needed to consistently evaluate the success of the project? What is the process for incorporating change back in the solution (which is extremely important for Enterprise Search)? If questions like these are not incorporated into the early design of the project, there will be long-term challenges with the adoption and integration of the Enterprise Search investment.

Closing

As you can see the key to a successful Enterprise Search project is to understand the needs of the business and how the solution will be supported. Many of the tasks that were discussed are very standard; we just needed to put them in context.

Why is FAST Enterprise Search Important Part 1

Series

Introduction

The first thing that many will ask before beginning a major Enterprise Search initiative with a product like FAST is why is an Enterprise Search important? Secondly, what is an Enterprise Search project? My approach is to not understand these questions this from a sales perspective but from a technology management and consultant perspective.

Why is Enterprise Search important?

Users have to work mass amounts of data that is either stored internally or externally. Search can mean lots of things to different industries however the goal is simple; it is to display the right information to the right person at the right time without distraction. At the same time we must have a flexible and configurable search platform that will surface the most relevant information to the business user from where it is stored.

Information Workers have to search and then utilize data. How do they do this? They typically have to log into an application and perform a search. Or when they enter an application, there may be some data contextually rolled up to them based upon who they are. There is a demand by business users to make search easier. We have heard many times "how can I search my enterprise data in the same way I Google something on the internet". Users want the ability to go to a single place, run a search query and receive results from across the entire enterprise. This is very different than performing a public internet search or a search function contained within the scope of a single application. Public internet searching has its own complexities however it typically is indexing content on websites. Enterprise Search becomes complex because the data being indexed can come in numerous formats (document file, database, mainframe, etc). From the user perspective this complexity must be transparent. They must be given a single result set that will allow them to research problem, complete task or even initiate a business process.

Organizations are challenged with providing comprehensive search solutions that can access content no matter where the data resides. Public search engines have as well created demand to provide highly relevant search experiences. Relevancy is the key to success for a search solution. To have accurate relevancy it is important to know as much as we can about the user entering the query. Profile relevancy can be determined a by numbers of things. For example where the person is located, what is their job function, and what past searches have they or colleagues done. Relevancy can also be determined by the attributes associated to a piece of content. For example is the author considered to be trusted, is the content itself fresh, or even is content highly recommended by other users. The search platform must have an adaptive relevancy model. It must be able to change based on business demands and subsequently learn how to provide better results utilizing factors that are incorporated into the relevancy model. An Enterprise Search platform like FAST can provide this advanced capability.

The vision of going to a single place find data is not really a new concept. We have seen a major push for data warehouses to create a single location to facilitate enterprise reporting. We have seen enterprise portals created which give users a single user interface that provides contextual data from disparate systems. We have seen SOA trying to consolidate business services and now we are seeing cloud services gaining traction in the market. The reality is that the enterprise architecture on the large will be disparate. Companies have made significant investments into many technologies at one time or another and consolidating them to a single platform is not always realistic. This is why we are constantly trying to find new solutions to work with data in a uniform manner. This is an important justification for an Enterprise Search solution such as FAST.

To restate, the goal is to have an Enterprise Search platform that can create single result set using disparate data from across the enterprise. Where a lot of organizations fall short is they do not have the tools to navigate this data. Business users are required to have deep domain knowledge of the organization, format of the data, and business processes. The domain expert must know what is good or bad based upon experience which is not transferrable making continuity of operations challenging. This is yet another reason why an Enterprise Search platform provides significant value to an organization.

Here are some examples of how organizations have used Enterprise Search.

  • Several major ecommerce sites like Best Buy and Autotrader.com used FAST to better advertise to its customers, expose product significantly quicker to the customer, provide better navigation of search results and provide integration with OEM partners.
  • A business data brokerage firm was able to provide more relevant results, increase user satisfaction, provide data from multiple disparate locations, create better customer retention, created collaborative data rating system and allowed for communication between subject matter experts.
  • A community facilitator for the natural resource industry was able to create a B2B solution that provided dynamic drill/navigation of industry data, created automate extraction policies to mine for important data, was able to regionalize their search results, created a pay model for more high-end results, and improved their sales model by using relevancy.
  • A major computer production company used FAST to improve economies of scale for support personnel. They significantly lowered call-center cost by directing users to search first, provided customers with more up to date support information and allowed their worldwide staff of engineers to user their native languages when performing a search.
  • A global law firm used FAST to create a knowledge management solution that allowed them to reduce research personnel and created consolidated search experience. They significantly reduce ramp-up time of new lawyers, greatly improved relevant results with advanced content navigation, and provided better communication of best practices.
  • A law enforcement agency was able allow investigators to electronically research mass amounts of data across the government which they normally did not have access to. This subsequently increased productivity, shortened lengths of investigations and help them comply with government regulations.
  • Another government agency created a solution using FAST which would search public domain for information of persons who are potentially breaking laws and initiate business processes bring them to justice.

All these examples provide strong justifications for the value of an Enterprise Search solution. With FAST costs were reduced, they were able to meet regulations, they performed more efficiently, and generated more revenue for goods and services.

What is an Enterprise Search Project?

This will be discussed in my next blog What is a FAST Enterprise Search Project

Saturday, October 24, 2009

FAST Search Whitepapers

Here are some great whitepapers you should read if you want to start learning about FAST. I know there is a lot of buzz around it with its integration with SharePoint 2010 and finally providing SharePoint with a robust search engine. This is a great starting point for starting to understand what Enterprise Search is and how it can be strategically introduced and aligned with your Enterprise Architecture.

http://www.microsoft.com/enterprisesearch/en/us/FAST-technical.aspx

Tuesday, October 20, 2009

FAST Introduction and SharePoint Search Evolution

There is a lot of information that is coming out from the SharePoint 2010 conference and one of the biggest ones is the integration of FAST into SharePoint 2010. What is FAST? FAST is an enterprise search engine that Microsoft acquired and they have placed a significant investment into. The most important thing you should know right off the bat is FAST does not equal SharePoint. FAST is an enterprise search platform which can be used as the search engine for SharePoint. Up to this point Microsoft has not provided a way to search for content across the enterprise. What we have done to compensate for this is build custom applications or purchase products like FAST and Google Appliances to do enterprise search.

This is what I have seen with the evolution of search solutions in the context of SharePoint. SharePoint 2001, nothing to really discuss but with SharePoint 2003 we started to get a taste of what we wanted for Search. We found that the search did not really work well in SharePoint 2003 (cross site searching did not work) and many customers who were using SharePoint 2003 said it simply did not work. It did basic text searching of content within SharePoint but it was missing key things like relevancy. This created a small market of third party vendors who creates search solutions for SharePoint. Remember, at this time Google had become the search engine of choice, as every day business users would just say go Google something and get the answer. Problem was we did not have the same search engine that we could use internally with a company, organization or enterprise. As result FAST, Google, Autonomy, etc. created enterprise search solutions that could be used within a company enterprise and that many these features that were required by the business user.

Then SharePoint 2007 came out with Enterprise Search. It was a significant improvement over what we had with SharePoint 2003 but it was still far off from being an enterprise search solution. They improved the user interface, allowed for target content taxonomy searching, they added a relevancy model, best bests, synonyms, administrative features, reporting, an API we can build customizations to, added security using an access control list (ACL), and business data search using the business data catalog (BDC). All the stuff needed when creating an enterprise search platform. We now had the ability to search for data inside and outside of SharePoint, we could rank the search results based on who you were, we could analyze searches to improve the user experience, etc. however it still seemed to fall short. The core problem I go back to is users are expecting that Google experience; and not just doing text searching. SharePoint tried to solve some of that but in the end it fell short.

One thing that had always been the most interesting is the introduction of the business data catalog (BDC) to provide a single result set of data from multiple disparate data sources. This was the most interesting search feature for me when SharePoint 2007 came out. This is where they tried to become an enterprise search engine because you go to one place, you enter something to search on, and you query against many different places but get back a single result set. I personally was able to use it successfully to index custom SQL databases of HR related data for several clients. So when they searched for a person, they were able to get more information about that person other than just information stored in Active Directory. Now the BDC had lots of limitations including only able to call databases, stored procedures and web services, no ability to do data transformation, an API that was very hard to develop with and had limitied scalability.

With the introduction of FAST as part of the Microsoft stack, they really have a true enterprise search engine. FAST has a significant amount of features and functionality, which I have not even touched upon. In my next blog, I intend to write about some of these core features and capabilities that are needed for an enterprise search solution and how they are used to meet your business users needs to find the data.

For more information on the value proposition of FAST, I have written the following two blogs:

Friday, October 16, 2009

SharePoint GB 2057 Localization

I was recently asked to dig around into an issue with an international SharePoint site we are setting up. I personally have little experience with globalization other having to read about it to pass a MS certification test.

There are language packs for SharePoint which are used to support configurable text for globalization. Well the issue was how is LCID 2057 for England handled? The English language pack supports 1033, which is for US English. LCID 2057 is a considered a sub language of 1033. So, would it be possible to create a unique resx file for GB that maps to 2057? After digging and stumbling around, the answer is it is not possible.

The only resolution would be in the web application set the regional settings to LCID 2057 (GB), and then modify the resx for US English (1033) in that specific web application.

This is what I was able to find out:

  • There is only a language pack for English (1033).
  • It is possible to have formatted text, like dates, formatted to 2057. It is possible to change the locale to 2057 by accessing the SPWeb.Locale. You can try to change the locale through the SharePoint Regional Settings screen in Site Settings, but you will not see a GB option, only US. Another way to change the locale is to go to Webs table in the site collection database; HOWEVER that is not supported by Microsoft.
  • In the Webs table you will see another column called language. What I was able to find out is that the value in this column MUST correspond to a language pack that has been installed. Otherwise SharePoint will bomb. So setting Language = 2057 and Locale = 2057 will not work. However Language = 1033 and Locale = 2057 will work. What this will do is make sure that things like dates are formatted correctly. The reason why it fails is because in several places, including the 12 hive, SharePoint is building a relative path to resources installed when the Language Pack was installed. You will 1033 folders throughout the 12 hive. So if the Language is set to 2057, it will start looking for a 2057 folders and things will start breaking. At this point I said, it would not be possible to create a dedicated unique resx file for GB. Bummer.

Here are some references:

Wednesday, October 14, 2009

Copy SPListItem.Version (SPListItemVersion) Part 3

Background and Considerations

A while back I wrote a blog that discussed the issues with copying SPListItems from one list to another. However I recently needed to create a utility and thought my old blog would solve the problem – I am unhappy to say it did not. It definitely unlocks the issue with copying SPListItems with versions however I just found a couple shortcomings of what I wrote. Let’s try again.

Here are some considerations I had to understand before starting to build this.

  • The SPListItem CopyTo() and CopyFrom() methods do not work after doing some research with Reflector.
  • You will need to need to loop over the versions backwards and add the versions of the list items in the destination list.
  • Moving documents is different than moving list items.
  • Recursively looping over items within a SPList or SPDocumentLibrary is not straight forward. You usually want to maintain the folder structure when moving items from one list to another. You cannot simply loop over all items in the SPList nor does a SPFolder object have a collect of items within it. Only easy way of achieving this is to use a CAML query to get all the items for a specific folder.
  • If you need to preserve the Created and Modified time stamps on the version items, you need to set the times correctly because they are stored as GMT in the SharePoint database.
  • If you want to move items cleanly into a new or existing list, I recommend writing code that will first remove all the items from the destination list, then remove all the content types destination list and finally add the needed content types back into the destination list. There are numbers of reasons why to do this. It is possible to write a routine to reconcile the content types from the source list to the destination list however that can be come complicated. The important thing to know is that if a column is missing in the destination list, the movement of the SPListItem or SPDocument item will fail. The code I have written is not dependant on the content type ID which is a good thing. This is because if the content types are defined within the SharePoint UI a unique GUID is created for that content type. If you are moving items across SharePoint servers, you cannot be guaranteed that the Content Type ID will be the same, but the column names and types should be the same.

Create Copy Folders Structure

I created a method called MoveFolderItems which will recreate the folder structure in a new library. All you need to do initiate it is something like the following.

MoveFolderItems(sourceList, sourceList.RootFolder, destList, destList.RootFolder);

As you can see in this method, it first gets all the items for a specified folder. Then it checks to see if the item is another folder or not. If so, it will create a new folder, otherwise it will move over the item depending.

        private static void MoveFolderItems(SPList sourceList, SPFolder sourceFolder, SPList destList, SPFolder destFolder)
{
//Query for items in the source folder
SPQuery query = new SPQuery();
query.Folder = sourceFolder;
SPListItemCollection queryResults = sourceList.GetItems(query);

foreach (SPListItem existingItem in queryResults)
{
if (existingItem.FileSystemObjectType == SPFileSystemObjectType.Folder)
{
Console.WriteLine(existingItem.Name);

//Create new folder item
SPListItem newSubFolderItem = newSubFolderItem = destList.Items.Add(destFolder.ServerRelativeUrl,
SPFileSystemObjectType.Folder, null);

//Set folder fields
foreach (SPField sourceField in existingItem.Fields)
{
if ((!sourceField.ReadOnlyField) && (sourceField.Type != SPFieldType.Attachments))
{
newSubFolderItem[sourceField.Title] = existingItem[sourceField.Title];
}
}

//Save the new folder
newSubFolderItem.Update();

if (newSubFolderItem.ModerationInformation != null)
{
//Update Folder Status
newSubFolderItem.ModerationInformation.Status = SPModerationStatusType.Approved;
newSubFolderItem.Update();
}

//Get the source folder and the new folder created
SPFolder nextFolder = sourceList.ParentWeb.GetFolder(existingItem.UniqueId);
SPFolder newSubFolder = destList.ParentWeb.GetFolder(newSubFolderItem.UniqueId);

//Recursive call
MoveFolderItems(sourceList, nextFolder,
destList, newSubFolder);
}
else
{
//Move the item
Console.WriteLine(existingItem.Name);

if (sourceList.BaseTemplate == SPListTemplateType.DocumentLibrary)
{
MoveDocumentItem(existingItem, destFolder);
}
else {
MoveItem(existingItem, destFolder);
}
}
}
}

Move SPListItem

Here is the code for the SPList item with its history. First we create the list item. Then we loop over the versions backwards and add each version into the destination list.

            private static void MoveItem(SPListItem sourceItem, SPFolder destinationFolder) {
//Create a new item
SPListItem newItem;

if (destinationFolder.Item != null)
{
newItem = destinationFolder.Item.ListItems.Add(
destinationFolder.ServerRelativeUrl,
sourceItem.FileSystemObjectType);
}
else {
SPList destinationList = destinationFolder.ParentWeb.Lists[destinationFolder.ParentListId];
newItem = destinationList.Items.Add(
destinationFolder.ServerRelativeUrl,
sourceItem.FileSystemObjectType);
}

//loop over the soureitem, restore it
for (int i = sourceItem.Versions.Count - 1; i >= 0; i--) {
//set the values into the new item
foreach (SPField sourceField in sourceItem.Fields) {
SPListItemVersion version = sourceItem.Versions[i];

if ((!sourceField.ReadOnlyField) && (sourceField.Type != SPFieldType.Attachments))
{
newItem[sourceField.Title] = version[sourceField.Title];
}
else if (sourceField.Title == "Created"
sourceField.Title == "Modified")
{
DateTime date = Convert.ToDateTime(version[sourceField.Title]);
newItem[sourceField.Title] = sourceItem.Web.RegionalSettings.TimeZone.UTCToLocalTime(date);
}
else if (sourceField.Title == "Created By"
sourceField.Title == "Modified By")
{
newItem[sourceField.Title] = version[sourceField.Title];
}
}

//update the new item with version data
newItem.Update();
}

//Get the new item again
SPList list = destinationFolder.ParentWeb.Lists[destinationFolder.ParentListId];
newItem = list.GetItemByUniqueId(newItem.UniqueId);
newItem["Title"] = sourceItem["Title"];
newItem.SystemUpdate(false);

if (sourceItem.Attachments.Count > 0)
{
//now get the attachments, they are not versioned
foreach (string attachmentName in sourceItem.Attachments)
{
SPFile file = sourceItem.ParentList.ParentWeb.GetFile(
sourceItem.Attachments.UrlPrefix + attachmentName);

newItem.Attachments.Add(attachmentName, file.OpenBinary());
}

newItem.Update();
}
}

Move Document

As I mentioned earlier, moving a document is a little bit different. Here is the code that will copy a document, metadata and versions over to a new library.

       private static void MoveDocumentItem(SPListItem sourceItem, SPFolder destinationFolder)
{
//loop over the soureitem, restore it
for (int i = sourceItem.Versions.Count - 1; i >= 0; i--)
{
Hashtable htProperties = new Hashtable();

//set the values into the new item
foreach (SPField sourceField in sourceItem.Fields)
{
SPListItemVersion version = sourceItem.Versions[i];

if (version[sourceField.Title] != null)
{
if ((!sourceField.ReadOnlyField) && (sourceField.Type != SPFieldType.Attachments))
{
htProperties[sourceField.Title] = Convert.ToString(version[sourceField.Title]);
}
else if (sourceField.Title == "Created"
sourceField.Title == "Modified")
{
DateTime date = Convert.ToDateTime(version[sourceField.Title]);
htProperties[sourceField.Title] = sourceItem.Web.RegionalSettings.TimeZone.UTCToLocalTime(date);
}
else if (sourceField.Title == "Created By"
sourceField.Title == "Modified By")
{
htProperties[sourceField.Title] = Convert.ToString(version[sourceField.Title]);
}
}
}

//Get the version of the document
byte[] document;
if (i == 0)
{
document = sourceItem.File.OpenBinary();
}
else
{
document = sourceItem.File.Versions.GetVersionFromLabel(
sourceItem.Versions[i].VersionLabel).OpenBinary();
}

//Create the new item. Overwriting it will treat is as a
//new item.
SPFile newFile = destinationFolder.Files.Add(
destinationFolder.Url + "/" + sourceItem.File.Name,
document,
htProperties,
true);

newFile.Item["Created"] = htProperties["Created"];
newFile.Item["Modified"] = htProperties["Modified"];
newFile.Item.UpdateOverwriteVersion();
}

}

Wednesday, October 7, 2009

.NET 4.0 WF Initial Impressions

A couple months ago I was asked some very direct questions about the viability of K2 and other such tools with .NET 4.0 and Dublin. I personally have just not have had lots of time to do go off and research this. However I attended a quick one hour virtual session put on by Microsoft for WF in .NET 4.0.

The big thing I found out is that the State Machine workflow will not available in the initial release of .NET 4.0. That was a big surprise to me. All you will have is Sequential and Flow Chart workflows. The presenter said that you can achieve something similar to a State Machine workflow by doing a Flow Chart workflow. This would lead me to believe that many of the workflow challenges we had with WF in MOSS 2007 have not been resolved.

They talked a little about Workflow Services and I found out that you cannot do as much with Workflow Services than what you can do with WF. I did not get any details on what those specifics were.

A lot of the discussion was about how ISV can use WF to augment their frameworks and even provide the ability to allow customizations into their products using visual tools. This is what I have been preaching for a while now. You cannot adopt WF as the business process automation platform for a company. It does not come anywhere close. It is a framework to build business process automation frameworks.

I have had conversations think where companies believe that since they have SharePoint to host their WF workflows and they believe that is all they need. In the long run your costs will be significantly hirer to maintain, extend upon and manage. I have a personal thing with WF in SharePoint because I do not like the fact that the workflows can only be tied to a piece of content. If a company wanted to do finance or accounting process automation (that would span across enterprise systems) the workflow instance would have to be tied to a SharePoint list item which is not even actor in the process itself. So ask, why do we need this SharePoint list item, it serves no real purpose in the process. Plus if someone deletes the item or the associated task, the process will just end. There is no reporting, and the list goes on.

The point is that WF in MOSS should be used to just manage content in SharePoint. It is not a good platform for human workflow – you really need to look at other tools if you need human workflow. Plus it really does not look like Microsoft is chasing after companies like K2 and Nintex and they should have a healthy future.

Monday, October 5, 2009

IIS 7 Kerberos Configuration

I have seen several questions come up on projects in the past three weeks where teams are trying to configure Kerberos with IIS 7. With IIS 6 we were used to just setting up the SPNs. Now with IIS 7 we have to configure the <windowsAuthententication> node in the applicationHost.config file. If not, it will seem as if Kerberos is just flat out not working.

I have sent this blog to a couple of co-workers (http://sharepointspot.blogspot.com/2008/12/sharepoint-kerberos-on-windows-2008.html) and this got them up and running immediately.

If you want a little background Kerberos configuration in general – read this blog I wrote - http://www.k2distillery.com/2009/04/k2-blackpearl-kerberos-configuration.html. Most of the content is slighted towards K2 configuration with Kerberos however it will help you if you never done it before.

This blog (http://blogs.msdn.com/martinkearn/archive/2007/04/23/configuring-kerberos-for-sharepoint-2007-part-1-base-configuration-for-sharepoint.aspx) is probably the most well known blog on Kerberos for MOSS. This guy basically shows you all of the Kerberos commands that you need to run for all the SharePoint service accounts that you may create for your SharePoint farm.

As well, Kerberos configuration comes up a lot with the configuration of SSRS and MOSS. Here is a good article that explains it (http://msdn.microsoft.com/en-us/library/bb283324.aspx).

Saturday, October 3, 2009

Embed and Deploy User Control in SharePoint Web Part

1.0 Introduction

Several months I go I had some colleagues mention to me that it is possible load an ASP.net user control into a SharePoint web part. You may be asking why would I want to or consider doing that. There are some important reasons.

  • Your company may already have a large investment in standard ASP.net user controls and you do not want to have to rewrite them as a SharePoint web part.
  • ASP.net user controls can be easily embedded into other custom web applications.
  • SharePoint web part development can be challenging at times to build up a rich user interface. Using an ASP.net user control, you can build and test that code outside of the SharePoint context. I believe that most of this has to do with short comings of Visual Studio as an Integrated Development Environment (IDE) for SharePoint. We expect great things soon…

A popular code project call SmartPart is out there which many people have used to load user controls into a web part. Greg Galipeau referred me to blog which discussed many short comings of that project. Upon reading that, I knew I would never us it for a client and that it is really not that hard to create your own SmartPart web parts.

As I have discussed in the past, I am a huge proponent of:

  1. Creating SharePoint deployment projects that deploy everything in a solution and Feature.
  2. Anything that is deployed to SharePoint runs under minimal trust.

In this article I plan to show you how to create a .Net project that builds a .NET user control and web part, how to deploy the solution and best practices I learned along the way.

2.0 Creating the SharePoint Projects

There are basically two projects we need to create. The first if for the .NET user control and the second is for the ASP.net web part which will load the user control. The process I am going to take you through is:

  1. Build the ASP.net Project by itself.
  2. Create a Web Part Project.
  3. Then show you the modifications to integrate the user control into the web part.

2.1 Creating the ASP.net Project

First create the project for the ASP.net control. This is as simple as creating an ASP.net project. Here is a screen shot of the project that I created.

image

There is really nothing special about it:

  • I left the Default.aspx so that I can use it for testing the SmartControl.
  • I please the SmartControl in a UserControls folder. No specific reason other than come practice.

Here is the code from SmartContro.ascx. Note I only have a simple label we will use for testing purposes.

<%@ Control Language="C#" AutoEventWireup="true" CodeBehind="SmartControl.ascx.cs" Inherits="MOSSDistillery.SmartControl.UserControl.UserControls.SmartControl" %>
<asp:Label ID="lblHelloWorld" runat="server" Text=""></asp:Label>

Here is the code behind for SmartControl.ascx.cs. The only interesting thing I have done here is created a method for changing the color of the text of the label. In the example later on, I will show how this can be set from the web part’s configuration. The point of this is to show how to pass data into the user control.

public partial class SmartControl : System.Web.UI.UserControl
{
protected void Page_Load(object sender, EventArgs e)
{
lblHelloWorld.Text = "Hello World";
lblHelloWorld.ForeColor = System.Drawing.ColorTranslator.FromHtml("#FFFF00");
}
}

2.2 Creating the Web Part Project

Second create the project for the web part. I know that I could use WSPBuilder to make my life easier however I have found that it is really not that hard to build a Feature and web part. In my blog on how to create web part, I provided the exact steps on how to project for a web part project. Please go there and complete the instructions, as that is what I have done.

Here is my resulting project as well as the code for my web part project using the instructions I have in this blog.

image

SmartWebPart.cs

public class SmartWebPart : System.Web.UI.WebControls.WebParts.WebPart
{
protected override void Render(HtmlTextWriter writer) {
base.Render(writer);
writer.WriteLine("Hello World");
}
}

MOSSDistillery.SmartControl.WebPart.SimpleWebPart.webpart

<webParts>
<webPart xmlns="http://schemas.microsoft.com/WebPart/v3">
<metaData>
<type name="MOSSDistillery.SmartControl.WebPart.SmartWebPart, MOSSDistillery.SmartControl.WebPart, Version=1.0.0.0, Culture=neutral, PublicKeyToken=e652952dcf5e6363" />
<importErrorMessage>Cannot import MOSSDistillery.SmartControl.WebPart.SmartWebPart</importErrorMessage>
</metaData>
<data>
<properties>
<property name="Title" type="string">My Smart Web Part</property>
<property name="Description" type="string">My Smart Web Part Demonstration.</property>
</properties>
</data>
</webPart>
</webParts>

Feature.xml

<?xml version="1.0" encoding="utf-8" ?>
<Feature Id="21E3F9D4-6DC2-4042-A873-C3440127476F"
Title="My Smart Web Part"
Description="My Smart Web Part Demonstration."
Version="1.0.0.0"
Scope="Site"
Hidden="FALSE"
DefaultResourceFile="core"
xmlns="http://schemas.microsoft.com/sharepoint/">
<ElementManifests>
<ElementManifest Location="elements.xml" />
<ElementFile Location="MOSSDistillery.SmartControl.WebPart.SimpleWebPart.webpart"/>
</ElementManifests>
</Feature>

elements.xml

<?xml version="1.0" encoding="utf-8" ?>
<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
<Module Name="WebParts" List="113" Url="_catalogs/wp">
<File Url="MOSSDistillery.SmartControl.WebPart.SimpleWebPart.webpart" Type="GhostableInLibrary" />
</Module>
</Elements>

manifest.xml

<Solution xmlns="http://schemas.microsoft.com/sharepoint/" SolutionId="1B69F425-BCC3-4de0-BC2F-A84B1168F84A">
<FeatureManifests>
<FeatureManifest Location="MOSSDistillery.SmartControl.WebPart\Feature.xml"/>
</FeatureManifests>
<Assemblies>
<Assembly Location="MOSSDistillery.SmartControl.WebPart\MOSSDistillery.SmartControl.WebPart.dll" DeploymentTarget="GlobalAssemblyCache" >
<SafeControls>
<SafeControl Assembly="MOSSDistillery.SmartControl.WebPart.SmartWebPart, MOSSDistillery.SmartControl.WebPart, Version=1.0.0.0, Culture=neutral, PublicKeyToken=e652952dcf5e6363"
Namespace="MOSSDistillery.SmartControl.WebPart"
Safe="True"
TypeName="*"/>
</SafeControls>
</Assembly>
</Assemblies>
<CodeAccessSecurity>
<PolicyItem>
<Assemblies>
<Assembly PublicKeyBlob="..." />
</Assemblies>
<PermissionSet class="NamedPermissionSet" Name="MOSSDistillery.SmartControl.WebPart" version="1" Description="MOSSDistillery.UserControl.WebPart">
<IPermission class="AspNetHostingPermission" version="1" Level="Minimal" />
<IPermission class="SecurityPermission" version="1" Unrestricted="true" />
<IPermission class="WebPartPermission" version="1" Connections="True" />
<IPermission class="Microsoft.SharePoint.Security.SharePointPermission, Microsoft.SharePoint.Security, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" version="1" Unrestricted="true" />
</PermissionSet>
</PolicyItem>
</CodeAccessSecurity>
</Solution>

WSP.ddf

.OPTION Explicit
.Set CabinetNameTemplate="MOSSDistillery.SmartControl.WebPart.wsp"
.Set DiskDirectory1="C:\MOSSDistillery\SmartControl\MOSSDistillery.SmartControl\MOSSDistillery.SmartControl.WebPart\Deployment"

manifest.xml

.Set DestinationDir="MOSSDistillery.SmartControl.WebPart"
%outputDir%MOSSDistillery.SmartControl.WebPart.dll
TEMPLATE\FEATURES\MOSSDistillery.SmartControl.WebPart\elements.xml
TEMPLATE\FEATURES\MOSSDistillery.SmartControl.WebPart\Feature.xml
TEMPLATE\FEATURES\MOSSDistillery.SmartControl.WebPart\MOSSDistillery.SmartControl.WebPart.SimpleWebPart.webpart

.Delete outputDir

2.3 Preparing for Project Deployment

Now that we have both the projects created, we need to complete the following steps to integrate them so that we can display the user control within the web part.

2.3.1 Sign the ASP.net Project

First we need to sign the ASP.net project with the user control because it will be deployed to the GAC.

2.3.2 Get the Public Key Token

Then you will need to run the following command like we did for the web part project so we can get the public key token.

sn -Tp "C:\MOSSDistillery\SmartControl\MOSSDistillery.SmartControl\MOSSDistillery.SmartControl.UserControl\bin\MOSSDistillery.SmartControl.UserControl.dll"

2.3.3 Modify SmartControl.ascx

First add the Assembly tag to the user control. This is so the web control can reference the dll that will be deployed to the GAC. Notice we used the public key token we created in the previous step. Second, I removed the CodeBehind attribute in the Control element.


<%@ Assembly Name="MOSSDistillery.SmartControl.UserControl, Version=1.0.0.0, Culture=neutral, PublicKeyToken=367c68a97c663918"%>
<%@ Control Language="C#" AutoEventWireup="true" Inherits="MOSSDistillery.SmartControl.UserControl.SmartControl" %>
<asp:Label ID="lblHelloWorld" runat="server" Text=""></asp:Label>

2.3.4 Change the User Control Code

All I did was create some properties which set the text and the color. We will set these from the SharePoint web part.

public partial class SmartControl : System.Web.UI.UserControl
{
private string _text = string.Empty;
private string _color = string.Empty;

public string Color
{
get
{
return _color;
}
set
{
_color = value;
}
}

public string Text
{
get
{
return _text;
}
set
{
_text = value;
}
}

protected void Page_Load(object sender, EventArgs e)
{
lblHelloWorld.Text = _text;
lblHelloWorld.ForeColor = System.Drawing.ColorTranslator.FromHtml(_color);
}
}


2.3.5 Modify the Web Part

Now I have to modify the web part to set the properties of my user control. The core of this solution is the code I put in the CreateChildControls() method.

public class SmartWebPart : System.Web.UI.WebControls.WebParts.WebPart
{
private string _error = string.Empty;
private string _color = string.Empty;
private string _text = string.Empty;

[Personalizable(PersonalizationScope.Shared),
WebBrowsable(true),
WebDisplayName("Text"),
WebDescription("Text"),
Category("Custom")]
public string Text
{
get
{
return _text;
}
set
{
_text = value;
}
}

[Personalizable(PersonalizationScope.Shared),
WebBrowsable(true),
WebDisplayName("Color"),
WebDescription("Color"),
Category("Custom")]
public string Color
{
get
{
return _color;
}
set
{
_color = value;
}
}

protected override void Render(HtmlTextWriter writer) {
base.Render(writer);
writer.WriteLine(_error);
}

protected override void CreateChildControls()
{
try
{
base.CreateChildControls();

if (string.IsNullOrEmpty(_text))
{
throw new Exception("No text has not been set");
}

if (string.IsNullOrEmpty(_color))
{
throw new Exception("Color has not been set");
}

string path = "~/_controltemplates/MOSSDistillery.SmartControl.UserControl/SmartControl.ascx";
MOSSDistillery.SmartControl.UserControl.SmartControl control =
(MOSSDistillery.SmartControl.UserControl.SmartControl)Page.LoadControl(path);

control.Text = _text;
control.Color = _color;

Controls.Add(control);
}
catch (Exception ex)
{
_error = ex.Message + " " + ex.InnerException;
}
}
}

One important note, notice that I hard coded the path to the user control. If I wanted to make this a generic web part that would have the ability to load any ASP.net user control, I would instead make the following changes. The problem with this approach is that configuration values from the web part could not be set into the user control. It possible to use the web.config but again in this case it did not make sense because a web part like this could be used in lots of places. In each of those places the color or text may be different so a web.config setting would not work. It could be possible to create some code that uses reflection to set the properties of ASP.net user control but I really did not want to create anything that complex yet.

string _path = "";

[Personalizable(PersonalizationScope.Shared),
WebBrowsable(true),
WebDisplayName("User Control Path"),
WebDescription("User Control Path"),
Category("Custom")]
public string UserControlPath {
get {
return _path;
}
set {
_path = value;
}
}
protected override void CreateChildControls()
{
try
{
base.CreateChildControls();

if (string.IsNullOrEmpty(_path))
{
throw new Exception("Path has not been set");
}

System.Web.UI.UserControl control = (System.Web.UI.UserControl)Page.LoadControl(_path);
Controls.Add(control);
}
catch (Exception ex)
{
_error = ex.Message + " " + ex.InnerException;
}
}

2.3.6 Change Manifest.xml

Next we need to make the following changes to the manifest.xml. Basically we need to incorporate the user control into the deployment.

  1. We add the UserControl as a new Assembly element. It is important to add the SafeControls element for the UserControl so that it will be marked as a safe control in the web.config.
  2. We add TemplateFiles element which will place the ascx control in the CONTROLTEMPLATES folder in the 12 hive. Notice that the location has “MOSSDistillery.SmartControl.UserControl”. This will create a folder called “MOSSDistillery.SmartControl.UserControl” and will place the ascx control in that folder. This is important so your user controls do not get intermingled with the out of the box MOSS user controls.
<Solution xmlns="http://schemas.microsoft.com/sharepoint/" hSolutionId="1B69F425-BCC3-4de0-BC2F-A84B1168F84A">
<FeatureManifests>
<FeatureManifest Location="MOSSDistillery.SmartControl.WebPart\Feature.xml"/>
</FeatureManifests>
<Assemblies>
<Assembly Location="MOSSDistillery.SmartControl.WebPart\MOSSDistillery.SmartControl.WebPart.dll" DeploymentTarget="GlobalAssemblyCache" >
<SafeControls>
<SafeControl Assembly="MOSSDistillery.SmartControl.WebPart.SmartWebPart, MOSSDistillery.SmartControl.WebPart, Version=1.0.0.0, Culture=neutral, PublicKeyToken=e652952dcf5e6363"
Namespace="MOSSDistillery.SmartControl.WebPart"
Safe="True"
TypeName="*"/>
</SafeControls>
</Assembly>
<Assembly Location="MOSSDistillery.SmartControl.UserControl\MOSSDistillery.SmartControl.UserControl.dll" DeploymentTarget="GlobalAssemblyCache">
<SafeControls>
<SafeControl Assembly="MOSSDistillery.SmartControl.UserControl.SmartControl, MOSSDistillery.SmartControl.UserControl, Version=1.0.0.0, Culture=neutral, PublicKeyToken=367c68a97c663918"
Namespace="MOSSDistillery.SmartControl.UserControl"
Safe="True"
TypeName="*"/>
</SafeControls>
</Assembly>
</Assemblies>
<TemplateFiles>
<TemplateFile Location="CONTROLTEMPLATES\MOSSDistillery.SmartControl.UserControl\SmartControl.ascx"/>
</TemplateFiles>
<CodeAccessSecurity>
<PolicyItem>
<Assemblies>
<Assembly PublicKeyBlob="..." />
</Assemblies>
<PermissionSet class="NamedPermissionSet" Name="MOSSDistillery.SmartControl.WebPart" version="1" Description="MOSSDistillery.UserControl.WebPart">
<IPermission class="AspNetHostingPermission" version="1" Level="Minimal" />
<IPermission class="SecurityPermission" version="1" Unrestricted="true" />
<IPermission class="WebPartPermission" version="1" Connections="True" />
<IPermission class="Microsoft.SharePoint.Security.SharePointPermission, Microsoft.SharePoint.Security, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" version="1" Unrestricted="true" />
</PermissionSet>
</PolicyItem>
</CodeAccessSecurity>
</Solution>


2.3.7 Modify the WSP.ddf

Finally we needed to make a couple modifications to bring in the ASP.net User Control. You can see that I have pulled in both SmartControl.dll and SmartControl.ascx. Both of the paths to those files match the paths specified in the manifest.xml.

.OPTION Explicit
.Set CabinetNameTemplate="MOSSDistillery.SmartControl.WebPart.wsp"
.Set DiskDirectory1="C:\MOSSDistillery\SmartControl\MOSSDistillery.SmartControl\MOSSDistillery.SmartControl.WebPart\Deployment"

manifest.xml

.Set DestinationDir="MOSSDistillery.SmartControl.WebPart"
%outputDir%MOSSDistillery.SmartControl.WebPart.dll
TEMPLATE\FEATURES\MOSSDistillery.SmartControl.WebPart\elements.xml
TEMPLATE\FEATURES\MOSSDistillery.SmartControl.WebPart\Feature.xml
TEMPLATE\FEATURES\MOSSDistillery.SmartControl.WebPart\MOSSDistillery.SmartControl.WebPart.SimpleWebPart.webpart

.Set DestinationDir="MOSSDistillery.SmartControl.UserControl"
..\MOSSDistillery.SmartControl.UserControl\bin\MOSSDistillery.SmartControl.UserControl.dll

.Set DestinationDir="CONTROLTEMPLATES\MOSSDistillery.SmartControl.UserControl"
..\MOSSDistillery.SmartControl.UserControl\UserControls\SmartControl.ascx

.Delete outputDir


3.0 Conclusions

That is it; you can now see how easy it is to deploy a user control to SharePoint and load it into a web part. The great thing about this is that you can do development of web parts significantly more quickly.

4.0 References