Saturday, December 26, 2009

Call WCF from K2 blackpearl event

Introduction

I recently had the task to integrate some enterprise WCF services with a K2 blackpearl process for a company. If you need an introduction on WCF, please read this blog I wrote on how to write a WCF service.

There was not much stuff on the K2 Underground so I figured I write a short blog on how to go it. Thank you to Bob Maggio at K2 for giving some pointers.

There are really two ways you can do this.

  1. Call WCF endpoint through server side event handler in K2.
  2. Use the Dynamic WCF SmartObject Service.

Server Side Code Event

To all a WCF endpoint from a server side code event is pretty simple but there are a few things you should know. First, you may be used to right clicking on references in class library project and selecting the "Add Service Reference". What this does use the Svcutil tool to generate a client service class for you to call. That equivalent is not available in a K2 blackpearl project. You can try to add a reference by going to the K2 blackpearl process properties and then select references. In there you have the ability to add a "web reference" but that is for a Web Service. Even if the WCF service is deployed in IIS, you cannot use that methodology as it will give you errors. Just trying doing in a command line application outside of K2 and you will see what I mean.

The only way to do call the WCF service in clean manner is to create a class library project and call to it from your server side code event. The steps are:

  1. Add a class library project.
  2. In the K2 process properties, add a reference to that new project.
  3. Add the client service interface to that project and create a helper class that calls the WCF endpoints. Follow my instructions that I wrote about in this blog.
  4. In the K2 process server side event, add code to call your helper class.
  5. You will need to take the WCF client endpoint configuration (discussed in this blog) and add it to the C:\Program Files\K2 blackpearl\Host Server\Bin\K2HostServer.config.
  6. You will need to restart the K2 service to take effect.

Notes

  • In step five I discuss adding the WCF client endpoint configuration to the K2HostServer.config. This configuration is global to all K2 processes running on that server.
  • To test you helper class outside of K2, just create a command line project that reference the helper project. Place the client side endpoint configuration in the app.config of that project.

Dynamic WCF SmartObject Service

In my solution I elected to not use this even though it is a pretty good approach. Why? We really were not using SmartObjects as part of the overall architecture. First, I really did not want to have to pass through SmartObject layer just to do WCF call. Second, SmartObjects are great for doing visual development inside your K2 blackpearl processes. However in this solution we were already doing lots of server side code events so writing code into the process was not big deal and really quick to do anyways.

If we had decided to use the Dynamic WCF SmartObject Service I really liked to deployment better than the first option we went over. The configuration of the WCF service endpoints would have been in the SmartObject service on each K2 Server rather than having to modify the K2 Service config file and restarting the K2 server when there is a change.

Considerations

Only consideration I always mention is that K2 processes version controlled. That means your WCF services MUST be version controlled. What can happen is:

  • You deploy process version 1.
  • You create 10 process instances on version 1.
  • You make a change to the process and the WCF interface and deploy version 2 of the K2 process.
  • BOOM, process instances on version 1 start bombing because changed the WCF interface.

So if you were to:

  • Remove a WCF service method.
  • Change the parameters, data or contract of an existing WCF service method.
  • Drastically change the internals of an existing WCF service method.

There will be a high chance that you will run into this issue. You need to account for this because making changes to WCF services. This is really nothing new; you will run into this issue when calling external databases or web services. If you are practicing good SOA methodologies you will be able to handle service versioning.

My WCF Introduction

1.0 Simple WCF Introduction

Windows Communication Foundation (WCF) was released as part of .NET 3.0 and has been out for some time now. It has been heavily adopted up to this point and will be around for a long time. What WCF did was unify SOAP based web service messages, .NET Remoting, asynchronous communications and distributed transaction into a single service orientated framework for Microsoft. It also provides a nice pluggable environment such that messages can be sent over multiple communication protocols (HTTP, TCP, MSMQ, etc.).

This description probably does not do WCF justice but it this is what I think about with WCF (I also have some opinions below).

2.0 Your First WCF Service

I needed to quickly learn how to build a WCF service and I wanted to host it through IIS. I found two articles that showed me how to do this and I decided to consolidate this into one blog posting. Here are the quick steps to set up that first simple WCF service.

2.1 Create the Project

You can use the .NET project templates but is really simple to create this using a class library. Sometimes it is just good to know how it works.

  • Create a Class Library in my example K2Distillery.WCF.
  • Add folders called App_Code and Bin to the K2Distillery.WCF project.
  • Add a Web.config file to K2Distillery.WCF project.
  • Add references to System.Runtime.Serialization and System.ServiceModel.
  • Here where I deviate from most articles when they introduce WCF.
  • Add another Class Library project that will contain the actual code for the service, in my case I called K2Distillery.Service.
  • Add a reference to System.Runtime.Serialization to the K2Distillery.Service project.
  • In the K2Distillery.WCF project add a reference to the K2Distillery.Service project.

I do this because I personally believe that WCF should be a pass through to facilitate distributed computing, interoperability, etc. You should build your layered service architecture as a separate dlls. If you do not, your service methods will only be available through WCF and you will want to expose those services out through other means in the future.

2.2 Building the WCF Service Classes

In this example, a good simple service that may be needed is to get users by a business role. This data could be stored in many places (AD, HR Databases, etc.) and a nice generic method that returns a complete employee profile is very useful for all applications across the enterprise.

In the K2Distillery.Service project, I will add a method called GetEmployeesByRole(string roleName) to the RoleService.cs class. As will create an Employee class and decorate it with attribute that make properties serializable.

Here is the employee class:

    [DataContract]
public class Employee
{
string _employeeNumber;
string _firstName;
string _lastName;
string _email;
string _activeDirectoryID;

public Employee(string employeeNumber, string firstName, string lastName, string email, string activeDirectoryID)
{
EmployeeNumber = employeeNumber;
FirstName = firstName;
LastName = lastName;
Email = email;
ActiveDirectorID = activeDirectoryID;
}

[DataMember]
public string EmployeeNumber
{
get { return _employeeNumber; }
set { _employeeNumber = value; }
}

[DataMember]
public string FirstName
{
get { return _firstName; }
set { _firstName = value; }
}

[DataMember]
public string LastName
{
get { return _lastName; }
set { _lastName = value; }
}

[DataMember]
public string Email
{
get { return _email; }
set { _email = value; }
}

[DataMember]
public string ActiveDirectorID
{
get { return _activeDirectoryID; }
set { _activeDirectoryID = value; }
}
}

Here is the service method:

    public class RoleService
{
public static List<Employee> GetEmployeesByRole(string roleName)
{
List<Employee> users = new List<Employee>();
users.Add(new Employee("111", "Jason", "Apergis", "japergis@foo.com", "foo\\japergis"));
users.Add(new Employee("222", "Ethan", "Apergis", "eapergis@foo.com", "foo\\eapergis"));

return users;
}
}

Then in K2Distillery.WCF project I will add two files to the App_Code folder called IRoleService.cs and RoleService.cs. I am not going to go into the details of WCF (read the references I have provided) but you need to create an interface that that is decorated with attributes and then implement that interface.

In IRoleService.cs I define the interface for the service being exposed through WCF.

    [ServiceContract()]
public interface IRoleService
{
[OperationContract]
Employee[] GetEmployeesByRole(string roleName);
}

Then in RoleService.cs in the K2Distillery.WCF project I will add the following code to implement IRoleService. This will basically just calls my service method.

    public class RoleService : IRoleService
{
public Employee[] GetEmployeesByRole(string roleName)
{
List<Employee> employees = K2Distillery.Service.RoleService.GetEmployeesByRole(roleName);
return employees.ToArray();
}
}

Notice the only thing the WCF method does is transforms the results from List<Employee> to a Employee[]. This is required because .NET Generic lists cannot go across platform boundaries and must be transformed into a simple array. It could be argued by the SOA zealots that my service layer should have returned the values out in a generic fashion but in these days, I almost everything I work in is .NET and is never really a requirement.

2.3 Preparing Service Configuration

The next thing you will need to do is create the configuration files for the service.

First thing you need to create is a .svc file which can be placed in the root of the K2Distillery.WCF project. The service attribute just points to the fully qualified name of the service class.

<%@ServiceHost language=c# Debug="true" Service="K2Distillery.WCF.RoleService"%>

In my example we are going to deploy this WCF service through IIS, so we will need to add a web.config file to K2Distillery.WCF project. Below is web.config file that is needed for this service..

<?xml version="1.0"?>
<configuration>

<appSettings />

<system.serviceModel>
<behaviors>
<serviceBehaviors>
<behavior name="mexBehavior" >
<serviceMetadata httpGetEnabled="true" />
</behavior>
</serviceBehaviors>
</behaviors>

<services>
<service name="K2Distillery.WCF.RoleService"
behaviorConfiguration="mexBehavior">
<endpoint address=""
binding="wsHttpBinding"
contract="K2Distillery.WCF.IRoleService" />
<endpoint address="mex"
binding="mexHttpBinding"
contract="IMetadataExchange" />
</service>
</services>
</system.serviceModel>

</configuration>

2.4 Final Solution

The final solution should look like the following:

clip_image002

2.5 Deployment

There are multiple different ways to deploy the service to IIS. The two I will call your attention.

  1. Deploy the service as DLL into the bin directory or GAC.
  2. Deploy the .cs file in the App_Code directory into IIS and have the file dynamically compiled.

In my opinion the best practice would be to deploy it as a DLL in the bin directory of a web site. The reason being is similar to the security issues we run into with SharePoint.

To deploy this service:

  1. Create a service account to run the service under.
  2. Create an application pool in IIS to use that service account.
  3. Create a new virtual directory in IIS to host these files.
  4. Place the web.config and RoleService.svc files in the root folder of the new virtual directory.
  5. Add a bin folder and place the K2Distillery.WCF.dll and K2Distillery.Service.dll in it.

I would recommend creating a bat file that will automate this deployment for you. But that is all you need to do.

To quickly test if everything worked, just open IE and navigate to the svc file (http://XXX/RoleService.svc?wsdl) and see if there are any errors.

2.6 Generating Client Classes

To allow a client to access the service there are two simple ways of doing this:

  1. Use Visual Studio, right click the project references, select Add Service reference and provide the url to the RoleService.svc.
  2. Use the Svcutil tool that have more granular control over the client interface creation. For detailed information read this - ServiceModel Metadata Utility Tool (Svcutil.exe)

I personally like using the Scvutil because if gives more control and it will also generate the app configuration you will need for the client application. You will need to add the generated config setting to the client application web.config or app.config file.

In this case the command would be:

cd C:\Program Files\Microsoft SDKs\Windows\v6.0\Bin

svcitil.exe http://XXX/RoleService.svc?wsdl

2.7 Calling Service from Client

In this example I am going to create a simple console application to call out to the service.

  • You will need to add the generated class from the previous step.
  • You will need to create an app.config file and add the generated configurations from the previous step.

Here is example of the code for the generated client class.

try
{
using (RoleServiceClient client = new RoleServiceClient())
{
Employee[] employees = client.GetEmployeesByRole("");

foreach (Employee employee in employees) {
Console.WriteLine(employee.EmployeeNumber);
}
}
}
catch (Exception ex)
{
throw new Exception("Error Calling WCF GetEmployeesByRole: " + ex.Message);
}

You must always sure that you properly dispose your service.

That is how easy it is to get started on creating some simple WCF services.

4.0 WCF Final Thoughts

On MSDN, Microsoft states that “Windows Communication Foundation (WCF) is designed to offer a manageable approach to distributed computing, broad interoperability, and direct support for service orientation”. I had several healthy discussions with some smart colleagues of mine that I trust. Some say I take it too literal but I base my opinion on the experiences I have. I have a couple issues with using WCF because I have seen many well intended developers and companies incorrectly architect SOA and Web Services in general over the past couple years. In my opinion there is no need for WCF unless you have:

  • Interoperability requirements.
  • Need to support distributed transactions for the enterprise.
  • Need to provide common functionality to many systems.

Far too often:

  • People put in web services to create a physical boundary in the layer architecture to force a decoupled architecture. A good OO developer who knows good Gang of Four development will not need to do that.
  • Creating a service to host something utility classes which should be directly referenced.
  • Creating DAL services that are primarily for a single application.

I as a developer really like WCF because it is strongly typed and you have the ability to scale out your services using different communication protocols. However when I put on my architect and manager hat I see a different story. I will challenge developers on the need for WCF. Very often in the environments that I work in SOA is not needed. Couple years ago developers thought web services were the hottest thing out there and they baked them into their applications. The result was performance issues and code coupling maintenance nightmares.

Why do SOA projects fail? Governance. Typically when you build an application there are business owners, technical management, assigned development teams and maintenance personnel. We do not see that level of support with services because services are too far behind the scenes. What typically happens is a service is implemented and it may be used in one or two applications. Then new business requirements are received which require application specific modifications. The services are then modified for the specific application and things just start becoming harder to manage. My point is we are usually in application management mode and not service management mode. Successful SOA implementations are applications to themselves. They are developed from the beginning without much knowledge from the consumers and the consumers must call them based on the public interface contract.

Services work great with exposing functionality from ERP systems like SAP and PeopleSoft. Or even exposing enterprise functionality for SharePoint (search, content management, etc.). But nine times out of ten it is not needed for an internal company custom database for a custom application.

My conclusion is good design will push you to create decoupled application layers (UI, business, persistence, data, etc). WCF should just wrap these layers and expose functionality; nothing more. You should write good methods that expose enterprise data and functionality as an “endpoint” as needed. Please know your requirements and make sure you ask tough questions. Just because you can create a WCF service does not mean you should.

5.0 References

Monday, November 2, 2009

What is a FAST Enterprise Search Project Part 2

Series

Introduction

In my previous blog (Why is FAST Enterprise Search Important) I discussed why is an Enterprise Search project in import? In this blog posting I will discuss what is needed for a successfully Enterprise Search project. This should hopefully give you enough information to anticipate what will be needed in an Enterprise Search project.

What is an Enterprise Search Project?

A few years ago I had to make the transition as a custom application developer to an application server consultant with Microsoft products. Project plans for implementing SharePoint, K2 or BizTalk were really not much different other than you have several new tasks associated to the configuration, integration, sustainability and maintenance of the new application server. Still with application server projects you still have lots of custom artifacts and components that have to be developed. This too is the case with FAST.

When posed the question of what is an Enterprise Search project, I first did not know where to start. I wanted to draw from my past experience. I also knew that Enterprise Search projects can be complex but I did not understand what a search project would entail.

Content Processing and Transformation

Enterprise Search within an organization many complexities. First we have to be able to index content where ever it may be (in a custom database, 3rd party enterprise application server, file share, mainframe, etc.). Custom code may have to be written to facilitate bringing this content over to FAST so that it can be indexed. Knowing this a comprehensive analysis project must be completed to understand all the content/data that is spread across the organization. A common mistake is a company may index bad data and they get the old "garbage in; garbage out" issues. There must be plans for indexing both good and bad data, formatting unstructured data, making data relevant, normalizing data (removing duplicates), etc. We will need to understand the entire life-cycle of that data and how it can be effectively pulled or pushed into the FAST Search index. This is very similar to a data warehouse project however the context is a little different.

An Enterprise Search project is also very similar to a complex ETL project because you will have to create several transformation processes/workflows. The processes must transform the content into a document that can be recognized by the FAST Index. FAST refers to anything in the index as a document; even if the index item comes from a database. A document for FAST is a unique piece of data with metadata which gives it relevancy. FAST provides several out of the box connectors that do this transformation and they provide an API to write custom ones. In many cases you may have to build or extend connectors. Just as important as the ETL pre-processing, there is post-processing routines that must be executed before the search results are passed back to the user interface layer. Again more relevancy rules or aggregation of search results may be incorporated here. I was happy to hear that the FAST team also draws comparisons to an ETL project when discussing what an Enterprise Search project is.

User Interface

Most Enterprise Search platforms like FAST do not have a traditional GUI; it is an Enterprise Search engine that can be plugged into new or existing platforms. FAST does provide several controls that can be integrated into any UI platform but in many cases you will be extending upon or building complete new controls. FAST provides a rich API that is accessible in such languages and .NET, Java and C++.

User Profile

An important element of the FAST Enterprise Search project is to understand the user profile that is performing the search. Things such as their current location, where they are within the organization, what sort of specialties do they have, what types of past searches have they done, who have they worked for work for, and past or future projects, tasks or initiatives they have supported can all be used to give a more relevant search result. This requires integration to go to systems that can infer these relationships and pass this information along with the query to FAST Query and Results server which will return a relevant result.

Security

The profile is also important for incorporating security. FAST has numerous ways in which documents can be securely exposed to the end user. For instance there is an Access Control List (ACL) which is part of the document instance in the search index. The ACL is populated during the indexing of content and this may require customizations to set the ACL appropriately. As well, more customizations may be added to do real-time authorization to ensure that documents being returned from the index have not been removed from the user's visibility. Another consideration is to partition indexes based on boundaries such as internet, extranet and intranet. There are several more considerations that must be accounted for so time must be accounted for in the plan to ensure that content is managed properly.

Installation and Configuration

A major portion of the project plan needs to be devoted to the installation and configuration of the FAST server. There are several important things that need to be accounted for when doing this. For instance how many queries will be executed concurrently, what are peak usage scenarios, how much content will be indexed, what sort of complexities/exceptions are there in the indexing process, what is the anticipated growth, etc. All of this must be known for us to properly scale the FAST server and the design of custom components.

Testing

With all of the custom transformation and GUI components to support the Enterprise Search implementation, there will need to be a focus on system integration testing, system application testing, and user acceptance testing. There will be specific test for search to ensure that indexing, query performance and result relevancy are accurate and within acceptable ranges. This is nothing new but we need to be sure that a proportionate amount of time is incorporated into the plan to ensure that a quality solution is put in place.

Sustainment and Governance

Sustainment next needs to be part of the plan which is commonly neglected. Too often the plan is focused on the short-term end result while the long-term management is not incorporated into the solution. What sort of organizational management changes are required to support and maintenance of the search implementation? What sort of configuration management business processes will need to be introduced to continually tune the index and relevancy model based on usage? What sort of new roles and responsibilities need to be incorporated into the employee performance (from both a systems and business user perspective)? How is the enterprise taxonomy going to be maintained? What sort key performance metrics and reporting are needed to consistently evaluate the success of the project? What is the process for incorporating change back in the solution (which is extremely important for Enterprise Search)? If questions like these are not incorporated into the early design of the project, there will be long-term challenges with the adoption and integration of the Enterprise Search investment.

Closing

As you can see the key to a successful Enterprise Search project is to understand the needs of the business and how the solution will be supported. Many of the tasks that were discussed are very standard; we just needed to put them in context.

Why is FAST Enterprise Search Important Part 1

Series

Introduction

The first thing that many will ask before beginning a major Enterprise Search initiative with a product like FAST is why is an Enterprise Search important? Secondly, what is an Enterprise Search project? My approach is to not understand these questions this from a sales perspective but from a technology management and consultant perspective.

Why is Enterprise Search important?

Users have to work mass amounts of data that is either stored internally or externally. Search can mean lots of things to different industries however the goal is simple; it is to display the right information to the right person at the right time without distraction. At the same time we must have a flexible and configurable search platform that will surface the most relevant information to the business user from where it is stored.

Information Workers have to search and then utilize data. How do they do this? They typically have to log into an application and perform a search. Or when they enter an application, there may be some data contextually rolled up to them based upon who they are. There is a demand by business users to make search easier. We have heard many times "how can I search my enterprise data in the same way I Google something on the internet". Users want the ability to go to a single place, run a search query and receive results from across the entire enterprise. This is very different than performing a public internet search or a search function contained within the scope of a single application. Public internet searching has its own complexities however it typically is indexing content on websites. Enterprise Search becomes complex because the data being indexed can come in numerous formats (document file, database, mainframe, etc). From the user perspective this complexity must be transparent. They must be given a single result set that will allow them to research problem, complete task or even initiate a business process.

Organizations are challenged with providing comprehensive search solutions that can access content no matter where the data resides. Public search engines have as well created demand to provide highly relevant search experiences. Relevancy is the key to success for a search solution. To have accurate relevancy it is important to know as much as we can about the user entering the query. Profile relevancy can be determined a by numbers of things. For example where the person is located, what is their job function, and what past searches have they or colleagues done. Relevancy can also be determined by the attributes associated to a piece of content. For example is the author considered to be trusted, is the content itself fresh, or even is content highly recommended by other users. The search platform must have an adaptive relevancy model. It must be able to change based on business demands and subsequently learn how to provide better results utilizing factors that are incorporated into the relevancy model. An Enterprise Search platform like FAST can provide this advanced capability.

The vision of going to a single place find data is not really a new concept. We have seen a major push for data warehouses to create a single location to facilitate enterprise reporting. We have seen enterprise portals created which give users a single user interface that provides contextual data from disparate systems. We have seen SOA trying to consolidate business services and now we are seeing cloud services gaining traction in the market. The reality is that the enterprise architecture on the large will be disparate. Companies have made significant investments into many technologies at one time or another and consolidating them to a single platform is not always realistic. This is why we are constantly trying to find new solutions to work with data in a uniform manner. This is an important justification for an Enterprise Search solution such as FAST.

To restate, the goal is to have an Enterprise Search platform that can create single result set using disparate data from across the enterprise. Where a lot of organizations fall short is they do not have the tools to navigate this data. Business users are required to have deep domain knowledge of the organization, format of the data, and business processes. The domain expert must know what is good or bad based upon experience which is not transferrable making continuity of operations challenging. This is yet another reason why an Enterprise Search platform provides significant value to an organization.

Here are some examples of how organizations have used Enterprise Search.

  • Several major ecommerce sites like Best Buy and Autotrader.com used FAST to better advertise to its customers, expose product significantly quicker to the customer, provide better navigation of search results and provide integration with OEM partners.
  • A business data brokerage firm was able to provide more relevant results, increase user satisfaction, provide data from multiple disparate locations, create better customer retention, created collaborative data rating system and allowed for communication between subject matter experts.
  • A community facilitator for the natural resource industry was able to create a B2B solution that provided dynamic drill/navigation of industry data, created automate extraction policies to mine for important data, was able to regionalize their search results, created a pay model for more high-end results, and improved their sales model by using relevancy.
  • A major computer production company used FAST to improve economies of scale for support personnel. They significantly lowered call-center cost by directing users to search first, provided customers with more up to date support information and allowed their worldwide staff of engineers to user their native languages when performing a search.
  • A global law firm used FAST to create a knowledge management solution that allowed them to reduce research personnel and created consolidated search experience. They significantly reduce ramp-up time of new lawyers, greatly improved relevant results with advanced content navigation, and provided better communication of best practices.
  • A law enforcement agency was able allow investigators to electronically research mass amounts of data across the government which they normally did not have access to. This subsequently increased productivity, shortened lengths of investigations and help them comply with government regulations.
  • Another government agency created a solution using FAST which would search public domain for information of persons who are potentially breaking laws and initiate business processes bring them to justice.

All these examples provide strong justifications for the value of an Enterprise Search solution. With FAST costs were reduced, they were able to meet regulations, they performed more efficiently, and generated more revenue for goods and services.

What is an Enterprise Search Project?

This will be discussed in my next blog What is a FAST Enterprise Search Project

Saturday, October 24, 2009

FAST Search Whitepapers

Here are some great whitepapers you should read if you want to start learning about FAST. I know there is a lot of buzz around it with its integration with SharePoint 2010 and finally providing SharePoint with a robust search engine. This is a great starting point for starting to understand what Enterprise Search is and how it can be strategically introduced and aligned with your Enterprise Architecture.

http://www.microsoft.com/enterprisesearch/en/us/FAST-technical.aspx

Tuesday, October 20, 2009

FAST Introduction and SharePoint Search Evolution

There is a lot of information that is coming out from the SharePoint 2010 conference and one of the biggest ones is the integration of FAST into SharePoint 2010. What is FAST? FAST is an enterprise search engine that Microsoft acquired and they have placed a significant investment into. The most important thing you should know right off the bat is FAST does not equal SharePoint. FAST is an enterprise search platform which can be used as the search engine for SharePoint. Up to this point Microsoft has not provided a way to search for content across the enterprise. What we have done to compensate for this is build custom applications or purchase products like FAST and Google Appliances to do enterprise search.

This is what I have seen with the evolution of search solutions in the context of SharePoint. SharePoint 2001, nothing to really discuss but with SharePoint 2003 we started to get a taste of what we wanted for Search. We found that the search did not really work well in SharePoint 2003 (cross site searching did not work) and many customers who were using SharePoint 2003 said it simply did not work. It did basic text searching of content within SharePoint but it was missing key things like relevancy. This created a small market of third party vendors who creates search solutions for SharePoint. Remember, at this time Google had become the search engine of choice, as every day business users would just say go Google something and get the answer. Problem was we did not have the same search engine that we could use internally with a company, organization or enterprise. As result FAST, Google, Autonomy, etc. created enterprise search solutions that could be used within a company enterprise and that many these features that were required by the business user.

Then SharePoint 2007 came out with Enterprise Search. It was a significant improvement over what we had with SharePoint 2003 but it was still far off from being an enterprise search solution. They improved the user interface, allowed for target content taxonomy searching, they added a relevancy model, best bests, synonyms, administrative features, reporting, an API we can build customizations to, added security using an access control list (ACL), and business data search using the business data catalog (BDC). All the stuff needed when creating an enterprise search platform. We now had the ability to search for data inside and outside of SharePoint, we could rank the search results based on who you were, we could analyze searches to improve the user experience, etc. however it still seemed to fall short. The core problem I go back to is users are expecting that Google experience; and not just doing text searching. SharePoint tried to solve some of that but in the end it fell short.

One thing that had always been the most interesting is the introduction of the business data catalog (BDC) to provide a single result set of data from multiple disparate data sources. This was the most interesting search feature for me when SharePoint 2007 came out. This is where they tried to become an enterprise search engine because you go to one place, you enter something to search on, and you query against many different places but get back a single result set. I personally was able to use it successfully to index custom SQL databases of HR related data for several clients. So when they searched for a person, they were able to get more information about that person other than just information stored in Active Directory. Now the BDC had lots of limitations including only able to call databases, stored procedures and web services, no ability to do data transformation, an API that was very hard to develop with and had limitied scalability.

With the introduction of FAST as part of the Microsoft stack, they really have a true enterprise search engine. FAST has a significant amount of features and functionality, which I have not even touched upon. In my next blog, I intend to write about some of these core features and capabilities that are needed for an enterprise search solution and how they are used to meet your business users needs to find the data.

For more information on the value proposition of FAST, I have written the following two blogs:

Friday, October 16, 2009

SharePoint GB 2057 Localization

I was recently asked to dig around into an issue with an international SharePoint site we are setting up. I personally have little experience with globalization other having to read about it to pass a MS certification test.

There are language packs for SharePoint which are used to support configurable text for globalization. Well the issue was how is LCID 2057 for England handled? The English language pack supports 1033, which is for US English. LCID 2057 is a considered a sub language of 1033. So, would it be possible to create a unique resx file for GB that maps to 2057? After digging and stumbling around, the answer is it is not possible.

The only resolution would be in the web application set the regional settings to LCID 2057 (GB), and then modify the resx for US English (1033) in that specific web application.

This is what I was able to find out:

  • There is only a language pack for English (1033).
  • It is possible to have formatted text, like dates, formatted to 2057. It is possible to change the locale to 2057 by accessing the SPWeb.Locale. You can try to change the locale through the SharePoint Regional Settings screen in Site Settings, but you will not see a GB option, only US. Another way to change the locale is to go to Webs table in the site collection database; HOWEVER that is not supported by Microsoft.
  • In the Webs table you will see another column called language. What I was able to find out is that the value in this column MUST correspond to a language pack that has been installed. Otherwise SharePoint will bomb. So setting Language = 2057 and Locale = 2057 will not work. However Language = 1033 and Locale = 2057 will work. What this will do is make sure that things like dates are formatted correctly. The reason why it fails is because in several places, including the 12 hive, SharePoint is building a relative path to resources installed when the Language Pack was installed. You will 1033 folders throughout the 12 hive. So if the Language is set to 2057, it will start looking for a 2057 folders and things will start breaking. At this point I said, it would not be possible to create a dedicated unique resx file for GB. Bummer.

Here are some references: