Wednesday, April 29, 2009

Publishing Site Provisioning

I recently ran into an issue with creating a site template that where I wanted to use the Publishing Feature. My end goal was to create a real site definition but I first wanted to try to create template directly in SharePoint. I quickly found out that if you have the "Office SharePoint Server Publishing" Feature turned, on the "Save site as template" will not be available. The most common solution that many suggested was to simply turn off the Publishing Feature, create the site template and then manually turn on the Pushing Feature.


However this would not work well because I am building an automated site provisioning processes using K2 blackpearl. Basically in this K2 blackpearl process I use an InfoPath form, get approval on the site request, and then dynamically generate the site with custom SharePoint groups. We are trying to drive SharePoint Governance with K2 blackpearl which will ensure that the site topology is organized well, SharePoint groups and users are managed in a repeatable process and system administrators can be less involved with creating sites. As well, I want to use customized site templates to ensure that all sites are presented in the same manner instead of being a hodge-podge mess of content. We are even going as far as adding standardized content types into the site templates with K2 workflows mapped to the content types to ensure that publication of the content always goes through a standard process.


Back to the original problem at hand – knowing that I am creating an automated process to site provisioning I cannot expect users to go in and manually turn on the Publishing Feature on the site. The options I came up with were the following:

  1. Create my own site template and in the ONET.xml add a dependency to turn on the Publishing Feature.
  2. Create stapling Feature that would turn on the Publishing Feature.
  3. Write some code that would turn on the Publishing Feature.
  4. There are more – but will stick to this for now…

Option 1 – Did not work as intended. I wanted to use the STS template. I followed best practices, created my own site template and then added <Feature FeatureId="94C94CA6-B32F-4da9-A9E3-1F3D343D7ECB" /> to the <WebFeatures> element in the onet.xml file. I also modified the <Modules> to have several custom web parts displayed on the default.aspx. Doing this made sure that the Publishing Feature was turned on when the site was provisioned by SharePoint. However the Publishing Feature would completely wipe out my home page (default.aspx) and all the changes I made to default.aspx in the onet.xml file were gone!!!


Now, if a manually create my site template and then manually turn on the Publishing Feature the default.aspx will not get wiped out. Since the Publishing Feature is being turned during the actual site creation process within SharePoint, SharePoint is allowing the Publishing Feature to take over the homepage. So this would not work for me.


Option 2 – My next solution was to create the following Site Stapling Feature:

<Feature Id="13F62CC1-22DE-4719-AA44-1BCACD9E2D50"
Title="ML Demo KB Site Staple"
Description="Associates publishing and content type binding to Site Template."
Version="1.0.0.0"
Scope="Site"
Hidden="FALSE"
xmlns="http://schemas.microsoft.com/sharepoint/">
<ElementManifests>
<ElementManifest Location="SiteStaple.xml" />
</ElementManifests>
</Feature>


<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
<!-- Web Publishing -->
<FeatureSiteTemplateAssociation Id="94C94CA6-B32F-4da9-A9E3-1F3D343D7ECB" TemplateName="KB#0" />
<!-- KB Content Type Binding -->
<FeatureSiteTemplateAssociation Id="AD644A91-BA8B-45ff-89FD-F96BCBEDC3BD" TemplateName="KB#0" />
</Elements>

As you can see the FeatureSiteTemplateAssociation is used to turn the custom features I need, including the Publishing Feature. The result was the exact same as Option 1. Again because the Publishing Feature is being activated during SharePoint's site provisioning process the default.aspx page is being overridden.

Option 3 – My ultimate solution was to add some code into my K2 blackpearl process to activate the Feature. I was pretty happy to say to that point I had no code yet in my process – however that is just not possible some days but that is why K2 blackpearl is so great. I basically added the following lines of code into my site provisioning process and I was able to completely replicate what I was able to do manually as a user within SharePoint.


The K2 Process






Code from "Activate Site Features"

//Activate the Features...
using (SPSite site = new SPSite(K2.StringTable["SharePoint Site Collection URL"]))
{
string webURL = K2.StringTable["KB Collaboration Site Logical Path"] + "/KB" +
K2.ProcessInstance.DataFields["New KB Number"].Value;

using (SPWeb web = site.OpenWeb(webURL))
{
SPFeatureCollection features = web.Features;

features.Add(new Guid(K2.StringTable["Publishing Feature"]), true);
features.Add(new Guid(K2.StringTable["Web Publishing Feature"]), true);
features.Add(new Guid(K2.StringTable["KB Content Type Binding Feature"]), true);
}
}

Wednesday, April 15, 2009

K2 blackpearl Developer Resources

I was just given some links to a really great resource for K2 Developers. Chris Geier at K2 has pulled this together and is managing it; this is great!

In it are good blog postings, K2 whitepapers, KB articles, MSDN articles, videos, etc. I still recommend getting the K2 blackpearl Professional book and use this information to supplement your learning of the K2 blackpearl Platform.

Sunday, April 12, 2009

K2 blackpearl Kerberos Configuration

1 - Introduction

Recently I went through another Kerberos configuration and I promised myself after going it I would write something about it. I have had to do Kerberos configurations several times and it seems that every time I do it something changes. Luckily it is not just me who has this experience so I do not feel too bad.


I am not a security expert nor am I a network engineer. There are tons of blogs and articles that explain Kerberos however I am going to try to make this simple. When installing K2 or MOSS you may need to configure Kerberos to get around authentication issues associated an account's credentials being passed between applications that reside on different physical servers. For example:

  • Application A is on Server A.
  • Application B is on Server B.
  • A user logs into Application A on Server A and needs to use Application B services that are on Server B.
  • Application A needs to pass the user's credentials from Server A to Server B so that Application B can authenticate against that account without requiring another login.

Kerberos is required to resolve the issue commonly referred to as the "double hop" issue. If you ever see errors like "NT AUTHORITY/ANONYMOUS LOGON" or "401 - Access Denied" you are experiencing an issue that can be resolved by configuring Kerberos authentication. When receiving an error like this, what is happening is Application A does not have permission to delegate user credentials to Application B which resides on Server B. Because there is no permission to delegate permissions from one machine to another, Application B subsequently defaults to using an anonymous login which is not sufficient for most applications.

To resolve this issue a level of trust needs to be created such that service accounts are trusted to pass a user's credentials from one machine to another.

  • First, a combination of service type and machine, and service account will be registered with the domain controller.
  • Second, a service account will then be given permission to pass user credentials to a specified registration (service type, machine, and service account).

In the case of K2, Kerberos authentication is always required once the K2 topology becomes dispersed across multiple servers. A common scenario is when SharePoint and K2 are not installed on the same physical servers.

Setting up Kerberos authentication should not be confused with Single Sign-On (SSO). SSO is used for storage of multiple username and passwords and then allowing the user to only log in once. Since all of the credential information is stored centrally then a broker will use those stored credentials to log the user into another application without having them to type in a different username and password. This is typically used so that the user does not have to remember multiple username and passwords to access systems.

In following sections I will discuss setting up Kerberos for K2 blackpearl. In part two of this series I will discuss Kerberos considerations for SharePoint. Setting up Kerberos authentication in a SharePoint environment is considered to be a best practice and provides better security than NTLM. It is also common to have to configure Kerberos with SharePoint when working with SQL Reporting services (SSRS).

2 - K2 and Kerberos


As mentioned before, you will need to set up Kerberos authentication for K2 when it is distributed across many servers. The major components are that can be distributed are:

  • The K2 Host Server
  • SQL Server
  • SharePoint
  • SQL Reporting Services
  • K2 Workspace web site

The K2 Getting Starting documentation is really good. A good portion of the following documentation comes from it however I have re-organized it a little and identified a few gaps.


2.1 - K2 and SQL Server


It is possible to have K2 blackpearl and SQL Server sitting on different machines and not require Kerberos authentication. The K2 Service account needs to have access to the access to SQL Server to access the various K2 databases; that is it.

2.2 - K2 Host Server


In the K2 Getting Started instructions, it states that you will need to run the following commands:

  • setspn -A K2Server/MachineName:5252 domain\K2 Service Account
  • setspn -A K2Server/MachineName.FQDN:5252 domain\K2 Service Account
  • setspn -A K2HostServer/MachineName:5555 domain\K2 Service Account
  • setspn -A K2HostServer/MachineName.FQDN:5555 domain\K2 Service Account

You will need to run these commands regardless of how you plan to distribute your K2 environment.


To verify the spn commands you ran:


  • setspn -L domain\K2 Service Account

This command states which machines the K2 Services Account can delegate from. It is good to run this periodically so you do not get lost while doing this.

The following picture shows the following Kerberos delegations that could be required based on the distribution of the K2 environment. The numbers correspond to sections below.


2.3 - K2 and SharePoint

A common scenario is that K2 Host Server and the SharePoint server will not reside on the same machine. If that is the case the following commands need to executed.

  • setspn -A HTTP/MachineName domain\SharePoint Service Account
  • setspn -A HTTP/MachineName.FQDN domain\SharePoint Service Account

Notes:


  • The MachineName is the name of the server where SharePoint resides. If there is an alias for the SharePoint site the MachineName will be the DNS entry. If you have multiple web front end servers (WFEs) in the SharePoint farm that are load balanced it is not required to run the above commands for each physical machine. For example if the SharePoint DNS entry is http://mymossfarm/, then the setspn command would be HTTP/mymossfarm. Make sure that this name is configured in DNS as an A Record and NOT an alias (CName).
  • The SharePoint Service Account refers to the service account for the application pool that SharePoint is running under.

Now that the spn has been set up for the SharePoint WFE we need to:

  • Go to Administrative Tools >> Active Directory Users and Computers.
  • Search for the SharePoint Service Account, open properties, go to the Delegation tab, and select the "Trust this user for delegation to specified services only" option. Then select the "User Kerberos only" option.

  • Press the add button, search for the K2 Service Account and select both the K2Server and K2HostServer Service Types. These were created in section 2.2.
  • Click ok and ok again.
  • Finally you will need to go to Central Admin of SharePoint >> Application Management >> Authentication Providers. Then in the default zone you will set the IIS Authentication Settings to use "Negotiate (Kerberos)". You will need to do an IIS reset for the changes to take effect.

In the delegation tab, what we basically did was say that the SharePoint Service Account is allowed to delegate credentials to the K2 Service Account only. Hopefully that can simplifies your understanding of what is going on.

You may be wondering why you had to do all of this. If you do not, K2 commands that are generated from SharePoint will not be trusted by the K2 Host Server. For instance the K2 task list web part, a web service call from web enabled InfoPath, etc. will not be trusted by the K2 Host Server. If you were to run the K2 Service from command line and not do any of the configurations above you would see a bunch of "NT AUTHORITY/ANONYMOUS LOGON" errors.

Debugging Note - There can be no duplicate SPN entries created. Uniqueness is defined by the combination of service and machine name. Examples of a service to this point are K2Server, K2HostServer and HTTP. A common mistake is to set up SPNs for the same service and machine with multiple service accounts. You will not receive an error when using the setspn command however Kerberos will not work if this is done. Some tools are discussed later that will help uncover these sort of issues.

Unusual Error – I had a painful experienced recently when configuring a K2 environment. We had followed all of the instructions correctly but we were still getting "NT AUTHORITY/ANONYMOUS LOGON" errors when watching the K2 Host Server via command line. Our solution was to select the "Use any authentication protocol" option instead of the "User Kerberos only" option in the Delegation tab. We lost days trying to figure this issue; unknowingly thinking that the only valid option was to select "User Kerberos only".

When doing some research on this option I found that the "Use any authentication protocol" option, meant that "the account can use the protocol transition extension to obtain a service ticket enables it to obtain service tickets to a pre-configured subset of kerberized services". I needed some more clarification and a colleague of mine (Jason Montgomery) sent me the following, "For Kerberos to function, a user's computer needs to be able to contact the Key Distribution Center (KDC) directly in order to get a Ticket to pass along to the Web Site they would like to authenticate with. If the user is outside the network and doesn't have network access to the KDC (Domain Controller), Protocol Transition allows the user to authenticate using any windows authentication protocol (Client Certificates, Forms Auth, Token, proprietary, etc). Once the user has authenticated, the Protocol Transition Extension allows the Service to retrieve a Ticket from the KDC on behalf of a user to itself. From this point on, the Service will be to properly pass the users ticket along to other tiers where required allowing the system to function as designed." He further explained "In your case having Protocol Transition setup didn't work because the Service needs first authenticate the user then call LsaLogonUser or use WindowsIdentity to obtain the token using the S4USelf extension (i suspect they are using the normal Kerberos proxy delegation or S4UProxy extension). If the K2 Service doesn't use the S4USelf extension during Auth then configuring K2 to use Protocol Transition will always fail". That was much lower that I had expected to go, however my tip is sometimes if Kerberos is still not working, try changing the Kerberos delegation to use the "Use any authentication protocol" option.


2.4 - K2 and K2 Workspace

When K2 and the K2 Workspace are placed on different servers, Kerberos authentication will need to be configured between the K2 Workspace and the K2 Host Server. In many cases I personally tend to keep the K2 Workspaces on the same machine as the K2 Host Server. The reason being is that the K2 Workspace is typically only opened up to a few power users and administrators. It is full of a ton of administrative functionality and I typically equate it to Central Administration for SharePoint.

Before continuing note that many occasions I have seen the same account used for both the K2 Service and the K2 Workspace Service. It is still required that you set up an SPN because delegation is going across physical servers and that is why Kerberos is required.

  • If you have not done section 2.2; complete that first.
  • setspn -A HTTP/MachineName domain\K2 Workspace Service Account
  • setspn -A HTTP/MachineName.FQDN domain\K2 Workspace Service Account
  • Go to Administrative Tools >> Active Directory Users and Computers.
  • Now search for the K2 Workspace Service Account, open properties, go to the Delegation tab, and select the "Trust this user for delegation to specified services only" option. Then select the "User Kerberos only" option.
  • Press the add button, search for the K2 Service Account and select both the K2Server and K2HostServer Service Types.
  • Click ok and ok again.

We have now given the K2 Workspace Service Account permission to pass user credentials to K2 Service Account. Otherwise you will get a bunch of 401 errors in the K2 Workspace web site.

2.4.1 K2 Workspace IIS Metabase


Next you need to update the IIS Metabase. The changes will be made to IIS in order to allow Kerberos authentication for the K2 Workspace web site.

  • Open IIS Manager.
  • Right click the top node and select properties.
  • Check the Enable Direct Metabase Edit checkbox.
  • Click ok a few times and finish.
  • Get the K2 Workspace Site Identifies from the IIS Manager. Click on top node and in right main window, there will be a Site Identifier number. That number will be used in the following commands.
  • Open a command line window and cd C:\Inetpub\AdminScripts
  • Next you need for force IIS to use Kerberos instead of NTML. Run the following commands
    • cscript adsutil.vbs set w3svc/NTAuthenticationProviders "Negotiate,NTLM"
    • cscript adsutil.vbs set w3svc/Site Identifier/NTAuthenticationProviders "Negotiate,NTLM"

2.5 – SharePoint and K2 Workspace


If SharePoint and the K2 Workspace are not on the same server, Kerberos authentication needs to be set up such that the SharePoint Service Account and delegate to the RuntimeServices web services that are hosted within the K2 workspace.

  • You must set up the SPN that were described in section 2.3.
  • At a minimum make sure to complete the first three bullets on Section 2.4 (even if both K2 Host Server and the K2 Workspace are on the same machine.
  • Go to Administrative Tools >> Active Directory Users and Computers.
  • Now search for the SharePoint Service Account, open properties, go to the Delegation tab, and select the "Trust this user for delegation to specified services only" option. Then select the "User Kerberos only" option.
  • Press the add button, search for the K2 Workspace Service Account and select the HTTP Service Type.
  • Click ok and ok again.

2.6 - K2 and SSRS


Another common scenario is that SSRS will already be installed somewhere. The following configurations are required.

Note - This can be avoided by installing SSRS on the machine where the K2 Host Server resides. This is possible because SSRS is a web site can be installed anywhere. Be warned, that if you do this, you will need to pay another SQL Server License because Microsoft deems this as another install of SQL Server.

2.6.1 SSRS to K2

The Reporting Services Service Account is the account which the SSRS web site runs under in the IIS Application Pool. This is needed to ensure that the account that being used to access a report is verified against the K2 Host server. For instance if K2 SmartObject Data Provider is used in a SSRS report, the account needs pass the K2 Host Server.

Complete the following:

  • If you have not done section 2.2; complete that first.
  • setspn -A HTTP/MachineName domain\Reporting Services Service Account
  • setspn -A HTTP/MachineName.FQDN domain\Reporting Services Service Account
  • Go to Administrative Tools >> Active Directory Users and Computers.
  • Now search for the SSRS Service Account, open properties, go to the Delegation tab, and select the "Trust this user for delegation to specified services only" option. Then select the "User Kerberos only" option.
  • Press the add button, search for the K2 Service Account and select only the K2HostServer Service Type.
  • Click ok and ok again.

2.6.2 K2 to SSRS


Another thing you will need to do is to allow the K2 Host Server service account to schedule SSRS reports. Now the K2 Host Server needs to be able to delegate to the SSRS Service Account. You will not need to run any spn commands because you have then all done at this point. You need to do the following:

  • Search for the K2 Service Account, open properties, go to the Delegation tab, and select the "Trust this user for delegation to specified services only" option. Then select the "User Kerberos only" option.
  • Press the add button, search for the SSRS Service Account and select the HTTP Service Type.
  • Click ok and ok again.

2.6.3 - K2 Workspace to SSRS

To add one more wrinkle to this, if you have the K2 Workspace and SSRS sitting on different servers, you need to allow the K2 Workspace to delegate use credentials to SSRS. This is because SSRS reports are embedded directly into the K2 Workspace.

  • If you have not done section 2.4; complete that first.
  • At a minimum, complete the first three bullets of 2.6.1.
  • Go to Administrative Tools >> Active Directory Users and Computers.
  • Search for the K2 Workspace Service Account, open properties. On the Delegation tab both "Trust this user for delegation to specified services only" and "User Kerberos only" options should already be selected.
  • Press the add button, search for the SSRS Service Account and select only the HTTP Service Type.
  • Click ok and ok again.

2.6.4 SSRS IIS Metabase


Next you need to update the IIS Metabase. The changes will be made to IIS in order to allow Kerberos authentication for the SSRS web site.

  • Open IIS Manager.
  • Right click the top node and select properties.
  • Check the Enable Direct Metabase Edit checkbox.
  • Click ok a few times and finish
  • Get the SSRS Site Identifies from the IIS Manager. Click on top node and in right main window, there will be a Site Identifier number. That number will be used in the following commands.
  • Open a command line window and cd C:\Inetpub\AdminScripts
  • Next you need for force IIS to use Kerberos instead of NTML. Run the following commands
    • cscript adsutil.vbs set w3svc/NTAuthenticationProviders "Negotiate,NTLM"
    • cscript adsutil.vbs set w3svc/Site Identifier/NTAuthenticationProviders "Negotiate,NTLM"

5 - Still Having Issues - Kerberos Configuration Tool


Download this tool; this helped me out tremendously after going through the Kerberos configuration and validating if I had set up everything correctly. It took a little bit to get the hang but it works great.

You only need to install it on one machine in the farm. What you can do is modify the parameters and it will tell you if there are any issues with Kerberos authentication between machines.


Some other tools that were recommended to me which I have not used are ldifde.exe and spnquery.vbs.


6 - Recommendations

My recommendation is place the K2 Host Server, K2 Workspace and SSRS on the same machine. SharePoint should be installed on its own dedicated environment. The SSRS install is only created to support K2 SSRS reports. This will only require you to complete sections 2.2, 2.3 and 2.5.


7 - References


8 - Credits


I also had some reviewers which helped me out.

Saturday, April 4, 2009

April 2009 K2 User Group

Update 4/16/2009 - here is the recorded presentation.

All,

I will be making a presentation to the K2 User Group on Tuesday April 14th from 11am to 1pm central US time. Below is information for attending via LiveMeeting.

I will be making a presentation on how to do document management with SharePoint and K2 with no code. This is a demo that I have given a few times and really demonstrates the power of the wizards within K2 to some really effective business processing. The information below says that the demo is on blackpoint, however it is all built on blackpearl. However since it is a no code implementation, it can be done completely on blackpoint…

---------------------------------------

Phillip Knight from Merit Energy will be hosting the K2 user group meetings at Merit Energy, located at 13727 Noel Road, 2nd Floor Conference room, Tower 2, Dallas, Texas 75240. Parking information is included in the linked map below. Remote attendance information is included at the bottom of this message.

Link to map: http://www.meritenergy.com/content/MeritMap.pdf. Reminder: Merit Energy is on the 5th floor, but the meeting will be held in a 2nd floor conference room. Once off the elevator, go to the reception area and we will bring you back to the conference room.

Please RSVP to me via email
whether you are attending via live meeting or if you will be attending in person (so that we can plan for the number of people to order food for).

Check out the K2 Underground site and our user group at http://k2underground.com/k2/InterestGroupHome.aspx?IntGroupID=11. We are posting webexes/live meetings from our meetings at this site.

5/12/2009 11am – 1pm
06/9/2009 11am – 1pm
07/14/2009 11am – 1pm
08/11/2009 11am – 1pm
09/8/2009 11am – 1pm

Meeting Agenda:
11-11:15 Networking/Refreshments
11:15-11:30 Announcements/Intros of New people
11:30-11:45 Tips & Tricks
11:45-12:45 Technical Presentation
12:45-1:00 Meeting Wrap up

The Announcements section of the meeting will include any information regarding K2 upcoming events and user group events as well as brief introductions of our presenter and refreshment provider.

The Tips & Tricks Presentation is when we as members can pose questions to each other on projects that we are working on and having difficulty with. It is also a time when if we have learned something that we feel will be helpful to others, we can share it with the group. Bring yours to share/ask.


Meeting Presentation & Company:

We thank Jason Apergis from MicroLink for
presenting at our April K2 user group meeting. Jason will be demonstrating a K2 BlackPoint document management workflow with no code that contains InfoPath, site creation, permission management, topology management, site templates, content types, sharepoint integrated workflow for word documents, document metadata updating, emails, etc.


Founded in 1998, MicroLink provides Business Intelligence, Information Discovery, Portals, and Collaboration solutions. MicroLink has a history of providing reliable, high quality, customer-driven solutions that focus on improving productivity, collaboration, and teamwork throughout our customers' enterprise. With a reputation for consistent, superior performance, and outstanding work in the public sector and commercial organizations MicroLink has earned the respect of its clients, partners, and employees. In recognition of this dedication, MicroLink has received the following awards; 2007 Autonomy Global Partner of the Year, 2008 Microsoft Federal Repeatable Solutions Award, 2007 Microsoft Federal Partner of the Year, and the 2007 and 2006 Microsoft DoD Partner of the Year, IBM Cognos client award for Excellence in the Public Sector 2008.


Meeting Presenters:


Jason Apergis is a coauthor on the recent WROX Professional K2 BlackPearl book that was released. He authored the chapter on InfoPath, SmartObjects and Deployments. Jason is also a K2 Insider. He currently works for MicroLink LLC as a Solution Architect focusing on MOSS and Business Process Automation solutions. Over the past four years Jason has done a lot of work with K2.net integrating it with BizTalk, SharePoint 2003/2007, InfoPath, ASP.net, SSIS, mainframes and other non-MS technologies. One of the solutions was nominated for a Microsoft Partner of the Year solution in 2006. Jason is a Virginia Tech alumni completing both his undergrad and masters in Information Technology. He plays ice hockey on a regular basis and is a huge Washington Capitals fan.


Meeting Sponsor:

We thank Jason Moseley from Hitachi Consulting for sponsoring our refreshments at our April meeting. Hitachi is a Microsoft Gold Certified Partner with a full range of business solutions and services for strategy and organization effectiveness, business intelligence and performance management, customer and channel (including CRM) and strategic technologies (including IT architecture and SOA).

Hitachi Consulting is widely recognized leader in delivering practical, value-based business strategies and technology solutions. From business strategy development through application deployment, we are committed to helping clients quickly realize measurable business value and achieve sustainable ROI.

For more information please contact Jason Moseley, Senior Manager Hitachi Consulting (amoseley@hitachiconsulting.com, 972-768-2789)


For Virtual Attendees:

Note: please keep your phone on mute until you are ready to speak.

Audio Information

Telephone conferencing
Choose one of the following:

Start Live Meeting client, and then in Voice & Video pane under Join Audio options, click Call Me. The conferencing service will call you at the number you specify. (Recommended)

Use the information below to connect:
Toll: +1 (719) 867-1571

Toll-free: +1 (877) 860-3058

Participant code: 914421

First Time Users:

To save time before the meeting, check your system to make sure it is ready to use Microsoft Office Live Meeting.
Troubleshooting
Unable to join the meeting? Follow these steps:

Copy this address and paste it into your web browser:

1. https://www.livemeeting.com/cc/scna1/join?id=KZ3QJJ&role=attend&pw=7%21j%27mJ%28%7BP

2. Copy and paste the required information:

Meeting ID: KZ3QJJ

Entry Code: 7!j'mJ({P
Location: https://www119.livemeeting.com/cc/scna


If you would like to provide refreshments at an upcoming meeting or present at an upcoming meeting, please contact me.

Our next meeting announcement will be sent out next Tuesday.

Let me know if you have any questions prior to the meeting.

Sunday, March 1, 2009

What is SharePoint Governance?

1 Introduction

One of the most challenging things with SharePoint is not user adoption; far from it. From what we have seen with SharePoint is that enjoys extremely high adoption however with most implementations of SharePoint tend to grow at an uncontrolled pace. From an IT perspective, it is reminiscent of file share. Tons of sites, more sub sites, upon sub sites and upon sub sites of content. Clients come back every time saying, we love SharePoint but we have issues. In most cases it is not having governance and program management to support SharePoint. SharePoint is not like other server products out there that IT departments purchase or support. It is not like PeopleSoft, SAP, SQL Server, whatever else that is out there. SharePoint allows everyday users to freely create and store content. If not managed or configured correctly such that SharePoint does not align itself with the goals and objectives of the organization it will grow in an uncontrolled rate.


The following are reasons why governance should be established:

  • Improve information reliability, availability, and security.
  • Address how information is shared, used and analyzed internally and externally.

2 What is Governance?

Governance is the combination of people, policies and processes that an organization leverages to achieve a desired outcome. There needs to be a measurable outcome that the organization expects to achieve through governance to obtain a desired result. To accomplish this there must be adoption by both stakeholders and users. To have high adoption the right people must be selected to craft policies and then SharePoint stakeholders must given a clear understanding of how to meet these policies.

To create these policies executive, financial, IT, department leads, compliance, development and information workers need to be selected. It is extremely important to pick people who share the common vision and work together to achieve a common goal. Once policies have been created, heavy handed enforcements are not needed. Instead policies can be enforced with education, training and communications plans. Training needs to be geared specifically to targeted audiences and cannot be in the form of a onetime push of information. Still even with training a minimal level of enforcement will be required and processes will have to be created to enforce policies. Processes can be executed through system automation or manually. Finally a governing body is needed to measure performance and continually update policy based on the evolving workplace. What should be put in place is a governance process that is scalable and flexible to meeting the demands of the business.



3 SharePoint Governance


The following is Microsoft's definition of how governance should be implemented with SharePoint (http://technet.microsoft.com/en-us/library/cc263356.aspx). Every enterprise is unique and should determine the best way to implement its own governance plan. The following are suggested stages of a governance implementation:

  • Determine initial principles and goals: The governance body should initially develop a governance vision, policies, and standards that can be measured to track compliance and to quantify the benefit to the enterprise. For example, at this stage, the initial corporate metadata taxonomy could be determined along with the initial IT service offerings. The initial principles, goals, and standards should be published and publicized.
  • Develop an education strategy: The governance policies that you determine must be publicized to your enterprise, and you should have ongoing education and training plans. Note that this includes training in the use of Office SharePoint Server and training in the governance standards and practices. For example, your IT department could maintain a frequently asked questions (FAQ) page on its Web site to respond to questions about its Office SharePoint Server service offerings. Your business division could provide online training that describes the implementation and use of the document management system in the enterprise.
  • Develop an ongoing plan: Because successful governance should be ongoing, the governance body should meet regularly. Ongoing activities include incorporating new requirements in the governance plan or reevaluating and adjusting governing principles or standards. Conflicts may need to be resolved as competing needs arise, such as between your IT department and one or more business divisions. Your governance body should report regularly to its executive sponsors to promote accountability and to help enforce compliance across the enterprise. Keep in mind that, although this sounds laborious, the goal is to increase the return on your investment in Office SharePoint Server, maximize the usefulness of your Office SharePoint Server solution, and increase the productivity of your enterprise.

Highlighting the last sentence of this is extremely important. What should remember is that this is not easy and that organizations should plan on understanding SharePoint governance before they start.

4 What does a SharePoint Environment without Governance?


Here are just a few things I have seen:

  • Site administrator and contributors are not sufficiently trained nor did many have strong information architecture backgrounds.
  • No governing body that understands how to use SharePoint to solve business problems.
  • No consistency in the way content is presented across SharePoint.
  • Roles and responsibilities have not been officially defined.
  • Management and staff do not always understand the level of effort involved in building and managing a SharePoint site.
  • Site topology is not actively managed.
  • There is clear line between what is considered intranet, extranet, and public content.
  • Islands of information are created in the form of SharePoint lists and not managed as enterprise business data. This analogous to having Excel and Access manage enterprise business data.
  • Custom solution development, deployment and maintenance standards were never created.
  • Content databases growth is uncontrolled.
  • No content development and integration.
  • No document discovery and impossible to find a document. Search is not always the solution.
  • Myriad of web documents stored all over the place.
  • No document retention and retirement.
  • Content security is uncontrolled and impossible to manage accounts.

The challenge that we have seen with the purchase of SharePoint is that sometimes it is positioned as an out of the box silver bullet solution when in fact SharePoint is a platform for creating solutions. Much of the out of the box functionality of SharePoint can be used to solve many business needs. SharePoint empowers every day business users with the ability to create web based solutions that can be highly integrated with Microsoft Office. This is why we have seen such high adoption of SharePoint and why Microsoft is making significant investments into the technology. However SharePoint commonly runs into problems where business users are not properly trained to create solutions nor given parameters in which they should work within. This is why governance has become a predominant issue with many organizations that have implemented SharePoint.

5 SharePoint with Governance


Creating a governance structure could possibly bring:

  • Reliable and available content.
  • Business processes that are built around the web content management.
  • Information management focusing on identifying owners of high-valued data.
  • Secure information.
  • Meets business objectives.
  • Complies with policies and regulations.
  • Carbon footprint reduced.
  • Useful taxonomy and metadata management.
  • Site topology matches the organization.
  • Managed expectations (audience, staff, and executives).
  • Roles and responsibilities identified.
  • Site continuity.
  • Processes to review site statistics to improve user experience and make most popular content readily available.
  • Continuous improvement of Search by reviewing search statistics.
  • Codified policies and procedures (management, planning, design).
  • Process within the PMO to identify opportunities where SharePoint should be used in the enterprise architecture.
  • Configuration Management policies for SharePoint.
  • Change management business processes.

As you can see, SharePoint governance is not just about managing SharePoint for a system standpoint, it also provide guidance the users of SharePoint.

All of these things can be prioritized and driven by the SharePoint Governance team. To support this team you need:

  • Business users and power users who are supportive of SharePoint.
  • Systems analysts who are devoted to supporting SharePoint.
  • Developers who create custom functionality.
  • Systems administrators who manage the SharePoint farms.
  • Technical leadership who understand how to translate business requests into SharePoint solutions.
  • Business directors who can position SharePoint.

Once governance team is put together, they can continually manage and improve SharePoint as it evolves with the organization.

6 References

Monday, February 16, 2009

Custom SharePoint Web Service – Upload InfoPath Attachment

1.0 Creating the Utility


1.1 Introduction

This blog will hopefully help when trying to get attachments out of InfoPath forms and into SharePoint document libraries. If you are doing standard MOSS and InfoPath development, you are going to want to eventually get the attachments out of the InfoPath form. For instance when you search for attached documents, if they are still in the InfoPath form, search cannot find them because they are a binary string inside the InfoPath instance.

For K2 developers many of you know of the storage ramifications if the InfoPath form stores the attachment binary. I wrote a blog back in the K2.net 2003 days where I wrote a custom web service that would using the K2 API to save the attachment off in into SharePoint 2003. Some others came up with other solutions like using K2 SmartObjects, or extracting the binary out of the XML before the process instance starts. I just figured that there must be a better way of doing this in a generic way.

1.2 Solution

The solution is to create a custom SharePoint web service that runs under the context of SharePoint. Why did I decide to ultimately use a custom SharePoint service to implement this solution?

  • I wanted to use the WSS 3.0 SharePoint API to implement this. I wanted to make this reusable solution whether I am doing K2 development or not.
  • I wanted to use the SharePoint context to upload the document. When a document is uploaded in the fashion, the user must have permission to the document library. Now I get security for free.
  • SharePoint will upload the file under the current user account. If I had done this as an external web service, the username that would have uploaded the file would the system account that the web service is running under; instead of the actual user who uploaded the document.

I will provide a detailed solution on how to get InfoPath attachments out of the forms and into SharePoint document libraries. This solution is geared towards anyone doing MOSS and InfoPath development. The solution is straight forward:

  • First, I will create a utility class that will handle the saving of a attachment binary into SharePoint.
  • Second, I will create a custom SharePoint service to wrap these methods.
  • Third, I will show how to deploy the custom service in a WSP solution.
  • Fourth, I will show how to wire up the custom service in an InfoPath form.

2.0 Upload Utilities


Before going off and creating this web service, let's create the utilities that will upload the files.

  • I will create a utility class that will be responsible for uploading the file.
  • I will also create a different class to assist with the creation of folders for uploaded documents. The reason why I provided this is because when building a workflow for the InfoPath form, you will probably have a unique key that is for the process instance (WF, K2, etc). For attached items, you will want to store them by process instance so you do not have attachment names conflict.
  • I will also write another class that will get the files so that you can display links to all the files in the InfoPath form.

The ultimate user experience is that they will not ever know that they are uploading the document to a SharePoint library and it is completely transparent.

I will create a C# class library called WSSDistillery.Utilities. The reason why I am doing this is because all of these methods can be completely reusable in a list event handler, in a WF workflow, web part, etc. All the custom service is going to do is wrap the method calls.

In this project add references to Microsoft.SharePoint and System.Web.

2.1 Upload Attachments


The trick associated to getting an attachment out of an InfoPath form is removing the header. What happens is when an attachment is pulled into an InfoPath form; InfoPath will add a header into the binary of the attachment that captures information like the name and size of the file. We need to rip that out and only send the binary of the attachment itself into a document library. Plus we want to make sure the name of the file set into SharePoint too.

The following class will show you how to do it. Now in my old blog I show how to do it. I wanted to rewrite it plus I wanted to do it in a better way. I found this blog which made it so much simpler. I have repurposed that code here.


public class InfoPathForm
{
/// <summary>
/// Uploads an InfoPath attachment to a list and folders. Method
/// only supports saving items up to two folders deep.
/// </summary>
/// <param name="siteURL"></param>
/// <param name="webUrl"></param>
/// <param name="docLibName"></param>
/// <param name="parentFolderName">Pass NULL if there is no folder.</param>
/// <param name="subFolderName">Pass NULL if there is no sub folder.</param>
/// <param name="infopathAttachment"></param>
/// <param name="overwrite"></param>
public static void UploadAttachmentFile(string siteURL, string webUrl,
string docLibName, string parentFolderName, string subFolderName,
byte[] infopathAttachment, bool overwrite)
{
try
{
MemoryStream stream = new MemoryStream(infopathAttachment);
BinaryReader reader = new BinaryReader(stream);

//Read header
reader.ReadBytes(16); // Skip the header data.
int fileSize = (int)reader.ReadUInt32();
int attachmentNameLength = (int)reader.ReadUInt32() * 2;
byte[] fileNameBytes = reader.ReadBytes(attachmentNameLength);

//Get the attachment name
Encoding enc = Encoding.Unicode;
string attachmentName = enc.GetString(fileNameBytes, 0, attachmentNameLength - 2);

//Get the real attachment without InfoPath header
byte[] attachment = reader.ReadBytes(fileSize);

using (SPSite site = new SPSite(siteURL))
{
using (SPWeb web = site.OpenWeb(webUrl))
{
SPList list = web.Lists[docLibName];
if (String.IsNullOrEmpty(parentFolderName))
{
//Save document directly into list.
list.RootFolder.Files.Add(attachmentName, attachment, overwrite);
}
else
{
if (string.IsNullOrEmpty(subFolderName))
{
//Save document in folder in root of list.
list.RootFolder.SubFolders[parentFolderName].Files.Add(
attachmentName, attachment, overwrite);
}
else
{
//Save document in a sub folder.
list.RootFolder.SubFolders[parentFolderName].SubFolders[subFolderName].Files.Add(
attachmentName, attachment, overwrite);
}
}
}
}
}
catch (Exception ex)
{
throw new Exception("Error saving attachment >> " + ex.Message.ToString());
}
}
}

2.2 Folder Methods

The following are the folder methods that I will use to create a folder in the root of a SPList. I also have a method for creating a sub folder. I really did not try to make this handle creating folders deeper than two levels. I also know it possible to write something a little more generic however I just want to expose allowing external callers to create a parent and a sub directory only.

public class Folder
{
/// <summary>
/// Create a folder in the root of a list.
/// </summary>
/// <param name="siteURL"></param>
/// <param name="webUrl"></param>
/// <param name="docLibName"></param>
/// <param name="folderName"></param>
/// <returns></returns>
public static SPListItem CreateTopFolder(string siteURL, string webUrl,
string docLibName, string folderName)
{
SPList list = GetList(siteURL, webUrl, docLibName);

return CreateFolder(list,
folderName,
list.RootFolder.ServerRelativeUrl,
null);
}

/// <summary>
/// Creates a sub folder using a reference to a parent folder.
/// </summary>
/// <param name="siteURL"></param>
/// <param name="webUrl"></param>
/// <param name="docLibName"></param>
/// <param name="parentFolderName"></param>
/// <param name="subFolderName"></param>
/// <returns></returns>
public static SPListItem CreateSubFolder(string siteURL, string webUrl,
string docLibName, string parentFolderName, string subFolderName)
{
SPList list = GetList(siteURL, webUrl, docLibName);

return CreateFolder(list,
subFolderName,
list.RootFolder.ServerRelativeUrl + "/" + parentFolderName,
parentFolderName);
}

//Get the list
private static SPList GetList(string siteURL, string webUrl,
string docLibName)
{
using (SPSite site = new SPSite(siteURL))
{
using (SPWeb web = site.OpenWeb(webUrl))
{
return web.Lists[docLibName];
}
}
}

/// <summary>
/// Creates a folder.
/// </summary>
/// <param name="list"></param>
/// <param name="folderName"></param>
/// <param name="folderUrl"></param>
/// <param name="parentFolder"></param>
/// <returns></returns>
public static SPListItem CreateFolder(SPList list, string folderName,
string folderUrl, string parentFolder)
{
SPListItem folder = null;
foreach (SPListItem f in list.Folders)
{
if (f.Name == folderName)
{
folder = f;
break;
}
}

if (folder == null)
{
folder = list.Items.Add(folderUrl,
SPFileSystemObjectType.Folder, parentFolder);

if (folder != null)
{
folder["Name"] = folderName;
folder["Title"] = folderName;
folder.Update();
}
}

return folder;
}
}

2.3 Get Files
This method will return all of the files for a specified SPList or folder. Right now it is only geared towards going two layers deep. I could write something that is recursive where I can say get all files from a specified folder at a specific level however right now that is not needed.

public class Items
{
/// <summary>
/// Method will return all of the files in a specified folder. It is
/// limited from only getting two levels deep.
/// </summary>
/// <param name="siteURL"></param>
/// <param name="webUrl"></param>
/// <param name="docLibName"></param>
/// <param name="parentFolderName">optional</param>
/// <param name="subFolderName">optional</param>
/// <returns></returns>
public static SPListItemCollection GetItems(string siteURL, string webUrl,
string docLibName, string parentFolderName, string subFolderName)
{
try
{
SPListItemCollection items = null;

using (SPSite site = new SPSite(siteURL))
{
using (SPWeb web = site.OpenWeb(webUrl))
{
SPList list = web.Lists[docLibName];
SPFolder folder = null;

if (String.IsNullOrEmpty(parentFolderName))
{
//Get from root
folder = list.RootFolder;
}
else
{
if (string.IsNullOrEmpty(subFolderName))
{
//Get from first level of folders
folder = GetFolder(list.RootFolder.SubFolders, parentFolderName);
}
else
{
//Get from second level of folders
folder = GetFolder(list.RootFolder.SubFolders[parentFolderName].SubFolders,
subFolderName);
}
}

//Get the items
if (folder != null)
{
SPQuery query = new SPQuery();
query.Folder = folder;

items = web.Lists[docLibName].GetItems(query);
}

return items;
}
}
}
catch (Exception ex)
{
throw new Exception("Error creating list of items >> " + ex.Message.ToString());
}
}

private static SPFolder GetFolder(SPFolderCollection folders, string folderName)
{
foreach (SPFolder folder in folders)
{
if (folder.Name == folderName)
{
return folder;
}
}

return null;
}
}

3.0 Creating the Custom SharePoint Web Service

The next step is to create the custom SharePoint web service. There is a MSDN Article and SDK article but in this article I have streamlined it. Here is another blog I found but it has some issues associated to the deployment.

3.1 Create a web service project


Create a web service project called WSSDistillery.Utilities.Service.InfoPath with a web service called InfoPathAttachment.asmx.

3.2 Create Web Methods


I then added a reference to the WSSDistillery.Utilities project I created in the previous section. You will need to add reference to Microsoft.SharePoint. All my web service does is wrap and expose the methods from my utility class.

public class InfoPathAttachment : System.Web.Services.WebService
{

[WebMethod]
public void UploadInfoPathAttachment(string siteURL, string webUrl,
string docLibName, string parentFolderName, string subFolderName,
byte[] attachment, bool overwrite)
{
InfoPathForm.UploadAttachmentFile(siteURL, webUrl, docLibName,
parentFolderName, subFolderName, attachment, overwrite);
}

[WebMethod]
public void CreateFolder(string siteURL, string webUrl,
string docLibName, string folderName)
{
Folder.CreateTopFolder(siteURL, webUrl, docLibName, folderName);
}

[WebMethod]
public void CreateSubFolder(string siteURL, string webUrl,
string docLibName, string parentFolderName, string subFolderName)
{
Folder.CreateSubFolder(siteURL, webUrl, docLibName, parentFolderName, subFolderName);
}

[WebMethod]
public XmlDocument GetAttachments(string siteURL, string webUrl,
string docLibName, string parentFolderName, string subFolderName)
{
StringBuilder xml = new StringBuilder();
xml.Append("<items>");

SPListItemCollection items = WSSDistillery.Utilities.Items.GetItems(
siteURL, webUrl, docLibName, parentFolderName, subFolderName);

foreach (SPListItem item in items)
{
//Do not include folders items
if (item.Folder == null)
{
xml.AppendFormat("<item><name>{0}</name><url>{1}</url></item>",
item.Name, HttpUtility.UrlPathEncode(item.Web.Url + "/" + item.Url));
}
}

xml.Append("</items>");

//InfoPath can consume an XML Document easily
XmlDocument xmlDoc = new XmlDocument();
xmlDoc.LoadXml(xml.ToString());
return xmlDoc;
}
}

3.2.1 Remove Code Behind Reference



After creating the service, right click InfoPathAttachment.asmx, click on View Mark Up and remove the code behind attribute leaving just.

<%@ WebService Language="C#" Class="WSSDistillery.Utilities.Service.InfoPath.InfoPathAttachment" %>

3.3 Create the Discovery and WSDL files

Next we need create the discovery and WSDL files. Now these steps are a little strange and may not make complete sense at first. Basically we have to make the service discoverable inside of SharePoint.

First what I did was create a virtual directory in IIS that pointed to my web service project. So my project is located at C: \WSSDistillery\Utilities\WSSDistillery.Utilities.Service.InfoPathAttachment and I point the virtual directory to use those files. The name of virtual directory is TestInfoPathAttachment. Then I opened the .NET command line window and ran the following command.

disco http://localhost/TestInfoPathAttachment/InfoPathAttachment.asmx

Second, this will generate two files (InfoPathAttachment.wsdl and InfoPathAttachment.disco) that are located at C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin. Go get those files, copy them into your web service project directory and add them into your Visual Studio project by adding existing items. Rename the files to be the following:

  • Change InfoPathAttachment.disco to InfoPathAttachmentdisco.aspx
  • Change InfoPathAttachment.wsdl to InfoPathAttachmentwsdl.aspx

Third, in both files do the following. Remove the XML tag line:


<?xml version="1.0" encoding="utf-8"?>

And replace it with this:

<%@ Page Language="C#" Inherits="System.Web.UI.Page" %>
<%@ Assembly Name="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>
<%@ Import Namespace="Microsoft.SharePoint.Utilities" %>
<%@ Import Namespace="Microsoft.SharePoint" %>
<% Response.ContentType = "text/xml"; %>

Be really careful when doing this because Visual Studio will sometimes try to add in double quote marks. If that happens, get rid of them because InfoPath will break when making the web service call. Someone actually mentioned this in the MSDN Article and I would have been scratching my head of weeks if that was not there.

Fourth, go into the disco file replace the contractRef and both soap nodes with the following. Pay special attention to where you replace in the name of the service. I have bolded the things you must change based on the name of the web service – in this case InfoPathAttachment.

<contractRef ref=<% SPHttpUtility.AddQuote(SPHttpUtility.HtmlEncode(SPWeb.OriginalBaseUrl(Request) + "?wsdl"),Response.Output); %>
docRef=<% SPHttpUtility.AddQuote(SPHttpUtility.HtmlEncode(SPWeb.OriginalBaseUrl(Request)),Response.Output); %>
xmlns="http://schemas.xmlsoap.org/disco/scl/" />
<soap address=<% SPHttpUtility.AddQuote(SPHttpUtility.HtmlEncode(SPWeb.OriginalBaseUrl(Request)),Response.Output); %>
xmlns:q1="http://tempuri.org/" binding="q1:InfoPathAttachmentSoap" xmlns="http://schemas.xmlsoap.org/disco/soap/" />
<soap address=<% SPHttpUtility.AddQuote(SPHttpUtility.HtmlEncode(SPWeb.OriginalBaseUrl(Request)),Response.Output); %>
xmlns:q2="http://tempuri.org/" binding="q2:InfoPathAttachmentSoap12" xmlns="http://schemas.xmlsoap.org/disco/soap/" />

Fifth, in the WSDL file, go to the every end of the file and you will find the following soap nodes.

<soap:address location="http://localhost/foo/InfoPathAttachment.asmx" />

<soap12:address location="http://localhost/foo/InfoPathAttachment.asmx" />

Change them to be:

<soap:address location=<% SPHttpUtility.AddQuote(SPHttpUtility.HtmlEncode(SPWeb.OriginalBaseUrl(Request)),Response.Output); %> />

<soap12:address location=<% SPHttpUtility.AddQuote(SPHttpUtility.HtmlEncode(SPWeb.OriginalBaseUrl(Request)),Response.Output); %> />

DONE - That is really as simple as it gets!

4.0 Deploying the Service

Next we need to deploy this web service. I was disappointed in the MSDN Article because instructions had you go directly into the files of the server and dropping files on manually. In this blog, the author had you putting DLLs in both the GAC and the web bin directory. Nuts!

As I have said before, if I cannot deploy using a SharePoint solution you should probably reconsider your deployment plan. Plus, I have to consider the fact that if I have a SharePoint farm, I will have to manually put the files up on each server which is a drag. In this section I will show you how to create a WSP solution that you can use to deploy your custom SharePoint web service. Note, was not simple and required me to refresh myself of CAS.

It is a well known best practice to deploy all of your DLLs to the web bin directory (i.e. C:\Inetpub\wwwroot\wss\VirtualDirectories\80\bin). The reason why is:

  • Deploying to the GAC makes the DLL completely accessible. If you have a real SharePoint farm and you are trying to respect security some DLLs should not be accessible across all farms.
  • As well, may people get lazy and say just drop the DLL in the GAC or worse, make the security level medium or full. If anyone tells you that – they are not thinking straight. Not production environment should run above minimum unless a CAS policy has been specifically applied to a DLL.

4.1 Sign both Projects


You will need to sign both the web service and the utilities libraries. Even though they will not be deployed to the GAC, to write a CAS Policy we will need them to be signed to ensure they are uniquely named.

4.2 APTC the WSSDistillery.Utilities Project

When I was going through the deployment steps farther down I started to get errors, specifically when making the web service call. The errors were saying that partially trusted callers are not permitted. What was basically occurring was the WSSDistillery.Utilities class does not fully trust the web service call. This would occur because the web service DLL is in the SharePoint web site bin directory and not in the GAC. Even if I put the WSSDistillery.Utilities in the GAC, I would still get this error. The solution is to add the following into the AssemblyInfo .cs file of the WSSDistillery.Utilities project.

[assembly: System.Security.AllowPartiallyTrustedCallers]

I did some reading and found out that Microsoft.SharePoint allow partially trusted callers too; then I did not feel too bad. The deal is that make sure you are not susceptible to injection if you are going to allow this.

4.3 Manifest File


Next I create the following manifest.xml file which will deploy the DLLs and the .asmx and .aspx files to the ISPI folder. You do not need to create any sort of SharePoint Feature because all we are doing is pushing files on to the web servers. I am going to provide you with two options and this will work in a production environment that is running under minimal trust.

Before going continuing it would be good to open a Visual Studio command prompt and run the following commands as you will need these values when setting up the CAS policies:

sn -Tp C:\WSSDistillery\Utilities\WSSDistillery.Utilities.Service.InfoPathAttachment\bin\WSSDistillery.Utilities.Service.InfoPath.dll

sn -Tp C:\WSSDistillery\Utilities\WSSDistillery.Utilities.Service.InfoPathAttachment\bin\WSSDistillery.Utilities.dll
4.3.1 Option 1

In this option I deploy on DLL to the GAC and one to the web bin. What we are doing is allowing the web service to run in a safe mode.

<Solution xmlns="http://schemas.microsoft.com/sharepoint/"
SolutionId="2583E0DD-2B0E-41b4-BFF3-4D4100B3D11B">
<Assemblies>
<Assembly Location="WSSDistillery.Utilities.dll" DeploymentTarget="GlobalAssemblyCache" />
<Assembly Location="WSSDistillery.Utilities.Service.InfoPath.dll" DeploymentTarget="WebApplication">
<SafeControls>
<SafeControl Assembly="WSSDistillery.Utilities.Service.InfoPath.InfoPathAttachment, WSSDistillery.Utilities.Service.InfoPath, Version=1.0.0.0, Culture=neutral, PublicKeyToken=7e600b50acc43694"
Namespace="WSSDistillery.Utilities.Service.InfoPath"
Safe="True"
TypeName="*"/>
</SafeControls>
</Assembly>
</Assemblies>
<RootFiles>
<RootFile Location="ISAPI\InfoPathAttachment.asmx"/>
<RootFile Location="ISAPI\InfoPathAttachmentdisco.aspx"/>
<RootFile Location="ISAPI\InfoPathAttachmentwsdl.aspx"/>
</RootFiles>
<CodeAccessSecurity>
<PolicyItem>
<Assemblies>
<Assembly PublicKeyBlob="[ADD VALUE]"/>
</Assemblies>
<PermissionSet class="NamedPermissionSet" Name="WSSDistillery.Utilities.Service.InfoPath" version="1" Description="Permission for WSSDistillery.Utilities.Service.InfoPath">
<IPermission class="AspNetHostingPermission" version="1" Level="Minimal" />
<IPermission class="SecurityPermission" version="1" Flags="Execution" />
<IPermission class="Microsoft.SharePoint.Security.SharePointPermission, Microsoft.SharePoint.Security, version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" version="1" Unrestricted="True" />
</PermissionSet>
</PolicyItem>
</CodeAccessSecurity>
</Solution>

4.3.2 Option 2


In this option I deploy both DLLs to the web site bin directory.

<Solution xmlns="http://schemas.microsoft.com/sharepoint/"
SolutionId="2583E0DD-2B0E-41b4-BFF3-4D4100B3D11B">
<Assemblies>
<Assembly Location="WSSDistillery.Utilities.dll" DeploymentTarget="WebApplication">
<SafeControls>
<SafeControl Assembly="WSSDistillery.Utilities.Folder, WSSDistillery.Utilities, Version=1.0.0.0, Culture=neutral, PublicKeyToken=5695a828a53b1107"
Namespace="WSSDistillery.Utilities"
Safe="True"
TypeName="*"/>
<SafeControl Assembly="WSSDistillery.Utilities.InfoPathForm, WSSDistillery.Utilities, Version=1.0.0.0, Culture=neutral, PublicKeyToken=5695a828a53b1107"
Namespace="WSSDistillery.Utilities"
Safe="True"
TypeName="*"/>
<SafeControl Assembly="WSSDistillery.Utilities.Items, WSSDistillery.Utilities, Version=1.0.0.0, Culture=neutral, PublicKeyToken=5695a828a53b1107"
Namespace="WSSDistillery.Utilities"
Safe="True"
TypeName="*"/>
</SafeControls>
</Assembly>
<Assembly Location="WSSDistillery.Utilities.Service.InfoPath.dll" DeploymentTarget="WebApplication">
<SafeControls>
<SafeControl Assembly="WSSDistillery.Utilities.Service.InfoPath.InfoPathAttachment, WSSDistillery.Utilities.Service.InfoPath, Version=1.0.0.0, Culture=neutral, PublicKeyToken=7e600b50acc43694"
Namespace="WSSDistillery.Utilities.Service.InfoPath"
Safe="True"
TypeName="*"/>
</SafeControls>
</Assembly>
</Assemblies>
<RootFiles>
<RootFile Location="ISAPI\InfoPathAttachment.asmx"/>
<RootFile Location="ISAPI\InfoPathAttachmentdisco.aspx"/>
<RootFile Location="ISAPI\InfoPathAttachmentwsdl.aspx"/>
</RootFiles>
<CodeAccessSecurity>
<PolicyItem>
<Assemblies>
<Assembly PublicKeyBlob="[ADD VALUE]"/>
</Assemblies>
<PermissionSet class="NamedPermissionSet" Name="WSSDistillery.Utilities" version="1" Description="Permission for WSSDistillery.Utilities.Service.InfoPath">
<IPermission class="AspNetHostingPermission" version="1" Level="Minimal" />
<IPermission class="SecurityPermission" version="1" Flags="Execution" />
<IPermission class="Microsoft.SharePoint.Security.SharePointPermission, Microsoft.SharePoint.Security, version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" version="1" Unrestricted="True" />
<IPermission class="System.Security.Permissions.FileIOPermission, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" version="1" Read="C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\TEMPLATE\LAYOUTS" />
</PermissionSet>
</PolicyItem>
<PolicyItem>
<Assemblies>
<Assembly PublicKeyBlob="[ADD VALUE]"/>
</Assemblies>
<PermissionSet class="NamedPermissionSet" Name="WSSDistillery.Utilities.Service.InfoPath" version="1" Description="Permission for WSSDistillery.Utilities.Service.InfoPath">
<IPermission class="AspNetHostingPermission" version="1" Level="Minimal" />
<IPermission class="SecurityPermission" version="1" Flags="Execution" />
<IPermission class="Microsoft.SharePoint.Security.SharePointPermission, Microsoft.SharePoint.Security, version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" version="1" Unrestricted="True" />
</PermissionSet>
</PolicyItem>
</CodeAccessSecurity>
</Solution>

4.4 Create a ddf File


Second you will need to create .ddf file, with the following commands. I usually just call this WSP.ddf and I add it to my web service project. This will package up the two DLLs and the three web service files.


.OPTION Explicit
.Set CabinetNameTemplate="WSSDistillery.Utilities.Service.InfoPath.wsp"
.Set DiskDirectory1="C:\ "

manifest.xml
%outputDir%WSSDistillery.Utilities.dll
%outputDir%WSSDistillery.Utilities.Service.InfoPath.dll

.Set DestinationDir="ISAPI"
InfoPathAttachment.asmx
InfoPathAttachmentdisco.aspx
InfoPathAttachmentwsdl.aspx

; if we don't delete this variable, we get an error.
.Delete outputDir

4.5 Project Post Build Events


Then I add the following to the post build events of the Web Service project which will build the WSP every time I have successful build of the web service project.

cd $(ProjectDir)
MakeCAB /D outputDir=$(OutDir) /f "WSP.ddf"

4.6 Deployment Scripts

Now all you need to do is running the following STSADM command to push the solution out. Note you have to provide the URL of the web server because the DLLs are deployed to the web site bin and not the GAC.

stsadm.exe -o addsolution -filename C:\WSSDistillery.Utilities.Service.InfoPath.wsp
stsadm.exe -o deploysolution -url http://MyServer/ -name WSSDistillery.Utilities.Service.InfoPath.wsp -allowGacDeployment -allowCasPolicies -immediate -force
stsadm.exe -o execadmsvcjobs

Now test it, simply go to http://MyServer/_vti_bin/InfoPathAttachment.asmx and you will see the new web service.

4.7 Changes to Custom Web Service

If you make changes to the custom web service, like adding a new method or parameter to your WebMethod you will need to regenerate the WSDL that was described earlier. Then you will need repackage everything including the recompiled DLLs and push all of the changes back out using the WSP solution.


stsadm.exe -o retractsolution -url http://MyServer/ -name WSSDistillery.Utilities.Service.InfoPath.wsp -immediate
stsadm.exe -o execadmsvcjobs
stsadm.exe -o deletesolution -name WSSDistillery.Utilities.Service.InfoPath.wsp

4.8 Making Web Service Discoverable in Visual Studio

This is completely optional, however the MSDN Article mentions that to make the web service discoverable in Visual Studio as a web service alongside the default Windows SharePoint Services web services you need to modify spdisco.aspx file that is located in the ISPI folder in the 12 hive. You will need to make this change on every server in the farm.


<contractRef ref=<% SPHttpUtility.AddQuote(SPHttpUtility.HtmlEncode(spWeb.Url + "/_vti_bin/InfoPathAttachment.asmx?wsdl"), Response.Output); %>
docRef=<% SPHttpUtility.AddQuote(SPHttpUtility.HtmlEncode(spWeb.Url + "/_vti_bin/InfoPathAttachment.asmx"), Response.Output); %>
xmlns=" http://schemas.xmlsoap.org/disco/scl/ " />
<discoveryRef ref=<% SPHttpUtility.AddQuote(SPHttpUtility.HtmlEncode(spWeb.Url + "/_vti_bin/InfoPathAttachment.asmx?disco"),Response.Output); %>
xmlns="http://schemas.xmlsoap.org/disco/" />

5.0 Using in InfoPath

I am not going to dive much farther to InfoPath development at this point. If you can get these methods created, you should be good to go from here. Below are a few notes to get you going.

5.1 Web Service Data Connection

Hooking this into you InfoPath form is really straight forward. All you need to do is create a data connection using a web service to http://MyServer/_layouts/InfoPathAttachment.asmx. When creating that web service data connection, do not select submit option, make sure you select retrieve option. I am going to skip the rest of the configuration of the web service data connection as that should be pretty simple.


5.2 Create an Upload Button

The only other little trick you will need to know is that on your InfoPath form, you will need to add a binary data field (base64) into the main source. The reason why is if you try to use the binary field in web service data connection on the form, InfoPath will not allow you to use the attachment control. Now I bet if you crack open the manifest.xsf file you can probably resolve this issue.

Below is a screen shot of a rule I created on a button that will upload a file from the IP from to the SharePoint server. Now all you need to do is create a rule to clear out the main data source attachment field and then create a third rule that will call the GetAttachments web method.

With all of these web methods you could satisfy a use case like:

  • User opens a Purchase Request InfoPath form.
  • A Purchase Request number is generated in the InfoPath form (out of scope of this posting).
  • An attachment folder is created for the Purchase Request.
  • User uploads a file associated to the Purchase Request.
  • User will see links Purchase Request attachments on the form.

5.3 SharePoint Farm?

If you SharePoint farm that has several front-end web servers that are load balanced, you may run into an issue when making a connection to SharePoint web service when the InfoPath form is running in a web enabled mode. I ran into this issue a long time ago when trying to use the SharePoint Profile web service to get the current users information when using a web enabled InfoPath form (read this blog). You will need to use the URL to the web server directly instead of the load balanced URL. I have never dug any deeper into why this; if you scroll down in the responses of the blog you will see the issue that everyone runs into.

6.0 References


I did get some help along the way: