May 2008 - Posts
If you are building a custom document library solution, there may be a time where you don't want users to use the Explorer View of the document library. I am talking about the view itself and not the Open with Windows Explorer action in this case. The correct way to do is of course is to create your own custom document library template (I'll be covering that in a future post) and then modify the schema.xml file. If you want to go crazy and unsupported, you can also modify the schema.xml of the built-in DocumentLibrary feature and accomplish the same result for the out-out-the-box document library. Removing it is just a matter of removing the appropriate View element in the schema.xml file. In this case, you are looking for an element similar to the one below.
<View BaseViewID="3" Type="HTML" WebPartZoneID="Main" DisplayName="$Resources:core,Explorer_View;" Url="Forms/WebFldr.aspx" SetupPath="pages\webfldr.aspx" RequiresClientIntegration="TRUE" ReadOnly="TRUE">
It will have a BaseViewId of 3 and a Display Name indicating that it is the explorer view. Just delete this entire View element and reactivate the feature and this view will not be present on any new document libraries you create. Again, I recommend creating your own document library template instead of modifying the built-in one. I'll cover this soon, since I haven't found too many good sets of instructions out there on how to do it.
When implementing an ECM solution, it is often necessary to come up with a way for documents to inherit metadata from a parent folder. The main reason of doing this is so that user's can search on any of the document's properties. Other ECM solutions do this for you (yes there are other ECM solutions other than SharePoint), but WSS requires a bit of code to make it happen. The way you do it is by creating an ItemEventReceiver and then by setting properties in the event handling method of various events. To start out, I am assuming you have created your own content types for a custom folder and document. We'll call these Custom Folder Type and Custom Document Type respectively. I am also assuming you are deploying these via feature. The reason is you need to set up an Item Event Receiver on the Custom Document Type and it is much easier to configure via feature.
The first thing we need to do is create the class to contain the event handling methods of the ItemAdded, ItemUpdated, and ItemCheckedIn events (there are more event types, but these are the only ones we need). I recommend putting this class in its own library (or in a library with other item event handling methods). My reason for this is that I have only been able to get an item event handling method to fire is when the assembly is in the GAC. If you have read my blog in the past, you know I almost always recommend against this, but in this case, I don't know of any choice. To implement the class, start by having your custom class inherit from SPItemEventReceiver. You can then override various event handling methods. In this case, I want to enforce the metadata inheritance whenever a document is added, updated, or checked in. To do this, we override the ItemAdded, ItemUpdated, and ItemCheckedIn methods. The contents of all the methods is typically the same. They call a method to do the inheritance and if it fails, we attempt to cancel the event (although I have never had it cancel anything when there is an exception).
public override void ItemAdded(SPItemEventProperties properties)
{
base.ItemAdded(properties);
try
{
InheritCustomProperties(properties.ListItem);
}
catch (Exception e)
{
properties.Cancel = true;
properties.ErrorMessage = "Error inheriting metadata.";
properties.Status = SPEventReceiverStatus.CancelWithError;
}
}
The SPItemEventProperties allows you to get access to the list item using the ListItem property. We then pass it to our custom method which does the inheritance. In this example, I am assuming that the parent folder has two properties: Product Id and Color, that we want to copy into the newly added (or updated) item.
protected void InheritCustomProperties(SPListItem childItem)
{
// make sure that the parentFolder and its item are not null
if ((childItem.File.ParentFolder != null) && (childItem.File.ParentFolder.Item != null))
{
SPItem parentItem = childItem.File.ParentFolder.Item;
// copy properties from parent to child
childItem["Product Id"] = parentItem["Product Id"];
childItem["Color"] = parentItem["Color"];
// event firing must be disabled otherwise this update will cause another event to fire
DisableEventFiring();
childItem.SystemUpdate(false);
EnableEventFiring();
}
}
One thing to make sure is that a parent item exists for you to inherit from (obviously you would need to do something different if you were uploading documents to the root of a library). Next it is just a matter of getting a reference to the parent item and then start copying each property. Before updating the child item, you need to make a call to the base class's DisableEventFiring method to keep it from firing an event when you save the item. Next, you should call SystemUpdate instead of Update which will save the item but not save it as a new version.
That is all the code that is involved. However, you still need to modify the XML of your Content Type feature to tell it to use your even receiver. In your content type's Elements.xml file, you'll need to add some new entries to the XmlDocuments element (add one if you don't have one already). Then, add a section similar to the one below to your file. You will see one receiver for each event type (ItemAdded, ItemUpdated, ItemCheckedIn).
<XmlDocument NamespaceURI="http://schemas.microsoft.com/sharepoint/events">
<spe:Receivers xmlns:spe="http://schemas.microsoft.com/sharepoint/events">
<Receiver>
<Name>Custom Document Added Event Handler</Name>
<Type>ItemAdded</Type>
<SequenceNumber>10001</SequenceNumber>
<Assembly>CustomEventReceivers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=f2d79125eb887b9e</Assembly>
<Class>CustomEventReceivers.CustomDocumentLibraryItemEventReceiver</Class>
<Data />
<Filter />
</Receiver>
<Receiver>
<Name>Custom Document Updated Event Handler</Name>
<Type>ItemUpdated</Type>
<SequenceNumber>10001</SequenceNumber>
<Assembly>CustomEventReceivers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=f2d79125eb887b9e</Assembly>
<Class>CustomEventReceivers.CustomDocumentLibraryItemEventReceiver</Class>
<Data />
<Filter />
</Receiver>
<Receiver>
<Name>Custom Document Check In Event Handler</Name>
<Type>ItemCheckedIn</Type>
<SequenceNumber>10001</SequenceNumber>
<Assembly>CustomEventReceivers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=f2d79125eb887b9e</Assembly>
<Class>CustomEventReceivers.CustomDocumentLibraryItemEventReceiver</Class>
<Data />
<Filter />
</Receiver>
</spe:Receivers>
</XmlDocument>
The SequenceNumber is the order in which your event handling method fires. Microsoft typically recommends something starting with 10,000 and above to avoid conflicts with their event handlers. The Assembly element, takes a standard assembly reference to a DLL in the GAC. The Class element specifies the namespace and class which contains the event handling methods.
Again, I recommend keeping the feature for your content type and your event receivers in different assemblies. Install your assemblies first with a solution package. Then you can install the feature containing your updated content type. Once you have it installed, add some metadata to a parent item, save it, and then try uploading a new document to verify that the metadata was copied to the new document. If the event handler threw an exception, most likely it did not display a visible error. Unfortunately, this means you'll have to do some GAC debugging. It sounds like quite a bit of work, but it really isn't bad.
When attempting to configure the Business Data Catalog for use with MOSS Enterprise Search, it can be quite common to get this message when you try crawling for the first time. The cause is actually pretty simple, but I figured I would give some troubleshooting advice today. When you get this error, the fact is, your Enterprise Search crawl account cannot access some portion of the item in the BDC. When crawling, permission has to be given to the crawl account in three different places: the BDC itself, the Application Definition you are crawling, and the individual entities inside the Application Definition.
When troubleshooting this error, the first thing you need to do is identify your crawl account. The easiest way to do this is to go to the Search Settings page on your SSP and see what the default content access account is. When looking for the source of this error, you can try looking in the Event Log, but typically the only time you see an error in there is if you are indexing a SQL Server data source and the crawl account doesn't have access to the SQL Server database. This is a good time to check and make sure that the crawl account does have access to the database. Your next best bet is to check the latest log file in your 12 hive's LOG folder. Search for something like "Access Denied" and see what you find. Here is an example of what I found.
Closing chunk on Exception: Access Denied for User 'MOSSTEST\MOSS_Search'. Securable MethodInstancewith Name 'GetGetContactIdsResultsInstance' has ACL that contains: User 'MOSSTEST\administrator' with Rights 'Execute, Edit, SetPermissions, UseInBusinessDataInLists, SelectableInClients'
In this case, my error is indicating that my crawl account did not have access to the entity itself. Let's assume though that we don't know what permission is needed. First, start off by going to the Business Data Catalog Permissions link on your SSP. Make sure your crawl account is listed and that it has Execute and SelectableInClients permissions. You can try and click the Copy all permissions to descendants link and hope it copies the same permissions down to the application definition and entity itself, but more than likely it won't work. So, the next step is to go to the Application Definition, and make sure your crawl account has the same permissions there. Again you can try the Copy all permissions to descendants link, but I recommend also going to each entity and verifying that your crawl account has permission.
At this point, try crawling again and hopefully everything will work. To alleviate some of the pain in the process, you can also set your permissions via Application Definition. One thing to note is that I have tried just granting Domain Users (or some other group) the needed permission to crawl, but it does not seem to work for some reason. It seems that you must explicitly grant access to the crawl account (and not some group that it is a member of) for the crawl to succeed. I have set up numerous BDC Crawls and it seems like I still run into this error on a regular basis. Hopefully, with these steps, you can get your crawl to run. If you can't get it to work, try checking your log files.
I have ran into a few people that didn't even know what the property bag is on the SPWeb object, so I figured that would be a good topic for a post. The property bag exposed by the Properties and AllProperties collections on the SPWeb object gives the developer a way to cache information on a site. It can be compared to ASP.NET's cache object but the values are shared across the farm and will persist through server reboots. Unfortunately, using the property bag isn't as simple as you would think. In fact to get it working in all situations, it requires a number of settings to allow it to work. It's probably enough to scare a lot of developers away from it.
Before we start, let me talk about the differences between Properties and AllProperties. Properties is a collection of type SPPropertyBag. It is sort of considered deprecated. You can use it to add new items to the property bag, but it will only return a subset of everything in the property bag when retrieving data. Why is that? I have no idea. So instead, Microsoft later implemented the AllProperties Hashtable which contains everything in the property bag.
Here is what the start of our code block looks like to write to the property bag. This assumes it is inside a using block to get access to an SPWeb.
// unsafe updates are required to be able to write to the property bag
currentWeb.AllowUnsafeUpdates = true;
// you must check to see if the collection has a value in the assigned key already
if (!currentWeb.AllProperties.ContainsKey(key))
currentWeb.Properties.Add(key, myValue);
else
currentWeb.AllProperties[key] = myValue;
// update the properties
currentWeb.Properties.Update();
currentWeb.AllowUnsafeUpdates = false;
There are a number of things to note here. First, it requires AllowUnsafeUpdates to be set to true. Secondly notice that the AllProperties object actually has a method to check to see if a key exists. This is probably the only collection in SharePoint that lets you check to see if something exists without throwing an exception. This is of course because, the type is Hashtable and not a custom SharePoint collection. If the key does not already exists, you add it to the Properties object (not AllProperties). If a value already exists, you store the value using an indexer with the AllProperties object. You might be able to get this to work other ways, but after lots of attempts, this is what works for me. After the value is stored, you have to call Update on the Properties object if you want it to actually store the values. If you have multiple values to store, you only need to call this once. Lastly, be sure to remember and turn off AllowUnsafeUpdates.
The above code works great...as long as the account running it is an administrator. If you need this code to work when executed by a regular user, there is an additional step. You have to use SPSecurity.RunWithElevatedPrivileges. I am assuming you are already familiar with how this method works.
SPSecurity.RunWithElevatedPrivileges(delegate()
{
using (SPSite currentSiteCollection = new SPSite("http://myserver/mysite"))
{
using (SPWeb currentWeb = currentSiteCollection.OpenWeb())
{
// unsafe updates are required to be able to write to the property bag
currentWeb.AllowUnsafeUpdates = true;
// you must check to see if the collection has a value in the assigned key already
if (!currentWeb.AllProperties.ContainsKey(key))
currentWeb.Properties.Add(key, value.Value.ToString());
else
currentWeb.AllProperties[key] = value;
// update the properties
currentWeb.Properties.Update();
currentWeb.AllowUnsafeUpdates = false;
}
}
});
There is a complete example. One thing to remember is that the method inside RunWithElevatedPrivileges requires that you create a new SPSite object for it to work. Do not pass it a reference from one that already exists. Create a new one by passing it a URL. Since we need the SPWeb object in this case, I have things nested like I have mentioned in my post in the past.
Luckily, reading from the property bag is not as difficult. Just check to see if the key exists and read from it using AllProperties.
if (currentSite.AllProperties.ContainsKey("MyItem"))
myItem = currentSite.AllProperties["MyItem"].ToString();
This may seem like a lot to store and retrieve values in the property bag, but it's pretty easy to wraps this code up into a class.
I had a good time at School of Dev this weekend. It had a fairly decent crowd consider this was its first time and it was on Mother's Day weekend. There was some good info there and I think just about everyone got a prize or two. I had the opportunity to give my updated talk on Searching Business Data with MOSS 2007 Enterprise Search. In the talk, I walked through how to search line of business data exposed through a web service. My slides are attached to this post.
I also had the opportunity to catch a couple of other SharePoint sessions. I made it to Dennis Bottjer's session on Developing a Public Facing Internet Portal with MOSS 2007. I also caught Becky "MOSS Lover" Isserman's session on using AJAX in SharePoint. All in all I had a good time and I enjoyed the opportunity to be able to talk with other SharePoint developers.
I am looking forward to TechFest 2008. I am currently working on a new talk on using Code Access Security with SharePoint which hopefully I will be giving at a user group meeting fairly soon.
I am looking forward to giving my updated talk on the Business Data Catalog and Enterprise Search tomorrow at School of Dev. For the two of you that read my blog, feel free to stop by and see the session.
If you're like me, you may have noticed this service on your installation of SharePoint and often wondered what it was. As a developer, when you see VSS, the first thing you might think of is Visual SourceSafe. I thought the same, but I knew that couldn't be quite right. I thought to myself, why on Earth would Microsoft spend time writing integration between SharePoint and SourceSafe. I decided to look it up.
In this case VSS is not Visual SourceSafe but is in fact the Volume Shadow Copy Service. Why they didn't call it VSCS, I don't know. If you are curious as to what the process actually does, it provides integration between SharePoint and the Volume Shadow Copy Service to provide a rich backup experience. Basically, it describes the data that needs to be backed up to third party backup software vendors. If you really want to know more, here is the info from Microsoft. So now you know what it is and this will help prevent you from saying something very noobish such as "SharePoint integrates with Visual SourceSafe out of the box." :)
I am astonished with how many public facing SharePoint sites are out there without this problem fixed. You know the one I am talking about.
The Web site wants to run the following add-on: 'Name ActiveX Control' from 'Microsoft Corporation'. If you trust the Web site and the add-on and want to allow it to run, click here...
As you may know, this issue only occurs when running Internet Explorer 7 when visiting a SharePoint site in the Internet Zone. Microsoft describes the issue here in KB931509. Here is an easy solution to fix it. I just find this issue so annoying and it really detracts from the quality of the site when it is this easy to take care. Does it bother you? Maybe it's just me and there aren't that many people using IE7. I thought the adoption rate on IE7 was increasing though. Either that or people just don't know its occurring.
I was working on an Enterprise Search deployment the other day and I encountered this lovely error. I found the answer buried in some forum posts, but I wanted to tell you what side effects I found when implementing the solution (and I wanted to make sure I could find the answer again someday when I forget it). In my case I configured my Content Source to crawl a remote SharePoint farm (although I don't think being on a remote farm has anything to do with it). When I started the crawl, I saw the following error in my crawl log.
HRESULT E_FAIL has been returned from a call to a COM component
The reason this error occurs is because, I did not have impersonation enabled in my web.config. I needed a line that looked something like this.
<identity impersonate="true" />
However, before you are doing that, you need to know that this really changes the way security works on the site. When impersonation is enabled, it means the Application Pool account will no longer be used to execute ASP.NET page requests. It also means that if you are using Integrated Security with your connection strings, that you will most likely end up with an error message such as the following.
Login failed for user 'NT AUTHORITY\NETWORK SERVICE'
Additionally if you are using the SqlDataSource, it is most likely mapped to the SpSqlDataSource (unless you removed the tag mapping) and will get a message like the following.
This control does not allow connection strings with the following keywords: ‘Integrated Security’, ‘Trusted_Connection’.
I talked about the above message in the past. On a related note, I have discovered (assuming Impersonation is off), that you can enable Integrated Security with the SPSqlDataSource control by setting the AllowIntegratedSecurity property to true.
So what is the bottom line? Well, from my personal attempts as well as from what I have read in forums, it is not possible to have Impersonation enabled as well as use Integrated Security with your connection strings. In theory, I do believe it could be possible if you were to give all of your SharePoint users the same permissions as your original application account on your SQL Server. What I mean by that is give DOMAIN\Domain Users permission directly on the database. I don't really like that idea though. This, unfortunately, left me to breaking a best practice (come on not like SharePoint hasn't made you do that before) and use an internal SQL account for authentication.
I am out of ideas on this one, so currently this seems to be my only answer. Once, I changed my connection string and turned impersonation off, I recrawled and everything works great.
I'll be talking about the BDC and Enterprise Search at Tulsa's School of Dev next Saturday May 10th. This is the same talk I was going to give at the SharePoint Users Group a while back, but it never happened due to logistics. This updated session will walk you through how to use the latest tools to use the Business Data Catalog and Enterprise Search with your line of business systems. It is sure to be an exciting talk (well as exciting as a talk on MOSS can be on a Saturday morning).