December 2008 - Posts

The topic of debugging came of in my talk SharePoint Development for ASP.NET developers earlier this month.  One technique I mentioned is the process of determining the application pool ID using the often used iisapp.vbs file.  I went to demo this on my Windows Sever 2008 virtual machine only to find that this file no longer existed.  After doing a bit of research, I discovered that is because this file was removed and replaced with appcmd.exe (which is a lot more powerful).  Appcmd.exe is located in c:\windows\system32\inetsrv. 

Here is the command to list the worker processes:

appcmd list wps

I found this information on TechNet.  From the article, I also learned you can get this information through the IIS Manager UI if you so chose.  Just click on the server, and look for the Worker Processes icon.

with no comments
Filed under: ,

I have been reconfiguring a search index and running into this issue a lot, so I thought I would post on it.  In my case, I am crawling a content source that comes from a BDC application definition.  The first time I crawl, I get the following error.

The parameter is incorrect. (LobSystemInstance could not be found using criteria 'Name=MyBDCInstance'.)

This is obviously a cryptic error.  As I have said in the past though, if you get an error crawling, always try to crawl again.  In this case, this is exactly the fix for it.  Full Crawl again and everything works just fine.  I am sure there is a good reason this happens, but I have never bothered to look into it.  If you get the error, just crawl again.

After building a workflow recently, I noticed I was getting the following error whenever I opened the task form.

An error occurred accessing a data source.

I have seen the error in the past when trying to connect to an external web service when it was down, but this form was much simpler than that.  The only data source it had was ItemMetadata.xml.  The weird thing was everything on the form was working.  At this point, I decided to hit the LOGS folder and see if I could figure out what was going on.  The error wasn't in there, but I found entries for my form along with an error stating that it could not read the ItemMetadata.xml file from the file system.

At this point, the answer was obvious to me.  When I configured the secondary datasource, I somehow didn't set the option "Include this file as a resource in the form template or template part".  I selected this option, republished my form, and then reinstalled the workflow.  To no surprise, I saw that the error went away.

I have posted in the past about how to use the KeywordQuery class to execute an Enterprise Search query.  This makes it easy to get a DataTable containing search results.  However, one caveat with this is that some columns get returned as a string array (string[]) instead of a string.  This can make data binding a pain.  If you wanted to bind your DataTable to a GridView, it simply will not display these columns.  I used to think it had something to do with it only happening on custom managed properties that you create, but that is not actually the cast.  It turns out this is actually related to the post yesterday, because it seems to happen for the same reason.  If you have a managed property that has more than one crawled property mapped to it, you get a string[] instead of a string.  I am sure there is a good reason for this, but I don't know what it is.

So how do we deal with this?  Option 1 of course is to only map one crawled property to your managed properties.  This is probably not going to be an ideal solution.  Another option we have is to use LINQ to DataSets to coerce that string[] into something usable.  First, we execute the query and load into a DataTable as usual. 

// execute the query and put it in a dataTable

ResultTableCollection queryResults = keywordQuery.Execute();

ResultTable queryResultsTable = queryResults[ResultType.RelevantResults];

DataTable queryDataTable = new DataTable();

queryDataTable.Load(queryResultsTable, LoadOption.OverwriteChanges);

In this case, I want to bind the builtin Title managed property as well as two of my own managed properties.  I use LINQ to DataSets to put these into a new anonymous type.  The Field property is generic so you can specify the type that you want.  In this case string[].  Notice the Title property doesn't require the use of the string array.

var formattedResults = from row in queryDataTable.AsEnumerable()

                       select new

                       {

                           Title = row.Field<string>("Title"),

                           MyManagedProperty1 = row.Field<string[]>("MyManagedProperty1")[0],

                           MyManagedProperty2 = row.Field<string[]>("MyManagedProperty2")[0]

                       };

We then can bind this to a GridView as usual (if that's what you need it for).

SearchGridView.DataSource = formattedResults;

SearchGridView.DataBind();

I am sure there is a good reason that these fields get returned as a string[].  If you don't like these options, you can always call the web service and get the results as XML.

The other day, I was trying to work some XSLT magic with my CoreResultsWebPart.  What I was trying to do is pass an integer value in a search query (that went against a BDC application definition) in a link to a page.  This should be a simple task, but when I added the field to my template, it displayed the value of System.Int64[] instead of the value of the integer.  This obviously wouldn't work with the page on the other side of the link that is expecting that integer value.

I first went back to my BDC application definition and confirmed that the field in the TypeDescriptor element was in fact set to System.Int32.  I then decided to issue my query with some code going against the web service.  I discovered that other builtin fields were returning Int64 as the type, but mine was returning the type as Object (I am sure there is a reason it uses Int64 versus Int32, but I don't know).   This led me back to the Managed Property itself.  I took a look and saw that I had two crawled properties (from two different BDC application definitions) mapped to it.  On a hunch, I removed one of them and recrawled my content source.  I ran my query again and to my surprise it was now returning the field as an Int64 as intended.

So the lesson, I have learned here is that Enterprise Search uses an Object type when you are mapping multiple crawled properties to one managed property.  I am not really sure why this is, but I assume it thinks that it might receive a different type from one of the crawled properties (even though it wouldn't let you map it).  Now my only issue is that if I want to have multiple crawled properties map to a single managed property in this case.  It may also be possible to somehow cast the object back into an integer and get the actual value, but my XSL skills aren't that good.  If you know how to do it, please let me know.

I am excited to announce that, the Wildcard Search Web Part and the Document Link Handler for Enterprise Search have been added to the Search Community Toolkit.  I thought both of these tools are a great fit to the toolkit since they are so useful in extending the functionality of Enterprise Search.  If you are not familiar with them, the Wildcard Search Web Part is a drop-in replacement for the CoreResultsWebPart which allows wildcard searching.  The Document Link Handler gives you the ability to link directly to a document's folder, document library, properties, as well as a link to editing the document.  I am glad these tools have been added and look forward to developing more tools to give back to the SharePoint community.  If you haven't checked out the SCT yet, give it a look.  There are some great tools that really allow you to enhance a user's search experience.

I've seen a few posts in the forums lately on this, so I thought I would post how I have implemented it in the past.  The issue is you have multiple SharePoint farms, but you want one index server to handle both of them.  This is actually pretty easy to setup, if you don't mind your farms sharing the same Shared Services Provider (SSP).  The process below describes how to share the services of one farm's SSP to other farms.

The first thing you need to do is decide which farm will host your index server and the configure it as normal.  We'll refer to this as the parent farm.  On the parent farm, go to Central Administration -> Application Management -> Managed Shared Services between Farms.  Click on the option "This farm will provide shared services to other farms".  You will then be prompted for which SSP to use.  Most likely you only have one.  If you have more than one, click the one that has your search index on it.  You will then need to specify a user account.  This account should have permissions on the child farm already.  Typically, I would use the account that is being used for the application pool in central administration.  Click ok and then you should see a screen giving you information on how to configure a child farm.  Specifically it will give you the name of the parent farm's database server and configuration database.

Now you will want to go to your child farm.  Go to the same page to manage shared services between farms, but this time click on "This farm will consume shared services from another farm".  Enter the database server and name in the fields provided.  I recommend using Windows authentication but keep in mind that account will need permission to access that configuration database.  This shouldn't be an issue if both farms are using the same accounts for central administration though.  The last option to specify is for Local Excel Services (which can not be shared across farms).  For that you will have to specify an SSP that is local to that child farm.

It may not be entirely obvious what to do on the next screen at first.  The goal is to associate any web applications you have created on the child farm to your new parent SSP.  So the first thing you need to do is select the SSP of your parent from the list.  It will have the word parent next to it in parenthesis.  You then need to check the box next to any web application you have created and click ok.  Your services from the parent SSP have now been shared to the child.  This means any use of the Business Data Catalog, Enterprise Search, User Profile Application, or Session State will now be configured on the parent farm.  If you want to add additional child farms, just repeat these steps on each farm.

At this point you can execute a search query on a web application on either farm and get the same results.  However, you still need to configure a content source to crawl the child farm.  You will need to go the Content Sources page of your Search Administration on the parent SSP.  To index the child farm, you can do this in one of two ways.  First, you can simply modify the existing Local Office SharePoint Server Sites content source and add the URL of your web application of the child farm into the list of start URLs.  You can also create a new content source and specify the URL of the child farm there.  I tend to approve the latter approach since I can put the crawls on different schedules that way.  Either way you go, perform a full crawl when you are done.  Once the crawl is complete, you will now be able to search for results on both farms from either farm.

Yesterday, I had the privilege to speak at the Oklahoma City Developer Group about SharePoint Development for ASP.NET Developers.  This talk was aimed at showing developers familiar with ASP.NET how they can get started with SharePoint.  In my talk I mentioned a number of posts from my blog, so I wanted to link them here.

In my talk I know I referenced my blog a lot, but when starting out with SharePoint development, there really is too much to cover in just a short one hour talk.  Hopefully, these posts will be a good starting point for anyone that wants to get started with SharePoint.  My slides and code samples from the talk are attached.  Feel free to leave a comment or shoot me an E-mail if you have any questions.  I want to thank the OKC Developer Group once again for having me.

with no comments