>
Blog
Book
Portfolio

Search

Search configuration always seems to be a manual process. In fact, across a majority of the SharePoint projects I've lead, we're lucky to even have a development environment with such working identically to production. And it's even more challenging on your local SharePoint environment, where it's hard to allocate enough juice for SharePoint itself, Active Directory, and search indexers.

But even if we can't precisely mirror the production farm (with multiple front ends, a load balancer, a bazillion gigs of RAM, etc.) I'd still urge you to get your local environment (whether it's a VM or bare metal) as close as possible. Don't let "I can't get search working on my machine..." be an excuse to wait until your code gets into the client's environment to smoke test your search functionality.

There's not a lot to say about search from a deployment perspective beyond automating the mundane tasks required to configure it. But since these configurations can be very tedious and time consuming, All Code can make your life a lot easier. We'll look at provisioning a search center, configuring our site collection to use it, and ensuring we have our own content source, result source, and crawled and managed properties.

The Feature

First, let's create a new "Search" feature in DDD.Web with the below settings. Do all the typical feature stuff we've done for the rest: pretty up the name and description, set the scope, update Constants with the feature guid, add a feature receiver, etc. To support the code in the receiver, add a reference to Microsoft.Office.SharePoint.Search to the project (by way of a file reference from DDD.Dependencies).

Configuring the Search feature

Configuring the Search feature

Next, let's pick apart the code that goes into the feature receiver.

Code Listing 86: Search.EventReciever.cs

  1. public override void FeatureActivated(SPFeatureReceiverProperties properties)
  2. {
  3. //initialization
  4. SPWebApplication webApp = properties.Feature.Parent as SPWebApplication;
  5. SPSite site = webApp.Sites.FirstOrDefault(s => s.ServerRelativeUrl.Equals("/"));
  6. //get root site
  7. if (site == null)
  8. site = webApp.Sites.SingleOrDefault();
  9. if (site == null)
  10. throw new Exception("There are no site collections in the web application.");
  11. //get service app
  12. SearchServiceApplication app = site.GetSearchApplication();
  13. if (app == null)
  14. throw new Exception("The search service application was not found.");
  15. //reset search index
  16. app.Reset(false, true);
  17. //get content object
  18. Content content = new Content(app);
  19. if (content == null)
  20. throw new Exception("The content object was not found.");
  21. //get schema object
  22. Schema schema = new Schema(app);
  23. if (schema == null)
  24. throw new Exception("The schema object was not found.");
  25. //get search center
  26. string login = string.Format("{0}\\{1}", Environment.UserDomainName, Constants.Search.Admin.Username);
  27. SPSite search = webApp.EnsureSearchCenter(
  28. Constants.Search.SearchCenter.Url,
  29. Constants.Search.SearchCenter.Title,
  30. login,
  31. Constants.Search.Admin.Username,
  32. Constants.Search.Admin.Email);
  33. //set content account
  34. SecureString password = new SecureString();
  35. Constants.Search.Admin.Password.ToList().ForEach(x => password.AppendChar(x));
  36. content.SetDefaultGatheringAccount(login, password);
  37. //set contact email
  38. SearchService.Service.ContactEmail = Constants.Search.Admin.Email;
  39. SearchService.Service.Update(true);
  40. //set search center url
  41. string searchUrl = string.Concat(search.ServerRelativeUrl, "/pages");
  42. app.SearchCenterUrl = searchUrl;
  43. //update site collection search urls
  44. site.RootWeb.AllProperties[Constants.Search.System.SearchCenterWebURLKey] = searchUrl;
  45. site.RootWeb.AllProperties[Constants.Search.System.SearchCenterSiteURLKey] = searchUrl;
  46. site.RootWeb.SetProperty(Constants.Search.System.SearchCenterSettingsKey, new SharedSearchBoxSettings(false, string.Concat(searchUrl, "/results.aspx"), false).Serialize());
  47. site.RootWeb.Update();
  48. //create content type managed property
  49. schema.EnsureSearchProperty(
  50. Constants.Search.System.CrawledPropertyContentTypeId,
  51. Constants.Search.ManagedProperties.DDDContentTypeId,
  52. ManagedDataType.Text);
  53. //create abstract managed property
  54. schema.EnsureSearchProperty(
  55. Constants.Search.CrawledProperties.Abstract,
  56. Constants.Search.ManagedProperties.Abstract,
  57. ManagedDataType.Text);
  58. //create category lookup managed property
  59. schema.EnsureSearchProperty(
  60. Constants.Search.CrawledProperties.CategoryLookup,
  61. Constants.Search.ManagedProperties.CategoryLookup,
  62. ManagedDataType.Text);
  63. //build query template
  64. SPContentTypeId articles = ContentTypeId.Page.GetContentTypeId(Constants.ContentTypes.RollupArticle.Id);
  65. string queryTemplate = string.Format(
  66. Constants.Search.Source.QueryTemplateFormat,
  67. Constants.Search.ManagedProperties.DDDContentTypeId,
  68. articles);
  69. //create result source
  70. app.EnsureResultSource(
  71. Constants.Search.Source.Name,
  72. queryTemplate);
  73. articles);
  74. //create content source
  75. app.EnsureContentSource(site.Url, Constants.Search.Source.ContentSource);
  76. //save
  77. app.Update(true);
  78. }

There is a lot going on here. First we'll talk about the high level things the code is doing, and jump into the Utilities methods that support each task. Then we'll take a look at the substantial number of constants that come into play. Finally, we'll add this to the data creator deployment script so that as soon as we create the content, we can spin up search and get it crawled.

Notice that this code follows the "Create if not already there" paradigm since, like taxonomy, coding against service applications is global, and therefore doesn't support simply "blowing it away" and starting afresh. This is why the receiver will throw exceptions if something needed in the search service application isn't found. Provisioning service apps themselves is out of scope for this book, since we usually get them for free when SharePoint is installed.

We'll start with Line #'s 4-12, where we get the root site collection from the scoped-to-web application. We need a reference to it because there are really two parts to this deployment: provisioning search itself and configuring our site to consume it. There'd be no point to automating a search deployment if you still need to manually touch your site collection.

In the next few stanzas, we get references to the high level search objects we need to start provisioning. We get the search service app itself on Line #12, the Content object on Line #16, and then, on Line #20, the Schema. "Content" is a wrapper around the content sources and their crawling logic. "Schema" is our entry point to configuring the crawled and managed properties. None of these have changed much since 2010.

Whenever we make an update to the search service application, I like to forcibly reset the search index. This clears out everything from the content sources and provides us with a clean slate to test our new configuration. It also helps alleviate any false positives we might get from these tests, in case a result is coming from an unexpected content or result search. Line #16 does this directly against the service app object. The Boolean values indicate that we aren't disabling search alerts and are ignoring unreachable servers.

The Search Center

In Line #'s 26-32, we create our search center, which is a separate site collection built off of the "SRCHCEN#0" enterprise template. The code to do this in the Utilities' EnsureSearchCenter method is not very interesting: create a site collection with this metadata at this URL if there's not already one there. To support this, Line #26 sucks in a user from name the Constants file and builds a login name from it. You can move this to the web.config (backed by SPWebConfigModifications of course) if you'd like.

Now things start the get interesting. Line #'s 34-36 pull more user info from Constants, build a SecureString password, and set the "default gathering account" for the Content object. This is the impersonation context the crawlers will use. The next two lines of code set the "Contact email" for the service app. The two after that configure the "Global Search Center URL" to point to the search center site collection we just provisioned. The following screen shot shows which settings these correspond to in central administration:

Viewing the Search service application

Viewing the Search service application

Configure The Search Settings

For some reason, as you can see in in Line #41, sample relative URLs in search settings screens around SharePoint 2013 end in "/pages" which sort of bothers me. This isn't a valid URL, as it points to a pages library, not a particular ASPX file or the default landing page of a site. Perhaps it just wants to know where the pages of a search center are located? This wasn't worth it to investigate; it just works; don't question SharePoint default behaviors too heavily or you'll drive yourself mad.

The next step is to tell our "content" site collection how to talk to its search center. This isn't in the API; it's a cryptic collection of settings that are strangely enough in the root web's property bag, not in the SPSite's. I had to sell my soul to the ILSpy devil and crack open the code behind of the following search settings page to find this code.

Viewing the site collection search settings

Viewing the site collection search settings

Above is the site collection search settings, where we wire up the Search Center URL. Below is the site (SPWeb) settings page, where we explicitly point to our search results destination. Both of these screenshots were taken after the provisioning code ran so we can see not only how confusing this it, but also what it looked like when I (finally) got it right.

Viewing the site search settings

Viewing the site search settings

Provision The Properties

Now we're going to start getting into the nitty-gritty of the search provisioning. First up is the EnsureSearchProperty extension method for the Schema object. This ensures (creates if not there) crawled properties and managed properties, and maps the two together. Although our sample search environment doesn't use them, it's important to include these in your search design so that any custom queries can grab certain site columns in their "select" clauses.

Another method in Utilities that's not even in the code is EnsureMapping, which maps an existing managed property to an existing crawled property. I use this guy to wire my custom site columns to out-of-the-box crawled properties. You'll see our schema extensions in full force on Line #'s 46-60. It was when I performed the manual process of mapping these managed and crawled properties (and screwing it up) way back when that inspired me to include search in All Code.

The beauty of provisioning these properties is that you can't create crawled ones via the UI. When I first rolled this code, I assumed you couldn't create them at all; crawled properties, by definition, are only brought into existence when they are plucked from newly-indexed content. However, the API allows it! This way, you don't have to do two full crawls like you would if you were creating and mapping these properties manually: one to get the crawled properties and then one to populate the mapped-to managed ones.

The Result Source

Next up is the result source. Result sources are new in SharePoint 2013, and replace search scopes. [Note: search scopes are still available, and editable, but you can't create new ones through the UI.] They are the same in nature, acting as a layer of abstraction and organization between queries and content sources. Think of result sources as database views covering underlying tables (that are the content sources).

As a view in SQL is just a query, so is a result source in SharePoint search. These are implemented as "Query Templates" and provide more flexibility than scopes did. The downside is that there isn't a clear mapping between result sources and the drop down in the search box like there was in 2010. To customize this dropdown in 2013, you need to configure "Search Navigation" in the site settings for a particular web; this is out of scope (pardon the pun) for this book.

Configuring the search results

Configuring the search results

So in order to use a result source and tie it into the search box, (in the content site collection) and the search results (in the search center site collection) we need to create one and set it as the default. This removes the search dropdown and forces the user's query to go against the rules specified in the default result source's query template.

In Line #'s 64-68, we first create the query template. A detailed discussion of the syntax (called "FAST Query Language" or "FQL," appropriately, since FAST search is built into SharePoint 2013) that builds this query is out of scope for this book, but not for MSDN: http://msdn.microsoft.com/en-us/library/ff394606.aspx. Therefore, we'll be building a simple one that takes the results from the user's query and only return items that are modeled with our Rollup Article content type (or one that derives from it).

The ability to build this query dynamically at deployment time is quintessential DDD. Line #64 makes use of the previously-mentioned GetContentTypeId extension method, which gives us an SPContentTypeId from our parent (the built-in publishing "page" content type) and the guid for this content type in Constants. We use this newly Frankenstein-ed value and shove it into a format string (also in Constants) to build out our FQL. Here's what it looks like after the format in Line #66:

{searchTerms} DDDContentTypeId:0x010100C568DB52D9D0A14D9B2FDCC96666E9F2007948130EC3D
B064584E219954237AF3900B7851C16669A4296880F04D10E833F11*

"{searchTerms}" is an FQL token that stands for the user's query after any preliminary transformations have been applied. After a space, we tell it to include items in the index whose content type ids start with Rollup Article's. This is what the trailing asterisk does. With FQL, you can do all kinds of complex queries that include Boolean logic against our managed properties.

Next, Line #70 creates the result source. Let's take a gander the underlying method:

Code Listing 87: Utlities.cs

  1. public static Source EnsureResultSource(this SearchServiceApplication app, string name, string queryTemplate)
  2. {
  3. //initialization
  4. Source source = null;
  5. FederationManager fm = new FederationManager(app);
  6. SearchObjectOwner owner = new SearchObjectOwner(SearchObjectLevel.Ssa);
  7. //get source
  8. source = fm.GetSourceByName(name, owner);
  9. if (source == null)
  10. {
  11. //create new source
  12. source = fm.CreateSource(owner);
  13. source.Name = name;
  14. source.CreateQueryTransform(new QueryTransformProperties(), queryTemplate);
  15. source.ProviderId = fm.ListProviders()[Constants.Search.System.SourceLocalSharePointSearchProvider].Id;
  16. }
  17. else
  18. {
  19. //update existing source
  20. source.QueryTransform.QueryTemplate = queryTemplate;
  21. }
  22. //save
  23. source.Activate();
  24. source.Commit();
  25. //set as default
  26. fm.UpdateDefaultSource(source.Id, owner);
  27. //return
  28. return source;
  29. }

This logic is actually pretty straightforward; the hard part was figuring out which API to use. The key is Line #'s 5 and 6, where we get a FederationManager and a SearchObjectOwner. The latter is just a wrapper around the SearchObjectLevel enumeration, which determines if the search object is, well, owned by a web, a site collection, or the search service application itself. I scoped this at the service app, so that it's available globally.

The FederationManager is the thing that manages the result scopes. Notice that we're operating against a "Scope" object, and not the "ResultScope" class. Although both implement IScope, this API only gives us a Scope, which, fortunately, is as well need. Still following "Create if not there," the "create" branch (starting at Line #11) builds the scope and configures its required metadata.

We use the "CreateQueryTransform" method to set our lovely query in Line #14. Since we are building the entire thing in FQL, we only need to seed it with a new instance of the "QueryTransformProperties" class; "{searchTerms}" takes care of the rest. If the result source already exists, we simply update the query in Line #20.

The only cryptic bit here is Line #15, which sets the result source's "ProviderId" to the "Local SharePoint Provider" begotten from the FederationManager. This corresponds to the "Local SharePoint" option in the result source settings screen's "Protocol" section. This means to that it's going to talk to the local farm, verses a remote one or something crazy like Exchange.

Editing the result source

Editing the result source

The rest just falls out: Line #'s 23, 24, and 26 activates, (you can turn these guys on and off) commits, (obviously the save operation) and sets our result source as the default one for the FederationManager. Finally, like the rest of the All Code extension methods (where it makes sense at least) we return the Scope object in case our consumer needs to be perform additional configuration against it.

The Content Source

Our last step back in the DDD search feature receiver is to create the content source, kick off a full crawl, (both done in EnsureContentSource on Line #75) and then save the changes to the service app. You can of course elect to not do the crawl if your requirements don't warrant it. I had to tip-toe around a few booby-traps in the SharePoint search API, which is why the logic in the EnsureContentSource method is a bit clunky. I'll point these out in the final search code listing below:

Code Listing 88: Utilities.cs

  1. public static void EnsureContentSource(this SearchServiceApplication app, string url, string name)
  2. {
  3. //initialization
  4. Uri uri = new Uri(url);
  5. Content content = new Content(app);
  6. //if there's already a content source by this name, there's no updates to make
  7. if (content.ContentSources.Exists(name))
  8. {
  9. //just kick off a crawl full
  10. content.ContentSources[name].StartFullCrawl();
  11. return;
  12. }
  13. //look for this url in the local content source
  14. ContentSource localSource = content.ContentSources[Constants.Search.System.SourceLocalSharePointSites];
  15. Uri startAddress = localSource.StartAddresses.Cast<Uri>().FirstOrDefault(u => u.Equals(uri));
  16. if (startAddress != null)
  17. {
  18. //remove
  19. localSource.StartAddresses.Remove(uri);
  20. localSource.Update();
  21. }
  22. //create content source
  23. ContentSource source = content.ContentSources.Create(typeof(SharePointContentSource), name);
  24. source.StartAddresses.Add(new Uri(url));
  25. source.Update();
  26. //start full crawl
  27. source.StartFullCrawl();
  28. }

Line #7 implements the "Create if not already there" pattern, since, as you'll see, there isn't a lot of metadata to configure for a content source. The aforementioned "booby-traps" are the fact that two content sources can't both index the same URL, and that you can't delete the out-of-the-box "Local SharePoint sites" content source that's current crawling our site.

That's why the logic on Line #15 checks the "local" content source's URLs (StartAddresses) for the one passed to the method. If it's found, it needs to be removed (Line #'s 19-20) from the default before being added to the new one (Line #'s 23-25). Finally, and, like I said, optionally, we start a full crawl of our new content source on Line #27.

That's the search code! Phew. These APIs are among the most intense I've ever worked with. And not because they are overly complimented or sexy; it's the face that they don't map very cleanly to the corresponding UI screens in central administration that makes them hard to work with. Like I said, I needed to use ILSpy (http://ilspy.net) to figure out what to do. If you're ever at a loss as to why the UI is performing something seemingly magical that's not in the API, do the following:

  1. Locate the page you're looking at in 15\TEMPLATE\LAYOUTS (or elsewhere in 15).
  2. Open it up, and determine which namespace and class the page itself (or a user control on the page) is in.
  3. Drag the corresponding DLL from 15\ISAPI into ILSpy
  4. Start digging.

In very very very few instances has the page's code done something truly magical like disappearing into COM or calling non-public methods that I couldn't somehow replicate in my own logic. If you find yourself in such a scenario, perhaps you're too far down a dead end path and need to come up with an alternate approach. Or just System.Reflection.HACK it. No, don't do that.

The Constants

Now let's take a quick look at the constants. You'll notice below that I have them all neatly organized into subclasses to keep things organized. One, however, stands out: Search.System. These are "system" strings used by out-of-the-box SharePoint for various search operations around the API that we need to perform. Also, make sure to update the password in the Constants below. If you'd rather move this to the web.config as a new mod, go right ahead.

Since they aren't specific to the current application's logic, these are reused in almost all of my SharePoint projects. Of course, only put items in Constants that aren't available via an enumeration in the API (such as Microsoft.SharePoint.SPBuiltInContentTypeId). In fact, the two guids in this class are the only ones I haven't found...yet.

Code Listing 89: Constants.cs

  1. public static class Search
  2. {
  3. public class System
  4. {
  5. public const string SearchCenterWebURLKey = "SRCH_ENH_FTR_URL";
  6. public const string SearchCenterSettingsKey = "SRCH_SB_SET_WEB";
  7. public const string SearchCenterSiteURLKey = "SRCH_ENH_FTR_URL_SITE";
  8. public const string CrawledPropertyContentTypeId = "ows_ContentTypeId";
  9. public const string SourceLocalSharePointSites = "Local SharePoint sites";
  10. public const string LocalSharePointSearchProvider = "Local SharePoint Provider";
  11. public static readonly Guid BasicCategory = new Guid("0B63E343-9CCC-11D0-BCDB-00805FCCCE04");
  12. public static readonly Guid SharePointCategory = new Guid("00130329-0000-0130-c000-000000131346");
  13. }
  14. public class SearchCenter
  15. {
  16. public const string Title = "DDD Search";
  17. public const string Url = "/sites/search";
  18. }
  19. public class Source
  20. {
  21. public const string Name = "DDD Result Source";
  22. public const string ContentSource = "DDD Content Source";
  23. public const string QueryTemplateFormat = "{{searchTerms}} {0}:{1}*";
  24. }
  25. public class ManagedProperties
  26. {
  27. public const string Abstract = "Abstract";
  28. public const string CategoryLookup = "CategoryLookup";
  29. public const string DDDContentTypeId = "DDDContentTypeId";
  30. }
  31. public class CrawledProperties
  32. {
  33. public const string Abstract = "ows_Abstract";
  34. public const string CategoryLookup = "ows_CategoryLookup";
  35. }
  36. public class Admin
  37. {
  38. public const string Password = "***";
  39. public const string Username = "Administrator";
  40. public const string Email = "chris@chrisdomino.com";
  41. }
  42. }

The Script

Finally, let's wrap this pretty search deployment package (more puns) up with a pretty PowerShell bow. Recall that our feature ends with kicking off a full crawl of our content source. However, it doesn't make sense to do so against an empty site. So what I like to do is create a new wrapper script in DDD.Common\Deployment called "Searcher" that first kicks off the data creator, and then provisions search. So add a new script (Seacher.ps1) that looks like so:

Code Listing 90: Searcher.ps1

  1. #initialization
  2. param($url = $(Read-Host -prompt "Url"), $path = $(Split-Path -Parent $MyInvocation.MyCommand.Path))
  3. #ensure sharepoint
  4. if((Get-PSSnapin Microsoft.Sharepoint.Powershell -ErrorAction SilentlyContinue) -eq $null)
  5. {
  6. #load snapin
  7. Add-PSSnapin Microsoft.SharePoint.Powershell;
  8. }
  9. #call data creator
  10. $script = Join-Path $path "\DataCreator.ps1";
  11. .$script -Url $url;
  12. #get search feature guid
  13. [System.Reflection.Assembly]::LoadWithPartialName("DDD.Common");
  14. Write-Host;
  15. $search = [DDD.Common.Constants+Features]::Search;
  16. #activate search (on the web application)
  17. Write-Host;
  18. Write-Host ("Activating Search (on the web application)...") -ForegroundColor Magenta;
  19. $script = Join-Path $path "\FeatureEnsureer.ps1";
  20. .$script -Url $url -Id $search -Scope "webapplication";

This isn't doing anything new; as a "wrapper" it simply aggregates calls to more functional PowerShell scripts. After the standard initialization and SharePoint insurance we kick off the data creator on Line #'s 10 and 11 in a different process to ensure that the new commandlets are registered. Once our site has its baseline data, Line #'s 19 and 20 activate the search feature against it. Finally, let's call this script from the end of DoEverythinger.ps1:

Code Listing 91: DoEverythinger.ps1

  1. ...
  2. #search
  3. Write-Host;
  4. Write-Host ("Deploying Search") -ForegroundColor Magenta;
  5. Write-Host;
  6. $script = Join-Path $path "\Searcher.ps1";
  7. .$script -url $siteUrl;

One thing to note is that the script will more likely than not finish before the crawl completes; make sure you wait until the index is fully built before testing the search functionality against a freshly-DDD'd site. I've spent more than a few minutes troubleshooting search issues caused simply by crawls not yet done crawling. But when it's done, with a single PowerShell command, we have a fully-configured SharePoint search-enabled site:

Searching the site

Searching the site

[Next]
Loading...