Share via


SharePoint Development environments: Rock your Search Performance 10 steps.

This article has heavy credits from the Microsoft Best Practices for crawling in SharePoint 2013and Best Practices for search in SharePoint Server 2010.

http://gokanx.files.wordpress.com/2014/05/12.png?w=98&h=100

When configuring the Search Service Application you’ll see that SharePoint (or the Search Service Application) creates and configures for you one content source name “Local SharePoint Sites”. Try to use this for all your SharePoint Sites, even sites from SharePoint Server 2007 or SharePoint Server 2010. The start addresses of all Web applications in the server farm are automatically included as part of the default content source. This content source is not crawled, by default. To index the content in the default content source, you have to either manually start or schedule crawls for it.

You don’t need to create for each site or Web Application to create a new Content Source. When do you need to create other content sources?

  • Performance (if you sites are too heavy, and using Continious crawl makes it impossible). Crawling content can significantly decrease the performance of the servers that host the content.
  • Priorities – Not all sites are important and maybe certain sites can wait until midnight before being crawled

However, to keep administration as easy as possible, Microsoft recommends you to limit the number of content sources that you create and use.

http://gokanx.files.wordpress.com/2014/05/21.png?w=1200&h=162

 

http://gokanx.files.wordpress.com/2014/05/31.png?w=96&h=100

Do not use Continious Crawl in Development/Qualifying/Acceptance Environments or in SharePoint Servers where the RAM and CPU us not respecting any guidelines or Best Practices. Use this only in Production environments and only if it’s asked by the business.

http://gokanx.files.wordpress.com/2014/05/41.png?w=1200&h=366

The main advantage of Continious Crawl is that the system helps us ensuring search-index freshness, even for SharePoint content that is frequently updated. A Continious crawl’s interval is every 15 minutes (this can be changed by PowerShell) but even then, if your Development environment isn’t that good, you can have dramatically performances.

The “Continuous Crawl” has the literary the same effect as an incremental crawl. At the parallel execution of crawls, the “Continuous Crawl” within the parameters defined in the “Crawler Impact Rule” which controls the maximum number of requests that can be executed by the server (default 8).

http://gokanx.files.wordpress.com/2014/05/51.png?w=1200&h=330

http://gokanx.files.wordpress.com/2014/05/6.png?w=98&h=100

You can reduce the effect of crawling on SharePoint crawl targets (that is, SharePoint front-end web servers) by doing the following:

  • For a small SharePoint environment, redirect all crawl traffic to a single SharePoint front-end web server. For a large environment, redirect all crawl traffic to a specific group of front-end web servers. This prevents the crawler from using the same resources that are being used to render and serve web pages and content to active users.
  • Limit search database usage in Microsoft SQL Server to prevent the crawler from using shared SQL Server disk and processor resources during a crawl.

http://gokanx.files.wordpress.com/2014/05/71.png?w=100&h=100

When you reset the search index, all content is immediately removed from the search index and users will not be able to retrieve search results. After you reset the search index, you must perform a full crawl of one or more content sources to create a new search index. Users will be able to retrieve search results again when the full crawl is finished and the new search index is created.

This can be good and bad at the same time. This is good because you data is always clear and good indexed. (Not much data on Developpment environment). This will avoid index issues. But this is extremely bad on Production environments during business hours, because the search results will be unavailable.

http://technet.microsoft.com/en-us/library/jj219652(v=office.15).aspx

http://gokanx.files.wordpress.com/2014/05/81.png?w=106&h=100

Defragment the search database. It’s a common best practice to change the index schema or rebuilding the index to compact and defragment the data. Because SharePoint can’t have its index schema changed, the only option is to rebuild the index for tables or indexes that require it. The search database contains metadata and ACLs of crawled content. Over a series of crawls, the search database can become fragmented. To improve performance of crawls and queries, periodically defragment the search database. For more information, see Database maintenance for SharePoint 2010 Products.

For more information about Database Maintance for Microsoft SharePoint 2010 Products:http://www.microsoft.com/en-us/download/details.aspx?id=24282

For Maintenance Plans: http://technet.microsoft.com/en-us/library/ms189036.aspx , avoid using Update Statistics as well as the Reindex or Rebuild Index tasks as SharePoint handles those via Timer Jobs.

http://gokanx.files.wordpress.com/2014/05/91.png?w=106&h=100

Search Service Application shouldn’t cause so much damage on our SharePoint Farm. Due to the amount of memory he’s asking on some servers the overall performance can be “dramatic”. On SharePoint 2010 I knew we could configure the Search Service Performance Level   with powerhsell.

Powershell Cmdlet:

  • Get-SPEnterpriseSearchService | Set-SPEnterpriseSearchService –PerformanceLevel “PartlyReduced”

**http://gokanx.files.wordpress.com/2014/05/13.png?w=1200&h=346 **

Performance Level Explained

  • Reduced: Total number of threads = number of processors, Max Threads/host = number of processors
  • PartlyReduced: Total number of threads = 4 times the number of processors , Max Threads/host = 16 times the number of processors
  • Maximum: Total number of threads = 4 times the number of processors , Max Threads/host = 16 times the number of processors (threads are created at HIGH priority)

And in fact on SharePoint 2013 it’s the same scenario. After searching I found on TechNet this article: http://technet.microsoft.com/en-us/library/ff608126.aspx  who actually confirms the theory.

Open PowerShell with a Domain Account who has Farm Administrator Rights and hit Get-SPEnterpriseSearchService and as imagined the PerformanceLevel is set up to Maximum.

http://gokanx.files.wordpress.com/2014/05/22.png?w=1200&h=594

To change this to reduced or partly reduced hit Set-SPEnterpriseSearchService -PerformanceLevel Reduced and restart the SharePoint Search if your Virtual machine is still answering. After this manipulation your SharePoint Server will be able to breathe and you can continue to work.

http://gokanx.files.wordpress.com/2014/05/32.png?w=102&h=100

Updating SharePoint Server 2013 with the March 2013 Patches fixes search-related performance problems. Please update it!

For servers that include search components, you have to follow specific steps to ensure that you install the PU correctly. For a high availability search topology, you use Windows PowerShell cmdlets to patch a Search service application.(Detailed instructions here)

http://gokanx.files.wordpress.com/2014/05/42.png?w=110&h=100

 

This one is strictly not supported and not recommended by Microsoft. Once again: This is for Development environments only! As you know, several search services run as noderunner.exe on the servers and they use a lot of memory. In the meanwhile you can limit the memory usage!

Navigate to C:\Program Files\Microsoft Office Servers\15.0\Search\Runtime\1.0 open noderunner.exe.config and look for a line like this:

<nodeRunnerSettings memoryLimitMegabytes=”0″ />

The zero means “unlimited”. The only thing to do is to set to the amount of RAM you’d like to set as a limit for each noderunner.exe processes.

http://gokanx.files.wordpress.com/2014/05/61.png?w=1200&h=668

 

http://gokanx.files.wordpress.com/2014/05/52.png?w=102&h=100

It’s quite possible that the task manager on Windows can show you multiple suspicious process instances for NodeRunner.exe. You can use the few modifications like setting the Performance Level to Reduced or changing the noderunner.exe.config file but still, that can be not enough for you.

So, if you also have this problem and given the fact that you aren’t working on some feature that has a dependency on the search service, my recommendation for you is to stop theSharePoint Search Host Controller service. => http://beamusupscotty.files.wordpress.com/

http://gokanx.files.wordpress.com/2014/05/72.png?w=1200&h=148

http://gokanx.files.wordpress.com/2014/05/82.png?w=98&h=100

Read the best practices and recommendation done by Microsoft and the references used in this blogpost: