次の方法で共有


Approaches to optimising SharePoint client side communication

The new SharePoint app model provides a great framework for creating rich SharePoint solutions that run remote to the SharePoint server itself, but this move has brought in to play a new element of performance concerns that the traditional SharePoint developer has not really needed to give a great deal of thought to. In the world of server side code we could just run next to the database server along with SharePoint and the only real consideration was around querying only what we needed so we didn't have a big impact on the SQL server - but now when we are running in a remote model, we need to give consideration to the potential impacts on performance of our applications.

There are a number of things that can be done to optimise and be smart about making remote calls in SharePoint though, and I will walk through some of them in this post. This isn't designed to be an exhaustive list, and you may find your environment to be a little different - this is more of a "food for thought" type of post designed to get you thinking about the concepts.

Understanding the impact of API choice

In SharePoint 2013 there are a number of options open to us as developers to allow us to interact with SharePoint. These are the managed .NET client side object model, the JavaScript object model, REST/oData end points and the Silverlight object model. From a communication stand point most of these options do similar things, they all communicate with the client.svc endpoint (which is mapped out as the _api folder in the REST URL structure). Generally speaking, the skill set you have as a developer or the nature of what it is you are trying to build will play a key role in the selection of an approach here, but it is worth considering the difference between the REST based querying model and the other object models. When you compare how the two work (and Andrew Connell has a great blog post explaining his opinions on this) it is apparent that the REST based approach is definitely chattier than the CSOM based API's. Now I personally line up on the same side of the fence as AC does here and I prefer the REST model because as a developer I'm way more productive and the code I write is cleaner and easier to maintain (my opinion there, you may disagree and you're welcome to) but at the end of the day if you are trying to optimise every last part of the communication between your app and SharePoint, it is worth considering. But lets talk through some scenarios with this:

Scenario 1: You need to load a large result set of data from a SharePoint list

In this scenario say you have a list that contains 4000 or so items. As a general rule of thumb you aren't going to request all 1000 to come back at once as the loading time for that request would be enough to potentially leave your users wondering why the app is loading so slowly - so you decide to take the approach of paging through the data so you load it in small chunks. Here you will be making a single request for each chunk, and so the REST API may prove to be no more chattier than the CSOM based approach (depending on the specifics of what's being loaded).

Scenario 2: You need to load data from multiple lists or objects

In this scenario you will find that you can batch multiple requests to load data in to a single request to SharePoint when you use the CSOM based API's, where as the REST based approach will have you sending two requests over the wire to get to the same data. In a scenario like this you will generally see that the REST option might perform a bit worse when compared to the CSOM approach.

 

Query only for what you need

This might sound immediately obvious, but I felt it worth calling out here and highlighting a couple of approaches to help with this. The simple fact is that if you request more data, you will receive more data - end of story. The more data you receive, the longer it will take to come down the wire to your application. There is some great documentation on MSDN that goes over the concepts of data retrieval in the CSOM API's (at https://msdn.microsoft.com/en-us/library/ee539350(v=office.14).aspx)  which discusses the concept of using different approaches to querying such as how to do LINQ queries and filter to selecting just the properties you need, as well as the difference between the in place load and queryable load methods.

 

Consider where the query is going to run

This is another concept that you need to consider when creating apps - where will your query run? Let me give you a couple of examples here to outline why this is important.

Scenario 1: SharePoint hosted app

This is a fairly straightforward scenario - you have a SharePoint hosted app that uses JavaScript to query a SharePoint list or object. In this case the query is running on the clients computer and it will go through a round trip that is similar to the request made to load the page itself. You don't have any options to really control how this will work, but is worth understanding that the loading time for the query will be subject to the same network conditions that the page load came under.

Scenario 2: Provider hosted app with on-prem SharePoint and on-prem IIS server

Now lets look at the provider hosted app approach in an on-premises model. Here you have your SharePoint servers, one or more IIS servers and the client computer. Here you have options - you can use the managed .NET API or make REST calls from your IIS server to load data and send the result to the browser, or you can use JavaScript to call for the data from SharePoint still. Now the decision about how to code can have an impact depending on what the network structure is like but also the type of querying you are doing. If you are needing to make a single query there might not be a huge difference between the approaches, but if there is a lot of separate requests to load then having it run at the IIS box and have it do whatever transformation or processing you need to do with it there and send it to the client after that's done.

Scenario 3: Provider hosted app with Office365 and Azure hosted web site

The third scenario here is the same as above but swap the on-premises world to the cloud. Here you might find that the link between the Office365 data center and the Azure data center might be quicker than the link your user has between the client browser and SharePoint and/or your app. This again might lead you to want to write code in the Azure website component rather than from the client side as this could result in better performance overall.

 

So you can see that from the above three scenarios, having a good understanding of the impact of where your queries will run could have a big impact on the potential performance of your app.

 

What data will you cache?

When working with data that is remote an obvious way to improve performance is to cache data. This isn't a new concept and there are a lot of ways to cache various different bits of data. If you are using jQuery to make requests to web services it can cache the results for you, or you can look at using HTML5 storage API's to store data locally if you are working with JavaScript. If you are writing a provider hosted app with code running remotely your options are almost endless for how you can go about caching data so it's really up to you to decide what will work and give you the best performance for data that can be cached. This is something that will likely tie in to your business requirements and how you want your app to run, but if you want to optimise your app if you don't need to bring every bit of data down on every request, caching could be a great win for you.

 

Test, test, test!

The concepts I talk about here are just things to consider when you plan your client side based solutions, but the reality there is no single magic bullet that will improve performance in every situation, just the same as there is no magic formula to get the best performance out of SharePoint overall. So what do you need to do to ensure you are getting the results you want? Test! Load testing is something you should be doing to ensure that you are getting appropriate performance out of your environment when it is under peak load. A good load testing strategy can help identify the potential bottlenecks in your application so you know where you need to look at making optimisations if and when problems occur.

 

 

So there is my collection of points to consider when you want to look at optimising the traffic between your users and SharePoint apps. As I mentioned through the post, your millage may vary but make sure you spend the time to think through your architecture and do the appropriate testing to make sure you are making the right decisions about getting the most from your apps!

Comments

  • Anonymous
    March 11, 2014
    Nice post Brian. The "where the query will run" aspect is definitely something I've been thinking of more and more with the various flavours of apps/client APIs etc.Good stuff.Chris.P.S. Good to meet you last week!
  • Anonymous
    March 12, 2014
    The comment has been removed
  • Anonymous
    October 21, 2015
    Is there caching server side for anonymous websites that can be configured for a "SharePoint App" using client side calls? The first query per user takes a long time. If we can have the query just take long for the first user, then all other users receive a similar cache from the server (instead of having to do all of the SQL join commands that make the query slow for lookup lists) it would be a lot better experience for users. We are using a Content Deployment publishing  scenario so the cache can be on the server for an hour, rather than on each users browser at the time the request is made.