SQL Performance Tuning – 2 Accessing Data
One approach to speed up application response time is to take advantage of cache. Cache in computing means a hardware/software component that stores data for future quickly use. Why we don't use a faster component for as permanent storage but a temporary storage? Sometimes it's because the cost difference, and sometimes it's because technology is not mature enough yet. And the tradeoff of using cache is that we need to handle data sync between multiple versions of same data.
Another approach to speed up application response time is to reduce shared resource contention. I learn this from my Operations Management class, the best way to reduce waiting time is by resource pooling.
Combining these 2 simple principles, I develop a quick check flow for DBA/ Dev to choose how a typical 3 tier web application should access its data:
# | Question | Answer | Recommendation |
1 | Does the web application read data only? | Yes | Go to Question 2 |
No | Go to Question 5. | ||
2 | Does end user need real time data? | Yes | Go to Question 3 |
No | Design the application to publish data into static html file with the back end application, no need to query database in real time and no database access from web application. | ||
3 | Is data changed frequently? | Daily | Design the application to publish data into static html file with the back end application, no need to query database in real time and no database access from web application every day. |
Hourly | Cache the data in application cache, and refresh the cache every hour. | ||
Minutely | Go to Question 4. | ||
4 | How many concurrencies the web application will support? | Few* | One web application connects to one database. |
Many* | Consider to create multiple read-only data source with replication or readable AlwaysOn AG. | ||
5 | How many concurrencies the web application will support? | Few* | One web application connects to one database. |
Many* | There are 2 design methods should be added into web application design to reduce table blocking and increase concurrency:
With SQL Server 2014/2016 OLTP In-Memory Engine, we could boost up the # of concurrency as well. |
* For the few and many threshold value: It depends on the hardware and programming language; you could measure your application by stress testing tool to get the capacity of each node.
--Posted by Shiyang Qiu, July 11, 2016