Improved application speed
Reduced resource burden of time-consuming database queries
Reduced response latency
Data that requires a slow and expensive query to acquire
It’s always slower and more expensive to acquire data from a database than from a cache. Some database queries are inherently slower and more expensive than others.
For example, queries that perform joins on multiple tables are significantly slower and more expensive than simple, single-table queries. If acquiring the interesting data requires a slow and expensive query, it’s a candidate for caching.
If acquiring the data requires a relatively quick and simple query, it might still be a candidate for caching, depending on other factors.
Relatively static and frequently accessed data
Determining what to cache also involves understanding the data itself and its access patterns.
For example, it doesn’t make sense to cache data that is rapidly changing or is seldom accessed. For caching to provide a meaningful benefit, the data should be relatively static and frequently accessed, such as a personal profile on a social media site.
Conversely, you don’t want to cache data if caching it provides no speed or cost advantage.
For example, it doesn’t make sense to cache webpages that return the results of a search because such queries and results are almost always unique.
Information that can be stale for an extended period of time
By definition, cached data is stale data—even if in certain circumstances it isn’t stale, it should always be considered and treated as stale. In determining whether your data is a candidate for caching, you need to determine your application’s tolerance for stale data. Your application might be able to tolerate stale data in one context but not another. For example, when serving a publicly traded stock price on a website, staleness might be acceptable, with a disclaimer that prices might be up to n minutes delayed. But when serving up the price for the same stock to a broker making a sale or purchase, you want real-time data.
With client-side caching, data is typically stored in a browser rather than making repeated queries to a web server. HTTP cache headers provide the details on how long the browser can fulfill future responses from the cache for the stored web content.
With server-side caching, various web caching techniques can be used to improve the performance of a website by providing caching in between the data’s source and the client.