Chapter 24: HTTP Caching
Caching is a traditional method for improving the performance of web applications. The HTTP protocol, used for distributed information systems, makes it possible to significantly reduce latency and server load by caching server responses. An HTTP cache is a storage layer (e.g., in the browser, a reverse proxy server, or a CDN) that keeps copies of previously retrieved server responses. The role of the cache can be fulfilled by the client (such as a web browser), intermediate components (e.g., proxy servers, CDNs), or servers themselves.
Caches can be divided into private and public (shared). A private cache is reserved for a single user and stores responses that may be specific to that user’s session. A public (shared) cache, on the other hand, stores responses that are not tied to a particular user. Shared caches are typically located between the client and the server. Common examples include reverse proxies and CDNs (Content Delivery Networks).