Caching is a method of persisting data temporarily in a RAM memory. It is useful for storing information temporarily when you are making heavy computations.
Types of caching
Caching Strategies
Advantages
- Makes your slow queries much faster, that means it reduces latency
- Allows you to reduce load on the database by reading from the cache instead of a database when common queries are requested
- Helps to reduce network cost, as you are making less steps to access data
- When data is accessed quite often, you can keep that data much closer to the user or in a faster memory. Quite commonly used in situations where, for example, you have a social media application, and there are certain profiles or posts that are frequently accessed.
Disadvantages
- When adding cache, you need to ensure that you have cache invalidation strategy in place. If not implemented correctly, it may give you unexpected results. It is a good practice to set Time To Live (TTL) for your cache to self-clean-up at the end of the day to ensure that your cache does not have any stale data.
- Cache is stored in a RAM memory, which usually is way smaller than Database, so you need to use some techniques to remove some data to make space for new data
AWS Cache Architecture
Below is a diagram of a typical architecture of cache on AWS using AWS ElastiCache
