Scaling Services with an In-Memory Distributed Caches
May 2018 • Presentation
This talk describes the problems of scaling a high-throughput API at GO-JEK and how using the constructs of Golang to build a distributed in-memory cache solved them.
Software Engineering Institute
This talk describes the problems we faced in scaling a high-throughput API at GO-JEK and how using the constructs of Golang to build a distributed in-memory cache eventually solved the problem. Using code as an example, we will discuss the choice of data structure, its time complexity based on the Golang spec, and concerns around thread safety that we encountered.
We will present a few examples of “war stories” that the team faced while implementing it and how we solved them. By learning about the importance of and implementation approach toward the eviction policy, the audience will both be able to understand the significance of cache invalidation and implement one by themselves using the Golang concurrency constructs.
With the ever-increasing demand to handle data, we will also discuss how we implemented sharding to minimize the memory footprint, hence enabling scaling up approximately 10-fold in a short time. By implementing the practices described in this talk, the audience will be able to make an informed decision of choosing whether a distributed in-memory caching approach makes sense for their problem and, if yes, implement one by themselves.