LRU Cache in Java

Caches are often used as a cheap way to reduce latency and improve the performance of applications. Often we leverage libraries like Guava because they are rock solid and have lots of functionality but at the same time, all libs using Guava could easily bring a big jar hell and distributed monolith if you dont be cautious about your shared libraries. So if you are building a service might be completely fine to use more libs but as you build shared libs that could be really bad. So It's always precise and lean about your dependencies. So today I want to show that algorithms do not byte(much) and it's not rocket science to create a simple LRU Cache in Java using ZERO libraries. So I recorded a video going through the code and explaining what I did and how the algorithm works. For sure you could be thinking I prefer to use what's already there, sure thats fine. At the same time blotted applications and shared libs are really a nightmare at scale so somethings are better to copy code or just write some simple implementations yourself. Don't be afraid of algorithms and if you do enough tests you should be fine. So Let's get started!

The Video

The Code

https://github.com/diegopacheco/java-pocs/tree/master/pocs/simple-LRU-cache

Cheers,

Diego Pacheco

Popular posts from this blog

C Unit Testing with Check

Having fun with Zig Language

HMAC in Java