Is Redis slowing down your application?

Surprised? It’s counter-intuitive to see that Redis, a cache which is usually introduced to improve performance, can actually slow down the application. I used to think, that the cache is something that should always be fast, until I’ve found that my APIs using Redis have several hundred miliseconds of latency. Today, I would like to show you, how I found and fixed that performance bottleneck.


Getting started with Kubernetes

So far I’ve been deploying all my applications on VMs directly using provisioning tools (Ansible) or for simpler applications deployment scripts (like Python’s Fabric). In case of applications running on AWS, I was leveraging a potential of using AMI images for EC2 instances to speed up new instance creation within AutoScaling groups. Although that way of deploying apps proved to be very reliable, running for years in production with no issues, I was looking for some ways to make deployments even more unified across all projects in the organization. Docker turns out to be a good candidate to fulfill such requirements, it defines a deployment unit - container, which is built and always launched in the same way regardless of the app inside, whether it’s Python, Node.js or Java application, the process of running container doesn’t change.