7 Design Aspects for DevOps Engineering

Some time ago I posted about some lessons learned by doing DevOps engineering and in the beginning of the year, I was posting some ideas and experiences(good and bad ones) around Design and DevOps Engineering. I do lots of different things on my daily basis like Architecture, Software Engineering, DevOps Engineering, Chaos / Stress Testing, Reliability Engineering, Consultancy, Management / Coaching.  I'm lucky to run a team of great architects and also code every single day,. This provides me great insights since I'm executing things but also reviewing things other people do. There are lots of things I think every day but more and more I think about Design and DevOps Engineering. The right designs and how things get bad and complicated by the lack of good design. There are several elements of good design. I already cover some aspects in other posts and today I will cover other things like Assumptions, The Right Language for the job, Multi X Single language, Troubleshooting, …

Reactive Programming with Akka Streams

Akka-Streams is one of the many interesting and very useful Akka modules. Akka is a powerful actor / reactive framework for the JVM. Akka is an extremely high-performance library -- you can do Up to 50 million msg/sec on a single machine. Small memory footprint; ~2.5 million actors per GB of the heap. Akka is also resilient by Design and follows the principles of the Reactive Manifesto. Akka follows ErlangActor philosophy/ideas.  Many big and successful companies use Akka in production like Blizzard, Intel, Wallmart, Paypal, Amazon, Zalando, Netflix, IGN, VMware UBS, and much many more others.

Deploy & Setup a Cassandra 3.x Cluster on EC2

Cassandra is a Rock solid AP NoSQL database. For this blog post, I will share a simple recipe to deploy a Cassandra cluster on AWS/EC2.

There is the DataStax AMI or DSE you should consider for production workloads.  This recipe I'm sharing is for Amazon Linux(CentOS based) but you can do for Ubuntu or even in Docker if you want to.

Keep in mind this is for Development / Experimentation purpose. I'm not covering proper tunning for your workload, Compaction Strategy and Keyspace design here and you also should be doing this under multiples ASG 1 per AZ ideally. So Let's get started.

Getting Started with Dyno Queues

Dyno-Queue is an interesting queue solution on top of Dynomite. Dyno-queue uses dynomite java driver a.k.a Dyno. Dyno Queue gets all benefits from dynomite like Strong Consistency, High Availability, Stability, High Throughput and Low latency and extend to queue semantics.

I highly recommend you read Netflix post about dyno-queue.

Running Multi-Nodes Akka Cluster on Docker

Akka a great actor framework for the JVM.  Akka is high performance with high throughput and low latency, resilient by Design.  It's possible to use Akka with Java however I always used and recommend you use with Scala.

Create actors systems is pretty easy however creating multiple nodes on Akka cluster locally could be boring and error-prone. So I want to show how easy is to create an Akka cluster using Scala and Docker(Engine & Docker-Compose). We will create a very simple actor system, we will just log events on the cluster however you can use this as a base to do more complex code.

The power of Scala Type Classes

Scala language it was very inspired by Haskell. Type classes in another sample of this and many inspirations from Haskell.

There are always several ways and styles of thinking to address problems. Type classes is another way of thinking and dealing with problems. For some problems, you might just use Pattern Matcher -- But if you have to add a type? Right, I could stick with good old OO since Scala is hybrid right? --  But what if you need to add one operator? Type Classes can help you out.

Running Dynomite on AWS with Docker in multi-host network Overlay

Dynomite is a kick-ass project. Basically, allow you to have strong consistency on top of NoSQL Databases. I've been using dynomite for a while in production(AWS) and I can say the core is rock solid and it just works.

Lots of developers use Windows or Mac for instance and dynomite is built in C and it's really meant for Linux(Like all good things).  So some time ago I made 2 simple projects to get started quickly with dynomite.  Basically, the project creates a simple dynomite 3 node cluster and let you run on your local machine with docker.

There are 2 projects - One to create a dynomite cluster with Redis -- The other with Facebook's RocksDB(Experimental). So you can use it on your local machine to Debug and it works just fine. So why not go 1 step further and run Dynomite in AWS using docker? There are cool benefits if you do this approach.