Kube Cloud Pt5 | Create a Consumer Service

Kube Cloud Pt5 | Create a Consumer Service

Kube Cloud Pt5 | Streaming Events with Kafka

full course
  1. Kube Cloud Pt5 | Create a Consumer Service
  2. Kube Cloud Pt5 | Create an Event Publisher
  3. Kube Cloud Pt5 | Create an Event Consumer

In this course we’re going to migrate the repository storage function out of cloud-application and migrate it into a new service called message-repository. Although this is a relatively minor change in behavior (in exchange for a significant effort), there are some real world benefits that this example will provide

  • cloud-application service performance will significantly improve. Most of the time spent in this application, if you review the traces we implemented last session, is spent writing to mongodb and waiting for its response.
  • By emitting an event, we can have multiple listeners consume that event and improve the flexibility of our application design. If a new feature was added (logging to a reporting database, for example) we can just pipe the event to that system.

But more importantly events are just fun (IMHO) and provide many flexible design patterns such as a event processing stream which can turn something that might have been implemented as a multi-stage batch process into a scalable real time processing workflow.

There are some downsides to pay attention to:

  • Event driven architecture comes with a cost in terms of complexity
  • They can take more time to build
  • There are more failure points
  • They can be slower

These can all be mitigated with a good design. In fact, event driven systems can be faster than traditional synchronous systems when used properly.

Spring Cloud Stream

Spring cloud stream is an event streaming library that allows you to easily chain together event producers, processors and consumers via functions. Although I’m not very proficient with functional design, the concept overlays onto the event streaming concept well. This also allows you to build upon a great wealth of enterprise integration patterns that have been developed over time. Essentially, you can create functions that encapsulate single responsibility concerns and chain them together through the integration patterns. It’s very powerful, but we’re not going to cover any of it here.

Kafka

Everyone seems to love kafka these days (for good reason) and my goal here is not to get into the details of kafka, but build a simple messaging system that barely scratches the surface of what kafka can do. For this, we’ll be using confluent kafka as our cloud based provider. You’ll need to setup an account there to get started.

Setting up a Confluent Account

Click get started free

I don’t have screens for the whole login process, but you can figure it out. You will have to provide a credit card, but we’re not going to get close to being charged in our example. Additionally, I believe that there are many ways to get additional credits if necessary.

Once we get to the point where we can create a cluster we’ll follow this process. Create a Basic cluster

Create an AWS cluster in us-east-1 with single availability (we don’t need failover)

Name your cluster and launch it

Now we’ve got a cluster and we’ll have to update our services to publish and consume from it.

0 comments on “Kube Cloud Pt5 | Create a Consumer ServiceAdd yours →

Leave a Reply

Your email address will not be published. Required fields are marked *