Skip to main content

Spring Cloud Stream Using Kafka




Recently I've been digging in event-driven architectures for service communication. For demo followed broker approach instead of Feed approach, as broker approach has dedicated broker responsible for distributing the events to n number of subscribers.

For demo we will use most popular open-source Kafka broker (https://kafka.apache.org).

As spring (http://spring.io)  always makes life easy 😎, Spring-Cloud-Stream project by spring makes Kafka integration in your application very easy 👍 with few annotations.

Spring Cloud Stream provides Binder implementations for KafkaRabbit MQRedis, and Gemfire.

Note: Before we proceed make sure you have installed ZooKeeper and Kafka & both are running.



Dependencies Add following dependencies to your POM for using KAFKA.




Use Case: We will have One producer (demo-producer application) & two consumers (demo-consumer application & demo-consume1 application).

Producer raises event on Message Broker on Topic "demochannel",  Both Consumers has subscribed on Message Broker on Same Topic "demochannel" to consume event. This means as soon as event is raised, broker will notify respective subscriber to take action.



Producing Event: Create Spring Boot Project(demo-producer) with above dependencies. Create Main Application class : DemoProduceApplication.java as below


Create SourceChannel Interface

The @Output annotation identifies an output channel, through which published messages leave the application.

Create MessageProducer which will put message every second.


Add the @EnableBinding annotation to your application to get immediate connectivity to a message broker. 

Apart from raising event every second, we will create event using REST. So define rest-controller


application.properties looks like this.




Consuming Events: Create Spring-Boot Project(demo-consumer) with above dependencies Create Main Application class: DemoConsumerApplication.java

Create Sink Interface


The @Input annotation identifies an input channel, through which received messages enter the application. application.properties



Follow exactly same step as demo-consumer (defined above) for creating demo-consumer2 project, Once all required files are created change application.properties as below ( need to change only server port)






Steps to Exceute

Start zookeeper which is installed - sh zkServer.sh start. 

Start Kafka Server: sh bin/kafka-server-start.sh config/server.properties

Start: Producer Application (demo-producer) 

Start Both Application (demo-consumer & demo-consumer1)

Check - Console for both consumer


Hit REST endpoint using CURL: Here you see once request is served it raise events and which is consume by two subscribers.

Cheers – Happy learning 🙂

Ketan Gote

Comments

Popular posts from this blog

Centralized configuration using Spring Cloud Config

In this blog we will be focusing on centralized configuration using  Spring Cloud Config  project. For single standalone application we can keep all required configuration bundle with application itself.  However, when we have more than one application, say in a microservice architecture, a better alternative would be to manage the configurations centrally. With the Config Server we have a central place to manage external properties for applications with support for different environments. Configuration files in several formats like YAML or properties are added to a Git repository. Features Spring Cloud Config Server features: HTTP, resource-based API for external configuration (name-value pairs, or equivalent YAML content) Encrypt and decrypt property values (symmetric or asymmetric) Embeddable easily in a Spring Boot application using  @EnableConfigServer Config Client features (for Spring applications): Bind to the Config Server and initialize...

Redis Basic CRUD

We have seen how to setup on your linux machine here , now we will see how to perform basic CRUD operations using Spring Data & Redis server We will be creating a simple application that would persist an employee information into redis database. You will be needing certain JARs like jedis.jar, spring-data-redis.jar etc details of which you can download and view at https://github.com/meta-magic/RedisCRUDexample.git  First of all we will be creating the Employee entity, plz note that we are using the Serializable interface which automatically mapps the objects against the key. Employee.java import java.io.Serializable ; public class Employee implements Serializable { private static final long serialVersionUID = - 8243145429438016231L ; public enum Gender { Male , Female } private String id; private String name; private Gender gender; private int age; public String getId () { return id; } public void setId ( Str...

Eureka-Server with spring cloud netflix

In this write-up we will focus on Service Registry – Eureka Server Rest service (auth-service application, eureka client) which register itself to registry. Web application which consumes Rest service using service registry. Service discovery  allows services to find and communicate with each other without hardcoding hostname and port. Eureka Server In spring-boot application enable the Eureka-Server by adding @EnableEurekaServer annotation to spring boot application. We have put all the configuration on GIT and this is accessed using config-server. To now more about centralized configuration (config-server) click  here eurekaserver.yml Place below bootstrap.yml in application, it basically connects to config-server and gets it required configuration. Start the spring-boot application and access eureka server using http://localhost:8760/ you will get below screen. Right now there is no application which...