What you'll learn:
- Introduction to AVRO and its advantages of using them for sharing messages between applications
- Publish AVRO records using Kafka Producer
- Introduction to Schema Registry
- Consume AVRO records using Kafka Producer
- Use Schema Registry to register the AVRO Schema
- Learn how Kafka Producer and Consumer interacts with the Schema Registry
- Enforce Data Contracts using Schema Registry
- Evolve AVRO schema using Schema Registry
- Build Spring Boot Kafka Producer and Consumer applications that uses AVRO as a serialization format and interacts with Schema Registry
This course is structured to give you a theoretical and coding experience of Building Kafka Applications using AVRO and Schema Registry.
If you are looking forward to learning the below-listed things:
Techniques that are available to evolve the data between the applications that uses Kafka as a Streaming Platform
Use a compact data format like AVRO to exchange data between the applications
Use Schema Registry and its benefits
Enforcing Data Contracts between applications that uses Kafka as a Streaming Platform
Handle Data evolution gracefully using Schema Registry
Then this is the right course for you. This is a pure hands-on oriented course where you will be learning the concepts through code.
By the end of this course, you will have a complete understanding of these concepts:
Use AVRO as a data serialization format
Evolution of the data using Schema Registry
Getting Started with Kafka
In this section, I will give you all an introduction to the course and what to expect from this course.
Data Contract & Serialization in Kafka
Learn "How serialization is connected to Kafka" and how it benefits the overall Kafka architecture.
We will look into different Serialization formats and the support for Schema in AVRO, Protobuf and Thrift
Introduction to AVRO - A data serialization system
An introduction to AVRO and why AVRO is popular to work with Kafka and Schema Registry.
Learn to build a simple AVRO schema.
Kafka Setup & Demo in Local Using Docker
In this section , we will setup Kafka in local, Produce and Consume messages using Kafka Console Producer and Consumer.
Greeting App - Base AVRO Project SetUp - Gradle
We will set up the base project for the greeting app which we can use to generate the Java Classes from the Greetings schema using Gradle build tool.
Greeting App - Base AVRO Project SetUp - Maven
We will set up the base project for the greeting app which we can use to generate the Java Classes from the Greetings schema using Maven build tool.
Build AVRO Producer and Consumer in Java
We will learn to build a Kafka Producer to publish AVRO records in to the Kafka topic.
We will learn to build a Kafka Consumer to consume AVRO records from the Kafka topic.
CoffeeShop Order Service Using AVRO - A Real time use Case
We will build a AVRO schema for a real time use case and build Kafka Producers and Consumers to it.
Logical Types in AVRO
I will cover the different logical types in AVRO and how to use them.
TimeStamp
Decimal
UUID
Date
AVRO Record- Under the Hood
Anatomy of an AVRO record when the data is published and consumed as AVRO record
Schema Changes in AVRO
Demonstration of how the consumer breaks with changing business requirements
Data Evolution using Schema Registry
Cover the different techniques of evolving a Schema with the changing business requirements.
I will cover the different Compatibility techniques to share data between the producer and consumer applications
Backward Compatibility
Forward Compatibility
Full Compatibility
None Compatibility
Schema Naming Strategies
I will cover the different naming strategies for Schema and how its impacts the application events.
TopicName Strategy
RecordName Strategy
TopicRecordName Strategy
Build a Coffee Order Service using SpringBoot & Schema Registry
In this section, we will code and build a Spring Boot Kafka application that exchanges the data in an AVRO format and interacts with Schema Registry for data evolution.
Build a RestFul service to publish the events through which we receive events through the rest interface and then publish them to Kafka
By the end of this course, you will have a complete understanding of these concepts:
Use AVRO as a data serialization format
Evolution of the data using Schema Registry