Apache Kafka
Introduction:
What is event Streaming?
Event is basically any any change in logs of your app. When we share our logs to anyone live that is known as event streaming.
i.e: When we use youtube streaming. Our streaming is avaliable for public so they can watch our streaming to see what we are doing etc? So If we are sharing our gameplay with public. In this example any change in the stream is a event and we are sharing our every event with public is a event streaming.
Technically speaking, event streaming is the practice of capturing data in real-time from event sources like databases, sensors, mobile devices, cloud services, and software applications in the form of streams of events; storing these event streams durably for later retrieval; manipulating, processing, and reacting to the event streams in real-time as well as retrospectively; and routing the event streams to different destination technologies as needed. Event streaming thus ensures a continuous flow and interpretation of data so that the right information is at the right place, at the right time.
What is EDA?
Advantages of EDA
Disadvantages of EDA
EDA is a powerfull approach for building scalableand responsive apps
Synchronous Inter Services Messages between Microservices
There are two main approaches to inter-service communication in a microservices architecture:
Both synchronous and asynchronous communication have their advantages and disadvantages:
The best choice for your microservices communication will depend on the specific needs of your application. Consider factors like latency requirements, message volume, and desired level of coupling between services when making your decision.
Synchronous Communication with FastAPI, Docker, and Poetry
Code Examples: FastAPI, Docker, and Poetry Here are some code snippets to get you started:
FastAPI - Simple Hello World Endpoint:
Python
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
async def root():
return {"message": "Hello World!"}
This code defines a basic FastAPI application with an endpoint (/). The async def root function is an asynchronous handler, but it can be used for synchronous communication as well.
Dockerfile (Basic): Dockerfile
FROM python:3.9
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
Use code with caution. content_copy This Dockerfile defines a container based on the Python 3.9 image. It copies the requirements.txt file (which would list your FastAPI dependencies) and installs them. Then, it copies your application code and finally runs the uvicorn command to start the FastAPI app.
from sqlmodel import Field, SQLModel
class User(SQLModel):
id: int = Field(default=None, primary_key=True)
name: str = Field(max_length=255)
This code defines a basic User model for a database. The id field is the primary key and the name field is a string.
A Challenge: FastAPI Event Driven Microservices Development With Kafka, KRaft, Docker Compose, and Poetry
Kafka 3.7 Docker Images
Follow this Quick Start with Docker and KRaft: https://kafka.apache.org/quickstart
Get the docker image
Recommended by LinkedIn
docker pull apache/kafka:3.7.0
Start the kafka docker container
docker run -p 9092:9092 apache/kafka:3.7.0
Open another console and check to see if container running:
docker ps
Copy the container name, and give the following command to attach:
docker exec -it <container_name> /bin/bash/
Note: you can also use first four digit of container_id
After getting in intractive mode check your position:
ls
Note: Kafka commands are in this directory in the container
cd /opt/kafka/bin
then:
ls
Create a Topic to store your Events
Ready to store your events in Kafka? Here's what you need to do first:
So before you can write your first events, you must create a topic. Run:
/opt/kafka/bin/kafka-topics.sh --create --topic quickstart-events --bootstrap-server localhost:9092
All of Kafka's command line tools have additional options:
Note: run the kafka-topics.sh command without any arguments to display usage information. For example, it can also show you details such as the partition count of the new topic:
/opt/kafka/bin/kafka-topics.sh --describe --topic quickstart-events --bootstrap-server localhost:9092
Write events to a Kafka topic
A Kafka client communicates with the Kafka brokers via the network for writing (or reading) events. Once received, the brokers will store the events in a durable and fault-tolerant manner for as long as you need—even forever.
Run the console producer client to write a few events into your topic. By default, each line you enter will result in a separate event being written to the topic.
/opt/kafka/bin/kafka-console-producer.sh --topic quickstart-events --bootstrap-server localhost:9092
Generate enents:
Hello this is my event
I'm Hamza Waheed
Read the Event
Open another terminal session and run the console consumer client to read the events you just created:
/opt/kafka/bin/kafka-console-consumer.sh --topic quickstart-events --from-beginning --bootstrap-server localhost:9092
You should see the events you just created:
Hello this is my event
I'm Hamza Waheed
Because events are durably stored in Kafka, they can be read as many times and by as many consumers as you want. You can easily verify this by opening yet another terminal session and re-running the previous command again.
Very informative!