Kafka Zero to Hero

Intro + Hands-On for DevOps & Developers

Main Speaker

Learning Tracks

Course ID

52048

Date

22-07-2025

Time

Daily seminar
9:00-16:30

Location

Daniel Hotel, 60 Ramat Yam st. Herzliya

Overview

Who Should Attend

  • DevOps engineers & developers with no Kafka experience
 

Prerequisites

  • Basic comfort with CLI and Docker is required!
  • Python knowledge preferred, but code will be guided step-by-step

Course Contents

  • Introduction to Kafka & Its Ecosystem
    • What is Apache Kafka? Why use it?
    • Kafka vs traditional queues (RabbitMQ, etc.)
    • Overview of ecosystem: Kafka Core, Kafka Connect, Schema Registry, Kafka Streams, KRaft
   
  • Kafka Architecture Deep Dive (KRaft Mode)
    • Brokers, Topics, Partitions, Producers, Consumers
    • How Kafka achieves durability and fault tolerance
    • KRaft architecture (Controller + Brokers)
    • Leader election and replication under KRaft
    • Log-based storage and retention
    • 🔧 Hands-On:
      • Deploy Kafka in KRaft mode using Docker Compose
      • Explore broker logs and controller election
      • Verify topic creation and inspect internal topics
   
  • Topics and Producers
    • What are topics and partitions?
    • Keys and partitioning
    • Delivery semantics (at most once, at least once, exactly once – intro only)
    • Producing data to topics
    • 🔧 Hands-On:
      • Create topics via CLI
      • Produce messages using Kafka CLI and kafka-console-producer.sh
      • Test partitioning using keys
   
  • Consumers and Consumer Groups
    • Pull-based consumption model
    • Offset tracking, auto-commit vs manual commit
    • Consumer groups and parallelism
    • Rebalancing mechanics
    • 🔧 Hands-On:
      • Consume from topic using kafka-console-consumer.sh
      • Demonstrate group-based parallel consumption
      • Observe offset behavior using CLI tools
   
  • Python Producers and Consumers (Generated with ChatGPT)
    • Install & use confluent-kafka or kafka-python
    • Structure of a Kafka producer and consumer
    • GPT-assisted: Generate a full Python producer/consumer pair in real-time
    • 🔧 Hands-On:
      • Use ChatGPT to generate a Kafka producer that sends JSON messages
      • Generate a Kafka consumer that listens and prints formatted output
      • Test sending logs, user events, or mock sensor data
      • Add basic error handling and retries
   
  • Kafka Connect: Basic Ingestion Pipeline
    • Kafka Connect overview
    • Connectors: source vs sink
    • Common use cases (file, database, S3, Elasticsearch)
    • 🔧 Hands-On:
      • Deploy Kafka Connect with FileStream connector
      • Send a log file into Kafka
      • Consume the ingested messages via Python or CLI
   
  • Kafka Use Cases and DevOps Patterns
    • Kafka as a message bus for microservices
    • Event-driven pipelines
    • Kafka in CI/CD and infrastructure workflows
    • Logging, tracing, metrics pipelines (mention only)
    • 🔧 Hands-On Discussion:
      • Small brainstorming of where Kafka fits in real systems
      • GPT-assisted example: building a “log aggregator” or “event bus”
   
  • Mini Project / Wrap-Up
    • Build a small Kafka-powered workflow:
      • Python producer (sends event logs or mock orders)
      • Kafka topic
      • Python consumer (writes to file or logs)
    • Cleanup, review CLI commands, Docker teardown
    • Tips on scaling and what to explore next (Streams, ksqlDB, Schema Registry)
 

The conference starts in

Days
Hours
Minutes
Seconds