Built to scratch a personal itch of understanding how log pipelines work, not how it is done in the real world
Mock services (auth, order, payment) publish logs to Kafka, consumed and indexed into Elasticsearch by a log processor, then republished to a live topic for streaming
Real-time log streaming to a React dashboard over WebSocket, with filtering by log level and service
Historical log search via HTTP to gRPC to Elasticsearch, with cursor-based pagination using search_after
Backend health polling from the frontend to track gateway reachability
Backend in Go with Gin, gRPC, Kafka (Sarama), Gorilla WebSocket, multiple independent services each containerized
Frontend scaffolded with AI assistance in React 19, TypeScript, Vite and Tailwind CSS with a chip-based query builder and paginated results table
Full stack spun up locally via Docker Compose with Kafka, Zookeeper, Elasticsearch, Kibana and Grafana
Personal learning project exploring distributed logging patterns with modern tech stack
Hands-on experience with event-driven architectures and real-time data streaming