kafka-ops
Kafka Operations
Expert in Apache Kafka deployment, monitoring, and operational tooling.
⚠️ Chunking Rule
Large Kafka infrastructure = 800+ lines. Generate ONE component per response:
- Deployment → 2. Monitoring → 3. CLI Tools → 4. Automation
Core Capabilities
Kubernetes Deployment
- Strimzi Operator: Open-source Kafka on K8s
- Confluent for Kubernetes: Enterprise Kafka
- MSK/Confluent Cloud: Managed services
Infrastructure as Code
- Terraform modules for Kafka clusters
- AWS MSK, Confluent Cloud, Aiven provisioning
- Network and security configuration
Observability
- Prometheus metrics (JMX exporter)
- Grafana dashboards for Kafka
- Consumer lag monitoring
- Alert configuration
CLI Tools
- kcat: Swiss army knife for Kafka
- kafkactl: Modern CLI for Kafka
- kafka-console-*: Built-in tools
Kubernetes Deployment
# Strimzi Kafka Cluster
apiVersion: kafka.strimzi.io/v1beta2
kind: Kafka
metadata:
name: my-cluster
spec:
kafka:
replicas: 3
listeners:
- name: plain
port: 9092
type: internal
tls: false
storage:
type: persistent-claim
size: 100Gi
zookeeper:
replicas: 3
storage:
type: persistent-claim
size: 50Gi
Terraform
# AWS MSK Cluster
resource "aws_msk_cluster" "kafka" {
cluster_name = "my-kafka-cluster"
kafka_version = "3.5.1"
number_of_broker_nodes = 3
broker_node_group_info {
instance_type = "kafka.m5.large"
client_subnets = var.private_subnets
security_groups = [aws_security_group.kafka.id]
storage_info {
ebs_storage_info {
volume_size = 100
}
}
}
}
Monitoring
# Prometheus scrape config
- job_name: 'kafka'
static_configs:
- targets: ['kafka-1:9404', 'kafka-2:9404', 'kafka-3:9404']
relabel_configs:
- source_labels: [__address__]
target_label: instance
Key metrics to monitor:
kafka_server_brokertopicmetrics_messagesin_totalkafka_consumer_consumer_fetch_manager_metrics_records_lagkafka_server_replicamanager_underreplicatedpartitions
CLI Examples
# kcat - produce message
echo '{"event":"order.created"}' | kcat -P -b localhost:9092 -t orders
# kcat - consume messages
kcat -C -b localhost:9092 -t orders -o beginning
# kafkactl - describe topic
kafkactl describe topic orders
# kafkactl - consumer groups
kafkactl get consumer-groups
kafkactl describe consumer-group order-processor
When to Use
- Deploying Kafka on Kubernetes
- Setting up Kafka with Terraform
- Configuring monitoring and alerts
- Operational tasks with CLI tools
- Troubleshooting Kafka issues
More from anton-abyzov/specweave
technical-writing
Technical writing expert for API documentation, README files, tutorials, changelog management, and developer documentation. Covers style guides, information architecture, versioning docs, OpenAPI/Swagger, and documentation-as-code. Activates for technical writing, API docs, README, changelog, tutorial writing, documentation, technical communication, style guide, OpenAPI, Swagger, developer docs.
45spec-driven-brainstorming
Spec-driven brainstorming and product discovery expert. Helps teams ideate features, break down epics, conduct story mapping sessions, prioritize using MoSCoW/RICE/Kano, and validate ideas with lean startup methods. Activates for brainstorming, product discovery, story mapping, feature ideation, prioritization, MoSCoW, RICE, Kano model, lean startup, MVP definition, product backlog, feature breakdown.
43kafka-architecture
Apache Kafka architecture expert for cluster design, capacity planning, and high availability. Use when designing Kafka clusters, choosing partition strategies, or sizing brokers for production workloads.
34docusaurus
Docusaurus 3.x documentation framework - MDX authoring, theming, versioning, i18n. Use for documentation sites or spec-weave.com.
29frontend
Expert frontend developer for React, Vue, Angular, and modern JavaScript/TypeScript. Use when creating components, implementing hooks, handling state management, or building responsive web interfaces. Covers React 18+ features, custom hooks, form handling, and accessibility best practices.
29reflect
Self-improving AI memory system that persists learnings across sessions in CLAUDE.md. Use when capturing corrections, remembering user preferences, or extracting patterns from successful implementations. Enables continual learning without starting from zero each conversation.
27