CRUD API on Kafka
CRUD API on Kafka
Get started with Zilla by deploying our Docker Compose stack. Before proceeding, you should have Docker Compose installed.
Running this Zilla sample will create a simple API using the Zilla REST Kafka proxy to expose common entity CRUD endpoints with the entity data being stored on Kafka topics. Leveraging Kafka's cleanup.policy=compact
feature, Zilla enables a standard REST backend architecture with Kafka as the storage layer. Adding an Idempotency-Key
header during creation will set the message key
and act as the ID
for the record. A UUID is generated if no key is sent.
- GET - Fetches all items on the topic or Fetch one item by its key using
/:key
. - POST - Create a new item with the
Idempotency-Key
header setting the key. - PUT - Update an item based on its key using
/:key
. - DELETE - Delete an item based on its key using
/:key
.
Setup
Create these files, zilla.yaml
and docker-compose.yaml
, in the same directory.
name: REST-example
bindings:
# Proxy service entrypoint
north_tcp_server:
type: tcp
kind: server
options:
host: 0.0.0.0
port: 7114
exit: north_http_server
north_http_server:
type: http
kind: server
routes:
- when:
- headers:
:scheme: http
:authority: localhost:7114
exit: north_http_kafka_mapping
# Proxy REST endpoints to Kafka a topic
north_http_kafka_mapping:
type: http-kafka
kind: proxy
routes:
- when:
- method: POST
path: /items
exit: north_kafka_cache_client
with:
capability: produce
topic: items-snapshots
key: ${idempotencyKey}
- when:
- method: GET
path: /items
exit: north_kafka_cache_client
with:
capability: fetch
topic: items-snapshots
merge:
content-type: application/json
- when:
- method: GET
path: /items/{id}
exit: north_kafka_cache_client
with:
capability: fetch
topic: items-snapshots
filters:
- key: ${params.id}
- when:
- method: PUT
path: /items/{id}
exit: north_kafka_cache_client
with:
capability: produce
topic: items-snapshots
key: ${params.id}
- when:
- method: DELETE
path: /items/{id}
exit: north_kafka_cache_client
with:
capability: produce
topic: items-snapshots
key: ${params.id}
# Kafka sync layer
north_kafka_cache_client:
type: kafka
kind: cache_client
exit: south_kafka_cache_server
south_kafka_cache_server:
type: kafka
kind: cache_server
options:
bootstrap:
- items-snapshots
exit: south_kafka_client
# Connect to local Kafka
south_kafka_client:
type: kafka
kind: client
options:
servers:
- ${{env.KAFKA_BOOTSTRAP_SERVER}}
exit: south_kafka_tcp_client
south_kafka_tcp_client:
type: tcp
kind: client
telemetry:
exporters:
stdout_logs_exporter:
type: stdout
version: '3'
services:
zilla:
image: ghcr.io/aklivity/zilla:latest
pull_policy: always
depends_on:
- kafka
ports:
- 7114:7114
environment:
KAFKA_BOOTSTRAP_SERVER: "kafka:29092"
volumes:
- ./zilla.yaml:/etc/zilla/zilla.yaml
command: start -v -e
kafka:
image: bitnami/kafka:3.5
hostname: kafka
ports:
- 9092:9092
- 29092:9092
environment:
ALLOW_PLAINTEXT_LISTENER: "yes"
KAFKA_CFG_NODE_ID: "1"
KAFKA_CFG_BROKER_ID: "1"
KAFKA_CFG_CONTROLLER_QUORUM_VOTERS: "1@127.0.0.1:9093"
KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP: "CLIENT:PLAINTEXT,INTERNAL:PLAINTEXT,CONTROLLER:PLAINTEXT"
KAFKA_CFG_CONTROLLER_LISTENER_NAMES: "CONTROLLER"
KAFKA_CFG_LOG_DIRS: "/tmp/logs"
KAFKA_CFG_PROCESS_ROLES: "broker,controller"
KAFKA_CFG_LISTENERS: "CLIENT://:9092,INTERNAL://:29092,CONTROLLER://:9093"
KAFKA_CFG_INTER_BROKER_LISTENER_NAME: "INTERNAL"
KAFKA_CFG_ADVERTISED_LISTENERS: "CLIENT://localhost:9092,INTERNAL://kafka:29092"
KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE: "true"
kafka-init:
image: bitnami/kafka:3.5
command:
- "/bin/bash"
- "-c"
- |
/opt/bitnami/kafka/bin/kafka-topics.sh --bootstrap-server kafka:29092 --create --if-not-exists --topic items-snapshots
depends_on:
- kafka
init: true
Run Zilla and Kafka
docker-compose up --detach
Use curl
to send a greeting
curl -X POST http://localhost:7114/items -H 'Content-Type: application/json' -H 'Idempotency-Key: 1234' -d '{"greeting":"Hello, world"}'
Use curl
to list all of the greetings
curl http://localhost:7114/items
[{"greeting":"Hello, world"}]
Remove the running containers
docker-compose down
See more of what Zilla can do
Go deeper into this concept with the http.kafka.crud example.
Going Deeper
Try out more HTTP Kafka examples: