Docker
Docker-compose
Postgres
Debezium
Zookeeper
Kafka
- Kafka must produce topicks with changes in Postgres
- Consume created topics
- Installing Docker infrastructure
- Configure Dockerfile image configuration
- Creating SQL initiation script to create and initiate test table
- Configure Docker-compose services:
- postgreSQL 13
- zookeeper
- kafka
- debezium
- Creating schema-registry configuration for
Debezium connect
- Initiating
Docker-compose.yaml
:
$ docker-compose up -d
- Initiating connection between Postgres & debezium
curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" 127.0.0.1:8083/connectors/ --data "@debezium.json"
- Test created connection:
curl -i http://127.0.0.1:8083/connectors/exampledb-connector
- Get default container network name:
docker network list
- change "kafka-net" to container network name and initiate the kafka topics monitoring
docker run --tty --network kafka-net confluentinc/cp-kafkacat kafkacat -b kafka:9092 -C -s key=s -s value=avro -r http://schema-registry:8081 -t postgres.public.student
- Add SQL data to monitor changes in kafka topics:
- open additional terminal window
- open docker container list:
$ docker ps
- enter to container with PostgreSQL:
$ docker exec -it postgres_container_id /bin/sh
- open PSQL session to exempledb database
$ psql -U docker -d exampledb -W
- insert new data to student table:
$ INSERT INTO student VALUES (04, 'Ironman');
-
Kafka will produce new topic with values (04, 'Ironman') in host terminal
-
update data in student table:
$ UPDATE student SET name='Hulk' WHERE id=04;
- Kafka will produce new topic with values (04, 'Hulk') in host terminal