Kafka producer for recording HTTP traffic
Last updated 5 months ago by kimmo-meeshkan .
MIT · Repository · Bugs · Original npm · Tarball · package.json
$ cnpm install @meeshkanml/http-types-kafka 
SYNC missed versions from official npm registry.


github oclif Version Downloads/week License

Tools for writing ts-http-types to Kafka in Node.js, powered by kafka.js.

The library is a wrapper around kafkajs so it must be installed as peer dependency.


kafkajs must be installed as peer dependency.

$ yarn add kafkajs http-types-kafka
# or
$ npm i kafkajs http-types-kafa

Quick start

First create the topic you're writing to:

$ kafka-topics.sh --bootstrap-server localhost:9092 --topic express_recordings --create --partitions 3 --replication-factor 1

Note that you may need to change script name depending on how you installed Kafka.

Create a HttpTypesKafkaProducer and connect to Kafka:

import { CompressionTypes, KafkaConfig, ProducerConfig } from "kafkajs";
import { HttpTypesKafkaProducer } from "http-types-kafka";

// Create a `KafkaConfig` instance (from kafka.js)
const brokers = ["localhost:9092"];
const kafkaConfig: KafkaConfig = {
  clientId: "client-id",

const producerConfig: ProducerConfig = { idempotent: false };

// Specify the topic
const kafkaTopic = "express_recordings";

// Create the producer
const producer = HttpTypesKafkaProducer.create({
  compressionType: CompressionTypes.GZIP,
  topic: kafkaTopic,

// Connect to Kafka
await producer.connect();

Send a single HttpExchange to Kafka:

const exchange: HttpExchange = ...;
await producer.send(exchange);

Send multiple HttpExchanges:

const exchanges: HttpExchange[] = ...;
await producer.sendMany(exchanges);

Send recordings from a JSON lines file, where every line is a JSON-encoded HttpExchange:

await producer.sendFromFile("recordings.jsonl");

Finally, disconnect:

await producer.disconnect();

Delete the topic if you're done:

$ kafka-topics.sh --bootstrap-server localhost:9092 --topic express_recordings --delete

Command-line interface

See available commands:

$ http-types-kafka


First create the destination topic in Kafka.

To send recordings from recordings.jsonl to Kafka, run:

$ http-types-kafka producer --file=recordings.jsonl --topic=my_recordings


Install dependencies:

$ yarn

Build a package in lib:

$ yarn compile

Run tests:

$ ./docker-start.sh  # Start Kafka and zookeeper
$ yarn test
$ ./docker-stop.sh  # Once you're done

Package for npm:

$ npm pack

Publish to npm:

$ yarn publish --access public

Push git tags:

$ TAG=v`cat package.json | grep version | awk 'BEGIN { FS = "\"" } { print $4 }'`
# Tagging done by `yarn publish`
# git tag -a $TAG -m $TAG
$ git push origin $TAG

Working with local Kafka

First start kafka and zookeeper:

# See `docker-compose.yml`
docker-compose up

Create a topic called http_types_kafka_test:

docker exec kafka1 kafka-topics --bootstrap-server kafka1:9092 --topic http_types_kafka_test --create --partitions 3 --replication-factor 1

Check the topic exists:

docker exec kafka1 kafka-topics --bootstrap-server localhost:9092 --list

Describe the topic:

docker exec kafka1 kafka-topics --bootstrap-server localhost:9092 --describe --topic http_types_kafka_test

Using kafkacat

List topics:

kafkacat -b localhost:9092 -L

Push data to topic from file with snappy compression:

tail -f tests/resources/recordings.jsonl | kafkacat -b localhost:9092 -t http_types_kafka_test -z snappy

Consume messages from topic to console:

kafkacat -b localhost:9092 -t http_types_kafka_test -C

Current Tags

  • 0.0.2                                ...           latest (4 months ago)

3 Versions

  • 0.0.2                                ...           4 months ago
  • 0.0.1                                ...           5 months ago
  • 0.0.0                                ...           5 months ago
Today 0
This Week 0
This Month 0
Last Day 0
Last Week 0
Last Month 0
Dependencies (6)
Dependents (0)

Copyright 2014 - 2016 © taobao.org |