Version Matrix

Travis Codecov Version

FS2 Kafka

Tiny library providing an FS2 wrapper around the official Kafka Java client.
The API is inspired by Alpakka Kafka, and migration should be relatively easy.

This is a new project under active development. Feedback and contributions are welcome.

Getting Started

To get started with sbt, simply add the following line to your build.sbt file.

libraryDependencies += "com.ovoenergy" %% "fs2-kafka" % "0.17.1"

The library is published for Scala 2.11 and 2.12.

Backwards binary compatibility for the library is guaranteed between patch versions.
For example, 0.17.x is backwards binary compatible with 0.17.y for any x > y.


Start with import fs2.kafka._ and use consumerStream and producerStream to create a consumer and producer, by providing a ConsumerSettings and ProducerSettings, respectively. The consumer is similar to committableSource in Alpakka Kafka, wrapping records in CommittableMessage. The producer accepts records wrapped in ProducerMessage, allowing offsets, and other elements, as passthrough values.

import cats.Id
import cats.effect.{ExitCode, IO, IOApp}
import cats.syntax.functor._
import fs2.kafka._
import org.apache.kafka.clients.consumer.ConsumerRecord
import org.apache.kafka.clients.producer.ProducerRecord
import org.apache.kafka.common.serialization.{StringDeserializer, StringSerializer}

import scala.concurrent.ExecutionContext
import scala.concurrent.duration._

object Main extends IOApp {
  override def run(args: List[String]): IO[ExitCode] = {
    val consumerSettings = (executionContext: ExecutionContext) =>
        keyDeserializer = new StringDeserializer,
        valueDeserializer = new StringDeserializer,
        executionContext = executionContext

    val producerSettings =
        keySerializer = new StringSerializer,
        valueSerializer = new StringSerializer,

    val topics ="topic")

    def processRecord(record: ConsumerRecord[String, String]): IO[(String, String)] =
      IO.pure(record.key -> record.value)

    val stream =
      for {
        executionContext <- consumerExecutionContextStream[IO]
        consumer <- consumerStream[IO].using(consumerSettings(executionContext))
        producer <- producerStream[IO].using(producerSettings)
        _ <- consumer.subscribe(topics)
        _ <-
          .mapAsync(25)(message =>
              .map {
                case (key, value) =>
                  val record = new ProducerRecord("topic", key, value)
                  ProducerMessage.single[Id].of(record, message.committableOffset)
            .to(commitBatchWithinF(500, 15.seconds))
      } yield ()