kafka-connect-spooldir

A Kafka Connect connector reading delimited files from the file system.

Лицензия

Лицензия

Apache License 2.0
Группа

Группа

com.github.jcustenborder.kafka.connect
Идентификатор

Идентификатор

kafka-connect-spooldir
Последняя версия

Последняя версия

1.0.31
Дата

Дата

Тип

Тип

tar.gz
Описание

Описание

kafka-connect-spooldir
A Kafka Connect connector reading delimited files from the file system.
Ссылка на сайт

Ссылка на сайт

https://github.com/jcustenborder/kafka-connect-spooldir
Система контроля версий

Система контроля версий

https://github.com/jcustenborder/kafka-connect-spooldir

Скачать kafka-connect-spooldir

Зависимости

compile (7)

Идентификатор библиотеки Тип Версия
com.opencsv : opencsv jar 3.10
org.apache.commons : commons-compress jar 1.16.1
com.github.jcustenborder.parsers : extended-log-format jar [0.0.1.2, 0.0.1.1000)
com.fasterxml.jackson.core : jackson-databind jar 2.8.5
org.reflections : reflections jar 0.9.10
com.google.guava : guava jar 18.0
com.github.jcustenborder.kafka.connect : connect-utils jar [0.3.33,0.3.1000)

provided (2)

Идентификатор библиотеки Тип Версия
net.sourceforge.argparse4j : argparse4j jar 0.7.0
org.apache.kafka : connect-api jar 1.0.0

test (5)

Идентификатор библиотеки Тип Версия
com.github.jcustenborder.kafka.connect : connect-utils-testing jar [0.3.33,0.3.1000)
org.junit.jupiter : junit-jupiter-engine jar 5.0.0
org.junit.jupiter : junit-jupiter-api jar 5.0.0
org.mockito : mockito-core jar 2.6.3
ch.qos.logback : logback-classic jar 1.1.8

Модули Проекта

Данный проект не имеет модулей.

Introduction

Documentation

Installation through the Confluent Hub Client

This Kafka Connect connector provides the capability to watch a directory for files and read the data as new files are written to the input directory. Each of the records in the input file will be converted based on the user supplied schema.

The CSVRecordProcessor supports reading CSV or TSV files. It can convert a CSV on the fly to the strongly typed Kafka Connect data types. It currently has support for all of the schema types and logical types that are supported in Kafka Connect. If you couple this with the Avro converter and Schema Registry by Confluent, you will be able to process CSV, Json, or TSV files to strongly typed Avro data in real time.

Schema Less Json Source Connector

This connector is used to stream <https://en.wikipedia.org/wiki/JSON_Streaming>_ JSON files from a directory while converting the data based on the schema supplied in the configuration.

CSV Source Connector

The SpoolDirCsvSourceConnector will monitor the directory specified in input.path for files and read them as a CSV converting each of the records to the strongly typed equivalent specified in key.schema and value.schema.

Json Source Connector

This connector is used to stream <https://en.wikipedia.org/wiki/JSON_Streaming> JSON files from a directory while converting the data based on the schema supplied in the configuration.

Line Delimited Source Connector

This connector is used to read a file line by line and write the data to Kafka.

Extended Log File Format Source Connector

This connector is used to stream Extended Log File Format <https://www.w3.org/TR/WD-logfile.html> files from a directory while converting the data to a strongly typed schema.

Development

Building the source

mvn clean package

Версии библиотеки

Версия
1.0.31
1.0.30
1.0.29
1.0.28
1.0.27
1.0.26
1.0.24
1.0.23
1.0.22
1.0.21
1.0.17
1.0.16
1.0.14
1.0.13