Import Data
This page lists options for importing data into Apache Pinot™ with links to detailed instructions with examples.
There are multiple options for importing data into Apache Pinot™. The pages in this section provide step-by-step instructions for importing records into Pinot, supported by our plugin architecture. The intent is to get you up and running with imported data as quickly as possible.
Pinot supports multiple file input formats without needing to change anything other than the file name. Each example imports a readsdsdy-made dataset so you can see how things work without needing to find or create your own dataset.
Pinot Batch Ingestion
These guides show you how to import data from popular big data platforms.
SparkHadoopPinot Stream Ingestion
This guide shows you how to import data using stream ingestion from Apache Kafka topics.
Ingest streaming data from Apache KafkaThis guide shows you how to import data using stream ingestion with upsert.
Stream ingestion with UpsertThis guide shows you how to import data using stream ingestion with deduplication.
Stream ingestion with DedupThis guide shows you how to import data using stream ingestion with CLP.
Stream ingestion with CLPPinot file systems
By default, Pinot does not come with a storage layer, so all the data sent won't be stored in case of system crash. In order to persistently store the generated segments, you will need to change controller and server configs to add a deep storage. See File systems for all the info and related configs.
These guides show you how to import data and persist it in these file systems.
Amazon S3Azure Data Lake StorageGoogle Cloud StorageHDFSPinot input formats
This guide shows you how to import data from various Pinot-supported input formats.
Input formatsThis guide shows you how to handle the complex type in the ingested data, such as map and array.
Complex Type (Array, Map) HandlingThis guide shows additional examples on how to work with complex types.
Complex Type ExamplesThis guide shows you how to handle records with dynamic schemas, like JSON log events.
Ingest records with dynamic schemasReloading and uploading existing Pinot segments
This guide shows you how to reload Pinot segments from your deep store.
Reload a table segmentThis guide shows you how to upload Pinot segments from an old, closed Pinot instance.
Upload a table segmentLast updated