Delta Lake extension
Delta Lake is an open source storage framework that enables building a
Lakehouse architecture with various compute engines. DeltaLakeInputSource lets
you ingest data stored in a Delta Lake table into Apache Druid. To use the Delta Lake extension, add the druid-deltalake-extensions
to the list of loaded extensions.
See Loading extensions for more information.
The Delta input source reads the configured Delta Lake table and extracts the underlying Delta files in the table's latest snapshot based on an optional Delta filter. These Delta Lake files are versioned Parquet files.
Version support
The Delta Lake extension uses the Delta Kernel introduced in Delta Lake 3.0.0, which is compatible with Apache Spark 3.5.x. Older versions are unsupported, so consider upgrading to Delta Lake 3.0.x or higher to use this extension.
Downloading Delta Lake extension
To download druid-deltalake-extensions
, run the following command after replacing <VERSION>
with the desired
Druid version:
java \
-cp "lib/*" \
-Ddruid.extensions.directory="extensions" \
-Ddruid.extensions.hadoopDependenciesDir="hadoop-dependencies" \
org.apache.druid.cli.Main tools pull-deps \
--no-default-hadoop \
-c "org.apache.druid.extensions.contrib:druid-deltalake-extensions:<VERSION>"
See Loading community extensions for more information.
Known limitations
This extension relies on the Delta Kernel API and can only read from the latest Delta table snapshot. Ability to read from arbitrary snapshots is tracked here.