rtdl makes it easy to build and maintain a real-time data lake
rtdl's initial feature set is built and working. You can use the API on port 80 to configure streams that ingest json from an rtdl endpoint on port 8080, process them into Parquet, and save the files to a destination configured in your stream. rtdl can write files locally, to HDFS, to AWS S3, GCP Cloud Storage, and Azure Blob Storage and you can query your data via Dremio's web UI at http://localhost:9047 (login with Username: rtdl
and Password rtdl1234
). rtdl supports writing in the Delta Lake table format as well as integration with the AWS Glue and Snowflake External Tables metadata catalogs.
stream
configurations.rtdl's initial feature set is built and working. You can use the API on port 80 to
configure streams that ingest json from an rtdl endpoint on port 8080, process them into Parquet,
and save the files to a destination configured in your stream. rtdl can write files locally, to
HDFS, to AWS S3, GCP Cloud Storage, and Azure Blob Storage and you can query your data via Dremio's
web UI at http://localhost:9047 (login with Username: rtdl
and Password rtdl1234
).
stream
configurations to JSON files instead of SQL.stream
configurations.rtdl's initial feature set is built and working. You can use the API on port 80 to
configure streams that ingest json from an rtdl endpoint on port 8080, process them into Parquet,
and save the files to a destination configured in your stream. rtdl can write files locally, to
AWS S3, GCP Cloud Storage, and Azure Blob Storage and you can query your data via Dremio's web UI
at http://localhost:9047 (login with Username: rtdl
and Password rtdl1234
).
rtdl's initial feature set is built and working. You can use the API on port 80 to
configure streams that ingest json from an rtdl endpoint on port 8080, process them into Parquet,
and save the files to a destination configured in your stream. rtdl can write files locally, to
AWS S3, GCP Cloud Storage, and Azure Blob Storage and you can query your data with Dremio on port
9047 (login with Username: rtdl
and Password rtdl1234
).
compression_type_id
as 2 for GZIP and 3 for LZOingester
endpoint as a webhook in Segment.
You will need to create a stream with the stream_alt_id
as either the Source ID
or the
Write Key
from the API Keys
tab of Settings
for the Source connected to the Webhook
Destination.rtdl is not full-featured yet, but it is currently functional. You can use the API on port 80 to
configure streams that ingest json from an rtdl endpoint on port 8080, process them into Parquet,
and save the files to a destination configured in your stream. rtdl can write files locally, to
AWS S3, and to GCP Cloud Storage, and you can query your data with Dremio on port 9047 (login with
Username: rtdl
and Password rtdl1234
).
gcp_json_credentials
field in the
createStream
API call. Previously, you had to double-quote everything and flatten.git commit -s -m "..."