jamesshocking / Spark-REST-API-UDF
Example of how to leverage Apache Spark distributed capabilities to call REST-API using a UDF
☆50Updated 2 years ago
Alternatives and similar repositories for Spark-REST-API-UDF:
Users that are interested in Spark-REST-API-UDF are comparing it to the libraries listed below
- Delta Lake examples☆217Updated 4 months ago
- Delta Lake helper methods in PySpark☆315Updated 5 months ago
- A Python Library to support running data quality rules while the spark job is running⚡☆173Updated this week
- DBSQL SME Repo contains demos, tutorials, blog code, advanced production helper functions and more!☆47Updated this week
- This repo is a collection of tools to deploy, manage and operate a Databricks based Lakehouse.☆44Updated 3 weeks ago
- Trino dbt demo project to mix and load BigQuery data with and in a local PostgreSQL database☆72Updated 3 years ago
- Code snippets for Data Engineering Design Patterns book☆69Updated 2 weeks ago
- Spark style guide☆257Updated 4 months ago
- Custom PySpark Data Sources☆40Updated last month
- Code samples, etc. for Databricks☆63Updated last month
- Delta Lake Documentation☆48Updated 8 months ago
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆199Updated last week
- Yet Another (Spark) ETL Framework☆18Updated last year
- Delta-Lake, ETL, Spark, Airflow☆46Updated 2 years ago
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆60Updated last year
- A workspace to experiment with Apache Spark, Livy, and Airflow in a Docker environment.☆39Updated 3 years ago
- Materials of the Official Helm Chart Webinar☆27Updated 3 years ago
- Learn how to add data validation and documentation to a data pipeline built with dbt and Airflow.☆166Updated last year
- Demonstration of using Files in Repos with Databricks Delta Live Tables☆30Updated 7 months ago
- PyJaws: A Pythonic Way to Define Databricks Jobs and Workflows☆41Updated 7 months ago
- Code for dbt tutorial☆151Updated 8 months ago
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆234Updated 2 weeks ago
- How to execute a REST API call in Apache Spark the right way, using Scala☆17Updated 2 years ago
- The Trino (https://trino.io/) adapter plugin for dbt (https://getdbt.com)☆226Updated 2 months ago
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 6 months ago
- how to unit test your PySpark code☆28Updated 3 years ago
- Execution of DBT models using Apache Airflow through Docker Compose☆114Updated 2 years ago
- Step-by-step tutorial on building a Kimball dimensional model with dbt☆126Updated 7 months ago
- Don't Panic. This guide will help you when it feels like the end of the world.☆23Updated 8 months ago
- The source code for the book Modern Data Engineering with Apache Spark☆35Updated 2 years ago