ognis1205 / unitycatalog-explorer-legacy
Unity Catalog Explorer is a TypeScript + Next.js based Web UI for the Unity Catalog OSS.
☆13Updated 10 months ago
Alternatives and similar repositories for unitycatalog-explorer-legacy
Users that are interested in unitycatalog-explorer-legacy are comparing it to the libraries listed below
Sorting:
- Delta Lake helper methods in PySpark☆323Updated 8 months ago
- Custom PySpark Data Sources☆50Updated 2 weeks ago
- The Lakehouse Engine is a configuration driven Spark framework, written in Python, serving as a scalable and distributed engine for sever…☆246Updated 3 months ago
- Pythonic Programming Framework to orchestrate jobs in Databricks Workflow☆215Updated last week
- Never sift through endless dbt™ logs again. dbt Command Center is a free, open-source, local web application that provides a user-friendl…☆28Updated 3 weeks ago
- This repo is a collection of tools to deploy, manage and operate a Databricks based Lakehouse.☆45Updated 3 months ago
- Unity Catalog UI☆40Updated 8 months ago
- Notebooks to learn Databricks Lakehouse Platform☆24Updated last week
- Delta Lake examples☆224Updated 7 months ago
- A lightweight Python-based tool for extracting and analyzing data column lineage for dbt projects☆159Updated last month
- SQL Queries & Alerts for Databricks System Tables access.audit Logs☆28Updated 7 months ago
- This is the main repository for SDF documentation found at docs.sdf.com, as well as public schemas, benchmarks, and examples☆119Updated 3 months ago
- Code samples, etc. for Databricks☆64Updated last month
- Data product portal created by Dataminded☆185Updated this week
- Demo DAGs that show how to run dbt Core in Airflow using Cosmos☆59Updated this week
- A Micosoft Power BI Custom Connector allowing you to import Trino data into Power BI.☆68Updated 4 months ago
- Library to convert DBT manifest metadata to Airflow tasks☆48Updated last year
- A Python Library to support running data quality rules while the spark job is running⚡☆188Updated last week
- This project demonstrates many of dbt's features when used with the Snowflake Data Cloud☆17Updated last week
- Sample Data Lakehouse deployed in Docker containers using Apache Iceberg, Minio, Trino and a Hive Metastore. Can be used for local testin…☆69Updated last year
- Databricks Implementation of the TPC-DI Specification using Traditional Notebooks and/or Delta Live Tables☆83Updated this week
- The Trino (https://trino.io/) adapter plugin for dbt (https://getdbt.com)☆235Updated last month
- Turning PySpark Into a Universal DataFrame API☆393Updated this week
- ☆117Updated 9 months ago
- A lightweight helper utility which allows developers to do interactive pipeline development by having a unified source code for both DLT …☆49Updated 2 years ago
- PyJaws: A Pythonic Way to Define Databricks Jobs and Workflows☆43Updated 10 months ago
- Modern serverless lakehouse implementing HOOK methodology, Unified Star Schema (USS), and Analytical Data Storage System (ADSS) principle…☆114Updated last month
- Run, mock and test fake Snowflake databases locally.☆131Updated 2 weeks ago
- Proof-of-concept extension combining the delta extension with Unity Catalog☆84Updated 2 weeks ago
- CLI tool for dbt users to simplify creation of staging models (yml and sql) files☆263Updated this week