san99tiago / fastapi-docker-github-actions
GitHub Actions Pipeline with a FastAPI Application built, tested and deployed to DockerHub.
☆16Updated last year
Alternatives and similar repositories for fastapi-docker-github-actions:
Users that are interested in fastapi-docker-github-actions are comparing it to the libraries listed below
- Deploy Python FastAPI serverless application on Azure Functions☆21Updated 8 months ago
- Template for building FastAPI Asynchronous applications with PostgreSQL☆23Updated 7 months ago
- An example application to implement unit and integration tests using test database☆20Updated 2 years ago
- Code snippets for Data Engineering Design Patterns book☆62Updated 3 weeks ago
- Testing a FastAPI CRUID API using Pytest☆29Updated 8 months ago
- End to end data engineering project☆53Updated 2 years ago
- Example repo to create end to end tests for data pipeline.☆21Updated 7 months ago
- build dw with dbt☆35Updated 3 months ago
- ☆24Updated last year
- An extendable async API using FastAPI, SQLModel, PostgreSQL and Redis.☆157Updated 3 months ago
- ☆38Updated last month
- Produce Kafka messages, consume them and upload into Cassandra, MongoDB.☆39Updated last year
- Glue ETL job or EMR Spark that gets from data catalog, modifies and uploads to S3 and Data Catalog☆11Updated last year
- FastAPI SSO example with various providers☆76Updated this week
- This repository contains FastAPI learning stuff☆35Updated last year
- Project for "Data pipeline design patterns" blog.☆43Updated 5 months ago
- Code for my "Efficient Data Processing in SQL" book.☆56Updated 5 months ago
- ☆45Updated last year
- FastAPI & Kubernetes Tutorial with PyCharm☆106Updated 2 years ago
- A FastApi Boilerplate for Production☆52Updated last year
- Delta-Lake, ETL, Spark, Airflow☆46Updated 2 years ago
- This project contains the source code for the medium article☆56Updated 5 months ago
- Building Python Web APIs with FastAPI, published by Packt☆209Updated 6 months ago
- This Repository is for FastAPI projects☆31Updated 3 weeks ago
- Get data from API, run a scheduled script with Airflow, send data to Kafka and consume with Spark, then write to Cassandra☆131Updated last year
- Create a streaming data, transfer it to Kafka, modify it with PySpark, take it to ElasticSearch and MinIO☆59Updated last year
- Medium article "Clean Architecture with Python"☆45Updated this week
- Supporting repository for the blog post at https://www.firasesbai.com/articles/2022/01/09/logging-with-elasticsearch.html☆26Updated 2 years ago
- End-to-end data platform leveraging the Modern data stack☆45Updated 9 months ago
- ☆76Updated 8 months ago