Orchestrating Data Pipelines With Snowpark dbt Python Models And Airflow

Опубликовано: 28 Сентябрь 2024
на канале: Snowflake Developers
10,735
240

A common Airflow use case is orchestrating Snowflake queries as part of a data pipeline. However, such pipelines are normally SQL-based, and data engineers would like to interact with Python models to perform their data engineering. With the release of dbt version 1.3, it's now possible to create both SQL- and Python-based models in dbt, and these Python-based dbt models are made possible by Snowflake's new native Python support and Snowpark API for Python. With Airflow, you are now able to harness the power of Snowpark dbt Python Models and orchestrate these workflows to transform and manage your data. Adrian Lee, Solutions Engineer at Snowflow, does a technical deep dive into how you can build more fluid data pipelines with Snowpark for Python, dbt, and Airflow.

Try Snowflake for Python For Free
Test drive the Snowflake platform with our 30-day free trial
→ https://signup.snowflake.com/?utm_cta...

Learn how to build applications with Python in 10 minutes!
→https://tinyurl.com/mr38texp