Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

How to Integrate Snowflake with Apache Airflow?

With changing business trends, associations continuously try to develop modern information systems that depend on platforms that facilitate growth, reduce operational costs, enhance performance, and improve agility. Numerous organizations are additionally modernizing their data platforms and moving from customary data warehouse frameworks to cloud-based frameworks to accommodate their business needs.

The Snowflake Data Cloud is one such framework that is based on an entirely new SQL query engine. It is wholly made due, implying that clients don’t need to stress over back-end components like servers, data storage mechanisms, and other services like maintenance and installation. It has remarkable engineering of traditional shared-disk and shared-nothing database architectures which assists with offering help to a wide range of data.

Snowflake can help simplify data pipelines to help organizations focus on outfitting the power of data and analytics rather than infrastructure management. Snowflake can be incorporated with a few different tools to support the power of this data cloud. One such platform which can be utilized with Snowflake is Apache Airflow.

What is Apache Airflow?

It is an open-source workflow management platform that can organize complex computational work processes, data processing pipelines, and ETL processes. Airflow helps to visualize data pipeline conditions, progress, logs, code, trigger tasks, and achievement status. It additionally offers a simple to-utilize and exceptional UI. This makes completing complex tasks a lot simpler for the client.

Setting Up Airflow Snowflake Integration

Here is the outline that you’ll cover while navigating ahead in Airflow Snowflake Integration:

Step 1: Connection to Snowflake

In this step of Airflow Snowflake Integration, to connect with Snowflake, you need to create a connection with Airflow. On the Administrator page of Apache Airflow, click on Connections, and on the dialogue box, fill in the details as shown underneath. (Assuming Snowflake utilizes AWS cloud as its cloud supplier).

Step 2: Creation of DAG

DAG represents Directed Acyclic Graph and addresses the assortment of errands you need to run. Each task runs on various specialists at various moments. DAG contains several operators that perform the functions of the specialist, such as PythonOperator to perform python errands, BashOperator to perform Bash tasks, etc.

To make a DAG for Snowflake Integration with Airflow that will perform operations on Snowflake, you’ll have to utilize the Snowflake administrator and Snowflake hooks given via Airflow:

Snowflake Administrators are utilized when you need to play out an errand without anticipating output. These administrators can execute – make, embed, consolidate, update, erase, copy into, and shorten operations, which is not required in such cases.

Snowflake Hook is utilized when you anticipate an outcome from a question. Hooks are primarily involved with select inquiries as they remove Snowflake results and pass them to Python for additional handling.

Let us create a sample DAG to automate the errands in Airflow Snowflake Integration:

1 – To generate a DAG for Airflow Snowflake Integration, you must establish Python imports using the following code.

2 – Preparing a DAG object for Airflow Snowflake Integration is easy, as it needs a DAG id and the default parameters with arranged intervals. There are many possible parameters provided by Airflow for added functionalities. You can refer to the broad list of parameters here.

3 – Produce functions in Python to make tables, insert several records, and obtain row count from Snowflake for Snowflake Integration with Airflow.

 

4 – Make DAG with PythonOperator  and SnowflakeOperatorto include the functions designed above for Airflow Snowflake Integration.

 

5 – Now that you have created the errand, you want to connect them with a (>>) operator to make a pipeline for Airflow Snowflake Integration.

After the end of the above script, you want to transfer the script into the Airflow home for the Snowflake Integration with Airflow. After the revive, the DAG will show up on the UI and will look as displayed:

The following is the complete instance of the DAG for the Airflow Snowflake Integration:

 

In the above DAG, the Snowflake administrator makes a table and inserts information into the table. The Snowflake hook is then used to question the table created by the administrator and return the outcome to the Python administrator, which logs the output to the console completing the Airflow Snowflake Integration.

The End

Snowflake has a rundown of tools that can be incorporated by simply accessing its tools page and choosing the platform you want. Airflow is a proper data tool to coordinate with Snowflake as it helps you make proficient datasets and changes your data into significant intelligent leads. But, the manual methodologies of associating Airflow with Snowflake can be complicated and tedious. Besides, they are error-prone, and a considerable number of technical expertise is mandatory to implement them successfully.

The post How to Integrate Snowflake with Apache Airflow? appeared first on infometry.



This post first appeared on Infometry Data Analytics Software Services, please read the originial post: here

Share the post

How to Integrate Snowflake with Apache Airflow?

×

Subscribe to Infometry Data Analytics Software Services

Get updates delivered right to your inbox!

Thank you for your subscription

×