ETL With Airflow


Airflow is a platform to programmatically author, schedule and monitor workflows. We use airflow to author workflows as directed acyclic graphs (DAGs) of tasks.

The airflow scheduler executes your tasks on an array of workers while following the specified dependencies, we have heard about airflow but we never knew how to get it working for the talend jobs and it will be very easy as we will have the UI for scheduling and monitoring the working flow of the talend jobs.
Lets check our To Do to achieve the goal
Launching the instance of the Ec2 : We will be launching the ubuntu server for the installation of the airflow and also for copying the talend jobs in the server
Installing Airflow in Ec2 instance : We will follow the steps for the installation of the airflow and get the webserver of the airflow working
Adding of the talend job and creating DAGs file

Launching the instance of the Ec2

Sure, here are the general steps to launch an EC2 instance:

  1.  Log in to the AWS Management Console and go to the EC2 service dashboard.
  2. Click on the “Launch Instance” button.
  3. Choose an Amazon Machine Image (AMI) that suits your needs. An AMI is a pre-configured virtual machine image that serves as the basic template for your instance.
  4. Select the instance type that you want to use. Instance types determine the amount of compute, memory, and networking capacity that your instance will have.
  5. Configure the instance details, including the number of instances to launch, the network settings, and other options.
  6. Add storage to your instance by selecting the appropriate type and size of storage volumes.
  7. Configure any additional details, such as security groups and key pairs.
  8. Review your selections and launch the instance.
Once you have launched your instance, you will be able to connect to it using SSH or other remote access protocols, and you can start using it to run your applications or perform other tasks as needed.

Installing Airflow in Ec2 instance

Sure, here are the general steps to install Airflow in an EC2 instance:

  1. Log in to your EC2 instance using SSH.
  2. Update the instance packages and dependencies by running the command
    sudo apt-get update‘.
  3. Install the required dependencies for Airflow by running the command ‘sudo apt-get install python3-dev libmysqlclient-dev‘.
  4. Install pip, the package installer for Python, by running the command ‘sudo apt-get install python3-pip‘.
  5. Install Airflow using pip by running the command
    pip3 install apache-airflow’.
  6. Initialize the Airflow database by running the command
    airflow db init‘.
  7. Create an Airflow user by running the command
    ‘airflow users create –username <USERNAME> –firstname <FIRSTNAME> –lastname <LASTNAME> –role Admin –email <EMAIL>‘.
    Replace the <USERNAME>, <FIRSTNAME>, <LASTNAME>, <EMAIL> with the actual values.
  8. Start the Airflow web server and scheduler by running the command ‘airflow webserver -p 8080’ and ‘airflow scheduler’.
  9. Access the Airflow web UI by navigating to the public DNS or public IP address of your EC2 instance on port 8080 in your web browser.

That’s it! You should now have Airflow installed and running on your EC2 instance. You can use it to create and manage workflows for your data pipelines.

Adding of the talend job and creating DAGs file

To add a Talend job to Airflow and create a DAG file, you can follow these general steps:

  1. Create your Talend job in Talend Studio and export it as a standalone Job Archive (.zip file).
  2. Transfer the Job Archive file to your EC2 instance where Airflow is installed.
  3. Create a new directory in your Airflow home directory (e.g. /home/ubuntu/airflow/dags/talend) to store the Talend job and any related files.
  4. Extract the contents of the Job Archive file to the directory you just created. Make sure to keep the directory structure intact.
  5. Create a new Python file in the same directory with a filename that will serve as the name of your DAG (e.g.
  6. In this Python file, import the necessary Airflow libraries and define your DAG.
  7. Define your DAG tasks, using the BashOperator or PythonOperator as appropriate to execute your Talend job.
  8. Add the DAG to your Airflow scheduler by copying the Python file to the dags folder (e.g. /home/ubuntu/airflow/dags/).
  9. Start the Airflow webserver and scheduler if they’re not already running, using the
    ‘airflow webserver’ and ‘airflow scheduler’ commands.
  10. Check the Airflow web UI to confirm that your Talend job is running as expected.

Here’s an example Python code snippet for a DAG that runs a Talend job using a BashOperator:

from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime
# Define the DAG
dag = DAG(
    description=’DAG to run my Talend job’,
    start_date=datetime(2023, 2, 21),
# Define the BashOperator task to run the Talend job
run_talend_job = BashOperator(
# Set the order of the tasks in the DAG
run_talend_job >> end_of_dag

This example assumes that your Talend job includes a run script called that can be executed from the command line. You would need to modify the path and filename to match the location of your own Talend job files.
With these steps, you should be able to create a DAG that runs your Talend job in Airflow.

Benefits of Cross-checking data validation in your business

We know that inaccurate data costs the business time, money, and resources. Therefore, having high-quality data is essential for accuracy and dependability. The benefits of data validation in your business are listed below: 

  • Data validation ensures that the data in your system is accurate. Your business benefits from accurate data in many different ways, especially when it comes to sales. 
  • Without question, sales teams rely on reliable data to create and maintain accurate sales lead lists. Your sales funnel won’t be able to stay successful to fill pipeline full. If you keep employing disconnected lines or expired email addresses. 
  • Businesses save time and create many potential possibilities by authenticating data. 
  • Data validation ensures that you work with accurate data for your current clients, organizational structures, executive directories, and financial information. 


Airflow is a powerful platform for building ETL pipelines. Its ability to define, schedule, and monitor complex workflows makes it ideal for processing large volumes of data. By following best practices, organizations can build reliable and efficient ETL pipelines that can scale to meet their data processing needs. Leveraging Airflow’s capabilities for efficient and reliable data processing is crucial in the age of big data.

Let us handle the heavy lifting and ensure your data is safe and secure throughout the process.

Complere can help

Complere combines the most advanced automated technologies with skilled, experienced personnel to give you the best data validation services available. 

We understand that it is not possible to have your personnel manually validate information every day. We can swiftly and accurately authenticate data using industry-leading procedures, giving your employees access to the most recent, accurate, and comprehensive information whenever they need it. 

Call the Complere team at 7042675588 today to learn more about our data validation services and how we can help you.

Have a Question?

Puneet Taneja
CPO (Chief Planning Officer)

Table of Contents

Complere can help

Case study on Flolite

Improve the productivity & build high performance Challenge Leading a team, whether it consists of ten or a hundred people, is incredibly challenging. Analyzing the team’s performance ...

How automated reports helps in increasing the productivity.

How automated reports helps in increasing the productivity? Challenge Automated reports systems help businesses track the support team’s performance, goals, & self-evaluation to increase ...

Case Study: Textellent

Textellent has to build at speed and scale to handle clients' info uploads of Textellent customers from various Tax software, before the import files required to be validated and then processed. ...
Game bài đổi thưởng
Call Now Button