Match score not available

Remote - Airflow data engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

5+ years of software development or data engineering experience in Python, Spark, Scala, or equivalent technologies., Experience with Airflow, Snowflake, AWS, and API consumption., Knowledge of designing and building scalable data pipelines and working with large datasets (PB-scale)., Familiarity with modern software development practices, including CI/CD and containerization (Docker)..

Key responsabilities:

  • Design, build, and support scalable data pipelines, systems, and APIs for the AdTech and MarTech Data Platform.
  • Utilize distributed computing frameworks to facilitate data ingestion at scale.
  • Produce high-quality, robust, efficient, and maintainable code.
  • Collaborate with a distributed team of skilled professionals to implement solutions.

Resource Informatics Group, Inc logo
Resource Informatics Group, Inc SME https://www.rigusinc.com/
51 - 200 Employees
See all jobs

Job description

Role: Airflow data engineer
Job Location: Remote
Duration: 12+ Months
Interview: Video

Job Description:
We are looking for self-motivated and data-driven engineers, architects, and designers who desire to find solutions and make an impact at scale while collaborating with a distributed team of like-minded and highly skilled professionals
.
  • Must have experience on Airflow, snowflake, python, AWS and API Consumption
  • 5+ years of software development or data engineering experience in Python, Spark, Scala or equivalent technologies
  • Experience designing and building highly scalable data pipelines Airflow)
  • Knowledge and experience of working with large datasets (PB-scale)
  • Proven track record of working with cloud technologies (AWS)
  • Experience with developing or consuming web interfaces (REST API)
  • Experience with modern software development practices, leveraging CI/CD, and containerization such as Docker
Roles & Resposbilities.
  • Design, build and support scalable data pipelines, systems, and APIs for the AdTech and MarTech Data Platform
  • Use distributed computing frameworks and other cutting-edge technologies to support data ingestion at scale
  • Produce high-quality code that is robust, efficient, testable and easy to maintain

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Self-Motivation
  • Problem Solving

Data Engineer Related jobs