Match score not available

Data Engineer

unlimited holidays - extra holidays - extra parental leave - long remote period allowed
Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

2+ years of experience in data engineering, Extensive experience with SQL, preferably Postgres, Hands-on experience with GCP BigQuery or Snowflake, Experience using Terraform for cloud infrastructure.

Key responsabilities:

  • Design and develop data pipelines and models
  • Prepare and maintain documentation
  • Lead data migration and modeling from GCP
  • Contribute to ETL framework and data model implementation
TASQ Staffing Solutions logo
TASQ Staffing Solutions Human Resources, Staffing & Recruiting TPE https://www.tasq.work
11 - 50 Employees
See more TASQ Staffing Solutions offers

Job description

About the Role

We have partnered with a dynamic Australian start-up that is making a huge impact in the auto industry. As a Data Engineer, you will directly report to the Head of Data and Analytics and will be responsible in design and development of data modelling, data pipelines supporting the company's platform and apps, as well as providing data and reporting support for business end users.

What you'll be working on:

  • Design and development of reporting data model and data transformation jobs, including the modelling of very large data sets.
  • Identify and implement the most efficient ways of performing data transformation tasks using best practice methods and tooling.
  • Prepare and maintain documentation such as business requirements documents, design specifications and test cases.
  • Work with stakeholders (including data team, software engineers and product team) to understand business requirements and translate these into technical specifications.
  • Lead the data migration and modelling process from GCP to data warehouse.
  • Responsible for data warehouse administration, user access and security.
  • Contribute to the design and implementation of our data model and ETL framework.

What were looking for:

  • Minimum 2+ years of experience in a data engineering environment, with hands on experience building and maintaining complex data environments in the cloud (preferably GCP BigQuery and/or Snowflake).
  • Extensive experience with SQL (Postgres preferred), with a core focus on analyzing and validating complex and disparate data sets to find gaps between datasets, requirements, and source systems.
  • Demonstrate understanding and experience with following data engineering competencies:

-Data warehousing principles, including data architecture, modelling, database design, and performance optimization best practices.

-Building group data assets and pipelines from scratch, by integrating large quantities of data from disparate internal and external sources.

-Supporting analytics solutions to be productionized, including deployment, automation, orchestration, monitoring, and logging. Preferably with an ETL tool such as Matillion, DBT, or equivalent.

  • Experience in deploying cloud infrastructure as code (IaC) using Terraform or similar.
  • Experience using Python to develop scripts and small programs for job orchestration and/or data manipulation.
  • Ability to interact with business end user to draw and distil business requirement into data pipeline design and reporting solution.
  • Ability to prioritize on the fly and work in a high-performing, outcomes- focused environment with multiple competing and ambiguous deliverables.
  • Working in an Agile development environment

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Industry :
Human Resources, Staffing & Recruiting
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Adaptability
  • Teamwork
  • Communication
  • Problem Solving

Data Engineer Related jobs