Match score not available

717 - Ssr/Sr Data Engineer

extra holidays - extra parental leave
Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor's or Master's degree in Computer Science, Data Science, Information Science, or related field, or equivalent work experience., 2+ years of experience with SQL on multiple database platforms., Strong programming background in data science-focused languages such as Python, Scala, or R., Experience working with cloud-based enterprise analytics platforms and/or data warehouse projects, preferably Snowflake..

Key responsabilities:

  • Design and maintain data pipelines for analytics and reporting on the Enterprise Data Platform.
  • Build and deploy scalable data pipelines and recommend tools for efficient data movement and storage.
  • Collaborate with cross-functional teams to define and deliver reports based on business requirements.
  • Monitor system performance, troubleshoot data issues, and ensure data security and privacy best practices.

Darwoft logo
Darwoft SME https://darwoft.com/
51 - 200 Employees
See all jobs

Job description

Data Engineer
What Youll Bring to The Team:

We are seeking an experienced Data Engineer to design and maintain data pipelines that power analytics and advanced reporting on our clients Enterprise Data Platform. As a member of the R&D organization, you will manage critical business data assets, leveraging cloud-based big data tools to enable data-driven decisions. You will bring a strong focus on data quality and collaborate with security and compliance experts, understanding the unique data sovereignty requirements of a global customer base.

Responsibilities:
  • Craft and build reusable components, frameworks, and libraries at scale to support analytics products.
  • Recommend tools and techniques for efficient data movement, transformation, and storage to facilitate a high-performance data warehouse environment.
  • Build and deploy scalable data pipelines to power analytics and reporting across multiple source systems.
  • Deliver data platform infrastructure as code, managing deployment and configuration requirements as well as internal release documentation.
  • Identify and address data management issues to improve data quality.
  • Contribute to our culture, propose innovative solutions to industry challenges, provide constructive feedback, and help create a company that drives meaningful change.
  • Implement redundant systems, policies, and procedures for disaster recovery and data archiving to ensure data availability, protection, and integrity.
  • Plan for capacity and resource expansion to support data warehouse scalability.
  • Collaborate with cross-functional teams to define and deliver reports based on business requirements.
  • Participate in data warehouse improvement and growth projects.
  • Monitor system performance, optimize stored procedures, and improve query execution efficiency.
  • Ensure data security and privacy best practices are applied appropriately.
  • Troubleshoot data issues and perform root cause analysis to proactively resolve operational challenges.
  • Create, update, and maintain system documentation.
Ideal Candidate Profile:
  • Bachelors or Masters degree in Computer Science, Data Science, Information Science, or related field, or equivalent work experience.
  • 2+ years of experience with SQL on multiple database platforms.
  • 1+ years of experience working with cloud-based enterprise analytics platforms and/or data warehouse projects (Snowflake preferred).
  • Strong programming background in data science-focused languages such as Python, Scala, or R.
  • Solid understanding of both relational and NoSQL database modeling and schema design principles.
  • Experience working with large datasets and developing high-performance queries.
  • Hands-on experience with large-scale data migrations.
  • Strong knowledge of data security best practices.
  • Proficiency with Git and commitment to documentation best practices.
  • Ability to thrive in a hybrid work environment.
  • Positive and action-oriented mindset.
  • Strong interpersonal and communication skills, with the ability to ask the right questions.
  • Self-motivated and self-managing, with excellent task organization skills.
  • Ability to clearly and concisely communicate technical requirements and recommendations.
  • Proficiency in SQL & NoSQL databases (Snowflake, MongoDB), particularly in development or reporting.
  • Strong understanding of relational data structures, theories, and principles.
  • Experience mentoring and training other developers and engineers on data engineering best practices.
  • Strong knowledge of applicable data privacy regulations and best practices.
  • Strong SQL and schema comprehension skills, including many-to-many relationships.
  • Deep understanding of data structures and their implementation.
Bonus Points If You Have:
  • Experience with Snowflake, DBT, Fivetran, and Tableau.
  • Prior exposure to distributed data frameworks.
  • Proficiency in building modular applications.
  • Experience with Microservices and/or Service-Oriented Architecture.
  • Experience with database management and data operations.

How to Apply:

Interested candidates are encouraged to submit their resumes and a cover letter outlining their relevant experience and qualifications to talento@darwoft.com

Questions?
Follow the Recruiter
https://www.linkedin.com/in/hernanvietto/

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Social Skills
  • Self-Motivation
  • Communication

Data Engineer Related jobs