Match score not available

Data Engineer

Remote: 
Full Remote
Work from: 

Offer summary

Qualifications:

Bachelor’s degree in Computer Science or related field, Minimum of 5 years in data engineering roles, Proficiency in SQL and programming languages like Python, Java, or Scala, Familiarity with big data tools like Hadoop or Spark.

Key responsabilities:

  • Design, develop, and maintain scalable ETL pipelines
  • Collaborate with stakeholders to align data solutions with business goals
TUTEN logo
TUTEN
51 - 200 Employees
See all jobs

Job description

Mission

The Data Engineer's mission is to design, develop, and optimize the data infrastructure that powers analytics and business decisions. This role involves building robust ETL pipelines, ensuring data quality and integrity, and enabling scalable data storage and processing. The Data Engineer collaborates with cross-functional teams to implement efficient data solutions that support organizational objectives and drive innovation.

Responsibilities 🙌

  • Data Pipeline Development:
  • Design, develop, and maintain scalable ETL pipelines.
  • Automate data ingestion, transformation, and storage processes.
  • Data Architecture and Modeling:
  • Design and implement data architectures to support analytics and business intelligence.
  • Build and maintain efficient and reliable data models.
  • Performance Optimization:
  • Optimize database and query performance for large-scale data processing.
  • Ensure low-latency access to critical data assets.
  • Data Quality and Integrity:
  • Implement data validation and quality checks to ensure accuracy and reliability.
  • Monitor data pipelines and resolve any data-related issues promptly.
  • Collaboration with Stakeholders:
  • Work closely with data scientists, analysts, and business teams to understand data requirements.
  • Align data engineering solutions with business goals and strategies.
  • Cloud and Big Data Solutions:
  • Leverage cloud-based platforms for data storage and processing (e.g., AWS, Azure, GCP).
  • Implement big data solutions using tools like Hadoop, Spark, or equivalent technologies.
  • Documentation and Best Practices:
  • Document data pipelines, architectures, and processes comprehensively.
  • Promote and adhere to data engineering best practices.

Minimum Requirements

It will be a good match if you have:

  • Education: Bachelor’s degree or similar experience in Computer Science, Data Engineering, or related field.

Relevant certifications such as AWS Certified Data Analytics or Google Cloud Professional Data Engineer is a plus.

  • Language proficiency, native Spanish and advanced English (C1).
  • Experience: Minimum of 5 years in data engineering or related roles.

Proven experience in building and optimizing data pipelines and architectures.

Familiarity with big data tools and technologies (e.g., Hadoop, Spark).

Experience with CI/CD pipelines for data workflows.

  • Technical Knowledge

Proficiency in SQL and database management systems.

Strong programming skills in Python, Java, or Scala.

Experience with cloud-based data platforms (AWS, Azure, GCP).

Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes).

Benefits Package

💻 100% remote work.

⏰ Flexible schedule.

🍰 Day off on your birthday.

😁 Reduced hours on Fridays.

🌴 Superior vacation days.

💊 Sick leave days.

🏠 Moving day off.

🎓 Support for studies, training, and languages.

👥 Referral program.

We are proud to be a team that celebrates, supports, and promotes all kinds of diversity; we are committed to equal opportunities regardless of nationality, ethnicity, skin color, gender, gender expression, disability, or religion.

If everything you've read makes sense to you, apply now so we can get to know you! 💙

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Collaboration

Data Engineer Related jobs