Match score not available

Senior Data Engineer - Databricks (PST working hours)

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

5+ years of experience in data engineering with a focus on big data technologies., Strong expertise in Databricks, including Delta Lake and Unity Catalog., Proficiency in Python or Java for building scalable data solutions., Experience with cloud platforms such as AWS, Azure, or GCP..

Key responsabilities:

  • Collaborate with business and engineering teams to define data requirements and develop scalable solutions.
  • Design, develop, and maintain high-performance ETL pipelines using Databricks and PySpark.
  • Optimize big data processing workflows for efficiency and reliability.
  • Implement data governance and quality best practices while troubleshooting performance issues.

Bridgenext logo
Bridgenext Large https://www.bridgenext.com/
1001 - 5000 Employees
See all jobs

Job description

Company Overview:

Bridgenext is a digital consulting services leader that helps clients innovate with intention and realize their digital aspirations by creating digital products, experiences, and solutions around what real people need. Our global consulting and delivery teams facilitate highly strategic digital initiatives through digital product engineering, automation, data engineering, and infrastructure modernization services, while elevating brands through digital experience, creative content, and customer data analytics services.

 

Don't just work, thrive. At Bridgenext, you have an opportunity to make a real difference - driving tangible business value for clients, while simultaneously propelling your own career growth. Our flexible and inclusive work culture provides you with the autonomy, resources, and opportunities to succeed. 

Position Description:

We’re looking for a Senior Data Engineer with strong expertise in Databricks to design and build high-performance, scalable data pipelines. In this role, you’ll work closely with stakeholders, architects, and engineers to define data requirements, develop optimized solutions, and ensure seamless execution across our global teams.

 

What You’ll Do:

  • Collaborate with business and engineering teams to understand data needs and translate them into scalable solutions
  • Design, develop, and maintain high-performance ETL pipelines using Databricks and PySpark
  • Optimize big data processing workflows to ensure efficiency, reliability, and scalability
  • Work with AWS cloud services to manage, store, and process large-scale datasets
  • Implement data governance, security, and quality best practices
  • Troubleshoot and optimize performance issues in distributed data environments
  • Stay ahead of industry trends and drive innovation in data engineering best practices

 

Techncial Stack / Skills Required:

  • Databricks – Strong experience in Delta Lake, Unity Catalog, and Workflows.
  • PySpark / Spark SQL – Deep understanding of distributed data processing and optimization techniques.
  • Cloud Platforms (AWS / Azure / GCP) – Basics of AWS Glue, S3, Lambda, Athena, Redshift, or similar cloud-native services.
  • ETL & Data Orchestration – Hands-on experience with Apache Airflow or similar orchestration tools.
  • Data Warehousing & Modeling – Experience with Snowflake, Redshift, BigQuery, or similar; solid understanding of dimensional modeling and Star Schema.
  • Programming Languages – Proficiency in Python (preferred), Scala, or Java for data engineering and automation.

 

 

Workplace: Remote from anywhere in USA, working PST office hours

 

 

Must Have Skills:

  • 5+ years of experience in data engineering, specializing in big data technologies (e.g., Hadoop, Spark, Kafka)

  • Hands-on expertise in Databricks and distributed data processing

  • Strong knowledge of data architecture, ETL frameworks, and data warehousing concepts

  • Proficiency in Python or Java for building scalable data solutions

  • Experience working with cloud platforms (AWS, Azure, GCP) and modern data ecosystems

  • Strong analytical mindset with excellent problem-solving and debugging skills
     

Professional Skills:

  • Outstanding communication and leadership skills, with the ability to work in fast-paced, cross-functional teams
  • Solid English written, verbal, and presentation communication skills
  • Strong team and individual player
  • Maintains composure during all types of situations and is collaborative by nature
  • High standards of professionalism, consistently producing high quality results
  • Self-sufficient, independent requiring very little supervision or intervention
  • Demonstrate flexibility and openness to bring creative solutions to address issues
     

Bridgenext is an Equal Opportunity Employer

 

US citizens and those authorized to work in the US are encouraged to apply

 

#LI-CP1

#LI-REMOTE

 

 

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Collaboration
  • Communication
  • Leadership

Data Engineer Related jobs