Senior Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

5+ years of experience in a similar role with Snowflake and cloud data platforms., Deep understanding of Snowflake architecture and data warehousing concepts., Strong SQL proficiency and experience with modern ETL/ELT tools., Proficiency in at least one programming language for data automation..

Key responsibilities:

  • Integrate data from multiple sources into Snowflake while ensuring quality and consistency.
  • Design and maintain efficient ETL/ELT processes using Snowflake's capabilities.
  • Develop and optimize data models to support business intelligence and analytics needs.
  • Collaborate with teams to translate business requirements into technical solutions.

Ludi, Inc. - Provider Compensation Management logo
Ludi, Inc. - Provider Compensation Management
11 - 50 Employees
See all jobs

Job description

Ludi is seeking a skilled Senior Data Engineer to join our data architecture + engineering team. The ideal candidate will have deep expertise in Snowflake architecture, data warehousing, and cloud technologies. As a Senior Data Engineer, you will be responsible for building and optimizing our data pipelines, creating efficient data models, and ensuring high performance in our Snowflake data warehouse. This role involves working closely with software developers, product teams, analysts, and internal/external stakeholders to build scalable and efficient data solutions.

Key Responsibilities:
 
  • Data Integration: Integrate data from multiple sources (APIs, Cloud Data Sharing, Flat File) into Snowflake, ensuring data quality and consistency.
  • ETL/ELT Development: Design, build, and maintain efficient ETL/ELT processes using Snowflake’s native capabilities (SQL, Python) and integration tools (e.g., Workato, Snowpipe, Streams, Tasks).
  • Data Modeling and Design: Develop and optimize data models (logical, physical, and conceptual) within Snowflake to support business intelligence, analytics, and reporting needs.
  • Performance Tuning: Optimize query performance, data storage, and cost efficiency within Snowflake by leveraging clustering, caching, partitioning, and other performance-enhancing features.
  • Data Security: Implement best practices for data security, including role-based access control, encryption, SSO, MFA, and monitoring for the Snowflake environment.
  • Automation: Automate repetitive tasks related to data ingestion, transformation, and data quality checks using Snowflake's native tools and external frameworks (e.g., Python).
  • Collaboration: Work with data architect, analysts, product teams, and internal/external business stakeholders to translate business requirements into technical solutions and support various data-related projects.
  • Documentation: Develop and maintain detailed technical documentation, data diagrams / workflows including ETL/ELT processes, data models, and performance metrics.

Required Skills & Qualifications:
 
  • Experience: 5+ years of experience in a similar role, working with Snowflake and other cloud data platforms.
  • Expertise in Snowflake: Deep understanding of Snowflake architecture, including virtual warehouses, clustering, data sharing, time travel, zero-copy cloning, and micro-partitioning.
  • SQL Proficiency: Strong experience in SQL for querying and manipulating large datasets, building complex stored procedures, and optimizing performance.
  • ETL/ELT Tools: Experience with modern ETL/ELT tools (e.g., Workato, AWS DMS, Matillion, Talend, Fivetran, dbt) and integrating data from/to multiple sources (e.g., REST and SOAP APIs, AWS S3, Flat File).
  • Cloud Platforms: Hands-on experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and integrating Snowflake within these environments.
  • Scripting Knowledge: Proficiency in at least one programming or scripting language (e.g., Python, JavaScript) for data automation and orchestration.
  • Performance Optimization: Proven ability to monitor and optimize performance and costs in a cloud-based data warehouse environment.
  • Data Warehousing: Solid knowledge of data warehousing concepts (e.g., star schema, snowflake schema, normalization, denormalization).
  • Data Security: Experience with data security principles, including encryption, user roles, and secure views.
  • Version Control: Experience with version control systems (e.g., Git) to manage code and pipelines.

Preferred Qualifications:
 
  • Certifications: Snowflake certification (SnowPro Core, SnowPro Advanced Architect) or relevant cloud certifications.
  • Orchestration: Familiarity with workflow orchestration tools like Workato.
  • Big Data Experience: Experience working with big data technologies (e.g., Hadoop, Spark, Kafka).
  • Analytics Tools: Experience with BI tools (e.g., Power BI, Tableau, Looker).

Personal Attributes:
 
  • Strong problem-solving skills and attention to detail.
  • Ability to work collaboratively in a team-oriented environment.
  • Effective communication skills to translate technical details to non-technical stakeholders.
  • Self-motivated with a drive for continuous learning and improvement.
This is a remote position but qualified candidates MUST be located in the United States.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Detail Oriented
  • Teamwork
  • Communication
  • Problem Solving

Data Engineer Related jobs