5+ years of experience in a similar role with Snowflake and cloud data platforms., Deep understanding of Snowflake architecture and data warehousing concepts., Strong SQL proficiency and experience with modern ETL/ELT tools., Proficiency in at least one programming language for data automation..
Key responsibilities:
Integrate data from multiple sources into Snowflake while ensuring quality and consistency.
Design and maintain efficient ETL/ELT processes using Snowflake's capabilities.
Develop and optimize data models to support business intelligence and analytics needs.
Collaborate with teams to translate business requirements into technical solutions.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
Ludi, Inc. - Provider Compensation Management
11 - 50
Employees
About Ludi, Inc. - Provider Compensation Management
Ludi is the leader in provider compensation solutions. We partner with healthcare organizations to streamline provider compensation management and performance that reduces costs, promotes efficient care delivery, and fosters more collaborative provider relationships.
Ludi offers DocTime®, the most innovative and fully integrated compensation management solution in the industry. DocTime simplifies the complexities of provider compensation, keeping systems aligned with industry evolutions and sharpening the focus on exceptional care delivery. Our approach boosts efficiency, lowers costs, and enhances provider satisfaction.
In the evolving healthcare landscape, where cost management and operational efficiency are paramount, Ludi stands as the true innovation for health system executives. We understand the pressure to navigate complex payment compensation models, technology fatigue, and the need to maintain provider morale.
Ludi is seeking a skilled SeniorData Engineer to join our data architecture + engineering team. The ideal candidate will have deep expertise in Snowflake architecture, data warehousing, and cloud technologies. As a Senior Data Engineer, you will be responsible for building and optimizing our data pipelines, creating efficient data models, and ensuring high performance in our Snowflake data warehouse. This role involves working closely with software developers, product teams, analysts, and internal/external stakeholders to build scalable and efficient data solutions.
Key Responsibilities:
Data Integration: Integrate data from multiple sources (APIs, Cloud Data Sharing, Flat File) into Snowflake, ensuring data quality and consistency.
ETL/ELT Development: Design, build, and maintain efficient ETL/ELT processes using Snowflake’s native capabilities (SQL, Python) and integration tools (e.g., Workato, Snowpipe, Streams, Tasks).
Data Modeling and Design: Develop and optimize data models (logical, physical, and conceptual) within Snowflake to support business intelligence, analytics, and reporting needs.
Performance Tuning: Optimize query performance, data storage, and cost efficiency within Snowflake by leveraging clustering, caching, partitioning, and other performance-enhancing features.
Data Security: Implement best practices for data security, including role-based access control, encryption, SSO, MFA, and monitoring for the Snowflake environment.
Automation: Automate repetitive tasks related to data ingestion, transformation, and data quality checks using Snowflake's native tools and external frameworks (e.g., Python).
Collaboration: Work with data architect, analysts, product teams, and internal/external business stakeholders to translate business requirements into technical solutions and support various data-related projects.
Documentation: Develop and maintain detailed technical documentation, data diagrams / workflows including ETL/ELT processes, data models, and performance metrics.
Required Skills & Qualifications:
Experience: 5+ years of experience in a similar role, working with Snowflake and other cloud data platforms.
Expertise in Snowflake: Deep understanding of Snowflake architecture, including virtual warehouses, clustering, data sharing, time travel, zero-copy cloning, and micro-partitioning.
SQL Proficiency: Strong experience in SQL for querying and manipulating large datasets, building complex stored procedures, and optimizing performance.
ETL/ELT Tools: Experience with modern ETL/ELT tools (e.g., Workato, AWS DMS, Matillion, Talend, Fivetran, dbt) and integrating data from/to multiple sources (e.g., REST and SOAP APIs, AWS S3, Flat File).
Cloud Platforms: Hands-on experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and integrating Snowflake within these environments.
Scripting Knowledge: Proficiency in at least one programming or scripting language (e.g., Python, JavaScript) for data automation and orchestration.
Performance Optimization: Proven ability to monitor and optimize performance and costs in a cloud-based data warehouse environment.
Data Warehousing: Solid knowledge of data warehousing concepts (e.g., star schema, snowflake schema, normalization, denormalization).
Data Security: Experience with data security principles, including encryption, user roles, and secure views.
Version Control: Experience with version control systems (e.g., Git) to manage code and pipelines.