Enterprise Data Platform Lead Consultant (AWS, Python, Snowflake)

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

5+ years of experience in designing Enterprise Data Platforms., Proficiency in AWS, Python (Pandas, PySpark), and Snowflake., Experience with data integration tools and SQL/NoSQL databases., AWS Certified Solutions Architect or similar certifications preferred..

Key responsibilities:

  • Lead the development of Enterprise Data Platform reporting capabilities.
  • Architect and deliver cloud-based analytical solutions using AWS, Python, and Snowflake.
  • Design and implement data integration and curation pipelines for analytics.
  • Collaborate with teams to capture requirements and ensure best practices in data platform architecture.

Meta Resources Group logo
Meta Resources Group Startup https://metaresourcesgroup.com/
11 - 50 Employees
See all jobs

Job description

Our client, a global healthcare company, is seeking an Enterprise Data Platform Lead Consultant, highly proficient in AWS, Python, and Snowflake, to spearhead the design and implementation of their Enterprise Data Platform (EDP) solutions for cross-domain reporting and analytics. You will drive cloud-based data integration, storage, and curation using AWS, Python, and Snowflake, ensuring alignment with strategic initiatives of client programs, including but not limited to the Spectra to Quest Lab transition. This role demands technical leadership in scenarios where data gravity necessitates EDP-based reporting outside SAP Datasphere.



This is a remote, contract role, with minimal travel. The length of this contract will be through the end of 2025, with the likelihood of renewal well into 2026.


Job Responsibilities:
  • Lead for Enterprise Data Platform reporting capabilities for cross-domain reporting and analytical needs.
  • Architect and deliver cloud-based analytical solutions leveraging AWS, Python, and Snowflake.
  • Design and implement end-to-end data integration, storage, and curation pipelines for high-performance analytical use cases.
  • Function as the technical leader in implementing EDP solutions that support data-intensive initiatives within the client's program, especially where reporting must occur outside of DataSphere due to data gravity considerations.
  • Collaborate with data engineers, analysts, and business units to capture requirements and translate them into effective data models and pipelines.
  • Ensure scalability, governance, and security are core to the EDP solution design.
  • Support and guide project teams, enforcing data platform architecture best practices and performance optimization strategies.

 



Requirements
  • 5+ years in designing Enterprise Data Platforms, with expertise in AWS (certifications preferred), Python (Pandas, PySpark), and Snowflake.
  • Proficiency in data integration tools (e.g., Apache Airflow, dbt, Fivetran) and SQL/NoSQL databases.
  • Hands-on experience with data lakehouses, real-time analytics, and cloud security frameworks.
  • Experience leading large-scale migrations (e.g., legacy to cloud) and multi-domain data curation.

Preferred Qualifications:

  • AWS Certified Solutions Architect, Snowflake SnowPro Core/Advanced, or Python certifications.
  • Familiarity with Databricks, Tableau, or Power BI is a plus.
  • Fluent in English; ability to collaborate with global teams across EU time zones.
  • Strong problem-solving skills and stakeholder management for technical and non-technical audiences.


Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving

Consultant Related jobs