Match score not available

Senior Big Data Hadoop ML Engineer (GCP) - Canada

extra holidays - extra parental leave - fully flexible
Remote: 
Full Remote
Contract: 
Experience: 
Expert & Leadership (>10 years)
Work from: 

Offer summary

Qualifications:

Proficiency in Hadoop ecosystem tools, Strong programming skills in Java and Python, Experience with public cloud services, especially GCP, Prior experience in batch processing systems.

Key responsabilities:

  • Develop code for large-scale batch processing using Hadoop, Spark
  • Create and manage batch pipelines for Machine Learning workloads on GCP
  • Implement CI/CD practices and Infrastructure as Code (IaC)
  • Tackle complex challenges and propose innovative solutions
Rackspace Technology logo
Rackspace Technology Large https://www.rackspace.com/
5001 - 10000 Employees
See more Rackspace Technology offers

Job description

About the Role:

We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in developing batch processing systems, with extensive experience in the Apache Hadoop ecosystem (Map Reduce, Oozie, Hive, Pig, HBase, Storm).  This role involves working in Java, and working on Machine Learning pipelines for data collection or batch inference.   This is a remote position, requiring excellent communication skills and the ability to solve complex problems independently and creatively.

Work Location: US-Remote


What you will be doing:
  • Develop scalable and robust code for large scale batch processing systems using Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase
  • Develop, manage, and maintain batch pipelines supporting Machine Learning workloads
  • Leverage GCP for scalable big data processing and storage solutions
  • Implementing automation/DevOps best practices for CI/CD, IaC, etc.

  • Requirements:
  • Proficiency in in the Hadoop ecosystem with Map Reduce, Oozie, Hive, Pig, HBase, Storm
  • Strong programming skills with Java, Python, and Spark
  • Knowledge in public cloud services, particularly in GCP.
  • Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.
  • Ability to tackle complex challenges and devise effective solutions. Use critical thinking to approach problems from various angles and propose innovative solutions.
  • Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals.
  • Proven experience in engineering batch processing systems at scale.
  • Hands-on experience in public cloud platforms, particularly GCP. Additional experience with other cloud technologies is advantageous.

  • Must Have:
  • Experience with batch pipelines supporting Machine Learning workloads
  • Strong experience in programming language such as Java
  • Strong experience in the Apache Hadoop ecosystem
  • 10+ years of experience in customer-facing software/technology or consulting
  • 5+ years of experience with “on-premises to cloud” migrations or IT transformations
  • Technical degree: Computer Science, Software Engineering or related

  • Good to Have:
  • Familiarity with Terraform
  • Familiarity with Python
  • 5+ years of experience building, and operating solutions built on GCP 


  • #LI-VM1
    #Rackspace
    #LI-Rackspace
    #LI-USA
    #LI-Canada
    #LI-Remote


    About Rackspace Technology
    We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future.
     
     
    More on Rackspace Technology
    Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.
     
     

    Required profile

    Experience

    Level of experience: Expert & Leadership (>10 years)
    Industry :
    Information Technology & Services
    Spoken language(s):
    English
    Check out the description to know which languages are mandatory.

    Other Skills

    • Communication
    • Problem Solving

    Hadoop Developer Related jobs