Offer summary
Qualifications:
Proficiency in Hadoop ecosystem tools, Strong programming skills in Java and Python, Experience with public cloud services, especially GCP, Prior experience in batch processing systems.Key responsabilities:
- Develop code for large-scale batch processing using Hadoop, Spark
- Create and manage batch pipelines for Machine Learning workloads on GCP
- Implement CI/CD practices and Infrastructure as Code (IaC)
- Tackle complex challenges and propose innovative solutions