Offer summary
Qualifications:
Minimum 3 years experience as a Data Engineer, Proficient in Java, Scala, or Python, Expertise in Spark and/or Hadoop frameworks, Experience with Microsoft Azure, GCP, or AWS, Knowledge of services like Azure Data Lake.
Key responsabilities:
- Analyze and comprehend business needs
- Participate in architecture design
- Manage data lifecycle; prepare and validate data
- Develop jobs (e.g., Spark) and automate data flows
- Conduct load tests and maintain Big Data/Cloud solutions