Offer summary
Qualifications:
Experience in software development with Scala, Knowledge of Apache Spark processing large data sets, Familiarity with Java and microservices, Understanding of Parquet data storage format.
Key responsabilities:
- Write and maintain efficient, scalable code in Scala
- Develop large-scale data processing solutions using Apache Spark
- Implement and optimize data storage in Parquet format
- Collaborate with multidisciplinary teams
- Analyze and solve complex data processing issues
- Design architectures for data processing