Bachelor's degree in Computer Science or related field., Proficiency in Python and SQL for data manipulation and analysis., Experience with data warehousing and ETL processes., Familiarity with cloud platforms like AWS or Azure..
Key responsibilities:
Design and implement data pipelines for efficient data processing.
Collaborate with data scientists to understand data requirements.
Monitor and optimize data systems for performance and reliability.
Prepare and maintain documentation for data processes and workflows.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job: