5+ years of experience in system engineering or software development., 3+ years of experience in ETL work with databases and Hadoop platforms., Proficient in Python and SQL, with knowledge of REST APIs and data management tasks., Familiarity with AWS, DevOps principles, and ITIL processes..
Key responsabilities:
Develop and maintain ETL processes for data integration from various RDBMS systems.
Launch and manage Spark jobs in both client and cluster modes.
Collaborate with DevOps teams to ensure compliance with SDLC and change control processes.
Utilize source code control systems like Git for version management and code collaboration.
Report This Job
Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
We provide end-to-end recruitment services. Our outstanding longevity as a company has come through understanding our client’s needs and focusing on delivering value. We do not believe in being responsive to our client’s needs as much as we believe in anticipating those needs. We know the rapidly shifting and compelling world of recruitment better than most. That has come through long years in the field, understanding shifting trends and being flexible in adapting to changes.➼Launched in 2012 at Hyderabad, India.➼Started Bangalore Office in 2019.➼Reached $ 11.5 million USD in 2021.➼Inaugurated offices in the US, UK, and the Middle East in 2022-2023.➼Global Competitor for Top recruiting firms in India.➡VisionWith Quality and Ethics, Sureminds aims to be one of the top 25 Staffing Agencies by 2025.➡MissionInstitutionalizing and orchestrating the standards of the recruitment process with unparalleled strategies and Teamwork.➡ValuesCustomer Experience & Care, Employee Satisfaction, Training, Health & Safety, be part of stakeholder success and in turn be Triumphant.➡Work Culture➼Healthy Work Relationships➼Increased Productivity➼Gender Equality Ratio➼A Sense of Safety and Well-being
5+ years of experience in system engineering or software development
3+ years of experience in engineering with experience in ETL type work with databases and Hadoop platforms.
Skills
Hadoop GeneralDeep knowledge of distributed file system concepts, map-reduce principles and distributed computing. Knowledge of Spark and differences between Spark and Map-Reduce. Familiarity of encryption and security in a Hadoop cluster.
Data management / data structuresMust be proficient in technical data management tasks, i.e. writing code to read, transform and store data
XML/JSON knowledge
Experience working with REST APIs
SparkExperience in launching spark jobs in client mode and cluster mode. Familiarity with the property settings of spark jobs and their implications to performance.
Application DevelopmentFamiliarity with HTML, CSS, and JavaScript and basic design/visual competency
SCC/GitMust be experienced in the use of source code control systems such as Git
ETL Experience with developing ELT/ETL processes with experience in loading data from enterprise sized RDBMS systems such as Oracle, DB2, MySQL, etc.
AuthorizationBasic understanding of user authorization (Apache Ranger preferred)
Programming Must be at able to code in Python or expert in at least one high level language such as Java, C, Scala.
Must have experience in using REST APIs
SQL Must be an expert in manipulating database data using SQL. Familiarity with views, functions, stored procedures and exception handling.
AWS General knowledge of AWS Stack (EC2, S3, EBS, …)
IT Process ComplianceSDLC experience and formalized change controls
Working in DevOps teams, based on Agile principles (e.g. Scrum)
ITIL knowledge (especially incident, problem and change management)
Languages Fluent English skills
Required profile
Experience
Industry :
Human Resources, Staffing & Recruiting
Spoken language(s):
English
Check out the description to know which languages are mandatory.