Help us maintain the quality of our job listings. If you find any issues with this job post, please let us know.
Select the reason you're reporting this job:
The purpose of this job is to provide technical expertise for research, development and modification of extract, transform, load processes and jobs in support of a Big Data infrastructure of our client.
Health Insurance/HMO
Enjoy unlimited MadMax Coffee
Diverse learning & growth opportunities
Accessible Cloud HR platform (Sprout)
Above standard leaves
Research, develop, document, and modify Big Data Lake processes and jobs per data architecture and modeling requirements; collaborate with Data and Analytics data strategists and data scientists
Collaborate with business stakeholders to understand data needs including data velocity, veracity, and access patterns
Provide technical expertise to implement Data and Analytics specifications
Serve on cross-functional project teams and provide the data and big data perspective on executing key deliverables
Troubleshoot complex, escalated issues including connection, failed jobs, application errors, server alerts and space thresholds within predefined service level agreements (SLAs)
Proactively maintain and tune all code according to Big Data and EDW best practices to prevent issues
Review and ensure appropriate documentation for all new development and modifications of the Big Data Lake processes and jobs
Perform code and process reviews and oversee testing for solutions developed, and ensure integrity and security of institutional data
Educate business stakeholders on the usage and benefits of the EDW, Big Data Lake and related technologies
Mentor and guide less experienced team members and provide feedback on project work
Model behaviors that support the company’s common purpose; ensure guests and team members are supported at the highest level
Ensure all activities are in compliance with rules, regulations, policies, and procedures
Complete other duties as assigned
Bachelor’s degree in computer science, engineering, information technology, or relatedfield, required.
Minimum five years of technology operations experience required.
Strong SQL knowledge and skills required
Strong knowledge of Relational Databases like Oracle, Postgres or SQL Server required
String knowledge of relational modeling and features including triggers, stored procedures, and constraints required
Experience with Apache Spark or Spark-streaming, Message Queue technologies and Python required
Strong knowledge of enterprise data warehouse (EDW) data models with a focus on Star Schema data modeling techniques required
Strong knowledge of Amazon Web Services (AWS) or similar Cloud Big Data platform preferred
Excellent analytical skills and the ability to identify solutions to complex data problems
Ability to provide excellent customer service
Excellent written and verbal communication skills
Willingness to learn and embrace new technologies
Ability to mentor and motivate a diverse team; ensure team and individualaccountability and performance standards are met
Ability to prioritize, multitask and manage multiple projects successfully in a fast-pacedand dynamic environment
Strong organizational skills with attention to detail
Ability to communicate and interact effectively with different levels of the organizationto negotiate, problem solve, complete projects and influence decision making
Self-motivated with ability to work both independently and within teams in order toestablish and meet deadlines, goals, and objectives
Required profile
Experience
Industry :
Real Estate Management & Development
Spoken language(s):
English
Check out the description to know which languages are mandatory.