Match score not available

Remote Azure Data Integration Expert

Remote: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

6+ years' experience in data pipelines., Proficient in ETL tools like Informatica., Experience with OLAP cubes design and development., Deep knowledge of Data warehouse architecture., Bachelor's degree in Computer Science or related field..

Key responsabilities:

  • Lead data extraction, transformation, and load processes.
  • Design and produce complex data models.
  • Implement data integration, storage, and migration solutions.
  • Review and cleanse data for quality assurance.
  • Integrate and ingest data from various sources.
Sequoia Global Services logo
Sequoia Global Services Startup https://www.sequoia-connect.com/
11 - 50 Employees
See more Sequoia Global Services offers

Job description

Description

Our client is a fast-growing automation-led next-generation service provider delivering excellence in IT, BPO, and consulting services. They are driven by a combination of robust strategies, passionate teams, and a global culture rooted in innovation and automation.

Our client’s Digital offerings have helped clients achieve operational excellence and customer delight. Their focus lies on taking a leadership position in helping clients attain customer intimacy as their competitive advantage. They are now on a journey of metamorphosing the experiences of the customer’s customers by leveraging our industry-leading delivery and execution model, built around the strategy— Automate Everything™, Cloudify Everything™, Transform Customer Experiences™.

Powering our client’s complex technology solutions and services is the Bottom-Up Disruption, a disruptive crowdsourcing initiative that brings about innovation and improvement to everyday complexities and, ultimately, growing the client’s business. The digitally empowered workforce of our client represents various nationalities, comprising 19,833 employees, and lives the company’s philosophy of ‘customer success, first and always’. The company reported a 2020 global revenue of USD $845.04 Mn.

We are currently searching for an Azure Data Integration Expert:

Responsibilities

  • Leads the delivery processes of data extraction, transformation, and load from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities and sense of database performance
  • Designs, develops and produces data models of relatively high complexity, leveraging a sound understanding of data modelling standards to suggest the right model depending on the requirement
  • Batch Processing - Capability to design an efficient way of processing high volumes of data where a group of transactions is collected over a period 
  • Data Integration (Sourcing, Storage and Migration) - Capability to design and implement models, capabilities, and solutions to manage data within the enterprise (structured and unstructured, data archiving principles, data warehousing, data sourcing, etc.). This includes the data models, storage requirements and migration of data from one system to another
  • Data Quality, Profiling and Cleansing - Capability to review (profile) a data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to remediate the data 
  • Stream Systems - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it is produced, in any format, and at any quality
  • Excellent interpersonal skills to build network with variety of department across business to understand data and deliver business value and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-sized projects
  • Understand the difference between on-prem and cloud-based data integration technologies.

Requirements

  •  6+ years’ experience in developing large-scale data pipelines in a cloud/on-prem environment.
  • Highly Proficient in any or more of market leading ETL tools like Informatica, DataStage, SSIS, Talend, etc.
  • Experience with OLAP cubes, including design, development, and optimization.
  • Analysis Integration 
  • Deep knowledge of Data warehouse/Data Mart architecture and modeling
  • Define and develop data ingest, validation, and transform pipelines.
  • Deep knowledge of distributed data processing and storage
  • Deep knowledge of working with structured, unstructured, and semi-structured data
  • Working experience needed with ETL/ELT patterns
  • Extensive experience in the application of analytics, insights, and data mining to commercial “real-world” problems
  • Technical experience in any one programming language, preferably Java, .Net or Python
  • BE/Btech in Computer Science, Engineering, or a relevant field

Languages

  • Advanced Oral English.
  • Native Spanish.

Note:

  • Fully remote

If you meet these qualifications and are pursuing new challenges, Start your application to join an award-winning employer. Explore all our job openings | Sequoia Career’s Page: https://www.sequoiags.com/careers/.


Required profile

Experience

Level of experience: Senior (5-10 years)
Industry :
Information Technology & Services
Spoken language(s):
EnglishEnglishSpanish
Check out the description to know which languages are mandatory.

Other Skills

  • Problem Solving
  • Analytical Thinking
  • Social Skills

Related jobs