Match score not available

Data Engineer - Remote

Remote: 
Full Remote

Offer summary

Qualifications:

Bachelor's degree in Computer Science or related field, Experience as a machine learning engineer, Strong programming skills in Python, R, or similar, Familiarity with cloud platforms like AWS or Azure.

Key responsabilities:

  • Develop and optimize machine learning models
  • Collaborate with teams to drive projects to completion
Pozent Corporation logo
Pozent Corporation http://pozent.com/
51 - 200 Employees
See all jobs

Job description

Skills

Must have

  • Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field
  • Proven experience as a machine learning engineer, working on complex machine learning projects
  • Strong programming skills in languages like Python, R, or similar
  • Solid understanding of machine learning algorithms, deep learning frameworks, and statistical modeling techniques
  • Hands-on experience with machine learning libraries such as TensorFlow, PyTorch, or scikit-learn
  • Proficiency in data preprocessing, feature engineering, and data visualization
  • Experience with cloud platforms such as AWS, Azure, or GCP for deploying machine learning models
  • Familiarity with version control systems (e.g., Git) and collaborative development practices
  • Strong problem-solving skills and ability to troubleshoot and optimize machine learning models
  • Excellent communication skills to convey technical concepts to both technical and non-technical stakeholders
  • Proven ability to work in a collaborative team environment and drive projects to completion

Ideal Candidate Qualifications

  • Working proficiency in using Python/Scala, Spark (tuning jobs), SQL, Hadoop platforms to build Big Data products & platforms.
  • Good programming skills in Java and spring boot and Junit.
  • Knowledge in software development test approaches & frameworks
  • Familiarity with RESTful APIs and micro-services architectures
  • Experience in working with CI/CD
  • Experience in working with SQL database like Postgres, Oracle
  • Preferably with hands-on experience with Hadoop big data tools (Hive, Impala, Spark)
  • Experience with data pipeline and workflow management tools: NIFI, Airflow.
  • Comfortable in developing shell scripts for automation.
  • Good troubleshooting and debugging skills.
  • Proficient in standard software development, such as version control, testing, and deployment
  • Demonstrated basic knowledge of statistical analytical techniques, coding, and data engineering
  • Ability to quickly learn and implement new technologies
  • Ability to Solve complex problems with multi-layered data sets
  • Ability to innovate and determine new approaches & technologies to solve business problems and generate business insights & recommendations.
  • Ability to multi-task and strong attention to detail
  • Flexibility to work as a member of a matrix based diverse and geographically distributed project teams
  • Good communication skills
  • both verbal and written
  • strong relationship, collaboration skills, and organizational skills

Nice to have

  • Experience with performance Tuning of Database Schemas, Databases, SQL, ETL Jobs, and related scripts
  • Experience in working with Cloud APIs (e.g., Azure, AWS)
  • Experience participating in complex engineering projects in an Agile setting e.g. Scrum

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Detail Oriented
  • Collaboration
  • Communication
  • Problem Solving

Data Engineer Related jobs