Middleware Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

Bachelor of Science degree from an accredited university., Hands-on experience with Java, Python, and Spring Boot., Proven expertise in Kafka, JMS, or Azure Service Bus for scalable applications., Strong knowledge of data security best practices, especially concerning PII and sensitive data..

Key responsibilities:

  • Design and deploy middleware services using Java Spring Boot and integrate with REST APIs and data lakes.
  • Develop real-time and batch data pipelines for data extraction and transformation into systems like Snowflake.
  • Collaborate with architects and DevOps teams to ensure CI/CD readiness and monitoring of data flows.
  • Implement robust security measures for middleware systems processing sensitive information.

Fusemachines logo
Fusemachines SME https://bit.ly/
201 - 500 Employees
See all jobs

Job description

About Fusemachines
Fusemachines is a 10+ year old AI company, dedicated to delivering state-of-the-art AI products and solutions to a diverse range of industries. Founded by Sameer Maskey, Ph.D., an Adjunct Associate Professor at Columbia University, our company is on a steadfast mission to democratize AI and harness the power of global AI talent from underserved communities. With a robust presence in four countries and a dedicated team of over 400 full-time employees, we are committed to fostering AI transformation journeys for businesses worldwide. At Fusemachines, we not only bridge the gap between AI advancement and its global impact but also strive to deliver the most advanced technology solutions to the world.

This is a 1-year contractual role.

About the role:

We are seeking a Senior Middleware & Data Integration Engineer to design and build secure, scalable, and real-time data pipelines and middleware services across hybrid cloud environments. The ideal candidate will have strong proficiency in Java (Spring Boot), experience with data lakes (Azure, AWS), messaging systems (Kafka, Azure Service Bus, JMS), and an understanding of real-time and batch-based processing. You will work across full-stack components and integrate structured/unstructured data from upstream systems into platforms like Snowflake, while ensuring compliance and performance at scale.

Roles and Responsibilities:

  • Design, build, and deploy middleware services using Java Spring Boot with integrations across REST APIs, data lakes, and messaging systems
  • Develop and manage real-time and batch data pipelines that extract, enrich, and transform data from upstream sources into systems like Snowflake
  • Build resilient integrations using Kafka, Azure Service Bus, JMS, including handling retries, dead-letter queues, and throttling strategies
  • Leverage data spine architecture for metadata exchange, data standardization, and integration logic across systems
  • Integrate RESTful services (e.g., Spring Boot APIs) to facilitate ingestion and distribution of data across the platform.
  • Build and optimize workflows for data ingestion, event processing, and API interaction.
  • Implement crosswalk and data enrichment logic within data pipelines using technologies like PySpark or Java Streams.
  • Collaborate with architects and DevOps teams to ensure CI/CD readiness, monitoring, and alerting of data flows.
  • Install, configure and maintain middleware technologies (experience with any of these: Websphere, Weblogic, Tomcat, JBoss, Kafka, RabbitMQ or similar).
  • Ensure high availability, scalability and reliability of middleware systems.
  • Design and implement solutions for system and application integration..
  • Optimize middleware performance and recommend improvements.
  • Design and development of middleware components.
  • Design and implement API necessary for the integration and or data consumption.
  • Work independently and collaboratively on a multi-disciplined project team in an Agile development environment.
  • Be actively involved in the design, development and testing activities for Big data products.
  • Provide feedback to development teams on code/architecture optimization.
  • Design and implement secure data processing pipelines, including concepts like data spines, for handling sensitive information.
  • Architect and differentiate between event-driven and batch-based data pipelines, making informed decisions on their application.
  • Design and implement robust security measures for middleware systems processing PII or customer-sensitive data.
  • Design and develop middleware systems to process and enrich messages from multiple upstream sources, integrating with data warehouses like Snowflake.

Required Skills and Qualifications:

  • Hands-on experience developing Java, Python.
  • Hands-on experience with Spring Boot, Spring Boot Oauth, Spring Security, Spring Data JPA, and Spring Batch. 
  • Familiarity with Azure services.
  • Proven expertise in Kafka, JMS, or Azure Service Bus, including designing fault-tolerant, scalable message-driven applications
  • Experience with data enrichment and transformation processes, preferably using PySpark or Java Streams.
  • Experience integrating with Snowflake, Redshift, BigQuery, or similar data platforms
  • Deep understanding of event-driven architectures and batch-based workflows, including tradeoffs and ideal use cases
  • Experience working with data enrichment, schema alignment, and crosswalk logic in enterprise-scale pipelines
  • Proven experience with CI/CD. Proven experience with Jenkins, Ansible, Docker, Kubernetes.
  • In-depth understanding of event-driven and batch-based data pipeline architectures.
  • Experience with application servers like IBM WebSphere, Oracle WebLogic Server, Apache Tomcat, JBoss/WildFly.
  • Understanding Relational Databases, such as Oracle, SQL Server, MySQL, Postgres or similar.
  • Experience using software project tracking tools such as Jira.
  • Proven experience with version control (Github, Bitbucket).
  • Familiarity with Linux OS/concepts.
  • Strong knowledge of data security best practices, especially concerning PII and sensitive data.
  • Strong written and verbal communication skills.
  • Self-motivated and ability to work well in a team.

Education
Bachelor of Science degree from an accredited university

Fusemachines is an Equal Opportunities Employer, committed to diversity and inclusion. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristic protected by applicable federal, state, or local laws.

Required profile

Experience

Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Self-Motivation
  • Teamwork
  • Communication

Back-End Engineer Related jobs