Match score not available

Software Development Engineer 3

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

WEX logo
WEX XLarge http://www.wexinc.com
5001 - 10000 Employees
See all jobs

Job description

Company Overview

WEX is an innovative global commerce platform and payments technology company looking to forge the way in a rapidly changing environment, to simplify the business of doing business for customers, freeing them to spend more time, with less worry, on the things they love and care about. We are journeying to build a consistent world-class user experience across our products and services and leverage customer-focused innovations across all our strategic initiatives, including big data, AI, and Risk.

Team Overview

We are seeking a Level 3 Software Engineer  to join our dynamic team. In this role, you will collaborate across disciplines—combining software development, data engineering, and operations—to deliver comprehensive solutions. This approach ensures involvement in all stages of the product life cycle, fostering a deeper understanding of our systems and enhancing innovation. You will lead the design, development, and optimization of complex data products and platforms, delivering high-quality and scalable solutions. Additionally, you will play a pivotal role in mentoring team members, driving technical discussions, and contributing to strategic initiatives that shape the future of our organization. Your ability to work seamlessly across different engineering domains will be crucial in delivering robust and efficient solutions that align with our business objectives.

Responsibilities:

  • Collaborate with stakeholders to understand customer challenges and business requirements, translating them into effective technical solutions that align with organizational goals.

  • Design, develop, test, and optimize data products, systems, and platforms, focusing on small to medium complexity tasks. Ensure the solutions are high-quality, reliable, and scalable to meet the needs of the business.

  • Build and maintain scalable data pipelines and ETL processes to handle large volumes of data efficiently, ensuring data integrity, performance, and reliability throughout the entire data flow.

  • Develop and manage CI/CD pipelines using tools like GitHub Actions to streamline the integration and deployment process. Implement Infrastructure as Code (IaC) using Terraform to ensure efficient and automated infrastructure management.

  • Implement software development best practices, including Test-Driven Development (TDD) and Behavior-Driven Development (BDD), while leveraging Microservices and Vertical Slice Architectures for modular, maintainable codebases.

  • Support live data products and platforms by promoting proactive monitoring, rapid incident response, and continuous improvement processes to minimize downtime and enhance system performance.

  • Analyze and optimize existing systems and processes, identifying bottlenecks and opportunities for improvement. Address performance issues in data pipelines, storage systems, and data processing flows to ensure optimal performance.

  • Mentor peers and foster continuous learning within the team by providing guidance, constructive feedback, and technical expertise. Engage in code reviews, share best practices, and encourage collaboration to improve team performance.

  • Engage in continuous learning of new technologies, frameworks, and tools, applying this knowledge to enhance workflows, system performance, and overall team productivity. Stay current with industry trends and best practices to drive innovation within the team.

  • Ensure adherence to team processes and best practices, independently completing tasks of small to medium complexity, and proactively seeking feedback from senior engineers to ensure high-quality results.

  • Lead and participate in technical discussions, ensuring clarity in objectives and solutions. Collaborate with peers to complete tasks and projects efficiently, supporting team goals and ensuring alignment with broader business objectives.

Required Qualifications:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field, or equivalent practical experience demonstrating technical capabilities and deep understanding.

  • 2+ years of experience in software engineering with a focus on data engineering, designing and implementing data pipelines and data systems for efficient data processing and storage.

  • Proficiency in programming languages such as Java, C#, Go, or Python, with strong skills in coding, automated testing, debugging, and performance monitoring of data-driven applications.

  • Experience with building scalable data pipelines and data extraction from diverse sources (APIs, flat files, and NoSQL databases), and implementing ETL/ELT processes to ensure data is transformed and loaded accurately and efficiently.

  • Strong understanding of data modeling techniques, including dimensional modeling and schema design for relational databases, with experience optimizing SQL queries for performance and scalability.

  • Hands-on experience with big data technologies like Apache Spark or cloud-based data processing platforms such as AWS Glue, Azure Data Factory, or Google Dataflow for handling large-scale data processing and analytics workloads.

  • Proficiency in developing and maintaining CI/CD pipelines using tools such as GitHub Actions, Jenkins, or similar, ensuring seamless integration, testing, and deployment of data systems.

  • Experience implementing Infrastructure as Code (IaC) using tools like Terraform or CloudFormation to automate the provisioning and management of infrastructure in cloud environments.

  • Experience optimizing the performance of data pipelines and queries, identifying bottlenecks, improving data throughput, and fine-tuning storage and compute resources for efficient processing.

  • Familiarity with data governance and quality standards, including implementing data quality checks, data lineage, and data validation processes to ensure data integrity, consistency, and compliance.

  • Passionate about keeping up with modern technologies and design.

  • Strong willingness and capability to learn new technology and tools quickly when needed.

  • Passionate about understanding and solving customer/business problems.

Preferred Qualifications:

  • Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP), including familiarity with cloud services related to computing, storage, and data processing.

  • Experience with cloud-based data warehousing applications, such as Snowflake, Amazon Redshift, or similar technologies.

  • Experience in building data pipelines with cloud-native ingestion, orchestration, and transformation applications, leveraging tools and services like Airflow, DBT, AWS glue , Kafka, AWS kinesis etc

  • Knowledge of AI and machine learning concepts, with experience in leveraging data-driven technologies and tools to improve system capabilities, automate processes, or enhance product features.

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Mentorship
  • Collaboration
  • Problem Solving

Software Engineer Related jobs