Overview:
Join our dynamic ITility team and put your skills and passion to work! We are seeking a talented Data Engineer to join our Data aNd Analytics (DNA) team, who will be responsible for architecting, developing, and maintaining a multi-system data pipeline inside an AWS ecosystem. This position will support a new development activity for a client Chief Data Officer initiative to build an enterprise data exchange, brokering internal and external application data via the pipeline. Your skills in AWS tools and environments will be invaluable to this team. This is a remote position with an expectation of occasional client site visits.
You’ll be working on our prime contract supporting the USMEPCOM, a major command within the U.S. Department of Defense (DoD), responsible for screening and processing applicants into the U.S. Armed Forces. USMEPCOM operates 65 Military Entrance Processing Stations (MEPS) across the U.S., serving as the critical link between recruitment and training for the armed forces.
At ITility, we help our customers command the future by thinking beyond perceived limits to create new, unexpected ways to protect and defend our nation. We inspire and empower people to create significant solutions that secure what matters to our customers and communities, here and around the globe.
We Value:
- The Drive to Perform Beyond Perceived Limits.
- The Desire to Find Significance in All We Do.
- The Passion and Compassion That Powers Both.
Responsibilities:
Key Responsibilities:
- API Development & Management:
- Design, build, and maintain scalable RESTful and event-driven APIs using AWS API Gateway.
- Develop and optimize integrations between APIs and services such as AWS Lambda, RDS, S3, and Step Functions.
- Implement API Gateway security features, including API keys, Secrets Manager, IAM roles, and resource policies.
- Create detailed API documentation and manage versioning, stages, and deployments.
- Data Engineering:
- Design, build, and maintain ETL/ELT pipelines to process and ingest data from multiple sources using API.
- Work with structured and unstructured data in cloud storage systems like S3, Athena, RDS, and Redshift.
- Create and manage data models, schemas, and database designs to support analytics and applications.
- Development Skills:
- Write clean, efficient, and maintainable code in programming languages like Python.
- Develop serverless applications using AWS Lambda functions integrated with API Gateway.
- Collaborate with Product Owners and application teams via agile scrums to identify source data targets.
- Collaborate with data analysts and scientists to ensure data readiness for analytics use cases.
- Prepare code for integration into a CI/CD pipeline for automated deployment of APIs and data workflows.
- Knowledge of Infrastructure as Code (IaC) tools such as AWS CloudFormation and Terraform to ensure DevSecOps team can deploy applications.
- Additional details:
Qualifications:
Required Skills & Qualifications:
- Education: Bachelor’s degree in Computer Science, Data Engineering, or a related field.
- US Citizenship Required to be processed for a Government Background Investigation.
- 8+ years of professional experience as a Data Engineer or similar role.
- AWS certifications (e.g., AWS Certified Developer, AWS Certified Data Engineer).
- Development Skills:
- Proficiency in programming languages such as Python and SQL.
- Experience with API development frameworks and libraries (e.g., Flask, FastAPI, Python
- Version Control: Proficiency with Git and code collaboration tools like GitHub or Bitbucket.
- Knowledge of OAuth, OpenAPI Specification (Swagger), and API lifecycle management
- Data Skills:
- Strong SQL skills for querying and transforming data in RDS or similar databases.
- Hands-on experience with data modeling, schema design, and query optimization.
- AWS Expertise:
- Advanced knowledge of AWS API Gateway, Lambda, RDS, and S3.
- Familiarity with AWS services like Step Functions, Glue, RDS and Redshift.
- Experience with serverless architecture and microservices.
- Familiarity with asynchronous data processing frameworks like Apache Kafka or Kinesis.
- Experience with big data frameworks such as PySpark or EMR.
Soft Skills:
- Strong problem-solving and debugging capabilities.
- Excellent collaboration and communication skills to work in cross-functional teams.
- Proactive and adaptable mindset with a focus on continuous learning.
Physical Requirements:
- Prolonged periods of sitting at a desk and working on a computer.
- Must be able to lift up to 15 pounds.
ITility is an Equal Opportunity Employer:
ITility is committed to providing a work environment that is non-discriminatory, harassment free, fair, ethical and inclusive.
ITility is committed to the principle of equal employment opportunity and complies with all applicable laws which prohibit discrimination and harassment in the workplace. ITility strictly prohibits discrimination or harassment based on race, color, religion, national origin, sex, age, disability or any other characteristic protected by law in all terms, conditions and privileges of employment, including without limitation, recruiting, hiring, assignment, compensation, promotion, discipline and termination. This policy covers conduct occurring at ITility’s offices, client sites, other locations where ITility is providing services, and to all work-related activities.