FinOps Data Engineer

Remote: 
Full Remote
Contract: 
Work from: 

Offer summary

Qualifications:

2-5 years of experience in data engineering and building ELT data pipelines., Strong understanding of cloud platforms (AWS, GCP, Azure) and their billing/cost management., Proficient in programming with Python for data engineering and automation., Experience with advanced data visualization tools and cost management platforms like Cloudability..

Key responsibilities:

  • Build and maintain data pipelines to analyze cloud and SaaS tool consumption.
  • Create dashboards and reports using visualization tools to provide actionable insights.
  • Collaborate with cross-functional teams to identify and implement cost-saving opportunities.
  • Advocate for FinOps principles and manage access to Cloud and SaaS tools.

Precisely logo
Precisely Large https://www.precisely.com/
1001 - 5000 Employees
See all jobs

Job description

Precisely is the leader in data integrity. We empower businesses to make more confident decisions based on trusted data through a unique combination of software, data enrichment products and strategic services. What does this mean to you? For starters, it means joining a company focused on delivering outstanding innovation and support that helps customers increase revenue, lower costs and reduce risk. In fact, Precisely powers better decisions for more than 12,000 global organizations, including 93 of the Fortune 100. Precisely's 2500 employees are unified by four company core values that are central to who we are and how we operate: Openness, Determination, Individuality, and Collaboration. We are committed to career development for our employees and offer opportunities for growth, learning and building community. With a "work from anywhere" culture, we celebrate diversity in a distributed environment with a presence in 30 countries as well as 20 offices in over 5 continents. Learn more about why it's an exciting time to join Precisely!

Overview: The FinOps Data Engineer is responsible for providing governance and reporting capabilities to the business for our Cloud and 3rd Party SaaS tool consumption. This position is responsible for building/maintaining data pipelines and deriving new data sets via programmatically obtained data. From these data pipelines, this position will build dashboards, analyze and audit data and trends, and report out to the business on our Cloud and 3rd Party tool consumption, spend, and risks. This position will analyze consumption and spend data to identify trends and anomalies in support of maximizing our operational efficiency, identifying cost savings opportunities, and enforcing best practices. This position will also assist with administering and managing access to Cloud and 3rd Party SaaS tools. 

What you will do:

  • Leverage tools like Cloudability to gather, process, and analyze cloud cost data
  • Create dashboards and reports using visualization tools (e.g. Tableau) to provide actionable insights
  • Build models for budgeting, forecasting, and anomaly detection in cloud spend
  • Understand drivers of cloud costs and identify solutions or approaches for cost optimization implementation
  • Collaborate with DevOps, Cloud Engineering, and Finance teams to identify and implement cost-saving opportunities
  • Advocate for FinOps principles across teams, promoting cost visibility and accountability
  • Design, build and maintain scalable ELT pipelines to process and analyze large volumes of cloud cost and usage data
  • Programmatically integrate with cloud providers (AWS, Azure, GCP) and SaaS tool providers (e.g. Datadog, MongoDB) to collect detailed billing and usage information
  • Utilize Snowflake to create and manage data warehouses for efficient querying and reporting

What we are looking for:

  • 2-5 years of experience with data engineering, building ELT data pipelines, creating data visualizations, and analyzing data
  • Experience in data engineering, cloud cost management, or FinOps related roles
  • Experience building with APIs and integrating them into data driven workflows
  • Excellent problem solving and analytical skills
  • Strong communication and collaboration abilities across non-co-located, cross-functional teams
  • Proactive approach to identifying and implementing improvements
  • Experience with advanced data visualization tools
  • Strong understanding of cloud platforms (AWS, GCP, Azure) and their billing/cost management
  • Knowledge of FinOps principles, including cost allocation, budgeting, and optimization strategies
  • Proven ability to process and analyze large scale cloud usage data
  • Programming: Strong experience with Python for data engineering and automation
  • Database Management: Hands-on experience with Snowflake
  • ETL and Data Pipelines: Proven expertise in designing and managing ETL workflows and event-driven processes
  • Cloud Cost Tools: Experience with cost management platforms like Cloudability, Cost Explorer or similar tools

#LI-ZB1

The personal data that you provide as a part of this job application will be handled in accordance with relevant laws. For more information about how Precisely handles the personal data of job applicants, please see the Precisely Global Applicant and Candidate Privacy Notice.

Required profile

Experience

Industry :
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Analytical Skills
  • Problem Solving
  • Communication

Data Engineer Related jobs