OP

FinOps cloud data engineer

Optum

9 months ago

3 - 5 years

Work From Office

Noida, Uttar Pradesh, Uttar Pradesh, India

  • Develop and maintain data pipelines using tools like Azure Data Factory and Databricks to ingest, process, and transform cloud cost and usage data.
  • Design, develop, and maintain data models and dbt workflows to support reporting and analysis needs.
  • Conduct in-depth data analysis using SQL, Python (in Jupyter notebooks), and Excel to identify cost optimization opportunities, trends, and anomalies.
  • SQL

    EXCEL

    Data visualization tools

    power BI

    Cloud Platforms

    PYTHON

    Jupyter Notebook

    Data pipelines

    Azure Data Factory

    Data Bricks

    Data Modeling

    DBT

    Job description & requirements

    Responsibilities:


    • Develop and maintain data pipelines using tools like Azure Data Factory and Databricks to ingest, process, and transform cloud cost and usage data.
    • Design, develop, and maintain data models and dbt workflows to support reporting and analysis needs.
    • Conduct in-depth data analysis using SQL, Python (in Jupyter notebooks), and Excel to identify cost optimization opportunities, trends, and anomalies.
    • Develop and automate reporting and dashboards using tools like PowerBI to provide insights to stakeholders.
    • Leverage AI and bot technologies to automate data analysis tasks and improve efficiency.
    • Collaborate with data scientists and engineers to implement machine learning models for forecasting and anomaly detection.
    • Support data governance and ensure data quality across the cloud cost management platform.
    • Contribute to the development and maintenance of data documentation and training materials.


    Skills Required:

    • Bachelor’s degree in Computer Science, Data Science, or a related field.
    • 3-5 years of experience in data engineering or data analysis.
    • Strong proficiency in SQL and Excel.
    • Experience with reporting and visualization tools like PowerBI.
    • Excellent communication, problem-solving, and analytical skills.
    • Familiarity with major cloud service providers (GCP, AWS, Azure).
    • Hands-on experience with a high-level programming language (e.g., Python).
    • Experience working with Jupyter Notebooks or similar tools.
    • Experience writing code guided by AI bots.
    • Experience with cloud cost management tools.
    • Experience building and maintaining data pipelines using Azure Data Factory and Databricks (or similar tools on other cloud platforms).
    • Experience with data modeling and dbt.


    Experience :

    3 - 5 years

    Job Domain/Function :

    Data Engineering

    Job Type :

    Work From Office

    Employment Type :

    Full Time

    Number Of Position(s) :

    1

    Educational Qualifications :

    Bachelor's Degree

    Location :

    Noida, Uttar Pradesh, India, Noida, Uttar Pradesh, India

    Create alert for similar jobs

    Similar Jobs