OP

Senior Data Engineer

Optum

6 months ago

10+ years

Work From Office

Bengaluru, Karnataka, India

  • Develop efficient and high performing ETL solutions
  • Design, develop, test and performance tune complex ETLs
  • Develops and maintains scalable data pipelines in response to customer requirements
  • ETL

    Data Warehouse

    Snowflake

    Sql

    Hive

    Airflow

    Powershell

    Cloud Platforms

    CI/CD pipeline

    Software Development Life Cycle (SDLC)

    Big Data Platforms

    Job description & requirements

    Primary Responsibilities:


    1. Understand Caredata architecture/domain and start contributing for new/existing business requests
    2. Experience in Data Integration and Data Warehousing, Cloud
    3. Develop efficient and high performing ETL solutions
    4. Design, develop, test and performance tune complex ETLs
    5. Develops and maintains scalable data pipelines in response to customer requirements
    6. Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues
    7. Design, develop, implement, and run data solutions that improve data efficiency, reliability, and quality
    8. Participate in design review sessions and peer ETL reviews
    9. Assess and profile source and target data (data structure, quality, completeness, schema, nulls, etc.) and requested business use cases
    10. Summarizes testing and validation results and can communicate and make recommendations/decisions on the best course of action to remediate
    11. Resourceful at coming up with solutions using existing or available resources based on knowledge of the organization and level of execution effort
    12. Participate in agile work environment, attend daily scrums, and complete sprint deliverables on time
    13. Supports practices, policies and operating procedures and ensures alignment to departmental objectives and strategy
    14. Ensure the code is meeting the desired quality checks using Sonar
    15. To make sure all the cloud infra is intact and resolve any issues encountered
    16. Schedule the deployed pipelines using Airflow following proper dependency hierarchy of jobs
    17. Move the code/application to higher environments(stage/prod) from non-prod for go-live
    18. Build and maintain pipelines and automation through Git Ops (GitHub Actions, Jenkins, etc)
    19. Identifies solutions to non-standard complex requests and problems and creates solution using available technologies
    20. Builds solid relationship with IT operational leaders to ensure connectivity to the business
    21. Supports a work environment in which people can perform to the best of their abilities. Holds self-accountable for technical abilities, productive results, and leadership attributes
    22. Works with less structured, more complex issues
    23. Serves as a resource to others
    24. Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so


    Required Qualifications:

    1. Bachelor or master’s degree in computer science or information technology or equivalent
    2. 10+ years of experience in designing and developing ETL solutions
    3. 10+ years of working knowledge in a Data Warehouse/BI environment
    4. Solid DBMS experience: Snowflake, SQL, Hive
    5. Good experience in job schedulers like Airflow
    6. Windows batch PowerShell scripting and/or UNIX shell scripting and Python experience
    7. Good experience in working on cloud environment, preferably Azure
    8. Experience with Continuous Integration CI/CD pipelines using Jenkins, GitHub Actions
    9. Experience with contemporary SDLC methodologies such as Agile, Scrum
    10. Expertise in BigData frameworks like Spark and good understanding of Hadoop concepts
    11. Hands on experience with Rally
    12. Solid ETL skills using Bigdata (Databricks, Spark, Scala, Python), Kafka and Azure Cloud / AWS
    13. Solid SQL skills including complex SQL constructs, DDL generation
    14. Proven excellent organizational, analytical, writing, problem solving and interpersonal skills


    Experience :

    10+ years

    Job Domain/Function :

    Data Engineering

    Job Type :

    Work From Office

    Employment Type :

    Full Time

    Number Of Position(s) :

    1

    Educational Qualifications :

    Bachelor's Degree

    Location :

    Bengaluru, Karnataka, India, Bengaluru, Karnataka, India

    Create alert for similar jobs

    Similar Jobs