BA

Data Engineer

Barclays

6 months ago

5 - 7 years

Work From Office

Chennai, Tamil Nadu, Tamil Nadu, India

  • Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data.
  • Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.
  • Development of processing and analysis algorithms fit for the intended data complexity and volumes.
  • PySpark

    SparkSQL

    AWS Cloud

    Snowflake

    SQL

    PLSQL

    Apache Airflow

    ETL Tools

    Data Warehousing

    Job description & requirements

    To be successful as a Data Engineer, you should have experience with:


    1. Hands on experience in pyspark and strong knowledge on Dataframes, RDD and SparkSQL.
    2. Hands on Experience in developing, testing and maintaining applications on AWS Cloud.
    3. Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena).
    4. Design and implement scalable and efficient data transformation/storage solutions using Snowflake.
    5. Experience in Data ingestion to Snowflake for different storage format such Parquet, Iceberg, JSON, CSV etc.
    6. Experience in using DBT (Data Build Tool) with snowflake for ELT pipeline development.
    7. Experience in Writing advanced SQL and PL SQL programs.
    8. Hands On Experience for building reusable components using Snowflake and AWS Tools/Technology.
    9. Should have worked at least on two major project implementations.
    10. Exposure to data governance or lineage tools such as Immuta and Alation is added advantage.
    11. Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is added advantage.
    12. Knowledge on Abinitio ETL tool is a plus.


    Some other highly valued skills may include:


    1. Ability to engage with Stakeholders, elicit requirements/ user stories and translate requirements into ETL components.
    2. Ability to understand the infrastructure setup and be able to provide solutions either individually or working with teams.
    3. Good knowledge of Data Marts and Data Warehousing concepts.
    4. Resource should possess good analytical and Interpersonal skills.
    5. Implement Cloud based Enterprise data warehouse with multiple data platform along with Snowflake and NoSQL environment to build data movement strategy.


    You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills.


    The role is based out of Chennai.


    Purpose of the role

    To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.


    Accountabilities

    1. Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data.
    2. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.
    3. Development of processing and analysis algorithms fit for the intended data complexity and volumes.
    4. Collaboration with data scientist to build and deploy machine learning models.


    Experience :

    5 - 7 years

    Job Domain/Function :

    Data Engineering

    Job Type :

    Work From Office

    Employment Type :

    Full Time

    Number Of Position(s) :

    1

    Educational Qualifications :

    Bachelor's Degree

    Location :

    Chennai, Tamil Nadu, India, Chennai, Tamil Nadu, India

    Create alert for similar jobs

    BA

    Barclays

    Similar Jobs