Roles & Responsibilities: You will be expected to perform the following tasks and responsibilities in a manner consistent with CBA's Values and People Capabilities
- Design and develop highly reliable and scalable data pipelines and data platforms with comprehensive test coverage.
- Collaborate with stakeholders to analyse and translate requirements to technical implementation.
- Identify and drive opportunities for continuous improvement within the team and in delivery of products.
- Provide mentoring and technical assistance to other members of the team, including more junior Data Engineers.
- Produce high quality, sustainable solutions to meet business requirements, leveraging approved delivery frameworks and by applying industry best practice.
- Provide technical governance of product delivery to ensure successful delivery and adoption.
- Deliver data engineering solutions aligned to core concepts of data design, preparation, transformation, and load.
- Build and implement data pipelines in distributed data platforms including warehouses, databases, data lakes and cloud lakehouses to enable data predictions and models, reporting and visualisation analysis via data integration tools and frameworks.
- Contribute to domain planning, providing guidance to ensure that technical deliveries are aligned to engineering direction and strategy.
- Risk - Operate within the CBA Group risk appetite and effectively manage strategic and operational risk related to data
- Adhere to the Code of Conduct. The Code of Conduct sets the standards of behavior, actions and decisions we expect from our people.
Essential Skills:
- 5-7 years’ experience in design, architecture or development in Analytics and Data Warehousing.
- Experience leading Data Capability building projects. Experience in driving decisions across groups of stakeholders.
- Extensive experience with Teradata data warehouses and Cloudera Hadoop. Proficient across Enterprise Analytics/BI/DW/ETL technologies such as Teradata Control Framework, Tableau, OBIEE, SAS, Apache Spark, Hive.
- Experience in Development projects using Ab Initio, Snowflake, AWS/Cloud Services
- Nice to have Snowflake experience; Knowledge on Data Analysis; Banking Domain Experience
- Analytics & BI Architecture appreciation and broad experience across all technology disciplines.
- Experience in working within a Data Delivery Life Cycle framework & Agile Methodology.
- Extensive experience in large enterprise environments handling large volume of datasets with High SLAs and in creating reusable group data asset.
Education Qualification: Bachelor’s degree/Master’s degree in Engineering in Computer Science/Information Technology