Roles & Responsibilities:
You will be expected to perform the following tasks and responsibilities in a manner consistent with CBA's
Core Responsibilities
- Understand our customers' core business objectives and build end-to-end data centric solutions to address them.
- Lead or participate in the definition, execution and communication of ETL strategy and technical solution.
- Development of data staging, architecture and consuming internal and external application interfaces
- Cooperate with business and operational users, other data management staff and infrastructure teams to ensure adherence to client and company guidelines, restrictions and requirements as well as industry standards.
- Participate in the full lifecycle of agile projects (ideation through implementation)
- Designs technical solutions and consumption patterns for business requirements with increasing complexity.
- Adhere to the Code of Conduct. The Code of Conduct sets the standards of behaviour, actions and decisions we expect from our people.
- Produce high quality, sustainable solutions to meet business requirements, leveraging approved delivery frameworks
- Influence and guide stakeholders to ensure that solutions are robust, secure and highly available
- Uplift the quality of technical solutions by applying industry best practice
- Role model continuous improvement mindset in team and project interactions
- Contributor to delivery planning, providing guidance to ensure that technical deliveries are aligned to engineering direction and strategy.
Essential Skills:
- 7-10 years’ experience in design, architecture or development in Analytics and Data Warehousing.
- 5 years of Solid experience in ETL pipeline building with spark or scala programming framework with knowledge in developing UNIX Shell Script, Oracle SQL/ PL-SQL.
- Experience in Big data platform for ETL development with AWS cloud platform.
- Proficiency in AWS cloud services, specifically EC2, S3, Lambda, Athena, Kinesis, Redshift, Glue, EMR, DynamoDB, IAM, Secret Manager, Step functions, SQS, SNS, Cloud Watch.
- Excellent skills in Python-based framework development are mandatory.
- Should have experience with Oracle SQL database programming, SQL performance tuning, and relational model analysis.
- Experience in ETL Abinitio tools like GDE, Express-IT, Control centre, Meta programming is a plus.
- Extensive experience with Teradata data warehouses and Cloudera Hadoop. Proficient across Enterprise Analytics/BI/DW/ETL technologies such as Teradata Control Framework, Tableau, OBIEE, SAS, Apache Spark, Hive
- Experience in working within a Data Delivery Life Cycle framework & Agile Methodology.
- Good knowledge in developing UNIX scripts, Oracle SQL/PL-SQL, and Autosys JIL Scripts.
- Experience in solution design would be good.
- Self-starter, autonomous working
- Strong analytical skills and Strong interpersonal & communication skills
- Problem solving – advanced conceptual, analytical and problem-solving skills to analyses complex information for key insights and present as meaningful information to senior management
Education/Qualifications
- Bachelor’s degree in engineering in Computer Science/Information Technology