Roles & Responsibilities:
- Work with the team to understand our customers' core business objectives and deliver quality data centric solutions within committed timeframes
- Contribute to thought leadership enabling analytics teams to deliver world class data centric solutions and analytics by championing sustainable and re-useable data assets Design and build group data assets by integrating diverse data from internal and external sources
- Help to promote best in class coding standards and practices to ensure high quality and minimum risk
- Collaborating and communicating with business and delivery stakeholders and working without supervision
- Identify and escalate Technical Debt through simple processes to support effective planning and risk management.
- Risk Mindset – All Bankwest employees are expected to proactively identify and understand, openly discuss and act on current and future risks.
- Able to build solutions that are fit for purpose, perform well with large data volume and complex data transformation rules, and reliable to operate
- Understanding of the development and release cycle following Change Management processes
- Creative problem solver, with open thinking to generate and support new or better ways of doing things
- Strong capability and experience with modern engineering practices and techniques Experience with Agile working practices is beneficial
- Prior experience working in Financial Services industry would be highly regarded, but not essential.
- AWS (Amazon Web Services) Cloud (working experience with some of the core services EMR, EC2, S3, Auto-scaling, etc.) desirable.
- Experience with large migration programs and automating things to scale up migrations from on-prem to cloud. Hands on, Design and develop exceptionally reliable and scalable data pipelines and data platforms with comprehensive test coverage.
- Deliver data engineering solutions aligned to core concepts of data design, preparation, transformation, and load. Build and implement data pipelines in distributed data platforms including warehouses, databases, data lakes and cloud Lake houses to enable data predictions and models and reporting and visualization analysis via data integration tools and frameworks
Essential Skills:
- 8+ years of experience with good knowledge, working with the following languages
- ETL processes and practices
- Experience working with the following Tools:
- Source Control systems such as Git
- Build & Deployment tools such as Jenkins, TeamCity, Octopus Deploy
- Database – SQL, PL/SQL , Hadoop
- Strong experience in SAS / R / Python
- Monitoring tools such as Splunk, AppDynamics
- AWS Architecture
- AWS Data Engineer certification required
- Secure Coding practices
- CI/CD DevOps
Education Qualifications:
- Bachelor’s degree in engineering in Information Technology.