Accountabilities
- Lead the design, development, and enhancement of scalable ETL pipelines and Data Products as part of a Data Mesh inspired strategy.
- Demonstrate your expertise in ELT solutions (Python, Pyspark, Glue) and AWS ecosystems to deliver exceptional solutions.
- Collaborate with global and diverse Agile teams to overcome technical data challenges.
- Integrate the latest industry trends and innovations into your work such as GenAI.
Essential Skills/Experience
- 7 to 9 years of experience
- A proactive mindset and enthusiasm for Agile environments.
- Strong hands-on experience with cloud providers and services.
- Experience in performance tuning SQL and ETL pipelines.
- Extensive experience in troubleshooting data issues, analyzing end-to-end data pipelines, and working with users to resolve issues.
- Masterful debugging and testing skills to ensure excellence in execution.
- Inspiring communication abilities that elevate team collaboration.
- Experience handling structured, semi-structured (XML, JSON), and unstructured data including extraction and ingestion via web-scraping and FTP/SFTP.
- Production experience delivering CI/CD pipelines (Github, Jenkins, GitHub Actions).
- Excellent Cloud DevOps Engineer who can develop, test, and maintain CICD Pipeline using Terraform, cloud formation.
- Remain up to date with the latest technologies like GenAI / AI platforms and FAIR scoring to improve outcomes.