Your key responsibilities
- Work within software development applications as a Data Engineer to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions.
- Partner with Service/Backend Engineers to integrate data provided by legacy IT solutions into your designed databases and make it accessible to the services consuming those data.
- Focus on the design and setup of databases, data models, data transformations (ETL), critical Online banking business processes in the context Customer Intelligence, Financial Reporting and performance controlling.
- Contribute to data harmonization as well as data cleansing.
- A passion for constantly learning and applying new technologies and programming languages in a constantly evolving environment.
- Build solutions are highly scalable and can be operated flawlessly under high load scenarios.
- Together with your team, you will run and develop you application self-sufficiently.
- You'll collaborate with Product Owners as well as the team members regarding design and implementation of data analytics solutions and act as support during the conception of products and solutions.
- When you see a process running with high manual effort, you'll fix it to run automated, optimizing not only our operating model, but also giving yourself more time for development.
Your skills and experience
Mandatory Skills
- Hands-on development work building scalable data engineering pipelines and other data engineering/modelling work using Java/Python.
- Excellent knowledge of SQL and NOSQL databases.
- Experience working in a fast-paced and Agile work environment.
- Working knowledge of public cloud environment.
Preferred Skills
- Experience in Dataflow (Apache Beam)/Cloud Functions/Cloud Run
- Knowledge of workflow management tools such as Apache Airflow/Composer.
- Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub).
- Knowledge of GCS Buckets, Google Pub Sub, BigQuery
- Knowledge about ETL processes in the Data Warehouse environment/Data Lake and how to automate them.
Nice to have
- Knowledge of provisioning cloud resources using Terraform.
- Knowledge of Shell Scripting.
- Experience with Git, CI/CD pipelines, Docker, and Kubernetes.
- Knowledge of Google Cloud Cloud Monitoring & Alerting
- Knowledge of Cloud Run, Data Form, Cloud Spanner
- Knowledge of Data Warehouse solution - Data Vault 2.0
- Knowledge on NewRelic
- Excellent analytical and conceptual thinking.
- Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams.
- Good communication and experience in working with distributed teams (especially Germany + India)