Role purpose:
In the proposed role, you will interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle including Requirements Elicitation, Application Architecture definition, Design and Implementation, Deployment and Live support. You are expected to deliver ETL applications in GCP using GCP services like Cloud Data Fusion, BigQuery, Google Cloud Storage, PubSub, Composer, Cloud Functions etc.
You will also ensure high quality code deliverables for shared or common modules and support development on resolving their technical issues. Since this is a strategically greenfield project, you will be expected to support laying the foundations and build frameworks, set-up best practice guidelines and support organizing the entire project.
Core competencies, knowledge and experience:
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. ·
- 3 to 5 years of experience in data engineering, with a strong focus on GCP.
- Proficiency in GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer ,Cloud Function, Cloud RUN.
- Strong programming skills in Python, PLSQL.
- Experience with SQL and NoSQL databases.
- Knowledge of data warehousing concepts and best practices.
- Familiarity with data integration tools and frameworks.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration skills.
- Ability to work in a fast-paced, dynamic environment.