What you’ll do
Build and Operationalize data processing system as per Low level design
Should have good knowledge of Spark ,Python and SQL framework and must have hands-on Experience on Dataproc. Data Flow knowledge is good to have .
Understanding and experience of GCP Data Fusion, Big Query, Airflow for optimized development approach. Docker and Terraform knowledge is required
Knowledge of data pipeline patterns used in cloud, ability to think outside the box to understand way around of product limitations
Designs, tests, and maintains data pipelines to business requirements and industry practices through the application of data modelling, data warehousing and data manipulation techniques and standards
Designs, develops and maintains programmes (e.g. written in Python, Spark ,Scala, Java)
Who you are
Overall experience of 8-10 years
Minimum 5+ Years of relevant experience Cloud platforms (GCP/AWS/Azure)
Google Cloud Certified Professional Data Engineer
Must have technical / professional qualifications: B.E. / B. Tech , BCA / MCA , BSc / MSc (Comp Science)