Responsibilities
Key Responsibilities:
- Work effectively with fellow software engineers, product owners and other technical experts to deliver curated data products.
- Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solution.
- Design, develop, and deploy data pipelines using Dataflow, Dataproc, and other relevant GCP services.
- Build and maintain data warehouses using BigQuery.
- Implement data quality and validation processes.
- Develop and maintain data schemas and models.
- Collaborate with software engineers and other stakeholders to meet business requirements.
- Manage and monitor GCP infrastructure using tools like Cloud Monitoring and Cloud Logging.
- Implement security best practices for cloud-based applications.
- Mentor and guide junior developers.
Qualifications
Qualifications:
- Bachelor’s degree in computer science or a related field.
- 8+ years of experience in software development, with at least 5 years on Google cloud technologies.
- Expertise in Data ingestion and storage, Data Transformation and processing, Data analysis and visualization, Performance optimization and cost management, and a deep understanding, designing, data modeling, data integration, data warehousing and managing large-scale data storage system with below services: Big Query, Cloud Run, Cloud Function, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage and Data Studio, Dataflow, Dataproc, Cloud Build etc.,
- Strong experience with infrastructure-as-code (IaC) tools like Terraform or Cloud Deployment Manager and Expertise in Tekton pipeline.
- Deep understanding of GCP services, including Compute Engine, Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, and Cloud Networking.
- Extensive experience with DevOps practices and tools.
- Expertise in Data encryption, Identity and Access Management and Cloud Security.