Your role:
- Designs, creates, tests and maintains data pipelines and storage (inbound, inflight, etc); Creates end-to-end vision for how data will flow through an organization
- Maintains the infrastructure required for extracting, transforming and loading (ETL/ELT) the data from a wide variety of data sources
- Responsible for database design, data warehousing, setting up a data lake, design and implementation of data protection
- Address non-functional infrastructure requirements with a strong focus on end-users, high quality (QMS/regulatory standards), performance, safety and security, privacy, scalability, reliability, durability, availability, backups
- Establish and follow quality measures for data sets Keeps abreast of technical knowledge by studying and implementing state-of-the-art programming techniques and development tools, participating in educational opportunities, participating in communities of practice, reading professional publications and maintaining personal networks
Key Responsibilities:
- Develop scalable data pipelines using AWS Glue, Lambda and other AWS services
- Define workflow for data ingestion, cleansing, transformation, and storage using S3, Athena, Glue and other AWS services
- Ensure data security, compliance, and governance
- Implement robust monitoring and alerting mechanisms using CloudWatch and custom metrics for pipeline health and data quality
- Contribute to cloud optimization, cost control, and architectural design reviews.
You're the right fit if:
- Strong hands-on experience in AWS Data Engineering - S3, Lambda, Athena, S3, API Gateway, CloudFront, ECS and Glue
- Solid understanding of data modeling, partitioning strategies, and performance tuning in large datasets
- Familiarity with tools like CloudFormation for infrastructure automation
- Proficiency in Python and SQL
- Strong in SQL, Datawarehousing, Data Modelling
- Awareness of latest datalake architectures (Iceberg, S3 tables, duckdb)
- Knowledge of serviceability domain use cases such as diagnostics, telemetry, and predictive service operations is a plus
- Strong communication and stakeholder management skills.
Minimum required Education:
- Bachelor's / Master's Degree in Computer Science, Software Engineering, Information Technology or equivalent.
Minimum required Experience:
- Minimum 6-8 years of experience with Bachelor's in areas such as Software Development, Testing and Quality Assurance or equivalent