Roles & Responsibilities:
- Model Scoring & Scaling: Develop scalable model scoring functions that handle high-volume, low-latency AI requests, ensuring models perform effectively across batch and real-time use cases.
- Scalable Data Access Patterns: Design and implement robust data access patterns for both batch and real-time AI processes, enabling optimal performance for traditional machine learning and generative AI models.
- Cloud Integration: Deploy AI models on cloud platforms, ensuring they are scalable, secure, and cost-efficient, supporting both batch and real-time workloads.
- CICD Pipeline Implementation: Build and maintain CICD pipelines for the continuous integration, testing, and deployment of AI models and APIs, ensuring a streamlined and automated development process.
- Problem Solving & Optimization: Collaborate with cross-functional teams to solve complex business challenges through the integration of scalable AI solutions, improving business outcomes.
- Collaboration & Coordination: Work with data scientists, engineers, and business stakeholders to ensure AI models are successfully integrated and aligned with business goals.
- Security & Compliance: Ensure that all AI models and solutions meet internal security requirements and comply with industry regulations, particularly in the banking sector.
Essential Skills:
- 3 to 7 years of Experience.
- Proficiency in Python, R, Spark ETL, Teradata, AWS tools (Sage maker, EMR, Redshift), Snowflake, with experience maintaining scalable ML models.
- Hands-on experience with cloud platforms (AWS, GCP, Azure) for AI deployment and integration, supporting both batch and real-time data processing.
- Expertise in scalable data access patterns for both batch and real-time AI models.
- Experience designing, building, and maintaining in Data Engineering solutions
- Experience setting up CICD pipelines for automating the deployment and testing of AI models.
- Strong problem-solving skills, with a focus on creating scalable solutions that deliver business impact.
Preferred Qualification:
- Experience in the banking or financial services industry, particularly with AI integration projects.
- Familiarity with data pipeline and processing tools like Apache Kafka, Spark, or similar.
- Utilize containerization technologies such as Docker and Kubernetes for scalable, secure model deployment and orchestration in cloud environments.
- Knowledge of AI governance, risk management, and compliance frameworks in regulated environments.
Education Qualifications:
- B.E. or B. Tech. or equivalent or post-graduation degree in Computer Applications or IT or Business Administration with 10+ years of relevant experience.