Role
Data Engineer
Required Technical Skill Set
Python, pyspark, Azure Data Bricks, SnowFlake, AEM Redshift
Desired Experience Range
3-10 Years
Location of Requirement
Bhubaneswar
Desired Competencies (Technical/Behavioral Competency)
Must-Have
Python + Pyspark – hands-on, including performance tuning and optimization.
Azure Databricks – Cluster management, notebooks, DBFS, Job orchestration.
Data Warehousing – Snowflake or Redshift: Schema design, performance tuning, designing queries.
Data Modeling – Dimensional and normalized modeling for data marts/lakes.
Pipeline Orchestration – Experience using Azure Data Factory, Airflow or Databricks workflows.
Version Control – Git/GitHub workflow, pull request reviews, branching strategy.
Monitoring – Monitor job executions and resolve failures. Establish observability.