Accountabilities:
- Lead the design, development, and deployment of high-performance, scalable data warehouses and data pipelines.
- Collaborate closely with cross-functional teams to understand business requirements and translate them into technical solutions.
- Oversee and optimize the use of Snowflake for data storage and analytics.
- Develop and maintain SQL-based ETL processes.
- Implement data workflows and orchestrations using Airflow.
- Utilize DBT for data transformation and modeling tasks.
- Mentor and guide junior data engineers, fostering a culture of learning and innovation within the team.
- Conduct performance tuning and optimization for both ongoing and new data projects.
- Proven ability to handle large, complex data sets and develop data-centric solutions.
- Strong problem-solving skills and a keen analytical attitude.
- Excellent communication and leadership skills, with the ability to work effectively in a team-oriented environment.
- 8-12 years of relevant experience in data engineering roles, focusing on data warehousing, data integration, and data product development.
Essential Skills/Experience:
- Snowflake
- SQL
- Airflow
- DBT
Desirable Skills/Experience:
- Snaplogic
- Python
Education:
Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.