Minimum qualifications:
- Bachelor’s degree or equivalent practical experience.
- 3 years of experience coding in one or more programming languages.
- 3 years of experience working with data infrastructure and data models by performing exploratory queries and scripts.
- Experience in data analysis, database querying (e.g., SQL), and data visualization (dashboards/reports).
- Experience in defining and implementing data governance policies, procedures, and standards, as well as data privacy regulations and compliance requirements.
Preferred qualifications:
- Experience working in a business tooling or operations organization.
- Experience with data analysis at scale, including statistics, and machine learning model development (data preparation, model selection, evaluation, tuning).
- Experience with a wide range of data engineering and data governance tools and technologies, including cloud platforms (e.g., GCP, Looker), data warehousing solutions, data quality tools, and metadata management systems.
- Experience in building prototypes or proof-of-concepts using generative AI models (e.g., LLMs, diffusion models) or agentic workflow frameworks.
Responsibilities
- Lead the design, development, and maintenance of reliable data pipeline.
- Write queries and other data manipulation languages like Python and R. Conduct advanced quantitative data analysis.
- Partner with technical and non-technical stakeholders to translate data needs into technical designs. Lead technical design review and collaborate with cross-functional teams.
- Develop dashboards and reports in partnership with User Experience (UX) team and establish data visualization standards.
- Drive ownership of the Data Governance charter, defining and implementing policies and ensuring compliance.