KEY ACCOUNTABILITIES
70%of Time- Excellent Technical Work
· Design, develop, and optimize data pipelines and ETL/ELT workflows using GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.)
· Build and maintain data architecture that supports structured and unstructured data from multiple sources
· Work closely with statisticians and data scientists to provision clean, transformed datasets for advanced modeling and analytics
· Enable self-service BI through efficient data modeling and provisioning in tools like Looker, Power BI, or Tableau
· Implement data quality checks, monitoring, and documentation to ensure high data reliability and accuracy
· Collaborate with DevOps/Cloud teams to ensure data infrastructure is secure, scalable, and cost-effective
· Support and optimize workflows for data exploration, experimentation, and productization of models
· Participate in data governance efforts, including metadata management, data cataloging, and access controls
15%of Time- Client Consultation and Business Partnering
· Work effectively with clients to identify client needs and success criteria, and translate into clear project objectives, timelines, and plans.
· Be responsive and timely in sharing project updates, responding to client queries, and delivering on project commitments.
· Clearly communicate analysis, conclusions, insights, and conclusions to clients using written reports and real-time meetings.
10%of Time-Innovation, Continuous Improvement (CI), and Personal Development
· Learn and apply a CI mindset to work, seeking opportunities for improvements in efficiency and client value.
· Identify new resources, develop new methods, and seek external inspiration to drive innovations in our work processes.
· Continually build skills and knowledge in the fields of statistics, and the relevant sciences.
5% of Time-Administration
· Participate in all required training (Safety, HR, Finance, CI, other) and actively GKS, and ITQ meetings, events, and activities.
· Complete other administrative tasks as required.
MINIMUM QUALIFICATIONS
- Minimum Degree Requirements: Masters from an accredited university
- Minimum 6 years of related experience required
Specific Job Experience or Skills Needed
- 6+ years of experience in data engineering roles, including strong hands-on GCP experience
- Proficiency in GCP services like BigQuery, Cloud Storage, Cloud Composer (Airflow), Dataflow, Pub/Sub
- Strong SQL skills and experience working with large-scale data warehouses
- Solid programming skills in Python and/or Java/Scala
- Experience with data modeling, schema design, and performance tuning
- Familiarity with CI/CD, Git, and infrastructure-as-code principles (Terraform preferred)
- Strong communication and collaboration skills across cross-functional teams
For Global Knowledge Services:
- Ability to effectively work cross-functionally with internal/global team members.
- High self-motivation, with the ability to work both independently and in teams.
- Excels at driving projects to completion, with attention to detail.
- Ability to exercise judgment in handling confidential and proprietary information.
- Ability to effectively prioritize, multi-task, and execute tasks according to a plan. Able to work on multiple priorities and projects simultaneously.
- Demonstrated creative problem-solving abilities, attention to detail, ability to “think outside the box.”
PREFERRED QUALIFICATIONS
- Preferred Major Area of Study: Master’s degree in Computer Science, Engineering, Data Science, or a related field
- Preferred Professional Certifications: GCP
- Preferred 6 years of related experience