Basic Qualifications:
- Bachelor’s or Master’s degree in a Computer Science, Engineering or a related or related field of study
- 5+ Years - Ability to work effectively across organizations, product teams and business partners.
- 5+ Years - Knowledge Agile (Scrum) Methodology, experience in writing user stories
- 5+ Years - Strong understating of Database concepts and experience with multiple database technologies – optimizing query and data processing performance.
- 5+ Years - Full Stack Data Engineering Competency in a public cloud – Google, MS Azure, AWS
- Critical thinking skills to propose data solutions, test, and make them a reality.
- 5+ Years - Highly Proficient in SQL, Python, Java, Scala, or Go (or similar) - Experience programming engineering transformation in Python or a similar language.
- 5+ Years Demonstrated ability to lead data engineering projects, design sessions and deliverables to successful completion.
- Cloud native technologist
- Deep understanding of data service ecosystems including data warehousing, lakes, metadata, meshes, fabrics and AI/ML use cases.
- User experience advocacy through empathetic stakeholder relationship.
- Effective Communication both internally (with team members) and externally (with stakeholders)
- Knowledge of Data Warehouse concepts – experience with Data Warehouse/ ETL processes
- Strong process discipline and thorough understating of IT processes (ISP, Data Security).
Responsibilities
Responsibilities:
- Interact with GDIA product lines and business partners to understand data engineering opportunities, tooling and needs.
- Collaborate with Data Engineering and Data Architecture to design and build templates, pipelines and data products including automation, transformation and curation using best practices
- Develop custom cloud solutions and pipelines with GCP native tools – Data Prep, Data Fusion, Data Flow, DBT and Big Query
- Operationalize and automate data best practices: quality, auditable, timeliness and complete
- Participate in design reviews to accelerate the business and ensure scalability
- Work with Data Engineering and Architecture and Data Platform Engineering to implement strategic solutions
- Advise and direct team members and business partners on Ford standards and processes.
Qualifications
Preferred Qualifications:
- Excellent communication, collaboration and influence skills; ability to energize a team.
- Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality
- Hands on experience in Python using libraries like NumPy, Pandas, etc.
- Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, DataFusion, PubSub / Kafka, Looker Studio, VertexAI
- Experience with Teradata, Hadoop, Hive, Spark and other parts of legacy data platform
- Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products.
- Data Governance concepts including GDPR (General Data Protection Regulation), CCPA (California Consumer Protection Act), PoLP and how these can impact technical architecture