Responsibilities
• Analyzing customer requirements based on the understanding of conformed models and catering solutions that are implementable and maintainable.
• Discerning data accuracies in large sets of data of diverse formats
• Researching, designing, and documenting data specifications at all points in the production life cycle.
• Understanding software development and having proficiency in SQL and a data programming languages.
• Value Realization: Knowledge of value realization methods; ability to plan, execute, monitor and manage business activities and resources to determine and achieve the actual value from a business initiative as estimated in an associated business case.
• Communicating Complex Concepts: Knowledge of effective presentation tools and techniques to ensure clear understanding; ability to use summarization and simplification techniques to explain complex technical concepts in simple, clear language appropriate to the audience.
• Agile Development: Knowledge of agile methodologies and the agile development lifecycle; ability to utilize formal agile methodologies, disciplines, practices and techniques for the delivery of new and enhanced applications.
• Cloud Computing: Knowledge of the concepts, technologies and services of cloud computing; ability to design, deploy and implement cloud computing solutions in various business environment.
• Carries out tasks, under supervision, to increase capacity or add capabilities through cloud computing.
• ETL Process: Knowledge of the extraction, transformation and loading (ETL) process; ability to develop a database through the ETL process.
• Information Management: Knowledge of an organization's existing and planned Information Architecture and Information Management (IM) methodology; ability to collect and manage information from different sources, and distribute this information to enhance operational efficiency.
• Modeling: Data, Process, Events, Objects: Knowledge of data, process and events; ability to use tools and techniques for analyzing and documenting logical relationships among data, processes or events.
Basic Qualifications
• Master Or Bachelor's degree in computer science or a related field
• At least 5+ years of experience
• Experience in Data analysis Performing statistical analysis, data visualization, and predictive modelling to identify trends and patterns.
• Strong skills in SQL
• At least 2+ years of recent experience in programming - preferably Python
• Experience with relational databases such as Snowflake, MySQL or PostgreSQL
• Capable of thriving in high-pressure situations and delivering results within tight time constraints.
• Demonstrated passion for technology coupled with an eagerness to contribute to a collaborative team environment.
Nice to have
- a working knowledge of statistical methodologies and data management
- Proficient in working with diverse datastores, including Snowflake, Elasticsearch, MySQL, and Oracle.
- Well-versed in developing Snowflake procedures, tasks, and other Snowflake components.
- Proficient in utilizing batch or stream processing systems, including Apache Spark and AWS Glue.
- Familiarity with scheduling tools like Apache Airflow.
- Skilled in developing and working with Restful APIs.
- Hands-on experience with API tools like Swagger, Postman, and Assertible.
- Advocate of Test-Driven Development (TDD) and Behaviour-Driven Development (BDD).
- Extensive hands-on experience with testing tools like Selenium and Cucumber, with expertise in seamlessly integrating them into CI/CD pipelines."