Overview
Yassir is the leading super App in the Maghreb region set to changing the way daily services are provided. It currently operates in 45 cities across Algeria, Morocco and Tunisia with recent expansions into France, Canada and Sub-Saharan Africa. It is backed (~$200M in funding) by VCs from Silicon Valley, Europe and other parts of the world.
We offer on-demand services such as ride-hailing and last-mile delivery. Building on this infrastructure, we are now introducing financial services to help our users pay, save and borrow digitally. Helping usher the continent into a digital economy era. We’re not just about serving people – we’re about creating a marketplace to bring people what they need while infusing social values.
Job Position: Senior Data Engineer
Job Location: Lagos (Remote)
Job Responsibilities
- Build a centralized data lake on GCP Data services by integrating diverse data sources throughout the enterprise.
- Develop, maintain, and optimize SPARK-powered batch and streaming data processing pipelines. Leverage GCP data services for complex data engineering tasks and ensure smooth integration with other platform components
- Design and implement data validation and quality checks to ensure data’s accuracy, completeness, and consistency as it flows through the pipelines.
- Work with the Data Science and Machine Learning teams to engage in advanced analytics.
- Collaborate with cross-functional teams, including data analysts, business users, operational and marketing teams, to extract insights and value from data.
- Collaborate with the product team to design, implement, and maintain the data models for analytical use cases.
- Design, develop, and upkeep data dashboards for various teams using Looker Studio.
- Engage in technology explorations, research and development, POC’s and conduct deep investigations and troubleshooting.
- Design and manage ETL/ELT processes, ensuring data integrity, availability, and performance.
- Troubleshoot data issues and conduct root cause analysis when reporting data is in question.
Job Requirements
Required Technical Skills:
- PySpark
- GCP – Big Query, Dataproc, Dataflow, Dataplex, Pub-Sub and Cloud Storage
- Advance SQL knowledge
- NoSQL (Preferably MongoDB)
- Programming languages – Scala/Python
- Great Expectation – similar DQ framework
- Familiarity with workflow management tools like Airflow, Prefect or Luigi.
- Understanding of Data Governance, DWH and Data Modelling.
Good to have skills:
- Infrastructure as Code – Terraform
- Docker and Kubernetes
- Looker Studio
- AI and ML engineering knowledge.
How to Apply
Interested and qualified candidates should:
Click here to apply online