Overview
Flutterwave was founded on the principle that every African must be able to participate and thrive in the global economy. To achieve this objective, we have built a trusted payment infrastructure that allows consumers and businesses (African and International) make and receive payments in a convenient border-less manner.
Job Position: Data Engineer
Job Location: Lekki, Lagos (Hybrid)
Job Description
- Are you an avid learner, constantly looking to improve and innovate? We are seeking a highly skilled and experienced Data Engineer to partner with in driving our data strategy. You’ll play a key role in driving our journey towards becoming a fully data-driven organization and embedding a data-first culture across the business.
Job Responsibilities
- Design and build high-performance, secure, and scalable data pipelines to support data science projects following software engineering best practices.
- Curate, wrangle, prepare data, and feature engineering to be used in machine learning models
- Design and develop the data and analytics platform selecting the right technologies for each problem at hand (big-data stack, SQL, no-SQL, etc.)
- Build a modular pipeline to construct features and modeling tables.
- Build a sense of trust and rapport that creates a comfortable & effective workplace and experience in working as part of an agile squad
- Work with the Data Analytics and Science team to understand the business needs and build impactful analytics solutions.
- Coordinate and collaborate with the Data Analytics team & other engineering teams to align on our roadmap
Job Requirements
- Bachelor’s Degree in Computer Engineering or related field
- 4+ years’ experience in a data role (Data Engineer, Data Analyst, Analytics Engineer, etc.)
- 2+ years of hands-on experience building data pipelines in production and the ability to work across structured, semi-structured, and unstructured data.
- Hands-on experience implementing ETL (or ELT) best practices at scale .2+ years of experience in ML pipeline for streaming/batch workflow.
- Hands-on knowledge and experience in working with the modern data stack and cloud data platforms (Snowflake, BigQuery, Redshift, DBT)
- Hands-on knowledge and experience with orchestration tools (Airflow, Prefect, or Dagster)
- Professional experience using Python for data processing, SQL, Git (as source code versioning and CI/CD), and Apache Kafka.
- Good understanding of Data modeling and Data Architecture.
- Working experience with data governance tools.
- Exceptional communication skills and ability to support both technical and non-technical stakeholders.
- Natural ability to manage multiple initiatives and clients.
- Authorization to work in the country without sponsorship.
How to Apply
Interested and qualified candidates should:
Click here to apply online