Overview
ENGIE Energy Access is one of the leading Pay-As-You-Go (PAYGo) and mini-grids solutions provider in Africa, with a mission to deliver affordable, reliable and sustainable energy solutions and life-changing services with exceptional customer experience. The company is a result of the integration of Fenix International, ENGIEMobisol and ENGIEPowerCorner; and develops innovative, off-grid solar solutions for homes, public services and businesses, enabling customers and distribution partners access to clean, affordable energy.
ThePAYGosolar home systems are financed through affordable installments from $0.19per day and the mini-grids foster economic development by enabling electrical productive use and triggering business opportunities for entrepreneurs in rural communities. With over 1,700 employees, operations in 9 countries across Africa (Benin, Coted’Ivoire, Kenya, Mozambique, Nigeria, Rwanda, Tanzania, Uganda and Zambia), over 1.2 million customers and more than 6 million lives impacted so far, ENGIE Energy Access aims to remain the leading clean energy company, serving millions of customers across Africa by 2025.
Job Position: Data Engineer
Job Location: Nigeria
Job Grade: HL15
Job Description
- This position will be part of the Global Data team that is based across Germany, Uganda, Kenya, and Nigeria.
- You will report to the Head of Data, and work closely with data scientists, devops team, and software engineers.
- This is an incredible opportunity for a talented individual to join a high-performing team that is passionate about pioneering expanded financial services to off-grid customers at the base of the pyramid.
- Key responsibilities will include building, maintaining, and ensuring scalability of data pipelines between MySQL databases which service our in-house applications, IoT data delivered from devices, PBX, our in-house ticketing system, and our data lake used for analytics across the company.
- You would also be responsible for building and optimizing pipelines to deliver data in real-time to our field team mobile application to allow data-informed decisions to be made in the field, as well as working with members of the data team to ensure high code quality and database design.
- Your work will make a meaningful impact by enabling EEA to continuously innovate on how we support our customers in their solar kit experience and repayment journey.
Job Responsibilities
- Work with data and software teams to design, build, and support critical data pipelines using airflow for modeling, ETL, and analytics.
- Optimize data storage between Redshift, s3, and other storage solutions to support data analytics, modeling, archiving and data integrity.
- Develop logging and visualization KPI Dashboards with Grafana or another tool to score the efficiency of business processes.
- Containerize models with Docker and Kubernetes to serve real-time financial information to field teams.
- Work with software engineers and devops team to optimize performance of in-house applications which communicate data through APIs and other means.
- Maintain and Develop Tools for unit testing, streamlined ETL, and other processes for the Data Team.
- Mentor data scientists and analysts on best coding practices through code reviews and discussions.
Job Requirements
Qualifications:
- Degree in Computer Science or related field
Experience:
- 3+ years of industry experience
- Experience building infrastructure to support streaming or offline data
- Extensive programming experience in Python/Scala/Java
- Experience with SQL in addition to one or more of Spark/Hadoop/Hive/HDFS
- Experience with implementing unit and integration testing
- Ability to gather requirements and communicate with stakeholders across data, software, and platform teams
- Ability to develop a strategic vision for data pipelining and infrastructure
- AWS Certification is a plus
- Strong communication across data, devops, and software team members
- Sense of adventure and willingness to dive in, think big, and execute with a team
Language(s):
- English
- French, Portuguese, or German is a plus
Technology:
- Linux-based systems
- Knowledge of Amazon Web Services (AWS) and its services, such as, but not limited to, Cloudwatch, RDS, Redshift, Lambda, EMR, S3, SQS, EC2
- Python, jupyter notebooks
- Airflow (or other workflow management systems, such as Luigi)
- Docker, Kubernetes or other container like tools
- Streaming tools such as Kafka, Kinesis
- Knowledge of Hetzner is a plus
How to Apply
Interested and qualified candidates should:
Click here to apply online