Overview
eHealth4everyone is a digital health enterprise based in Nigeria (Africa) contributing to health service delivery using data science and information technology. At ehealth4everyone, our goal is saving lives and our approach is information and technology. We believe that if health is a right, proven digital health solutions and expertise such as ours should not be a privilege.
Job Position: Data Engineer
Job Locations: Abuja (FCT), Lagos and Oyo
Job Description
- As a Data Engineer specializing in data lakes, data warehouses, and ETL, you will be responsible for designing, implementing, and maintaining our data infrastructure.
- You will work closely with data scientists, analysts, and other stakeholders to ensure seamless data flow, high-quality data, and accessibility for analytical and operational use cases.
Job Responsibilities
- Design, build, and maintain scalable data lake and data warehouse architectures to store structured and unstructured data.
- Develop and manage ETL (Extract, Transform, Load) processes to ingest data from various sources into the data lake and data warehouse.
- Ensure data quality, data governance, and data security practices are implemented and maintained.
- Collaborate with data scientists and analysts to understand data requirements and provide solutions for data access and analysis.
- Optimize data storage and retrieval performance.
- Monitor and troubleshoot data infrastructure issues, ensuring high availability and reliability.
- Implement and maintain data catalog and metadata management tools.
- Stay updated with the latest trends and technologies in data engineering, data lakes, and data warehouses.
Job Requirements
- Bachelor’s Degree in Computer Science, Information Technology, or a related field.
- 7+ years of experience in data engineering or a similar role.
- Strong experience with data lake technologies such as AWS S3, Azure Data Lake, Google Cloud Storage, or similar.
- Proficiency in ETL tools and processes (e.g., AWS Glue, Apache NiFi, Talend).
- Experience with big data processing frameworks like Apache Spark or Hadoop.
- Knowledge of data warehousing concepts and technologies (e.g., Amazon Redshift, Google BigQuery, Snowflake).
- Experience with SQL and NoSQL databases.
- Familiarity with data governance and data security best practices.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Preferred Qualifications:
- Experience with cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with containerization and orchestration tools like Docker and Kubernetes.
- Knowledge of data catalog and metadata management tools (e.g., AWS Glue Data Catalog, Apache Atlas).
- Experience with data visualization tools and techniques.
- Relevant certifications in data engineering or cloud platforms.
How to Apply
Interested and qualified candidates should:
Click here to apply online