Carbon Nigeria Recruitment for Data Engineering – Carbon is a pan-African digital bank with a mission to provide friction-free finance to its customers. Carbon promises to play a fundamental role in its customers’ lives wherever they are, with flexible solutions.
We pride ourselves on our efficiency and with just $10mm of equity raised in 2015, we have disbursed over $100m in loans and earned more than $30mm in revenue over the last 2 years. Carbon has operations in Ghana, Kenya, and Nigeria, supported by a talented team spread between Lagos, Nairobi, London, Argentina, and Palo Alto so we operate with a remote-first mindset.
We are recruiting to fill the position below:
Job Title: Data Engineering
Location: Lagos
Employment: Type Full-Time
Department: Business Intelligence
Job Description
What are we looking for?
- We are looking for someone who can transform data into a format that can be easily analyzed and create the necessary connections to enable company units to consume transformed data.
- Candidates would have the ability to transform available raw data into formatted data through computer programming and database query.
- Data Engineering will be part of the Business Intelligence team.
- This role will work very closely with data scientists and business intelligence analysts to build solutions that enable Data Science and Business Intelligence teams to create robust data products, and other Carbon departments to consume these services.
Roles/Responsibilities
- Developing, maintaining and testing the infrastructure to transform the data that feeds our dashboards and machine learning models.
- Running complex queries over data.
- Manipulating database management systems (DBMS).
- Create and maintain optimal data pipeline architecture.
- Implementing feature generation and machine learning models scripts.
- Coding, testing and troubleshooting APIs/ endpoints to enable other company assets to integrate to and consume data science and business intelligence products, such as credit risk or fraud scores.
- Creating the necessary connections to expose transformed data to other systems or tools.
Qualifications and Requirements
We are looking for candidates who can meet the following criteria – We want to emphasize that we don’t expect you to meet all of the below but would love you to have experience in most of the areas:
Required:
- Candidates should possess a Bachelor’s or Master’s Degree in a technical field like Computer Science, Engineering with 3+ years of professional experience
- Ability to understand simple business problems
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), and working familiarity with various databases.
- Experience with relational SQL and NoSQL databases
- Experience designing, building, and testing API connections
- Experience with data pipelines
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience working with container technology, such as Docker files, Docker images, and GitHub repositories
- Experience with Talend (or any other ETL tool such as SSIS or Pentaho).
Desirable:
- Experience with Google Cloud Platform
- Experience with Tableau, Power BI, or BI Tools.
Benefits
- A great and upbeat work environment populated by a multinational team.
- Potential to work in different geographies.
- Health Insurance.
- Life Insurance
- Career development & Growth.
- We offer a remote working option.
How to Apply
Interested and qualified candidates should:
Click here to apply online
Application Closing Date
Not Specified.
Recruitment Process
- Call with People team
- Case Study ( Assessment)
- Interview.