Data Engineering


As the world’s leading marketplace for technologists, Andela is in the business of changing lives, matching the brightest talents to roles at innovative technology companies across the globe. By joining Andela, you can experience the joy of long-term work with vetted companies and competitive compensation, ensuring you grow your career while becoming part of a vibrant community. Andela’s Talent Network connects you to a diverse global network of like-minded technologists, so you can develop your expertise and access exciting employment opportunities. We are always looking for Senior Data engineers to join Andela and gain access to high quality, long term remote roles.

  • 5+ years hands-on experience using data technologies

  • Strong analytic skills related to working with both structured and unstructured datasets

  • Experience in building ETL/ELT data pipelines in Big Data deployments and processes suporting them - data structures, metadata, dependency and workload management

  • Experience writing complex queries and stored procedures in SQL

  • Strong modeling/data architecture experience along with relational database analysis/optimization

  • Coding experience with Python/Java/Scala/Golang

  • Experience deploying data pipelines in cloud environment

  • Knowledge of/and experience with modern git workflows (Pull Requests, CI, Code Reviews)

  • Knowledge of/and experience with Software Development Methodologies such as Scrum

  • Excellent verbal and written communication skills and the ability to work with others at all levels

  • Toolset:

    • Analytics: Spark, Databricks

    • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.

    • Stream-processing systems: Kafka, Storm, Spark-Streaming, etc.

    • Distributed data storage file systems/solutions (HDFS, S3)

    • AWS (or equivalent GCP and Azure) cloud services related to data storage and processing: EC2, EMR, RDS, Redshift.

Nice to have:

  • Working experience in data analytics; data wrangling, integration, analysis, visualization, data modeling and reporting using BI (Business Intelligence) tools

  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores

  • Experience with architecture of data warehouses and data lakes

  • Design and deployment of Data Science and ML models is a big plus

  • Experience with replication, administration and performance tuning of databases


D'autres postes #data ingénieur