Intermediate Data Engineer
JUMO
- South Africa
- Permanent
- Full-time
- Be responsible for creating robust, mission critical batch and streaming data processing capabilities
- Design, implement, and maintain the data pipelines that constitute JUMO's data platform, enabling effective use of data across the organisation
- Provide feedback on team members' output, encouraging skills development within the team
- Work closely with Portfolio Managers and Decision Scientists to understand the real world problems we're trying to solve
- Mentor juniors with technical leadership
- Be supported by senior leaders as you drive your own development
- BSc. in Computer Science, Electrical Engineering or equivalent tertiary degree
- Real-world understanding & experience of data pipeline design and development, as well as data processing and storage (i.e. experience in designing systems to process and curate large data sets)
- Experience in streaming technologies, specifically Spark and Kafka
- Experience with cloud technologies, AWS preferred
- Solid, proven experience working with big-data technologies such as Apache Spark, Flink, Hadoop, Kafka or Kinesis, Cassandra, DynamoDB, InfluxDB, MongoDB, Presto, Beam
- Command of productionising and monitoring of data pipeline workflows, and working knowledge of the Data Product Lifecycle
- Experience in application design and development with at least one of the following languages: Python (preferred), Scala, Java
- RDBMS experience in any relevant technology such as MySQL, PostgreSQL, Redshift and SQL Server. This includes experience with relational database administration, technical architectures and infrastructure components
- Understanding of CI/CD practices
- Productive within a Linux command line environment
- Proven ability to contribute software as part of a team. We are looking for someone that can effectively communicate technical concepts, and apply critical thinking under pressure
- Experience working with messaging systems (RabbitMQ, SNS)
- Experience working with data pipeline orchestration (Airflow, Nifi, Streamsets)
- Experience working with production BI environments and tools (Tableau, Superset, Looker)
- Collaborating with smart, engaging people in an inspiring work environment
- Working for impact
- Growing and learning continuously, with loads of encouragement and support
- Boldly taking risks as we navigate new challenges
- Flexible work practices enabling your best delivery
- Being autonomous and empowered to lead
- A stack of leading-edge technologies