Big Data Engineer
The position is in Tel-Aviv, Israel
About the Role
Big data analysis is at the core of our technology, and we need a system that can quickly and efficiently process data at scale.
Our Data Engineering Team manages the infrastructure that houses our sophisticated risk engine and directly supports Acolto’s growth. This team plays a crucial role in scaling and expanding our operations, and we’re looking for a multi-disciplinary Big Data Engineer to build and manage the backbone of our risk engine.
As a Big Data Engineer, you will solve complex problems that require a varied and multi-disciplinary skill set. You’ll be required to understand the bigger picture, design system architecture, build highly complex data flows, and manage multiple, multi-faceted projects at once.
What You’ll Be Doing
- Be a significant part of our data operation, support our model training pipeline, ingesting terabytes of data and provide near real-time BI and analytics to our customers
- Architect highly scalable data solutions for diversified and complex data flows
- Administer Acolto’s data infrastructure including: PostgreSQL, Redshift, Elasticsearch, DynamoDB, Redis and more
- Perform benchmarks for new technologies
- BSc Computer Science/Data Management or Mamram (IDF)
- At least 4 years of experience with server side development in a large scale deployment.
- Experience working with NoSQL databases such as Elasticsearch/Cassandra/DynamoDB/redis etc
- Familiarity with relational databases and SQL
- Experience in additional big data technologies – for example, Spark/Kafka/Hadoop – strong advantage
- DevOps skills such as docker, Kubernetes, CloudFormation etc – strong advantage
- Experience in AWS cloud – strong advantage