A leading international provider of sports betting and casino games is looking for a Data Engineer to join their growing team.
- Experience in ElasticSearch
- Experience in NiFi
- Experience in real time data
The Company My client is the biggest sports-betting operator in Germany. Driven by their core values of passion, progress and trust to secure the best possible product for their users.
- You will create and maintain both batch (ETL) and real-time data pipelines and architecture
- You assemble large and complex data sets that meet functional and non-functional business requirements
- You identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- You build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
- You build analytics tools that utilize the data pipelines to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
- Collaborate with stakeholders including data architects, Executive, Product, Data and IT team members from the beginning to the delivery of a project
- You have at least 3 years of relevant work experience
- You have strong analytics skills and the ability to innovate and think out of the box
- You are able to learn new and complex concepts quickly and you are relentlessly resourceful and scrappy
- You are collaborative, able to engage in interactive discussions with the rest of the team and able to communicate technical concepts clearly and concisely
- You are familiar with cloud data services such as AWS services such as S3, Athena, EC2, RedShift, EMR, and Lambda
- You have experience with large-scale production databases and SQL
- You have experience with time-series/analytics databases such as Elasticsearch
- You are experienced with ETL development (extractions, data load, aggregation, Talend, etc.)
- You have worked with or familiar with big data technologies such as Hadoop, NIFI, Kafka, Spark, and Logstash
- You have familiarity with containerization and orchestration technologies like Docker/Kubernetes