The Data Engineering team is looking for skilled and passionate Data Engineers. As part of the team, you will work on the application where all threads come together: the backend of the ad exchange. Our ultra-efficient exchange is capable of processing more than 30 billion ad requests daily. Consequently, every line of code you write matters as it is likely to be executed several billion times a day. We are one of the biggest AWS users with a presence in four different regions. If you want your code to make an impact this is the place to be.
A Data Engineer works on our large-scale analytical databases and the surrounding ingestion pipelines. The job will involve constant feature development, performance improvements and assure platform stability. The mission of our analytics team is “data driven decisions at your fingertips”. You own and provide the system that all business decisions will be based on. Precision and high-quality results are essential in this role.
Responsibilities:
- Design, develop, deliver and operate scalable, high-performance, real-time data processing software
- You will be responsible for handling more than a few terabytes of data at a given time
- Maintain the current platform and shape it’s evolution by evaluating and selecting new technologies
- Closely collaborate with stakeholders to gather requirements and design new product features in a short feedback loop
- Interact with UI/UX Engineers to visualize terabytes of data
Mandatory Requirements:
- Experience in Big-data platforms with deep understanding of Apache Kafka and/or Apache Druid (preferably comitter on PMC in any of these Apace big data open-source technologies)
- Experience in building and maintaining any distributed computing data platforms
- Good understanding of cloud technologies such as storage , compute
Good knowledge in container orchestration – automate deployment , management, scaling, networking using Kubernetes
Job Requirements:
- Overall experience : 5-10 years, Bachelor’s degree in computer science or equivalent
- Demonstrated experience in Owning the products and driving them end to end, all the way from gathering requirements, development, testing, deployment to ensuring high availability post deployment
- Experience and passion to work with OLAP/MPP database systems
- Experience in Java programming
- Hands-on experience with some of the following technologies: Druid, Kubernetes, Spark, Kinesis, Kafka, AWS-Stack, JVM based languages
- Hands-on experience in big data technologies and distributed systems
- You enjoy operating your applications in production and strive to make on-calls obsolete debugging issues in their live/production/test clusters and implement solutions to restore service with minimal disruption to business. Perform root cause analysis
Apply Now
Your information will be used to send you this and other relevant offers by email. We will never sell your information to any third parties. You can, of course, unsubscribe at any time. View our full Privacy Statement.