Big Data Engineer IV

Szczegóły pracy

Kraków, Małopolskie, Polska
OpenX
03.05.2024
Chcesz otrzymywać aktualizacje dotyczące tej i podobnych ofert pracy?

We are operating at a scale:

100% Cloud-based (GCP) platform

Over 400 billion Ad requests every day

Over 120 000 CPU daily

Over 140 TB RAM daily

Over 50 PB of data per week

Over 1200 production deployments a month

You will work in all aspects of agile application development, including our enterprise platform that interfaces with a multitude of services that are dependent on to deliver billions of requests per day. Your opinions will be important in all phases of product development, starting from requirements to validation and deployment.

Working on the enterprise platform, you will be working with multiple distributed teams to architect, create, and deliver new features and functionality in order to deliver the best possible advertising experience in the market. Scalability, performance, and rock-solid reliability are all factors to consider with every line you code.

The Team and Project:

You will be part of the core data development team. Our exchange handles billions of ad requests daily connecting thousands of publishers with demand partners. Each request produces data events that have to be processed to extract business value from them. Daily our applications produce more than 1PB of data.

Please note: all interview stages are run remotely!

Key responsibilities:
  • Design large-scale data processing systems.
  • Work with Product to drive the requirements, and own the project end-to-end.
  • Analyse and improve efficiency, scalability, and stability of applications.
  • Think long-term and be unsatisfied with band-aids.
  • Identify unnecessary complexity and remove it.
Required Qualifications:
  • 8+ years experience in Java, Scala or Python development
  • 3+ years of experience in Big Data systems development
  • Experience building complex data pipelines with data transformation tools like dbt or Dataform and SQL.
  • Expertise using BigQuery or other data warehouses including performance tuning and monitoring.
  • Understands data orchestration tools like Airflow to extend it with new data pipelines and troubleshoot its operational issues.
  • Cloud experience with Google Cloud Platform or AWS, k8s, and Docker.
  • Understanding of CI/CD, automated testing, monitoring and other DevOps practices.
  • Fluently speak algorithms, data structures, and platforms.
  • Experience with database system design, RDBMs and/or NoSQL.
  • Bachelor’s degree in computer science, physics, or related field. A Master’s degree is a plus.
  • Communicative Polish and English.
Desired qualifications:
  • Be comfortable using the right tools and languages for the job, even brand new ones.
  • Have the ability to develop scalable, modular applications.
  • SCRUM / Agile environment experience.
  • Experience working in digital media, marketing technology or advertising technology is a big plus.
  • Good written & oral communication skills.
  • Good sense of humor.
  • Team player.
  • Self-starter with the ability to independently identify and act on areas of improvement.
What we offer:
  • Employment contract (22 100 - 26 000 PLN gross/monthly) OR B2B (180 - 207 PLN net/hourly)
  • Working with the newest technologies such as Cloud Computing (GCP)
  • Experienced Team (50% of the company are senior developers!)
  • Challenges at work that are difficult to find anywhere else!
  • Solving important problems in a scale
  • Joining a company that is growing and scaling
  • Flexible working hours & remote work option (4 days a week)

#J-18808-Ljbffr
Zgłoś ofertę pracy