DevJobs

Senior Data Engineer

Overview
Skills
  • Python Python ꞏ 5y
  • SQL SQL
  • Kafka Kafka
  • AWS AWS ꞏ 3y
  • GCP GCP ꞏ 3y
  • Terraform Terraform
  • BigQuery

Tango is a successful, market leader, a live-streaming Platform with 450+ Million registered users, in an industry projected to reach $240 BILLION in the next couple of years.

The B2C platform, based on the best-quality global video technology, allows millions of talented people around the world to create their own live content, engage with their fans, and monetize their talents.


Tango live stream was founded in 2018 and is powered by 450+ global employees operating in a culture of growth, learning, and success!

The Tango team is a vigorous cocktail of hard workers, creative brains, energizers, geeks, overachievers, athletes, and more. We push the limits to bring our app from “one of the top” to “the leader”.


The best way to describe Tango's work style is not to use the word “impossible”. We believe that success is a thorny path that runs on sleepless nights, corporate parties, tough releases, and of course our users' smiles (and as we are a LIVE app, we truly get to see our users all around the world smiling right in front of us in real-time!).


Do you want to join the party?


Responsibilities:

  • Design and architecture data systems from scratch to production to support Tango’s platform (scalable, reliable, and high-performance data processing)
  • Design, develop, and maintain ETL workflows for different data sources
  • Troubleshoot and resolve issues related to data integration
  • Optimize performance and query execution in large-scale environments.
  • Analyze raw data and maintain datasets specific to business requirements
  • Collaborate with external and internal stakeholders for assistance in data-based issues (data quality, data accessibility, etc.)
  • Stay up-to-date with best practices to continuously improve our technology stack and development processes of backend engineering and big data processing.


Requirements:

  • 5+ years experience as a data engineer or software engineer in the field of data
  • 5+ years experience in Python programming language and data libraries
  • 3+ years experience working with Cloud Providers (GCP and/or AWS)
  • 5+ years experience with data modeling, ETL processes, data warehousing, and BI
  • Practical knowledge of SQL scripting
  • Experience with big data technologies and data streaming platforms (e.g. Kafka)
  • Proven experience working in a micro-services and GitOps approach
  • Experience with monitoring procedures and data quality tools
  • An enthusiastic self-learner who can confidently embrace new technologies, conduct research to discover innovative solutions, and successfully deliver results
  • Strong communication skills and the ability to collaborate effectively with cross-functional teams
  • Ability to adapt quickly and effectively manage context switches


Nice to have:

  • Experience with BigQuery DB
  • Experience developing large-scale infrastructure or distributed systems
  • Practical experience with organizing CI/CD data pipelines with Terraform or other alternatives
  • Strong understanding of Functional programming


#LI-Onsite

Tango