Senior Data Engineer
We usually respond within two weeks
We are bsport. The place to be!
bsport is an all-in-one platform combining boutique fitness and advanced technology. Our platform helps our partners manage their bookings, payroll, marketing and more, to streamline operations and boost their commercial success.
Since we launched in 2019, we have already achieved the following:
We’ve built a community of over 10 million users 🤝
In December 2024 we have finalised the biggest Series B (€30 million)🏆
Grown our team to more than 200 employees 🚀
We’re continuing to grow to become the #1 tech partner for boutique studios in Europe and the rest of the world!
Join us to write the next chapter of your career!
🎯 What you'll own & the role
The Data Platform Team builds and operates the infrastructure that powers every data use case at bsport: from raw ingestion to orchestration, quality monitoring, and the foundation for analytics and AI/ML. Your primary stakeholders are the full Data team (Analytics Engineers, Data Analysts, and Data Scientists) who is developing on the platform leveraging the ingested data layer and computing capacity for creating the reporting and enriched data layers. You will also work closely with the SRE team to keep the platform secure, reliable, and scalable.
As a Senior Data Engineer at bsport, you will:
Build and operate the data platform (Databricks, Airflow, AWS): compute configurations, Unity Catalog, access controls, and cost management to keep it performant and reliable.
Design and ship the ingestion layer bringing data from our operational systems (PostgreSQL, Kafka, SaaS tools) into the lakehouse data layer (with schema evolution, deduplication, and lineage tracking).
Standardise ingestion patterns across batch and CDC use cases, so every new source follows the same playbook for reliability and observability.
Build and maintain the observability framework covering platform health, pipeline SLAs, and data quality.
Drive developer experience across the Data team: shared libraries, CI/CD for data assets, self-service tooling, and contribution standards that make everyone faster.
🌟You would be great fit if you have:
5+ years of experience as a Data Engineer, with meaningful time spent on platform and infrastructure
Deep hands-on experience with Databricks (Unity Catalog, Delta Lake, job compute) and Apache Airflow in production.
Strong Python skills with a software engineering mindset: you write testable, maintainable code and hold others to the same bar.
You've built or significantly improved ingestion systems: you know the failure modes and tradeoffs between batch and streaming approaches.
Comfortable with AWS and Terraform for infrastructure provisioning.
You care about developer experience: you've introduced tooling or processes that made your team meaningfully more productive.
A proactive, problem-solving mindset
Strong communication skills and fluency in English to collaborate across teams.
🧰 Stack
Cloud: AWS (S3, RDS, MSK)
Data platform: Databricks, Delta Lake, Unity Catalog
Orchestration: Apache Airflow
Ingestion: dlthub, Debezium Kafka (CDC)
Transformation: dbt, PySpark
Infrastructure as Code: Terraform, Helm
Monitoring: Grafana
CI/CD: GitLab CI
Container orchestration: Kubernetes, Docker
💬 Our Hiring Process:
👋🏼 Interview with Talent
🎙️ Tech Talk
🧑🏽💻 Live Coding
📐 System Design
🤝 Final Interview with Founder: cultural fit
🌈🌍 Diversity is one of our most valuable assets, and we are committed to fostering an inclusive environment where everyone can contribute their best work. We welcome applicants from all backgrounds, identities, and experiences to help us build a more inclusive, equitable team.
If you’re excited about this role but don’t meet every qualification, we encourage you to apply-curiosity, adaptability, and a willingness to learn are just as important to us as specific skills.
What We Offer
We believe great work comes from happy, supported people-that’s why we offer meaningful perks designed to promote balance, growth, and connection.
💵 Attractive compensation package
Competitive salary packages based on your experience and role.
💻 Work-Life harmony
Hybrid model with remote days to support balance and flexibility.
🌎 Work from anywhere
Enjoy up to 15 days of remote work from abroad each year.
💪🏽 Exclusive perks
helloCSE: discount platform for fitness, shopping, culture, cinema, live events
🌍 Diverse fun loving team
Multicultural colleagues, after-work events, team-building & more.
- Departments
- Data
- Locations
- Paris
- Remote status
- Hybrid