LiveKit Logo

LiveKit

Senior Data Engineer

Reposted 17 Days Ago
Remote
Hiring Remotely in Canada
Senior level
Remote
Hiring Remotely in Canada
Senior level
As a Data Engineer at LiveKit, you will manage the analytics infrastructure, develop scalable GCP-based data pipelines, and ensure effective data movement and transformation processes, while collaborating with the Analytics and engineering teams.
The summary above was generated by AI

LiveKit is building the infrastructure layer for the voice-driven era of computing. Our platform gives developers everything they need to build, test, deploy, scale, and observe agents in production. Founded in 2021, LiveKit powers voice AI applications for OpenAI, xAI, Salesforce, Coursera, Spotify, and thousands of others, collectively facilitating billions of calls each year.

You'll thrive at LiveKit if you:
  • obsess with crafting code that is fast, reliable and practical for the problem

  • are known as the go-to person for tackling tough technical problems

  • work hard and can build and ship fast

  • can clearly explain complex technical concepts to others

  • are a fast learner, frequently picking up new languages and tools

The best way to impress us is with thoughtful Issues and/or PRs on our Github repos 😊

About This Role:

As a Senior Data Engineer at LiveKit, you'll own the analytics infrastructure that powers our business intelligence and data analysis capabilities. Working closely with the Head of Data and analytics peers, you'll design and implement scalable GCP-based data pipelines — from ingestion through transformation to delivery — maximizing the GCP ecosystem for cost-effective solutions while integrating additional services or homegrown tooling where appropriate. While analytics infrastructure is the core focus, you'll also engage with the broader application data infrastructure, contributing your data pipeline expertise to support product and engineering needs. This is a foundational IC role with significant ownership over the architecture and direction of our analytics stack as the team grows.

What You’ll Do:

Own the Analytics Infrastructure: You are the end-to-end owner of our GCP-based data infrastructure — including ingestion, movement, storage, security, and availability. You build and operate reliable, scalable pipelines that power analytics, and partner closely with the Analytics team on downstream transformation and BI.

Maximize the Cloud Ecosystem: Build cost-effective solutions primarily within GCP-native services, while bringing transferable cloud infrastructure expertise. Know when to extend with third-party tooling or homegrown solutions, and make pragmatic tradeoffs.

Contribute Across Data Infrastructure: While analytics is the primary focus, you'll bring broad data pipeline expertise to application data needs in collaboration with the product engineering team.

Managed Services First: Favor managed solutions over self-hosting. Evaluate build vs. buy with cost and operational burden in mind.

Engineering Standards: This role reports to the Head of Data within the Engineering org. Expect PR reviews, automated testing, proper change management, and production-grade standards.

AI-First Development: Work extensively with AI coding assistants and contribute to evolving our AI development workflows and infrastructure.

Startup Pace: Priorities shift quickly. Balance long-term architectural thinking with the tactical execution the moment requires.

Who You Are:
  • 8+ years of experience in data engineering with strong Python and SQL expertise. You've built analytics data infrastructure from scratch — ideally more than once — and owned the architecture end-to-end

  • Experience with cloud-native data infrastructure (GCP preferred; strong AWS builders who can translate cloud concepts welcome). Familiarity with BigQuery, Dataflow, Cloud Storage, or equivalent services

  • Proven ability to design and implement production-grade data pipelines and aggregation layers for BI and analysis

  • AI-first development mindset with hands-on experience building AI-driven workflows and effectively using AI coding assistants

  • Strong understanding of data modeling, transformation patterns, and working with dbt

  • Experience with data movement tools (Estuary, Airbyte, Fivetran, or similar)

  • Solid infrastructure and DevOps fundamentals: Terraform or similar IaC, CI/CD, Git workflows, and change management

  • Experience implementing observability and monitoring for data systems (DataDog, Grafana, or similar)

  • Strong communication skills and ability to work cross-functionally with engineering and business stakeholders

  • Self-directed and comfortable with ambiguity in a fast-paced startup environment

  • Located in the US or Canada

Bonus
  • Experience coordinating with dbt and analytics engineering teams

  • Background with AI workflow tools (n8n or similar)

  • Background with AI coding assistants

  • Prior experience as an early infrastructure hire building from the ground up

  • Prior experience building on GCP/BigQuery in production

Our Commitment to You:
  • An opportunity to build something truly impactful to the world

  • Contribute to open source alongside world-class engineers

  • Competitive salary and equity package

  • Health, dental, and vision benefits

  • Flexible vacation policy

LiveKit is an equal opportunity employer and does not discriminate on the basis of any characteristic protected by applicable law. If you require a reasonable accommodation during the application or interview process, please contact [email protected].

Similar Jobs

18 Days Ago
Easy Apply
Remote or Hybrid
Canada
Easy Apply
Senior level
Senior level
Marketing Tech • Real Estate • Software • PropTech • SEO
As a Sr. Data Engineer, you'll design and manage data pipelines, enhance data quality, and leverage AI to streamline operations, driving the success of our AI growth platform for real estate.
Top Skills: AirflowAWSIcebergKafkaKubernetesNode.jsPostgresPydanticPysparkPythonSpark StreamingSqsTypescript
5 Days Ago
In-Office or Remote
Senior level
Senior level
Software
As a Senior Data Engineer at FreshBooks, you will design and build scalable data pipelines on GCP, partner with multiple teams, and mentor engineers.
Top Skills: AirflowAzure PipelinesBigQueryCloud ComposerCloud FunctionsDataflowDatastreamDbtDockerFivetranGCPGithub ActionsJenkinsKubernetesOpentelemetryPub/SubPythonSQLTerraformTerraspace
15 Days Ago
Easy Apply
Remote
Canada
Easy Apply
Senior level
Senior level
Artificial Intelligence • Blockchain • Fintech • Financial Services • Cryptocurrency • NFT • Web3
The Senior Software Engineer will manage data systems, develop scalable pipelines, ensure data security, and build self-service applications for users at Coinbase.
Top Skills: AirflowGoJavaKafkaPythonSparkSQL

What you need to know about the Montreal Tech Scene

With roots dating back to 1642, Montreal is often recognized for its French-inspired architecture and cobblestone streets lined with traditional shops and cafés. But what truly sets the city apart is how it blends its rich tradition with a modern edge, reflected in its evolving skyline and fast-growing tech industry. According to economic promotion agency Montréal International, the city ranks among the top in North America to invest in artificial intelligence, making it le spot idéal for job seekers who want the best of both worlds.

Key Facts About Montreal Tech

  • Number of Tech Workers: 255,000+ (2024, Tourisme Montréal)
  • Major Tech Employers: SAP, Google, Microsoft, Cisco
  • Key Industries: Artificial intelligence, machine learning, cybersecurity, cloud computing, web development
  • Funding Landscape: $1.47 billion in venture capital funding in 2024 (BetaKit)
  • Notable Investors: CIBC Innovation Banking, BDC Capital, Investissement Québec, Fonds de solidarité FTQ
  • Research Centers and Universities: McGill University, Université de Montréal, Concordia University, Mila Quebec, ÉTS Montréal

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account