As a Senior Data Engineer at Wave, you will design and deploy components of a modern data platform, enhance AWS infrastructure, build data pipelines, and ensure data reliability and accessibility across teams, all while maintaining engagement with cross-functional stakeholders.
At Wave, we help small businesses to thrive so the heart of our communities beats stronger. We work in an environment buzzing with creative energy and inspiration. No matter where you are or how you get the job done, you have what you need to be successful and connected. The mark of true success at Wave is the ability to be bold, learn quickly and share your knowledge generously.
Reporting to the Manager, Data Engineering, as a Senior Data Engineer, you will be building tools and infrastructure to support the efforts of the Data Products and Insights & Innovation teams, and the business as a whole.
We’re looking for a talented, curious self-starter who is driven to solve complex problems and can juggle multiple domains and stakeholders. This highly technical individual will collaborate with all levels of the Data and AI team as well as various engineering teams to develop data solutions, scale our data infrastructure, and advance Wave to the next stage in our transformation as a data-centric organization.
This role is for someone with proven experience in complicated product environments. Strong communication skills are a must to bridge the gap between technical and non-technical audiences across a spectrum of data maturity. At Wave, you’ll have the chance to grow and thrive by building scalable data infrastructure, enhancing a modern data stack, and contributing to high-impact projects that empower insights and innovation across the company.
Here's How You Make an Impact:
- You’re a builder. You will design, build, and deploy components of a modern data platform, including CDC-based ingestion using Debezium and Kafka, a centralized Hudi-based data lake, and a mix of batch, incremental, and streaming data pipelines.
- You ensure continuity while driving modernization. You will maintain and enhance the existing Amazon Redshift data warehouse and legacy Python ELT pipelines, ensuring stability and reliability, while accelerating the transition to a brand-new Databricks-based analytics and processing environment. This platform, integrated with dbt, will progressively replace the existing data environment.
- You balance innovation with operational excellence. You enjoy building fault-tolerant, scalable, and cost-efficient data systems, and you continuously improve observability, performance, and reliability across both legacy and modern platforms.
- You collaborate to deliver impact. You will work closely with cross-functional partners to plan and roll out data infrastructure and processing pipelines that support analytics, machine learning, and GenAI use cases. You enjoy enabling teams across Wave by ensuring data and insights are delivered accurately and on time.
- You thrive in ambiguity and take ownership. You are self-motivated and comfortable working autonomously, identifying opportunities to optimize pipelines and improve data workflows, even under tight timelines and evolving requirements.
- You keep the platform reliable. You will respond to PagerDuty alerts, troubleshoot incidents, and proactively implement monitoring and alerting to minimize incidents and maintain high availability.
- You’re a strong communicator. Colleagues rely on you for technical guidance. Your ability to clearly explain complex concepts and actively listen helps build trust and resolve issues efficiently.
- You’re customer-minded. You will assess existing systems, improve data accessibility, and deliver practical solutions that enable internal teams to generate actionable insights and enhance our external customers' experience.
You Thrive Here by Possessing the Following:
- Data Engineering Expertise: Bring 6+ years of experience in building data pipelines and managing a secure, modern data stack. This includes CDC streaming ingestion using tools like Debezium into a data warehouse that supports AI/ML workloads.
- AWS Cloud Proficiency: At least 3 years of experience working with AWS cloud infrastructure, including Kafka (MSK), Spark / AWS Glue, and infrastructure as code (IaC) using Terraform.
- Data modelling and SQL: Fluency in SQL, strong understanding of data modelling principles and data storage structures for both OLTP and OLAP Databricks experience: Experience developing or maintaining a production data system on Databricks.
- Strong Coding Skills: Write and review high-quality, maintainable code that enhances the reliability and scalability of our data platform. We use Python, SQL, and dbt extensively, and you should be comfortable leveraging third-party frameworks to accelerate development.
- Data Lake Development: Prior experience building data lakes on S3 using Apache Hudi with Parquet, Avro, JSON, and CSV file formats.
- CI/CD Best Practices: Experience developing and deploying data pipeline solutions using CI/CD best practices to ensure reliability and scalability.
- Data Governance Knowledge:Familiarity with data governance practices, including data quality, lineage, and privacy, as well as experience using cataloging tools to enhance discoverability and compliance.
- Data Integration Tools: Working knowledge of tools such as Stitch and Segment CDP for integrating diverse data sources into a cohesive ecosystem is a plus.
- Analytical and ML Tools Expertise: Knowledge and practical experience with Looker, Power BI, Athena, Redshift, or Sagemaker Feature Store to support analytical and machine learning workflows is a definite bonus!
Bonus:
At Wave, we value diversity of perspective. Your unique experience enriches our organization. We welcome applicants from all backgrounds. Let’s talk about how you can thrive here!
Wave is committed to providing an inclusive and accessible candidate experience. If you require accommodations during the recruitment process, please let us know by emailing [email protected]. We will work with you to meet your needs.
Please note that we use AI-assisted note-taking in interviews for transcription purposes only. This helps ensure interviewers can remain fully present and engaged throughout the discussion.
This advertised posting is a current vacancy.
Top Skills
Amazon Redshift
Apache Hudi
Aws Glue
Databricks
Dbt
Debezium
Kafka
Python
S3
SQL
Terraform
Similar Jobs
Blockchain • eCommerce • Fintech • Payments • Software • Financial Services • Cryptocurrency
As a Senior Data Engineer, you'll design and manage ETL pipelines, optimize data models, monitor data quality, and collaborate with teams to support compliance operations.
Top Skills:
AirflowDatabricksDbtGitPrefectPythonSnowflakeSQLTableauTerraform
Gaming • Machine Learning • Mobile • Software
The Senior Data Engineer at Discord will design and maintain data pipelines for advertising products, ensuring data quality and facilitating cross-functional collaborations.
Top Skills:
AirflowBigQueryDagsterDbtLookerPythonSQLTableau
AdTech • Big Data • Consumer Web • Digital Media • Marketing Tech
Lead data engineering initiatives at Launch Potato, developing and optimizing ETL processes and data infrastructure to support analytics and business intelligence. Work closely with cross-functional teams to deliver actionable data, ensuring data quality and integrity while mentoring peers in a collaborative environment.
Top Skills:
AthenaAWSDockerGlueKinesisPythonS3SparkSQL
What you need to know about the Montreal Tech Scene
With roots dating back to 1642, Montreal is often recognized for its French-inspired architecture and cobblestone streets lined with traditional shops and cafés. But what truly sets the city apart is how it blends its rich tradition with a modern edge, reflected in its evolving skyline and fast-growing tech industry. According to economic promotion agency Montréal International, the city ranks among the top in North America to invest in artificial intelligence, making it le spot idéal for job seekers who want the best of both worlds.
Key Facts About Montreal Tech
- Number of Tech Workers: 255,000+ (2024, Tourisme Montréal)
- Major Tech Employers: SAP, Google, Microsoft, Cisco
- Key Industries: Artificial intelligence, machine learning, cybersecurity, cloud computing, web development
- Funding Landscape: $1.47 billion in venture capital funding in 2024 (BetaKit)
- Notable Investors: CIBC Innovation Banking, BDC Capital, Investissement Québec, Fonds de solidarité FTQ
- Research Centers and Universities: McGill University, Université de Montréal, Concordia University, Mila Quebec, ÉTS Montréal



