Knak Logo

Knak

Senior Data Engineer

Posted An Hour Ago
Be an Early Applicant
Remote
Hiring Remotely in Canada
Senior level
Remote
Hiring Remotely in Canada
Senior level
As the first Data Engineer, you will architect the Snowflake governed query layer, enable AI-driven tools, and define data access standards for various departments, significantly shaping Knak's data infrastructure and analytics capabilities.
The summary above was generated by AI

Knak is a mission-driven company

Why? Because our time is limited, our competition is fierce, and our margin for error is small. For us to have the greatest impact on the world, we need to be laser focused on our core mission, which is...

Empowering people to be creative.

That’s why Knak exists.

We are a world-class enterprise email and landing page creation platform with a focus on making successful and happy customers by providing them with an incredibly powerful, yet easy to use creation platform.

Our industry leading SaaS solution is built by Marketers, for Marketers. We know that it’s the small things that make the biggest impact and that emails and landing pages are where the rubber hits the road when it comes to Marketing Automation. We change the way Marketers work by making them more efficient, while improving conversion rate of their campaigns and helping them stay on brand.

Oh, and we have a bit of fun while doing it, too!

About Knak

Knak is the no-code email and landing page creation platform built for enterprise marketing teams. Our customers are some of the largest, most demanding brands in the world, and they trust us to power campaigns that reach millions. Behind the product is a small, high-leverage data team building the analytical foundation that every department, including Product, Sales, Marketing, Customer Success, Finance, and Security, depends on to make decisions.

About the role

This is the first Data Engineer hire at Knak, and the role is foundational for Knak’s Data Infrastructure and AI enablement strategy. Snowflake holds the company's product usage, customer, pipeline, revenue, and behavioral data, but most of the company can't get to it without routing through one or two people. Our goal is that anyone at Knak can access the data they need from wherever they work through internal AI agents and champion-built tools. None of that is possible today because the data layer underneath doesn't exist.

You will architect and own the governed query layer that fixes this: the per-department Snowflake views, the PII scrubbing logic, the schema documentation, and the connection standards that let analysts, AI agents, and downstream tools across the company query data safely and self-serve. This is a senior IC role with architectural ownership, not a maintenance role. The work is greenfield, cross-functional, and directly tied to a 90-day implementation plan with executive visibility.

You will set technical direction in partnership with the Director of Data & Analytics, work directly with department leaders (VP Customer Success, CMO, VP Sales, CFO, Head of Security) to scope and prioritize view requirements, and partner with DevOps on connection patterns. You will be expected to make the hard architectural calls (what to expose, what to scrub, what to deprecate) and to defend those decisions to engineering and executive stakeholders. The technical foundation you build in your first six months will shape how every future agent, dashboard, and analytical product at Knak gets built, and you will help shape the data team as it grows.

What you will do
  • Power the AI Champion Program. The data layer you build directly determines whether the company's AI ambitions land or stall. Customer Success's automated EBR generation, Marketing's outbound qualification engine and lifecycle Slack bot, Sales' lookalike scoring, Finance's research agent, and Security's compliance matrix are all real projects, with real owners, currently blocked on governed Snowflake access. You will be the person who unblocks them, and the technical partner those champions work with as their tools mature from prototype to production.
  • Architect and own the governed Snowflake layer. Design, build, and maintain per-department read-only views in Snowflake covering product usage, customer accounts, pipeline, revenue, and behavioral signals. Make the architectural calls on materialization strategy, refresh cadence, and access boundaries. Each view is scoped to a department's use case, respects contractual restrictions on customer data, and excludes PII at the view layer.
  • Define connection standards for AI agents and applications. Partner with DevOps to publish connection patterns (authentication, rate limiting, error handling, approved deployment paths) so that AI agents and champion-built tools can consume governed data without ad-hoc Engineering tickets.
  • Encode PII scrubbing and compliance as code. Translate policy decisions on customer data, contract restrictions, and PII handling into view logic. View-level scrubbing is the primary control, ahead of any browser-level or tool-level redaction. You will work with Security to validate that views respect contractual constraints.
  • Document the warehouse. Publish schema documentation for every view: what fields exist, what they mean, how they connect to Salesforce, Tableau, Gong, and other sources, and what is excluded and why. This documentation is the contract between the data team and every downstream consumer.
  • Modernize the analytics codebase. Convert ad-hoc SQL, brittle stored procedures, and one-off pipelines into version-controlled, tested, modular models. Establish dbt (or equivalent) as the transformation layer. Introduce CI for SQL changes. Replace silent-failure stored procedures with observable, alerted pipelines.
  • Support the Tableau and Mixpanel layers. Several of our most-used dashboards (the Knak Product Analytics workbook, the AI OKR Tracker, the MAA/MAU rolling time-series) sit on top of Snowflake. You will own the data contracts those dashboards depend on, including the upstream extract pipelines, type-casting rules, and the daily snapshot architecture for behavioral metrics.
  • Set technical direction and raise the bar. Establish patterns, conventions, and standards that the next two to three data hires will inherit. Lead code review for SQL and transformation work. Mentor analysts across the company on warehouse design. Represent the data team in architectural discussions with Engineering and DevOps.
What we are looking forRequired
  • 5+ years building production analytics systems on a cloud data warehouse (Snowflake strongly preferred; BigQuery, Redshift, or Databricks experience is transferable).
  • Deep SQL fluency. You can read a 200-line stored procedure and tell us what is wrong with it. You write CTEs and window functions without thinking. You have used INFORMATION_SCHEMA, EXECUTE IMMEDIATE, and exception handling in production. 
  • Demonstrated architectural ownership of a data governance layer. You have designed role-based access, view-level scrubbing, or row-level security for a company at meaningful scale, and you can walk us through the tradeoffs you made and what you would do differently.
  • Production experience with a transformation framework (dbt, SQLMesh, or equivalent), version control discipline (Git, code review, CI for SQL), and a track record of introducing or significantly improving these practices on a team.
  • Working understanding of the BI and product analytics layer (Tableau, Looker, Mode, Mixpanel, or similar) and the judgment to anticipate how upstream warehouse decisions will affect downstream dashboards before they break.
  • Track record of mentoring and influencing peers. You have been the person other analysts and engineers go to for warehouse design questions, and you can point to specific examples of decisions you led that others adopted.
  • Strong written communication. The schema documentation, design docs, and decision records you produce are read by people who do not work with you every day, and by executives who need to understand the tradeoffs.
Nice to have
  • Experience with Fivetran or similar ELT tooling, especially AWS, MySQL, Salesforce, LinkedIn Ads, and product event sources.
  • Experience supporting AI agents or LLM applications as data consumers, including connection patterns, rate limits, and prompt-safe data shaping.
  • Python for data engineering (programmatic Tableau workbook edits, ad-hoc ETL, openpyxl, pandas).
  • Familiarity with PII handling under enterprise SaaS contracts.
  • Prior experience as the second or third data hire at a growing company.

What We Offer

At Knak we have four foundational pillars. Culture, customers, product and growth. Culture is our number one pillar because we know that is at the core of building a strong company that can build amazing products and delight our customers. We do this with a laser focus on hiring the right people who are smart, positive and who want more than the typical nine-to-five offers. 

We offer an extremely rewarding, second to none work environment as acknowledged by Ottawa’s Best Places to Work 2025! We show our investment in our people through our competitive salaries, equity in the company, great benefits, paid vacation, Life leave days (because life happens), team lunches and off-sites, and most importantly our commitment to YOUR career growth.

If this sounds like something you’re looking for, then we’d love to hear from you! 

If you don’t see yourself fully reflected in every job requirement listed on the posting above, we still encourage you to reach out and apply. Research has shown that women and underrepresented groups often only apply when they feel 100% qualified. We strongly encourage applicants of all genders, ages, ethnicities, cultures, abilities, sexual orientations, and life experiences to apply. Knak believes in creating an inclusive, barrier-free working environment. If you require ANY accommodation to the interview process please contact [email protected].

At Knak, our recruitment process includes AI screening for keywords and minimum qualifications as well as v
ideo interviews which are transcribed with AI. Humans are still at the core of our decision making! 

Click here to view the Knak Privacy Policy

Similar Jobs

4 Days Ago
Easy Apply
Remote
Canada
Easy Apply
Expert/Leader
Expert/Leader
Big Data • Fintech • Mobile • Payments • Financial Services
The Senior Staff Software Engineer will architect Affirm's lakehouse analytics platform, implement data governance, mentor engineers, and collaborate cross-functionally to optimize data infrastructure and solutions.
Top Skills: Apache IcebergGoKotlinPythonSnowflakeSparkSQLTerraform
Yesterday
In-Office or Remote
Senior level
Senior level
Information Technology
The Senior Data Engineer designs and supports databases for an IVR platform, ensuring performance and scalability while collaborating with multiple teams on data availability and integrity.
Top Skills: AWSCi/CdCloudFormationDockerGitGitGoJavaKubernetesPythonScalaSQLTerraform
Yesterday
In-Office or Remote
Senior level
Senior level
Information Technology
The Senior Data Engineer designs, develops, and supports databases for IVR systems, ensuring their stability, performance, and scalability while collaborating across teams to optimize data storage and availability.
Top Skills: AWSCi/CdCloudFormationDockerGitGitGoJavaKubernetesPythonScalaSQLTerraform

What you need to know about the Montreal Tech Scene

With roots dating back to 1642, Montreal is often recognized for its French-inspired architecture and cobblestone streets lined with traditional shops and cafés. But what truly sets the city apart is how it blends its rich tradition with a modern edge, reflected in its evolving skyline and fast-growing tech industry. According to economic promotion agency Montréal International, the city ranks among the top in North America to invest in artificial intelligence, making it le spot idéal for job seekers who want the best of both worlds.

Key Facts About Montreal Tech

  • Number of Tech Workers: 255,000+ (2024, Tourisme Montréal)
  • Major Tech Employers: SAP, Google, Microsoft, Cisco
  • Key Industries: Artificial intelligence, machine learning, cybersecurity, cloud computing, web development
  • Funding Landscape: $1.47 billion in venture capital funding in 2024 (BetaKit)
  • Notable Investors: CIBC Innovation Banking, BDC Capital, Investissement Québec, Fonds de solidarité FTQ
  • Research Centers and Universities: McGill University, Université de Montréal, Concordia University, Mila Quebec, ÉTS Montréal

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account