The Data Engineer designs and optimizes data infrastructures, builds ETL pipelines, and collaborates with teams to ensure data accessibility and reliability.
Location - Toronto Hub
Summary
We are seeking a talented and experienced Data Engineer to join our growing team. The Data Engineer will play a key role in designing, building, and maintaining our data infrastructure, ensuring scalability, reliability, and performance.
Who You Are:
- An engineer at heart who takes pride in a clean and powerful code base.
- Deep knowledge of Data Architecture, Data Engineering best practices and passionate about making data accessible.
- Enjoys sharing knowledge and working in a collaborative team environment.
- Ready to take ownership and make decisions.
- Data governance, security, and compliance are always top of mind.
- Previous experience in ETL/ELT Architecture: Expertise in designing and optimizing ETL/ELT pipelines to handle various data formats (CSV, JSON, Parquet, etc.) and integrating data from multiple sources (e.g., APIs, cloud storage, client Snowflake shares).
- Strong understanding of REST API principles, experience with high-volume API requests, and ability to optimize API calls and data ingestion strategies.
- Proficiency in using orchestration tools like Apache Airflow, or similar tools to automate and manage data workflows.
- Expertise in building efficient ETL/ELT workflows to enable scalable feature engineering.
- Previous experience in performance testing and optimization (data load testing/performance tuning/monitoring) for various databases, and ETL pipelines.
- Experience building and testing resilient infrastructure using IaC and cloud-specific features for disaster recovery.
- Experience working in an Agile environment.
- Experience building data products in large scale distributed systems.
- Knowledge of industry best practices and compliance standards, such as DAMA, CCPA, PIPEDA, etc.
What You Will Do:
- Work with business partners and stakeholders to understand data requirements and support data-related projects.
- Work with engineering, product teams and 3rd parties to collect required data.
- Drive data modeling and warehousing initiatives to enable and empower data consumers.
- Develop ETL/ELT pipelines to ingest, prepare and store data for the product, analysis, and data science.
- Develop and implement data quality checks, conduct QA and implement monitoring routines.
- Improve the reliability and scalability of our ETL processes.
- Develop and manage backup and recovery strategies to ensure data integrity and availability.
- Ensuring our system is architected to balance cost and latency.
- Collaborate with partners to execute compliance, security, and privacy reviews/audits.
- Deploy data infrastructure to support product, analysis, and data science.
Qualifications:
- Education: Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Experience: Minimum of 4 years working with databases, preferably within a platform tool and automation environment.
- Technical Skills: Programming Languages (PySpark, Scala, Python, Snow SQL, SnowPipe, SQL, Terraform), Data Orchestration and Automation (Airflow, K8 or similar), Cloud Infrastructure and Data Management Systems (MongoDB, Snowflake, Databricks and Azure or similar).
- Problem-Solving: Strong analytical and problem-solving skills.
- Communication: Excellent verbal and written communication skills.
- Team Player: Ability to work effectively in a collaborative team environment.
- Knowledge of DevOps and mobile software develop practices and tools.
Top Skills
Airflow
Azure
Databricks
K8
MongoDB
Pyspark
Python
Scala
Snow Sql
Snowflake
Snowpipe
SQL
Terraform
Similar Jobs
Artificial Intelligence • Blockchain • Fintech • Financial Services • Cryptocurrency • NFT • Web3
The Software Engineer will design and build cloud provisioning services, manage projects, mentor peers, and create scalable solutions in a collaborative environment.
Top Skills:
Cloud Provisioning ServicesGoTerraformWeb3
AdTech • Digital Media • eCommerce • Marketing Tech
As a Senior Data Engineer, build scalable data pipelines and architecture using Databricks, PySpark, and AWS, collaborating with cross-functional teams to enhance data models and pipelines.
Top Skills:
AWSCi/CdDatabricksDelta LakeGitKafkaPysparkSparkSQL
Automotive • Consumer Web
As a Data Engineer, you'll design, build, and optimize ETL pipelines and scalable data architectures, ensuring reliability and data integrity for impactful analytics and insights.
Top Skills:
BigQueryData WarehousingETLPostgresPythonSnowflakeSparkSQLSQL Server
What you need to know about the Montreal Tech Scene
With roots dating back to 1642, Montreal is often recognized for its French-inspired architecture and cobblestone streets lined with traditional shops and cafés. But what truly sets the city apart is how it blends its rich tradition with a modern edge, reflected in its evolving skyline and fast-growing tech industry. According to economic promotion agency Montréal International, the city ranks among the top in North America to invest in artificial intelligence, making it le spot idéal for job seekers who want the best of both worlds.
Key Facts About Montreal Tech
- Number of Tech Workers: 255,000+ (2024, Tourisme Montréal)
- Major Tech Employers: SAP, Google, Microsoft, Cisco
- Key Industries: Artificial intelligence, machine learning, cybersecurity, cloud computing, web development
- Funding Landscape: $1.47 billion in venture capital funding in 2024 (BetaKit)
- Notable Investors: CIBC Innovation Banking, BDC Capital, Investissement Québec, Fonds de solidarité FTQ
- Research Centers and Universities: McGill University, Université de Montréal, Concordia University, Mila Quebec, ÉTS Montréal