Alpaca Logo

Alpaca

Analytics Engineer

Posted 6 Days Ago
Remote
2 Locations
Mid level
Remote
2 Locations
Mid level
The Analytics Engineer will design and maintain scalable data models using dbt and SQL, support diverse business needs, and collaborate across teams to ensure data quality and integration.
The summary above was generated by AI

Who We Are:

Alpaca is a US-headquartered self-clearing broker-dealer and brokerage infrastructure for stocks, ETFs, options, crypto, fixed income, 24/5 trading, and more. Our recent Series C funding round brought our total investment to over $170 million, fueling our ambitious vision.

Amongst our subsidiaries, Alpaca is a licensed financial services company, serving hundreds of financial institutions across 40 countries with our institutional-grade APIs. This includes broker-dealers, investment advisors, wealth managers, hedge funds, and crypto exchanges, totalling over 6 million brokerage accounts.

Our global team is a diverse group of experienced engineers, traders, and brokerage professionals who are working to achieve our mission of opening financial services to everyone on the planet. We're deeply committed to open-source contributions and fostering a vibrant community, continuously enhancing our award-winning, developer-friendly API and the robust infrastructure behind it.

Alpaca is proudly backed by top-tier global investors, including Portage Ventures, Spark Capital, Tribe Capital, Social Leverage, Horizons Ventures, Unbound, SBI Group, Derayah Financial, Elefund, and Y Combinator.


Our Team Members:

We're a dynamic team of 230+ globally distributed members who thrive working from our favorite places around the world, with teammates spanning the USA, Canada, Japan, Hungary, Nigeria, Brazil, the UK, and beyond!
We're searching for passionate individuals eager to contribute to Alpaca's rapid growth. If you align with our core values—Stay Curious, Have Empathy, and Be Accountable—and are ready to make a significant impact, we encourage you to apply.

Your Role:

We are seeking an Analytics Engineer to own and execute the vision for our data transformation layer. You will be at the heart of our data platform, which processes hundreds of millions of events daily from a wide array of sources, including transactional databases, API logs, CRMs, payment systems, and marketing platforms.

You will join our 100% remote team and work closely with Data Engineers (who manage data ingestion) and Data Scientists and Business Users (who consume your data models). Your primary responsibility will be to use dbt and Trino on our GCP-based, open-source data infrastructure to build robust, scalable data models. These models are critical for stakeholders across the company—from finance and operations to the executive team—and are delivered via BI tools, reports, and reverse ETL systems.

Our team is 100% distributed and remote.

Responsibilities:

  • Own the Transformation Layer: Design, build, and maintain scalable data models using dbt and SQL to support diverse business needs, from monthly financial reporting to near-real-time operational metrics.
  • Set Technical Standards: Establish and enforce best practices for data modelling, development, testing, and monitoring to ensure data quality, integrity (up to cent-level precision), and discoverability.
  • Enable Stakeholders: Collaborate directly with finance, operations, customer success, and marketing teams to understand their requirements and deliver reliable data products.
  • Integrate and Deliver: Create repeatable patterns for integrating our data models with BI tools and reverse ETL processes, enabling consistent metric reporting across the business.
  • Ensure Quality: Champion high standards for development, including robust change management, source control, code reviews, and data monitoring as our products and data evolve.

Must-Haves:

  • Core Experience: 3+ years of experience in data analytics or data engineering with a strong focus on the "T" (transformation) in ELT.
  • Expert SQL Skills: High fluency in SQL for complex queries and data manipulation on large datasets.
  • Analytics Engineering Fundamentals: Deep understanding of data modeling, transformation principles, and data engineering best practices (e.g., source control, code reviews, testing).
  • dbt Experience: Proven experience building scalable transformation layers using formalized SQL modeling tools, preferably dbt.
  • Technical Versatility:
  • Work Ethic: Comfortable with ambiguity, able to take ownership with minimal oversight, and adaptable in a fast-paced environment.

Nice to Haves:

  • Experience with data ingestion tools (e.g., Airbyte) and orchestration tools (e.g., Airflow).
  • Experiences with Semantic Layer modelling (e.g. Cube, dbt Semantic Layer).

How We Take Care of You:
  • Competitive Salary & Stock Options
  • Health Benefits
  • New Hire Home-Office Setup: One-time USD $500
  • Monthly Stipend: USD $150 per month via a Brex Card

Alpaca is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse workforce.

Recruitment Privacy Policy

Top Skills

Bi Tools
Dbt
Google Cloud Platform
SQL
Trino

Similar Jobs

10 Days Ago
Remote
Canada
Senior level
Senior level
Artificial Intelligence • Cloud • Consumer Web • Productivity • Software • App development • Data Privacy
The Data Analytics Developer will design and maintain data pipelines, ensure data quality, and collaborate with cross-functional teams to drive decision-making processes through analytics.
Top Skills: AirflowDatabricksGitPythonSalesforceSnowflakeSQLSupersetTableau
4 Days Ago
Remote or Hybrid
6 Locations
Senior level
Senior level
Artificial Intelligence • Healthtech • Machine Learning • Natural Language Processing • Biotech • Pharmaceutical
The Manager, Data and Analytics Engineer will develop and manage data infrastructure for Visier People Analytics, ensuring data quality and compliance across various business functions.
Top Skills: AirflowAWSAzureDbtGCPPythonSnowflakeSQLVisier People Analytics
17 Days Ago
Easy Apply
Remote
29 Locations
Easy Apply
Mid level
Mid level
Cloud • Security • Software • Cybersecurity • Automation
As an Intermediate Full Stack Engineer, you will develop and maintain analytics instrumentation tools, collaborating across teams to enhance product insights and support data-driven decisions.
Top Skills: AirflowAtlanAWSAzureClickhouseDbtGCPJavaScriptMixpanelPosthogPulumiRuby on RailsReactRubySegmentSnowflakeTerraformVue

What you need to know about the Montreal Tech Scene

With roots dating back to 1642, Montreal is often recognized for its French-inspired architecture and cobblestone streets lined with traditional shops and cafés. But what truly sets the city apart is how it blends its rich tradition with a modern edge, reflected in its evolving skyline and fast-growing tech industry. According to economic promotion agency Montréal International, the city ranks among the top in North America to invest in artificial intelligence, making it le spot idéal for job seekers who want the best of both worlds.

Key Facts About Montreal Tech

  • Number of Tech Workers: 255,000+ (2024, Tourisme Montréal)
  • Major Tech Employers: SAP, Google, Microsoft, Cisco
  • Key Industries: Artificial intelligence, machine learning, cybersecurity, cloud computing, web development
  • Funding Landscape: $1.47 billion in venture capital funding in 2024 (BetaKit)
  • Notable Investors: CIBC Innovation Banking, BDC Capital, Investissement Québec, Fonds de solidarité FTQ
  • Research Centers and Universities: McGill University, Université de Montréal, Concordia University, Mila Quebec, ÉTS Montréal

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account