Jobs

Information & Communication Technology

Senior Data Engineer – Snowflake

About the role

Our client are at a pivotal moment with the consolidation and transformation of their data infrastructure into a unified enterprise architecture & need a senior data engineer with expert level Snowflake who can drive us in the right direction.

You will be the most senior and experienced data engineer on the team, setting standards, defining best practices, and playing a central role in shaping how we build and operate data systems going forward. You’ll work closely with our data architect to plan and execute a migration from our current state to a modern, scalable future.

It’s hands-on, strategic, and high-impact with the view to promote this hire into a Lead Data Engineer within 12 months.

*NB: YOU MUST BE LOCATED IN SYDNEY AND HAVE AUSTRALIAN PERMANENT RESIDENCY OR CITZENSHIP**

The responsibilities

  • Lead the planning and execution of our cloud data warehouse consolidation (Redshift to Snowflake), working hand-in-hand with the data architect and wider team
  • Define and enforce engineering standards, coding best practices, and ways of working across the data engineering team
  • Maintain and improve existing Airflow ETL pipelines while technical debt is addressed and future-state systems are built out
  • Design and build robust, scalable data pipelines ingesting from REST APIs, Graph APIs, S3, SharePoint, and a variety of file formats
  • Administer Snowflake environments, user access, data sharing, roles, warehouses, and schemas across global teams
  • Provide technical guidance and oversight to onshore & offshore data engineering resources
  • Collaborate with BI, DevOps, IT, and occasionally senior leadership to translate technical decisions for non-technical stakeholders

The requirements

Essential:

  • Strong Snowflake expertise – including admin, data sharing, role-based access, multi-warehouse environments
  • Strong AWS experience (Redshift/S3) accompanied with Python and SQL expertise
  • Deep understanding of ETL/ELT design and pipeline architecture
  • Airflow experience (or equivalent orchestration tool – Prefect, Dagster, etc.)
  • Proven ability to consume REST and Graph APIs, including authentication (OAuth, tokens, refresh flows) and rate limiting
  • Strong data system design skills & ideally coming from a software engineering background (version control, code review, CI/CD mindset)
  • Comfortable operating autonomously and setting standards in a small, high-trust team

Desirable:

  • Cloud-to-cloud data warehouse migration experience
  • Kubernetes familiarity
  • Snowflake certification or Snowflake Cortex or AI/ML tooling exposure

The interview process

  1. Culture & behavioural interview: Situational questions, ways of working, and mutual fit
  2. Technical take-home + live review: A practical task followed by a deeper technical conversation covering system design thinking and broader experience
  3. In-office presentation: Meet the wider team and present to a mixed technical and non-technical audience
  4. Offer

Jobs

View More Jobs