Job Description
-
Permanent Role
-
Location: Sydney (must have full working rights)
-
3 days WFO, 2 days WFH
-
Salary: $170,000 - $180,000 + super + share options.
Principal Data Engineer – Own the Technical Vision
As a Principal Data Engineer, you’ll take full ownership of the technical direction across the data engineering function — selecting the right tools, designing resilient systems, and making informed trade-offs where performance, cost, usability, and scale intersect. You’ll build with purpose — not just clean code, but clear, valuable outcomes.
We’re looking for someone with a strong software engineering background who now specialises in data engineering to help shape and scale the data strategy of a high-growth, VC backed Tech company reshaping the insurance experience.
You understand modern data platforms, data modelling best practices, and how to design ETL/ELT pipelines that are scalable, maintainable, and production-ready. You think beyond pipelines — you think about data as a product.
What sets you apart:
-
You've worked in small to medium-sized businesses where you’ve had to lead technical decision-making, take ownership of data platforms end-to-end, and stay hands-on.
-
You apply software engineering principles to data platform design — from architecture to automation to testing and have worked with Product and Engineering teams.
-
You’ve built or scaled data platforms and understand the system holistically: storage, orchestration, transformation, data quality, security and access layers.
-
You're experienced in working through technical trade-offs: batch vs real-time, cost vs complexity, performance vs flexibility — and you know how to explain and justify your choices.
-
You don’t just default to a single solution — you approach problems by exploring multiple paths and designing fit-for-purpose outcomes.
-
You work well in ambiguous environments, translating evolving business needs into robust technical solutions through strong stakeholder engagement.
-
You bring a data-as-a-product mindset, designing pipelines and models that are reusable, observable, and built to scale.
What You Bring:
-
6–8+ years building and scaling data platforms, including warehouse design, data governance, and managing large-scale data environments.
-
Expert in SQL and Python, with production-ready code for both batch and real-time use cases.
-
Strong experience with ETL/ELT and orchestration tools, with a sharp eye for performance and optimisation.
-
Cloud-native mindset, confident working with Snowflake (or similar) and the wider AWS (or comparable cloud) toolbox: S3, Lambda, Step Functions, Glue.
-
Bonus: Experience with dbt and modular SQL transformations is a strong plus.
If this sounds like you reach out to ailbhe@theonset.com.au.
#LI-AL1