Job Description
### About Smart Working
At Smart Working, we believe your job should not only look right on paper but also feel right every day. This isn’t just another remote opportunity - it’s about finding where you truly belong, no matter where you are. From day one, you’re welcomed into a genuine community that values your growth and well-being. Our mission is simple: to break down geographic barriers and connect skilled professionals with outstanding global teams and products for full-time, long-term roles. We help you discover meaningful work with teams that invest in your success, where you’re empowered to grow personally and professionally. Join one of the highest-rated workplaces on Glassdoor and experience what it means to thrive in a truly remote-first world.
### About the Role
This is a long-term, strategic role, not a short sprint. You'll be embedded in a collaborative engineering and analytics team, working across the full data lifecycle: ingestion, transformation, modelling, and surfacing insights through Looker. You'll work closely with stakeholders across commercial, product, and marketing to ensure data is reliable, scalable, and meaningful. You'll be given real ownership. This is a role for someone who wants to shape standards, improve the architecture, and grow with a brand that takes its data seriously.
### Responsibilities
- Design, build, and maintain robust ETL/ELT pipelines that move data from source systems into Google BigQuery, ensuring reliability, scalability, and observability at every stage.
- Develop and enforce data models and schema standards using best-practice SQL and dimensional modelling principles, focusing on clarity, reuse, and performance.
- Own the Google BigQuery environment, optimizing queries, managing costs, enforcing data governance, and ensuring the platform scales alongside the business.
- Build and maintain Looker explores, LookML models, and dashboards that translate complex datasets into clear, actionable business intelligence for non-technical stakeholders.
- Work across the full Google Cloud Platform stack, including Cloud Storage, Dataflow, Pub/Sub, Cloud Functions, and Composer, to architect end-to-end data solutions.
- Partner with analytics, engineering, and commercial teams to understand data requirements and translate business problems into scalable technical solutions.
- Champion data quality and testing frameworks, implementing monitoring and alerting so that issues are caught early and resolved quickly.
- Contribute to documentation, coding standards, and architectural decision records so the team can move fast with confidence.
- Mentor junior data team members and set the bar for engineering rigor across the data function.
- Stay current with developments in the modern data stack and proactively recommend tooling or process improvements where appropriate.
### Requirements
- 5+ years of experience in SQL and data modelling, with strong command of dimensional modelling, star schemas, and performance optimisation.