SaaS solution Tech company based in Perth are seeking a GCP Data Engineer to join their team. Perm role.
Your new company
This company is a well-known SaaS solution, directly involved in assisting businesses to verify and validate their advertising traffic.
Your new role
In your new role you will focus on developing, constructing and maintaining data architecture on GCP – specifically building and creating a data warehouse using BigQuery.
You will be working with implementation teams by providing deep expertise to deploy large-scale data solutions in the enterprise. You will be using modern data analytics solutions on the cloud – GCP.
Duties will involve:
- Migrating current real-time data from third-party databases to one centralised data warehouse – BigQuery
- Integrate massive datasets from multiple data sources in order to complete data modelling procedures
- Automate processes within the predictive pipeline to free up resource time
- Ensure data quality in the pipeline and build process
- Design mature software with tools including Cloud Composer, Data Build Tool, Great Expectation using DevOps principles
- Query datasets, visualise results and create detailed reports
- Design usable and appropriate AI which is privacy-aware, accurately detects data and selection bias using AI systems such as BigQuery ML, Google AI Pipeline and AutoML
What you’ll need to succeed
The successful candidate will have proven experience working as a Data Engineer on large-scale data warehouse projects. You will bring a minimum of 2-3 years of project work to the role and have a true passion for big data and AI.
You MUST have the following key skills:
- Proven experience with Cloud BigQuery
- Experience designing and building data pipelines from ingestion of data through to consumption/visualisation within a “big data” architecture using Java, Python, Scala
- Knowledge or previous use of the following solutions; Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud PubSub, Cloud Functions, Cloud Run, Airflow and Cloud Composer
- Experience designing, building and deploying data solutions using Google Cloud, AWS or Azure
- Strong understanding of data governance principles and security requirements on GCP
- Extensive experience writing complex SQL queries and stored procedures
- Experience using SQL & Python to build and monitor data pipelines & detect data quality issues
- Experience using data visualisation tools such as Microsoft Power BI, Tableau, Python Data Viz or Data Studio
- Proven experience working with semi-structured data is a must have
What you’ll get in return
This role offers an excellent salary package, extremely flexible work arrangements; including 100% remote work for the right applicant; and a great opportunity for career growth and development. Working for this company will also provide the successful applicant the opportunity to work with large big data companies such as Google, AWS and Imply.
What you need to do now
If you’re interested in this role, click ‘apply now’ to forward an up-to-date copy of your CV, or call Hannah Becker now.
If this job isn’t quite right for you but you are looking for a new position, please contact us for a confidential discussion on your career.
SA Licence number : LHS 297508