Data Engineer (Google Cloud) - Opportunity to Work on Scalable Data Systems
A well-established SaaS organisation is seeking an experienced Data Engineer with a background in cloud infrastructure. This role involves working with large datasets and building efficient, scalable data systems using Google Cloud technologies to support the company's data-driven objectives.
Key Responsibilities:
Build and maintain efficient data pipelines on Google Cloud Platform (GCP), ensuring scalability and reliability.
Utilise tools such as Google BigQuery, Apache Spark, Apache Beam, Airflow, and Cloud Composer to manage and process large datasets.
Collaborate with engineering, product, and data teams to create insightful reporting and visualisation tools for internal teams and clients.
Maintain and improve datasets and models to meet business requirements.
Own and continuously enhance the internal data engineering stack, focusing on best practices and scalability.
What We're Looking For:
3-5 years of experience as a Data Engineer, with hands-on experience in cloud platforms (experience with Google Cloud is a plus).
Strong knowledge of data warehousing (e.g., Google BigQuery), data processing (Apache Spark, Beam), and pipeline orchestration (Airflow, Cloud Composer).
Proficiency with SQL and No-SQL databases (e.g., Cloud Datastore, MongoDB), and storage systems (e.g., Google Cloud Storage, S3).
Strong experience with Python or Java 8+, and object-oriented programming.
Familiarity with development tools such as GitHub, Jira, Docker, and Kubernetes.
What's on Offer:
Hybrid working options to support a balanced work-life environment.
A comprehensive benefits package, including health and wellness programmes, an electric car scheme, and childcare support.
Opportunities to engage in company working groups focused on initiatives like CSR, DE&I, and mental health support.
Enhanced family leave policies and career development opportunities.