Let’s get started
Company Logo

Remote Jobs

Principal GCP Data Engineer

6/19/2025

Remote

Job Summary

A company is looking for a Principal GCP Data Engineer to support data initiatives and build data pipelines.

Key Responsibilities
  • Develop and build data ingestion and ETL pipelines from scratch using SnapLogic, Python, SQL, Dataflow, and Spark
  • Migrate data to GCP and build out the GCP BigQuery warehouse while sunsetting legacy ETL processes
  • Support data modeling and orchestration, focusing on the development of new DAGs using Airflow or Cloud Composer
Required Qualifications
  • 5-6 years of experience in data engineering with a focus on building data pipelines
  • Experience with GCP technologies, particularly BigQuery and SnapLogic
  • Proficiency in orchestration tools such as Airflow or Cloud Composer
  • Familiarity with data warehousing fundamentals and data modeling
  • Experience with additional technologies such as Kafka, Java, Apache Beam, or Alteryx is a plus

Comments

No comments yet. Be the first to comment!