Job Summary
A company is looking for a Data Engineer - 100% Remote - PST Hours.
Key Responsibilities
- Design and implement an open-format data lake with Databricks, PySpark, Snowflake, and BigQuery
- Build and maintain a unified audience store with robust contracts, pipelines, and replication strategies
- Migrate foundational datasets and pipelines to a cloud-based Databricks environment
Required Qualifications
- 5+ years of data engineering experience with large-scale data lakes and cloud platforms
- Hands-on expertise with Databricks, PySpark, and Snowflake (S3/GCP)
- Strong experience with Google BigQuery and data pipeline development
- Proven track record in data migration and building scalable, resilient pipelines
- Solid understanding of data contracts, metadata management, and replication strategies
Comments