Job Summary
A company is looking for a Data Engineer to design, build, and optimize data pipelines and analytics platforms.
Key Responsibilities:
- Design and implement scalable data architectures leveraging BigQuery, Iceberg, Starburst, and Trino
- Develop robust ETL/ELT pipelines and optimize SQL queries for efficient analytics
- Collaborate with data scientists and stakeholders to integrate data from multiple sources
Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field
- 4+ years of experience in data engineering, particularly with SQL, BigQuery, and GCP
- Strong experience with Apache Iceberg, Starburst, and Trino for large-scale data processing
- Proficiency in SQL, including query optimization and performance tuning
- Hands-on experience with Terraform, Kubernetes, or Airflow for data pipeline automation
Comments