Job Summary
A company is looking for a Data Engineer to design, build, and optimize data pipelines and analytics platforms.
Key Responsibilities:
- Design and implement scalable data architectures using BigQuery, Iceberg, Starburst, and Trino
- Develop and optimize robust ETL/ELT pipelines for structured and unstructured data
- Collaborate with data scientists and analysts to integrate data from multiple sources and ensure data integrity
Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field
- 4+ years of experience in data engineering with expertise in SQL, BigQuery, and GCP
- Strong experience with Apache Iceberg, Starburst, and Trino for large-scale data processing
- Proficiency in SQL, including query optimization and performance tuning
- Hands-on experience with data pipeline automation tools like Terraform, Kubernetes, or Airflow
Comments