Job Summary
A company is looking for a Data Engineer to build and maintain its data infrastructure.
Key Responsibilities
- Develop and maintain data pipelines using Airflow for data ingestion and transformation
- Manage data models in the data warehouse using dbt and optimize queries in BigQuery
- Monitor data pipelines for performance issues and collaborate with analysts and scientists to meet data needs
Required Qualifications
- Basic understanding of data warehousing, ETL/ELT principles, and data modeling
- Proficiency in SQL and Python
- Familiarity with Apache Airflow, dbt, and Google BigQuery
- Experience with version control using GitHub
- Familiarity with Google Cloud Platform (GCP) or other major cloud providers
Comments