Job Summary
A company is looking for a Big Data Lead.
Key Responsibilities
- Build and maintain ELT pipelines using Apache Airflow, dbt, and Snowflake in an AWS environment
- Develop and implement modular Python code and deploy container-based services on AWS
- Ensure data quality and governance while integrating and processing data from REST APIs
Required Qualifications
- 10+ years of experience in data engineering
- Strong expertise in SQL and Snowflake, including performance tuning and data modeling
- Proficient in Python for scripting and automation
- Experience with AWS services such as S3, Lambda, and Redshift
- Familiarity with version control systems, particularly git, and CI/CD processes
Comments