Job Summary
A company is looking for a Big Data Lead.
Key Responsibilities
- Build and maintain ELT pipelines using Apache Airflow, dbt, and Snowflake in an AWS cloud environment
- Develop modular Python code and deploy container-based services in AWS with monitoring setups
- Integrate and process data from REST APIs while ensuring data quality and governance
Required Qualifications
- Strong expertise in SQL and Snowflake, including performance tuning and data modeling
- Proficient in Python for scripting, automation, and working with REST APIs
- Experience with Apache Airflow for orchestration and workflow monitoring
- Hands-on experience with dbt for modular, version-controlled data transformations
- Solid experience with AWS services such as S3, Lambda, IAM, and CloudWatch
Comments