Job Summary
A company is looking for a Senior Data Engineer to lead the design and implementation of scalable data pipelines.
Key Responsibilities
- Design and architect end-to-end data processing pipelines: ingestion, transformation, and delivery to the Delta Lakehouse
- Integrate with external systems to automate ingestion of diverse data sources
- Develop robust data workflows using Python and Databricks Workflows
Required Qualifications
- Master's degree in Computer Science, Engineering, Physics, or a related technical field or equivalent work experience
- 3+ years of experience building and maintaining production-grade data pipelines
- Proven expertise in Python and SQL for data engineering tasks
- Strong understanding of lakehouse architecture and data modeling concepts
- Hands-on experience with AWS cloud infrastructure
Comments