Job Summary
A company is looking for a Senior Data Engineer to design, develop, and maintain data pipelines and infrastructure in a remote capacity.
Key Responsibilities
- Operate and own production data pipelines, ensuring data quality, reliability, and rapid incident resolution
- Design, build, and optimize batch and streaming pipelines using Airflow and manage data in a Redshift-based warehouse
- Collaborate with business stakeholders to support reporting and analytics, while championing best practices in data engineering
Required Qualifications
- 5+ years of experience in designing and operating production data pipelines and warehouses
- Bachelor's or Master's degree in Computer Science
- Expert-level proficiency in Python and advanced SQL for data processing and analytical workloads
- Experience with AWS services and core data-engineering technologies such as Airflow and Apache Spark
- Foundational knowledge of NoSQL technologies like MongoDB and DynamoDB
Comments