Job Summary
A company is looking for a Senior Data Engineer to help transform the insurance industry by solving complex data challenges at scale.
Key Responsibilities
- Architect and implement modern, scalable ETL/ELT pipelines using AWS-native services to process insurance claims data
- Build resilient, high-throughput data pipelines ensuring quality and reliability for accurate enterprise data
- Collaborate with cross-functional teams to deliver business-aligned solutions and streamline operations through automation
Required Qualifications
- 6+ years of experience building production data pipelines with Spark, PySpark, and orchestration tools
- 4+ years of deep AWS data engineering expertise, including tools like Glue and EMR
- 3+ years of experience architecting data solutions using AWS services, from S3 to Redshift
- Expert-level SQL skills for large-scale data processing and advanced analytics
- Strong engineering practices, including GitOps workflows and infrastructure-as-code methodologies
Comments