Job Summary
A company is looking for a Data Engineer to design, build, and maintain scalable data pipelines and architectures.
Key Responsibilities
- Design, develop, test, and deploy data ingestion methods and pipelines using various technologies
- Ensure high-throughput and fault-tolerant data pipelines by applying best practices in data mapping and automation
- Collaborate with business partners to productionize and optimize enterprise analytics
Required Qualifications
- Bachelor's Degree or equivalent in Computer Science, Information Technology, or Engineering required
- 3-5 years of experience in data engineering with ETL or ELT patterns required
- 1-2 years of experience with cloud platforms (AWS, Azure, GCP) and Python programming or SQL required
- Understanding of dimensional data modeling for data warehouses preferred
- Experience with big data streaming technologies such as Kafka preferred
Comments