Job Summary
A company is looking for a Data Engineer to join their Utility Data Ingest team.
Key Responsibilities
- Design, build, and maintain data pipelines using Databricks and Python
- Develop and manage data orchestration workflows using Airflow or similar tools
- Improve reliability and fault tolerance of ingest pipelines handling large volumes of utility data
Required Qualifications
- 2-5 years of experience in data engineering or backend engineering roles
- Strong working knowledge of Databricks, Spark, or similar distributed data systems
- Experience with Airflow or other workflow orchestration tools
- Proficient in Python and SQL, with an understanding of scalable data design
- Familiarity with cloud platforms (GCP or AWS)
Comments