Job Summary
A company is looking for a Big Data Lead to support the migration of legacy ETL pipelines to modern cloud-based solutions.
Key Responsibilities
- Analyze and document existing Pentaho ETL jobs, transformations, and data flows
- Translate Pentaho logic into Python scripts and/or ADF pipeline components
- Develop and maintain scalable Python-based data processing solutions
Required Qualifications
- Bachelor's degree in Computer Science, Engineering, or a related field
- 5-7 years of experience in data engineering or ETL development
- Prior experience in migration projects is highly desirable
- Strong proficiency in Python for data manipulation and automation
- Hands-on experience with Pentaho Data Integration (PDI) and SQL
Comments