Job Summary
A company is looking for a DataOps/Cloud Data Engineer - Intermediate.
Key Responsibilities
- Design and develop scalable, efficient data pipelines using Azure Data Factory and Databricks Workflows
- Optimize pipeline performance for scalability, throughput, and reliability with minimal latency
- Implement robust data quality, validation, and cleansing processes to ensure data integrity
Required Qualifications
- 5+ years of experience in data engineering with strong proficiency in Python and familiarity with Azure Services
- Expertise with Azure Data Services: Azure SQL Database, Azure Data Lake, Azure Databricks
- Experience with data pipeline development, orchestration, deployment, and automation using ADF and Databricks
- Solid understanding of data warehousing and ETL concepts
- Familiarity with DataOps principles and Agile methodologies
Comments