Job Summary
A company is looking for a Big Data Engineer.
Key Responsibilities
- Design and implement ETL processes and manage data warehousing solutions
- Utilize Databricks and Py Spark for data processing and analysis
- Provide support for data migration and production environments
Required Qualifications
- Strong experience in ETL and data warehousing
- Proficiency in Databricks, Unity Catalogue, and Medallion Architecture
- Experience with SQL queries, stored procedures, and user-defined functions
- Familiarity with Azure Cloud services, including ADF and Azure Data Lake
- Experience with SQL Server and data migration processes
Comments