Job Summary
A company is looking for a Databricks Engineer.
Key Responsibilities
- Design, develop, and maintain scalable ETL/ELT pipelines using Databricks, Apache Spark, and Delta Lake
- Integrate and manage data flows between Salesforce and enterprise data platforms to support analytics and regulatory reporting
- Collaborate with cross-functional teams to deliver high-impact data solutions while ensuring data quality and governance
Required Qualifications
- 5+ years of experience in data engineering or a related field
- Strong hands-on experience with Databricks, Apache Spark, and Delta Lake
- Proficiency in Python, SQL, and orchestration tools
- Solid understanding of Salesforce data models and integration tools
- Experience working in financial services with knowledge of industry-specific data and compliance requirements
Comments