Job Summary
A company is looking for a Security Data Engineer to build and optimize data ingestion pipelines for security analytics.
Key Responsibilities
- Design and build scalable batch and streaming data pipelines for ingesting telemetry, log, and event data
- Develop and maintain orchestration workflows using tools like Apache Airflow
- Onboard new data sources and normalize security-related datasets while managing schema drift
Required Qualifications
- 8+ years of experience in data engineering or infrastructure roles focused on pipeline development
- Strong experience with Python and distributed data processing tools like Apache Spark
- Hands-on experience with orchestration frameworks like Apache Airflow
- Deep understanding of ingestion best practices and schema evolution
- Experience working with cloud environments (AWS, Azure, or GCP)
Comments