Job Summary
A company is looking for an LLM Safety Analyst to support the secure launch of LLM-based features.
Key Responsibilities
- Create and execute safety testing plans for LLM-based features
- Evaluate testing results for risk assessment and provide actionable recommendations
- Document testing plans, feedback, vulnerabilities, and communicate with internal stakeholders
Required Qualifications
- Bachelor's degree in Engineering, Computer Science, Data Science, or Information Systems
- 5 years of experience with LLM safety risks and conducting risk assessments for AI systems
- 5 years of experience working directly with Machine Learning Engineers or Data Scientists
- Knowledge of multiturn prompt engineering and multi-modal AI systems
- Master's degree or PhD preferred
Comments