Job Summary
A company is looking for an LLM Safety Analyst to help securely launch LLM-based features.
Key Responsibilities
- Create and execute safety testing plans for LLM-based features
- Evaluate testing results for risk assessment and provide actionable recommendations
- Document testing plans and communicate observations with internal stakeholders
Required Qualifications
- Bachelor's degree required; master's or PhD preferred
- Background in engineering, computer science, data science, or information systems
- Knowledge of LLM safety risks and experience with risk assessment for AI systems
- Familiarity with multiturn prompt engineering and multi-modal AI systems
- Experience in Trust & Safety, product QA, or adversarial testing is a plus
Comments