The LavaCon Content Strategy Conference | 27–30 October 2024 | Portland, OR
Marla Azriel

Marla Azriel is a documentation strategy lead at Meta. She co-leads the Docs@Scale programs, which are scalable self-service content solutions for teams across Meta. Marla created Meta’s Doc Training program to increase eng contributions to documentation, and the Doc Consult program, which helps teams identify strategic areas to improve and provides actionable recommendations.

Before Meta, Marla managed the Cloud User Assistance team at Oracle, delivering comprehensive content sets for multiple product suites. Prior to that role, she worked as a Senior Principal Technical Writer in the Content Management and Business Intelligence groups. She started her career at Fujitsu Network Communications doing technical writing, project management, UI and Web design, marketing writing, and information design and strategy.

Marla holds a BA in English from the University of California, Berkeley and studied abroad at the University of Kent in Canterbury.

On the personal side, Marla loves nature photography and travel. She is currently a black belt candidate in Tae Kwon Do.

From Overwhelmed to Empowered: Unleashing the Benefits of GenAI in Content Management and User Support

Co-presented with: Lingxia Song

Are you struggling to manage thousands of user questions and effectively define taxonomy? Do you find it time-consuming to manually triage user questions to the right engineering group for support? Have you attempted to use GenAI bots for support questions, only to discover they hallucinate and struggle to provide accurate responses to domain-specific questions?
GenAI holds immense potential, but it requires strategic guidance to fully unleash its capabilities. In this presentation, we will divulge our insights and best practices for leveraging GenAI to optimize content management and streamline the technical support process.
Our discussion will cover:

Enhancing Answer Quality: Retrieval Augmented Generation (RAG) methods we employ to improve response quality for domain-specific questions, with case studies.
Creating GenAI-Friendly Documentation: Guidelines on crafting high-quality documentation that makes Large Language Model (LLM) training more efficient, and that produces higher-quality answers.
Implementing GenAI Solutions: Case studies around GenAI solutions to help teams auto-analyze their support burden and provide custom remediation solutions, like auto categorization and triaging to reduce support burden.

Auto categorization reduced engineering work from 12 days in tagging 3 month’s customer questions to 2 days
700 gold standard sets of Q&A pairs from high-volume customer question groups have helped improve the LLM performance by 22% over the production model.

Join us as we delve into the world of GenAI and turn the jargon into reality to create efficient and effective content for user support.

In this session, attendees will learn:

  • Guidelines on creating GenAI friendly docs
  • How to use GenAI to generate effective taxonomies
  • Useful prompts to auto-categorize user support questions
  • Best practices to improve GenAI answer quality and avoid hallucinations
  • How to integrate a series of GenAI solutions into your support process to reduce oncall burden