The LavaCon Content Strategy Conference | 27–30 October 2024 | Portland, OR
Lingxia Song

Lingxia Song co-leads the Docs@Scale program at Meta, which provides scalable content solutions to help engineering teams increase their productivity and efficiency. The program offers doc consult, docathon, and doc training services to support these efforts. She initiated the Gen AI for Docs work stream as a key component of the Meta Docs@Scale program and has since then led various Meta-wide initiatives, projects and speeches on creating Gen AI-friendly docs.

Lingxia holds a BA in Chinese literature and MFA in Writing for the screen and stage. Before becoming a technical writer, Lingxia was a journalist and sitcom writer in Shanghai. With a diverse background in journalism, technical writing, and content strategy, Lingxia’s experience in creating engaging content for various audiences has equipped her with a deep understanding of how strategic content can drive productivity and efficiency.

From Overwhelmed to Empowered: Unleashing the Benefits of GenAI in Content Management and User Support

Co-presented with: Marla Azriel

Are you struggling to manage thousands of user questions and effectively define taxonomy? Do you find it time-consuming to manually triage user questions to the right engineering group for support? Have you attempted to use GenAI bots for support questions, only to discover they hallucinate and struggle to provide accurate responses to domain-specific questions?
GenAI holds immense potential, but it requires strategic guidance to fully unleash its capabilities. In this presentation, we will divulge our insights and best practices for leveraging GenAI to optimize content management and streamline the technical support process.
Our discussion will cover:

Enhancing Answer Quality: Retrieval Augmented Generation (RAG) methods we employ to improve response quality for domain-specific questions, with case studies.
Creating GenAI-Friendly Documentation: Guidelines on crafting high-quality documentation that makes Large Language Model (LLM) training more efficient, and that produces higher-quality answers.
Implementing GenAI Solutions: Case studies around GenAI solutions to help teams auto-analyze their support burden and provide custom remediation solutions, like auto categorization and triaging to reduce support burden.

Auto categorization reduced engineering work from 12 days in tagging 3 month’s customer questions to 2 days
700 gold standard sets of Q&A pairs from high-volume customer question groups have helped improve the LLM performance by 22% over the production model.

Join us as we delve into the world of GenAI and turn the jargon into reality to create efficient and effective content for user support.

In this session, attendees will learn:

  • Guidelines on creating GenAI friendly docs
  • How to use GenAI to generate effective taxonomies
  • Useful prompts to auto-categorize user support questions
  • Best practices to improve GenAI answer quality and avoid hallucinations
  • How to integrate a series of GenAI solutions into your support process to reduce oncall burden