Reduction of 28.6% in the median per-issue resolution time
Introduction
Customer service is a critical component for the success of any product, directly influencing customer satisfaction and loyalty. One of the major challenges in this area is swiftly and accurately retrieving relevant past issues to resolve customer inquiries efficiently. Traditional methods in retrieval-augmented generation (RAG) treat a large corpus of past issue tracking tickets as plain text, which often leads to limitations in performance.
In a groundbreaking paper presented at SIGIR 2024, researchers from LinkedIn propose an innovative approach that combines RAG with knowledge graphs (KGs) to significantly improve customer service question-answering systems, resulting in a remarkable reduction of 28.6% in the median per-issue resolution time.
Inspired by recent researches, ChatBees is combining ticket knowledge graphs with your internal knowledge base to further reduce ticket resolution time! Additionally, we are developing an AI Agent to automate actions. Stay tuned for more updates! Now, let’s dive deeper into the paper.
The Challenge
Typical customer service systems rely on historical issue tickets treated as plain text, segmented into smaller chunks to accommodate the context length constraints of embedding models. This method, while straightforward, faces two major limitations:
- Compromised Retrieval Accuracy: Ignoring the inherent structure and interconnections within issue tickets leads to a loss of crucial information.
- Reduced Answer Quality: Segmenting tickets can disconnect related content, resulting in incomplete answers.
The Solution: RAG with Knowledge Graphs
The authors introduce a novel method that constructs a knowledge graph from historical issues to use in retrieval, preserving both the intra-issue structure and inter-issue relations. This method is divided into two main phases:
- Knowledge Graph Construction: This phase involves parsing each issue ticket into a tree structure and connecting individual tickets to form an interconnected graph. This dual-level architecture maintains the intrinsic relationships among entities, achieving high retrieval performance.
- Retrieval and Question Answering: During the question-answering phase, the system parses consumer queries, retrieves related sub-graphs from the KG, and generates answers. This integration ensures logical coherence and delivers complete and high-quality responses.
Empirical Results
Empirical assessments on benchmark datasets using key retrieval and text generation metrics show that the proposed method outperforms the baseline significantly. Key improvements include:
- 77.6% increase in Mean Reciprocal Rank (MRR)
- 0.32 increase in BLEU score
Deployment and Impact
The method has been deployed within LinkedIn’s customer service team for approximately six months, resulting in a remarkable reduction of 28.6% in the median per-issue resolution time. This deployment demonstrates the practical benefits and efficiency of integrating KGs with RAG in real-world applications.
Conclusion
The integration of retrieval-augmented generation with knowledge graphs represents a significant advancement in the field of customer service. By preserving the structure and interconnections within issue tickets, this approach not only improves retrieval accuracy but also enhances the quality of generated answers. The promising empirical results and successful deployment at LinkedIn underscore the potential of this innovative method to transform customer service operations, paving the way for more efficient and effective solutions in the industry.