View a PDF of the paper titled $texttt{MixGR}$: Enhancing Retriever Generalization for Scientific Domain through Complementary Granularity, by Fengyu Cai and 6 other authors
Abstract:Recent studies show the growing significance of document retrieval in the generation of LLMs, i.e., RAG, within the scientific domain by bridging their knowledge gap. However, dense retrievers often struggle with domain-specific retrieval and complex query-document relationships, particularly when query segments correspond to various parts of a document. To alleviate such prevalent challenges, this paper introduces $texttt{MixGR}$, which improves dense retrievers’ awareness of query-document matching across various levels of granularity in queries and documents using a zero-shot approach. $texttt{MixGR}$ fuses various metrics based on these granularities to a united score that reflects a comprehensive query-document similarity. Our experiments demonstrate that $texttt{MixGR}$ outperforms previous document retrieval by 24.7%, 9.8%, and 6.9% on nDCG@5 with unsupervised, supervised, and LLM-based retrievers, respectively, averaged on queries containing multiple subqueries from five scientific retrieval datasets. Moreover, the efficacy of two downstream scientific question-answering tasks highlights the advantage of $texttt{MixGR}$ to boost the application of LLMs in the scientific domain. The code and experimental datasets are available.
Submission history
From: Fengyu Cai [view email]
[v1]
Mon, 15 Jul 2024 13:04:09 UTC (1,008 KB)
[v2]
Fri, 1 Nov 2024 14:08:31 UTC (2,282 KB)
Source link
lol