HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Deeper Text Understanding for IR with Contextual Neural Language Modeling

Zhuyun Dai; Jamie Callan

Deeper Text Understanding for IR with Contextual Neural Language Modeling

Abstract

Neural networks provide new possibilities to automatically learn complex language patterns and query-document relations. Neural IR models have achieved promising results in learning query-document relevance patterns, but few explorations have been done on understanding the text content of a query or a document. This paper studies leveraging a recently-proposed contextual neural language model, BERT, to provide deeper text understanding for IR. Experimental results demonstrate that the contextual text representations from BERT are more effective than traditional word embeddings. Compared to bag-of-words retrieval models, the contextual language model can better leverage language structures, bringing large improvements on queries written in natural languages. Combining the text understanding ability with search knowledge leads to an enhanced pre-trained BERT model that can benefit related search tasks where training data are limited.

Code Repositories

AdeDZY/SIGIR19-BERT-IR
Official
tf
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
ad-hoc-information-retrieval-on-trec-robust04BERT-MaxP
nDCG@20: 0.469
ad-hoc-information-retrieval-on-trec-robust04BERT-SumP
nDCG@20: 0.467
ad-hoc-information-retrieval-on-trec-robust04BERT-FirstP
nDCG@20: 0.444

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Deeper Text Understanding for IR with Contextual Neural Language Modeling | Papers | HyperAI