IBM has been granted a patent for a system that uses a Bidirectional Encoder Representations from Transformers (BERT) model to improve the ranking of search results. The system fine-tunes the BERT model using paraphrases generated from a frequently asked question (FAQ) dataset and then re-ranks the search results based on this fine-tuned model. This technology aims to enhance the accuracy and relevance of search results for users. GlobalData’s report on International Business Machines gives a 360-degree view of the company including its patenting strategy. Buy the report here.

According to GlobalData’s company profile on International Business Machines, M2M communication interfaces was a key innovation area identified from patents. International Business Machines's grant share as of September 2023 was 71%. Grant share is based on the ratio of number of grants to total number of patents.

The patent is granted for a system that re-ranks search results using a bert model trained on faq data

Source: United States Patent and Trademark Office (USPTO). Credit: International Business Machines Corp

A recently granted patent (Publication Number: US11775839B2) describes a system and method for improving the ranking of query candidates in response to user queries. The system utilizes a combination of techniques, including fine-tuning a generative pretrained transformer and a Bidirectional Encoder Representations from Transformers (BERT) model.

The system works by receiving a query and retrieving ranked candidates from an index based on the query. It then fine-tunes a generative pretrained transformer using question-answer pairs from a frequently asked question (FAQ) dataset. The fine-tuned transformer is used to automatically generate paraphrases for questions in the FAQ dataset, based on input answers from the dataset. The system filters the generated paraphrases to match the same FAQ as their original questions using the index.

Next, the system fine-tunes a BERT query-question (Q-q) model using the filtered generated paraphrases to match queries to questions in the FAQ dataset. The ranked candidates are then re-ranked using the fine-tuned BERT Q-q model. Finally, the system returns the re-ranked candidates as a response to the user's query.

The patent also describes additional features of the system, such as the use of triplets comprising a question, a positive paraphrase, and a negative paraphrase to train the BERT Q-q model. It also mentions a final re-ranking step that combines multiple re-rankers, including the fine-tuned BERT Q-q model, a BERT query-answer (Q-a) model, and a passage-based re-ranker, using an unsupervised late-fusion technique.

The patent further includes a computer program product embodiment, which provides program code for implementing the system and method described above.

Overall, this patent presents a system and method for improving the ranking of query candidates by leveraging generative pretrained transformers, BERT models, and advanced re-ranking techniques. The system aims to enhance the accuracy and relevance of search results in response to user queries.

To know more about GlobalData’s detailed insights on International Business Machines, buy the report here.

Data Insights

From

The gold standard of business intelligence.

Blending expert knowledge with cutting-edge technology, GlobalData’s unrivalled proprietary data will enable you to decode what’s happening in your market. You can make better informed decisions and gain a future-proof advantage over your competitors.

GlobalData

GlobalData, the leading provider of industry intelligence, provided the underlying data, research, and analysis used to produce this article.

GlobalData Patent Analytics tracks bibliographic data, legal events data, point in time patent ownerships, and backward and forward citations from global patenting offices. Textual analysis and official patent classifications are used to group patents into key thematic areas and link them to specific companies across the world’s largest industries.