This system was created to help Google understand how queries are related to pages and how they relate to concepts that are difficult to understand.
Let's say you enter the query "Tie my shoelaces" into the lithuania mobile database search results. This combination can now have multiple variants. Thanks to neutral matching, Google is able to "understand" that the word "laces" means shoelaces and will show the queryer results about ways to tie them. (Google has been using this system since 2018!)
BERT (2019)
The BERT (Bidirectional Encoder Representations by Transformers) system is considered a complete breakthrough in understanding how multiple words in a sentence relate to multiple words on a page and the expressions behind them.
This system is very important for entity recognition. This will help Google better understand the brand name, who the person is, and perhaps even what their expertise is in a particular topic.
BERT is a type of AI model that enables generative AI and AI insights. Google has been actively using this system since 2019!
Closely related to BERT is Deep Rank (a deep learning system). It is essentially BERT when used for ranking.
Over time, Deep Rank replaced much of RankBrain.
Neuronal concordance (2018)
-
- Posts: 496
- Joined: Sun Dec 22, 2024 3:27 am