BERT , which stands for Bidirectional Encoder Representations from Transformers, is a neural network-based technique for pre-training natural language processing. In simple terms, it can be used to help Google better discern the context of words in search queries.
If the models before this update worked on single words, in the exact order in which they were written, in fact they were designed in a unidirectional way (that is, the meaning of a word in a context window db center whatsapp number could only move in one direction, from left to right or from right to left, but never at the same time), BERT is capable of observing all the words of a sentence at the same time, in a bidirectional way and therefore of understanding how each single word influences all the others, just like a human mind.
For example, if a user previously typed the query “2019 brazil traveler to USA need a visa” before the algorithm update, the browser would provide useful links for US citizens intending to travel to Brazil. However, the correct search intent is not this but rather to understand whether Brazilian travelers to the US need a visa. BERT intervenes by capturing the exact search intent, thanks to its ability to understand the correlation between the different terms in the sentence.
To do this, it uses two different neural pre-training models:
The first model is called Mask Language Model (MLM) and it is used to predict some words and self-verify that it actually understood what is being talked about. It basically works like this: engineers mask random words within some sentences and the algorithm tries to guess them; as the algorithm learns, it is optimized to make fewer mistakes on the training data.
The second model is called Next Sentence Prediction (NSP) and it helps BERT to relate sentences to each other. In the training process, it receives pairs of sentences as input and learns to predict whether the second sentence of the pair can be the most correct one in succession.
Google BERT: Bidirectional Encoder Representations from Transformers
-
- Posts: 81
- Joined: Sun Dec 15, 2024 3:29 am