In case you haven’t heard about it, on Oct. 25th Google announced to have included the BERT in their search engine, which will help the search engine to better understand natural language, and be affecting right from the start 10% of search queries, and which they said to be the biggest update in the last 5 years (and one of the most significant one in the future).
As my clients started asking me about it during the last days, I decided to come up with a non-tech explanation to grasp, in very general terms, the importance of BERT and how it might affect SEO in the future.
I’d like to thank Dawn Anderson who, with her amazing article at searchengineland.com managed to explain many in-depth technical aspects that come with BERT – definitely head over to her article afterwards if you want to know about all the different linguistic problems BERT solves, and how (be ready to see your mind explode).
For the moment, here’s the extra light and condensed version of it, created by GERT about BERT 😉
What’s BERT and why is it important?
BERT stands for Bidirectional Encoder Representations from Transformers and represents a framework that’s now applied to Google search with the aim of better understanding a search query’s context and clear up certain ambiguities that exist with multiple meanings for many of the words being used.SEOs, better be prepared for Google understanding the whole context of every phrase now. Focus on quality – now more than ever! #SEO #BERT Click To Tweet
In times of voice search getting more popular, BERT essentially prepares the field for better language understanding. Hence, with the BERT framework, Google has now made a huge jump towards better NLP (natural language processing).
Language is full of ambiguities and differences in meaning: Just take this very simple sentence, for example, that might you an idea of how hard it must be for machines to understand a phrase (in reality, there are hundreds of additional complications explained in detail at searchengineland.com but which we leave out for the sake of simplicity in this context):
He did not understand why Frank had to tell him about his daughter.
- the first “he” refers to someone not even mentioned in the sentence
- the word “him” must be resolved to be referring to the same, yet unknown person
- the word “his” must be resolved back to Frank
Machines need to understand context, the meaning of each word, and how context might affect the real meaning, to correctly process what is being said.u003cemu003e“You shall know a word by the company it keeps.”u003c/emu003e u003cemu003e(Firth, J.R. 1957)u003c/emu003e #BERT #SEO Click To Tweet
How does BERT help Google to understand language?
- BERT learns from unlabeled text corpus (before, algorithms had to get prepared/labeled material to “learn”)
- BERT is bi-directional, meaning it takes into account the whole sentence to figure out the words’ meaning and relation to each other.
- BERT can zoom in on individual words’ meanings while always taking into account the other words’ context in the sentence and their impact on the one being focused on.
- BERT can predict missing words in a sentence to improve the training process
- BERT can hypothesize what the next sentence might be
- The BERT framework was open-sourced as “Vanilla BERT” by Google and is getting further developed by companies like Microsoft or Facebook, each one fine-tuning it for their own purposes
How does the BERT update affect SEO?
In my opinion, the application of BERT for search will, in the near future, highly impact content quality: the better Google gets at understanding words and phrases in its respective context, the better it can judge on quality and correspondence to search intents.
Google’s just got lightyears ahead in determining content quality.
PPD: See our video SEO audit and start improving your site’s SEO!