SEO STARTER GUIDE: THE BASICS

      166

If there’s one thing I’ve learned over the 15 years working on Google Search, it’s that people’s curiosity is endless. We see billions of searches every day, & 15 percent of those queries are ones we haven’t seen before--so we’ve sầu built ways khổng lồ return results for queries we can’t anticipate.

Bạn đang xem: Seo starter guide: the basics

When people lượt thích you or I come to Search, we aren’t always quite sure about the best way lớn formulate a query. We might not know the right words to lớn use, or how to spell something, because often times, we come to lớn Search looking khổng lồ learn--we don’t necessarily have sầu the knowledge khổng lồ begin with. 

At its core, Search is about understanding language. It’s our job khổng lồ figure out what you’re searching for & surface helpful information from the web, no matter how you spell or combine the words in your query. While we’ve sầu continued khổng lồ improve our language understanding capabilities over the years, we sometimes still don’t quite get it right, particularly with complex or conversational queries. In fact, that’s one of the reasons why people often use “keyword-ese,” typing strings of words that they think we’ll understvà, but aren’t actually how they’d naturally ask a question. 

With the lathử nghiệm advancements from our research team in the science of language understanding--made possible by machine learning--we’re making a significant improvement to lớn how we understand queries, representing the biggest leap forward in the past five years, và one of the biggest leaps forward in the history of Search. 

Applying BERT models to SearchLast year, we introduced & open-sourced a neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers, or as we Gọi it--BERT, for short. This công nghệ enables anyone to lớn train their own state-of-the-art question answering system. 

This breakthrough was the result of Google research on transformers: models that process words in relation lớn all the other words in a sentence, rather than one-by-one in order. BERT models can therefore consider the full context of a word by looking at the words that come before và after it—particularly useful for understanding the intent behind tìm kiếm queries.

But it’s not just advancements in software that can make this possible: we needed new hardware too. Some of the models we can build with BERT are so complex that they push the limits of what we can bởi using traditional hardware, so for the first time we’re using the lathử nghiệm Cloud TPUs lớn serve sầu tìm kiếm results và get you more relevant information quickly. 

Cracking your queriesSo that’s a lot of technical details, but what does it all mean for you? Well, by applying BERT models to lớn both ranking and featured snippets in Search, we’re able lớn bởi a much better job helping you find useful information. In fact, when it comes lớn ranking results, BERT will help Search better understand one in 10 searches in the U.S. in English, and we’ll bring this to more languages and locales over time.

Xem thêm: Sau Of Là Loại Từ Gì ? Cách Dùng Giới Từ Như Thế Nào

Particularly for longer, more conversational queries, or searches where prepositions lượt thích “for” and “to” matter a lot to the meaning, Search will be able lớn underst& the context of the words in your query. You can tìm kiếm in a way that feels natural for you.

To launch these improvements, we did a lot of testing to lớn ensure that the changes actually are more helpful. Here are some of the examples that showed up our evaluation process that demonstrate BERT’s ability to lớn underst& the intent behind your tìm kiếm.

Here’s a tìm kiếm for “2019 brazil traveler lớn usa need a visa.” The word “to” and its relationship to the other words in the query are particularly important to lớn understanding the meaning. It’s about a Brazilian traveling khổng lồ the U.S., and not the other way around. Previously, our algorithms wouldn't understvà the importance of this connection, and we returned results about U.S. citizens traveling lớn Brazil. With BERT, Search is able lớn grasp this nuance and know that the very common word “to” actually matters a lot here, và we can provide a much more relevant result for this query.