let’s see How Google Search Engine Gives you a better result
When you come to Google Search, our goal is to attach you with useful information as quickly as possible. That information can take many forms, and over the years the search results page has evolved to incorporate not only an inventory of blue links to pages across the online but also useful features to assist you to discover what you’re trying to find even faster. Some examples include featured snippets, which highlight results that are likely to contain what you’re looking for; Knowledge Panels, which may assist you find key facts about private or other topics within the world; and predictive features like Autocomplete that assist you to navigate Search more quickly.
Left-right: samples of a featured snippet, a Knowledge Panel, and an Autocomplete prediction
Because these features are highlighted during a unique way on the page or may show up once you haven’t explicitly asked for them, we’ve policies around what should and will not appear in those spaces. this suggests that in some cases, we may correct the information or remove those features from a page.
This is quite different from how we approach our web and image search listings and the way those results rank in Search, and that we thought it might be helpful to elucidate why employing a few examples.
One helpful information format is featured snippets, which highlight sites that our systems determine are especially likely to contain what you’re trying to find . Because their unique formatting and positioning are often interpreted as a sign of quality or credibility, we’ve published standards for what can appear as a featured snippet.
We don’t allow the display of any snippets that violate our policies by being sexually explicit, hateful, violent, harmful or lacking expert consensus on public interest topics.
Our automated systems are designed to avoid showing snippets that violate these policies. However, if our systems don’t work as intended and a violating snippet appears, we’ll remove it. In such cases, the page isn’t removed as an internet search listing; it’s simply not highlighted as a featured snippet.
The Knowledge Graph
The Knowledge Graph in Google Search reflects our algorithmic understanding of facts about people, places and things within the world. The Knowledge Graph automatically maps the attributes and relationships of those real-world entities from information gathered from the online , structured databases, licensed data and other sources. This collection of facts allows us to reply to queries like “Bessie Coleman” with a Knowledge Panel with facts about the famous aviator.
Information from the Knowledge Graph is supposed to be factual and is presented intrinsically. However, while we aim to be as accurate as possible, our systems aren’t perfect, nor are all the sources of knowledge available. So we collect user feedback and should manually verify and update information if we learn something is wrong and our systems haven’t self-corrected. we’ve developed tools and processes to supply these corrections back to sources like Wikipedia, with the goal of improving the knowledge ecosystem more broadly.
Furthermore, we give people and organizations the power to say their Knowledge Panels and supply us with authoritative feedback on facts about themselves, and if we otherwise are made conscious of misinformation, we work to repair those errors. If a picture or a Google Images results preview that’s shown during a Knowledge Panel doesn’t accurately represent the person, place or thing, we’ll also fix the error.
There are other features that are “predictive,” like Autocomplete and Related Searches, which are tools to assist you navigate Search more quickly. As you type each character into the search bar, Autocomplete will match what you’re typing to common searches to assist you save time. On the search results page, you’ll see a neighborhood of searches labeled “People also look for,” which is meant to assist you navigate to related topics if you didn’t find what you were trying to find, or to explore a special dimension of a subject .
Because you haven’t asked to ascertain these searches, we’re careful about not showing predictions which may be shocking or offensive or could have a negative impact on groups or individuals. Read more about our policies.
You can still issue any search you’d like, but we won’t necessarily show all possible predictions for common searches. If no predictions appear or if you’re expecting to ascertain a related search and it’s not there, it’d be that our algorithms have detected that it contains potentially policy-violating content, the prediction has been reported and located to violate our policies, or the search might not be particularly popular.
While we do our greatest to stop inappropriate predictions, we don’t always catch on right. If you think that a prediction violates one among our policies, you’ll report a prediction.
Across all of those features, we don’t want to shock or offend anyone with content that they didn’t explicitly hunt down, so we work to stop things like violence or profanity from appearing in these special formats.
Organic Google search results
While we’ve talked mostly about helpful features that appear on the search results page, the results that probably come to mind most are our organic listings—the familiar “blue links” of website results, thumbnails displayed during a grid in Google Images or videos from the online in video mode.
In these cases, the ranking of the results is decided algorithmically. We don’t use human curation to gather or arrange the results on a page. Rather, we’ve automated systems that are ready to quickly find content in our index–from the many billions of pages we’ve indexed by crawling the web–that are relevant to the words in your search.
To rank these, our systems take under consideration variety of things to work out what pages are likely to be the foremost helpful for what you’re trying to find . you’ll learn more about this on our How Search Works site.
While we shall provide relevant results and prioritize the foremost reliable sources on a given topic, like any automated system, our search algorithms aren’t perfect. you would possibly see sites that aren’t particularly relevant to your search term rising to the highest , or perhaps a page that doesn’t contain trustworthy information rank above a more official website.
When these problems arise, people often notice and ask us whether we shall “fix” the difficulty . Often what they could have in mind is that we’d manually re-order or remove a specific result from a page. As we’ve said repeatedly within the past, we don’t take the approach of manually intervening on a specific search result to deal with ranking challenges.
This is for a spread of reasons. We receive trillions of searches annually , so “fixing” one query doesn’t improve the difficulty for the various other variations of an equivalent query or help improve search overall.
So what can we do instead? We approach all changes to our Search systems within the same way: we learn from these examples to spot areas of improvement. We come up with solutions that we believe could help not just those queries, but a broad range of comparable searches. We then rigorously test the change using insights from live experiments and data from human search rater evaluations. If we determine that the change provides overall positive benefits– making an outsized number of search results more helpful, while preventing significant losses elsewhere– we launch that change.
Our search algorithms are complex math equations that believe many variables, and last year alone, we made quite 3,200 changes to our search systems. a number of these were visible launches of latest features, while many others were regular updates meant to stay our results relevant as content on the online changes. and a few also are improvements supported issues we identified, either via public reports or our own ongoing quality evaluations. Unlike with Search features where we are ready to quickly correct issues that violate our policies, sometimes identifying the basis explanation for ranking issues can take time, and enhancements might not happen immediately.. But as we’ve been for quite 20 years, we are always committed to identifying these challenges and dealing to form Search better.
This is to not say that there aren’t policies and guidelines that apply to our organic listings. Content there has got to meet our long-standing webmaster guidelines, which protect users against things like spam, malware and deceptive sites. Our spam protection systems automatically work to stop our ranking systems from rewarding such content.
In cases where our spam systems don’t work, we’ve long taken manual actions against pages or sites. We report these actions through the Manual Actions report in our Search Console tool, in hopes that site owners will curb such behavior. These actions aren’t linked to any particular search results or query. They’re taken against affected content generally.
Legal and Policy-based Removals
As our mission is to supply broad access to information, we remove pages from our leads to limited circumstances, when required by law–such as maltreatment imagery and infringement of copyright claims–and in narrow cases where we’ve developed policies to guard people, like content that has sensitive personal information.
Some of these legal and policy actions do remove results for particular searches, like someone’s name. However, none of those removals happen because Google search has chosen to “fix” poor results. Rather, these are acts of legal compliance and applications of publicly documented policies to assist keep people safe.
source: Google blogs