Search engine filters: list, description. How do search engine filters work?

Search engines have a huge responsibility - to provide Internet users with information that fully satisfies their request. However, over time, the number of sites began to increase at the rate of geometric progression. Not all of them have high-quality and popular content, and some resources are created solely for profit. In this regard, search engine filters were created that lower the site’s position when issuing or block the resource.

Features of the filters

First of all, you need to understand how search engine filters work. In general, search engines impose sanctions in several cases:

  • Texts and micro-marking. Search engine filters are triggered if a large amount of spam is detected on the site (text spam can be checked using programs to check the uniqueness of the text). Excessive optimization, as well as an excess of key queries in the text structure (headings, subheadings, etc.) negatively affects the quality of the resource.
  • Reference mass. For an unnaturally quick buildup of link mass (in particular external), the resource may fall under the filters. Also negatively affects the search results for spam in anchors, excessive linking and link manipulation. For example, a run or reference rings.
  • Behavioral factors. The filtering of search engines also covers the wrapping of behavioral factors and the placement on the pages of invisible elements with which the user interacts, but is not aware of this. It is unforgivable to redirect users to the site, which does not correspond to their request.
  • Advertising. Not the least role in the appearance of sanctions is played by the advertising content of the site. Advertisements should not be intrusive or redundant. Moreover, search engines can lower in the search the site for which aggressive ads, such as subscription or clicks, are "stored" for mobile versions. There are also filters that limit the activity of sites on the pages of which content of the 18+ category was seen
  • Identical snippets. If along with the link in the search results the same information appears for individual links of one site, this may serve as a reason for imposing sanctions on the site. This often happens with online stores that have an extensive range of products of the same type.
  • Affiliates Search engine filters are extremely negative for groups of sites that belong to the same campaign and are created specifically to bring the main resource to the TOP.

search engine filters

For each of this case, special filter programs or search algorithms have been created that carefully monitor the purity of optimization, preventing the appearance of low-quality content in the search results. Each search engine has its own limiters that successfully cope with this task.

Main filters "Yandex"

In the vast expanse of Runet, Yandex was and remains the main search engine. Every year he tightens the requirements for sites and creates more and more new filters. Novice webmasters are afraid to fall under the brutal sanctions of this search engine, so to successfully promote your resource you need to know the “enemy in person”, that is, have an idea of ​​the main filters of the Yandex search engine.

  • According to Internet marketers, Yandex’s AGS is considered the toughest filter, because at one time it successfully blocked more than one hundred resources. His main activity is aimed at combating sites that sell links and illegally increase the position of other resources, although they themselves do not carry useful information.
  • The Nepot Filter was modeled after the AGS. However, its task is narrower: to block only those resources that sell links. The filter determines the site involved in unauthorized activities and imposes restrictions that utilize the link weight of the promoted resource.
  • " You are the last ." This filter with a simple name is aimed only at lowering the position of sites with non-unique content. However, this is not a reason to ignore it. Often, sanctions are imposed on texts in which uniqueness is below 92%.
  • " New domain ." This is rather not a filter, but a kind of restriction that does not allow the newly created resource to occupy leading positions.
  • " You're spammy ." The spam filter carefully monitors the quality of content on the site, and if the text is re-optimized, then the position of the resource falls in the output. It is important to understand that the abundance of key queries is not a guarantee that the site will occupy a good place in the top. Any information should be, first of all, useful to the user, and only then meet the requirements of search robots. It is unlikely that the user will reveal a desire to view a sheet of text where, by an absurd "accident" in almost every sentence, there is a key request, but the necessary information is missing.
  • " Affiliate ." Yandex closely monitors campaigns that are trying to promote their core resources by creating additional sites and linking between them. If such sites contain unique and useful information, then they will have the right to life, but if the content is created only for the sake of content, the resource can easily be blocked.
  • " Redirect ". This filter is actively fighting doorways, that is, sites that redirect users to a resource that they have not been looking for.
  • "Adult." Imposes restrictions on resources with "adult" content.
  • " Behavioral factors ." The filter monitors the behavior of users who follow the link to the site. The number of failures, the “depth” and the duration of viewing content, which are considered the main indicators of utility, are strictly recorded.
  • " Aggressive advertising ." In the past few years, Yandex has begun to actively fight obsessive and aggressive advertising that interferes with viewing content.

key queries

Basic Google Filters

After Yandex in Russia, the Google search engine ranks second. Accordingly, its filters do not disregard the webmasters. The most common are:

  • " Duplicate Content ." Designed specifically to penalize duplicate content. The filter lowers the position of those resources that place non-unique content on their pages that is repeated more than once.
  • Sandbox This filter regulates the site getting into the top for high-frequency queries, combats spam and questionable content.
  • " Link Growth. " If the resource starts to quickly increase the reference mass, it decreases in the issuance.
  • " Google Bombing ." This filter imposes sanctions on a resource in which there is an overabundance of key requests in link anchors.
  • "- 30. " Such a cool, one might even say, Siberian name, got a filter that fights with numerous spam. The scope of his competence, in addition to standard unwanted mailings, includes blocking redirects and doorways.
  • " Sociating ." For Google’s search engine in Russia, it’s important that sites with similar topics link to the resource. If the site improves its position, but resources that are far from the main topic link to it, then this filter starts to act.
  • " Download speed ." Unlike Yandex, Google controls the download speed. If, after switching from a search engine to a resource, it slowly loads, and this is repeated several times, then the site loses its position.
  • " Supplemental Results ." On some pages of the site that do not carry unique information or do not have internal links to them, the search engine can impose a filter. At the same time, the resource with the main content does not lose its position.
  • " Link Exchange ." As soon as the epic of creating sites began, resources for mutual exchange of links were popular. Now, if the project suddenly begins to actively gain reference mass, the filter lowers its position.
  • " Trast Rank ". Or, more simply, the "level of trust." The main purpose of the filter is to determine how authoritative the project is. Accordingly, the more authoritative he is, the higher his position. Several factors influence this: the quality of the links and their total mass, the amount of content, the age of the domain name, and the linking between the pages of the resource.
  • " Result + ". Or a filter for additional results. If the system suddenly considers that the project is similar to hundreds of others and does not bring anything new, then the pages of the site fall into an additional index, and not into the main one.

Yandex ags

Ags

Above are the most relevant search engine filters. However, some of them need to be considered in more detail. For starters, you should pay attention to the most terrible "dictator" of the Runet - AGS Yandex. This filter was created specifically to withstand low-quality content on the Web. This name can be used together with the numbers "17", "30" or "40", which indicate versions of the program.

Under the sanctions of this filter are sites that have outgoing SEO links. Uninteresting and non-unique content negatively affects the site’s position, which leads to a large number of user failures. Doubles, aggressive advertising, doorways, redirects, cloaking. In addition, the filter closely monitors the functionality of projects. If a website has poor design or loading problems, this can also significantly lower its position.

To determine if a project fell under the AGS filter or not, you should carefully monitor the indexing process and the level of attendance. When a filter starts its dirty deed, new pages are not indexed by the search engine, and old pages that fall under sanctions completely drop out of the index.

To restore previous positions, it is necessary to make a real masterpiece from the resource, designed to meet the needs of users. To do this, you need to adjust the technical specifications to the requirements of the search engine. This means banning indexing of duplicate pages, removing SEO links, making navigation easier, and reviewing advertisements by deleting all aggressive media. Users should quickly find information that may interest them, and this information should be high-quality, new, useful and easy to understand. Site content should be constantly updated with new articles. Only when the project meets the requirements of search engines can you contact Yandex support to remove the filters.

"Penguin"

But not only Yandex’s sanctions are scary, filters for the Google search engine also need attention. If the ACS monitors the main characteristics that determine the quality of the project, then Penguin fines for artificial buildup of behavioral factors. The Penguin filter was created specifically to exclude manipulative link building during website optimization.

google russia search engine

Placing spam links on forums and blogs, acquiring backlinks, an overabundance of SEO anchors, creating invisible links on the project pages - all this is the reason for sanctions. Falling under the “Penguin” filter, the site starts to lose its position sharply, its traffic drops, and indexing is much slower than usual.

To avoid a sad fate, you need to remove all low-quality links and change the promotion strategy.

Very often the site is subject to sanctions of more than one filter. Together with the Penguin, the Panda can also visit the webmaster. This is another Google search engine filter. If the site publishes material with a uniqueness below 90%, then you can be sure that the resource is under these two “tamers”.

"Panda"

The Panda filter is responsible for the quality of the website as a whole. Low-quality projects lose a significant share of traffic, and in some cases completely fall out of the search. The quality of the sites is evaluated by such characteristics:

  • Content The quality of the content is determined not only by robots that scan it for uniqueness and usefulness, but also assessors - certifiers who give independent ratings to the content. In addition, user satisfaction with information affects. This is evidenced by the depth of viewing and the time spent on the site. Information cannot be duplicated or stolen openly (i.e. with low uniqueness).
  • Owners and users. Everything that is on the site, and the project itself must first of all focus on the user. That is, there should be a good design, a convenient and intuitive interface, a minimum of incorrect and intrusive advertising. A site can get a penalty if ad units are placed at the top of the page and the content itself is at the bottom. Unreadable texts are also punishable.

penguin filter

As in other cases, if the site starts to lose its position and traffic sharply, and the pages fall out of the index, it is highly likely that the project fell under the Panda filter. To get rid of the limitations of search engines, you need to change the optimization strategy, improve the quality of materials and website functionality. The following manipulations will help in this matter:

  • Technical audit . You need to make sure that the website is functioning properly, loading well, and there are no broken links.
  • Quality . Once again check the materials of the resource for uniqueness and quality. All materials that raise questions should be replaced.
  • Cleanliness is the key to success . If there are pages without content or with small paragraphs of texts, they need to be removed, supplemented or replaced. That is, it is necessary to carry out cleaning, getting rid of unnecessary fragments of the webmaster.
  • Regularity . The site should be constantly updated. At least once a week, you need to add new publications.

"Baden Baden"

In 2017, an update to the Yandex Baden-Baden search engine filter was released, which caused an unprecedented stir among webmasters. Someone claimed that this search algorithm would apply harsh sanctions to resources that would show at least a hint of spam, and someone calmly reiterated that it would not become a revolutionary impetus in SEO optimization.

filter additional results

The developers themselves claim that they have significantly improved the algorithm that "punished" re-optimized pages. Although it is still not clear, the new Baden-Baden has become a replacement for familiar filters or is part of a general algorithm. If you are inclined to the second option, then it will be difficult for webmasters to determine under what sanctions their resource fell.

Baden-Baden fights poor-quality content by excluding texts from the search, where there are many key queries. However, its effect does not apply to the entire project, but to a separate document (page). Simply put, if the site has one low-quality text, then only the page on which it is placed will fall out of the search, otherwise the resource will retain its position and the main part of the traffic. However, do not relax. One filter can be followed by many others.

Therefore, it is necessary to increase the visibility of the resource, review all SEO texts for re-optimization, and remove techniques that hide SEO content from search engines.

On the pulse

The requirements of search engines are the same for all sites, the punishment for their failure is the same. If the project suddenly began to lose traffic and decreased in SERPs, it means that it definitely fell under the sanctions of search engines. To keep abreast of the pulse, you need to constantly check the incoming traffic, not only the quantity, but also its sources.

Therefore, each webmaster needs to learn how to manage search filters. How to enable search filter? This is a fairly simple task. When registering a site in a search engine, the webmaster receives a personal account in which he can follow the changes that occur with his project ("Yandex. Webmaster" or Google Analytics). Going to this page, you need to open the administrative panel, select the site you are interested in and open the "Filters" section. In most cases, it is enough to activate the built-in delimiters, but for some cases it is necessary to create individual user restrictions.

panda filter

Why is this needed? Suppose if a site belongs to a large company, then it is highly likely that employees will visit it daily. And if on this site there is a chat or forum where they can communicate with each other, it is logical to assume that employees will be happy to spend time on the company's web resource. However, they are not the target audience.But the search engine captures all visits. In order to logically think through the optimization strategy, it is necessary to exclude from the reports of search engines information about the transition through the local network of the user's company. For this purpose, there are custom filters.

Simply put, the filters configured by the webmaster provide the opportunity to get only the information that will be necessary to promote the resource. This applies not only to specific IP addresses, but to conversions from specific directories, sites, and behavioral characteristics.

What is left over?

. - . , .

how search engine filters work

, , :

  • .
  • .
  • .
  • , .

Naturally, such a long way, it will take about six months to bring the website to the top for the main key requests, but if it has already got there, then its position will be much more stable than that of those who took this place dishonestly.

It was only at the beginning of the development of the Internet epic that it was possible with impunity to dilute meager fragments of texts with key queries and quickly advance in search results. Now everything has changed. Search engines focus on high-quality content. The priority is now original, authorial content, which can give an exhaustive answer to a user's question. Therefore, if there is no desire to become a victim of sanctions, you need to focus on solving this particular issue, and only after that you can start building link mass and attracting visitors.

Source: https://habr.com/ru/post/C33444/


All Articles