Bing announces penalty for inorganic site structures

Bing has announced a new penalty to address 'inorganic site structure' violations. The new penalty will apply to malicious attempts to obfuscate website boundaries, which covers some old attack vectors (such as doorways) and new ones (such as subdomain leasing).

Why has Bing introduced this new penalty?

Search engine algorithms differentiate between URLs that belong to the same website and URLs that don't. For example, internal links (same site) and external links (cross-site) have different values. Search engine algorithms also use site-level signals (among many others) to infer the relevance and quality of content.

That's why pages on a very trustworthy, high-quality website tend to rank more reliably and higher than others, even if such pages are new and didn't accumulate a lot of page-level signals.

It seems that many webmasters tried to cheat these algorithms with private blog networks, doorway pages, duplicate content, subdomain or subfolder leasing. Bing penalizes all of these techniques.

Bing decided to consider 'subdomain leasing' a violation of their 'inorganic site structure' policy when it is clearly used to bring a completely unrelated third-party service into the website boundary, for the sole purpose of leaking site-level signals to that service. In most cases, the penalties issued for that violation would apply only to the leased subdomain, not the root domain.

Previous News and Announcement

Google Quality Update Targets E-A-T, Page Quality and Interstitial

Top News and Updates From Google

September 2019 News Update

There are manual penalties and algorithmic penalties

Bing's Frédéric Dubut said that Bing manually reviews websites and that algorithms targeting these behaviors will roll out over time:

The policy is effective immediately for manual reviews. Algorithms targeting these behaviors will roll out progressively over time.
— Frédéric Dubut (@CoperniX) November 5, 2019

The penalty does not apply to the whole website:

These penalties are meant to nullify undeserved signals, not to obliterate a site. If the site is considered among the most relevant for our users and deserves to rank high on its own merits, then it will.
— Frédéric Dubut (@CoperniX) November 5, 2019

You are responsible for the content on your website

Bing reminds website owners that they are responsible for the content that is hosted on their domain. This is particularly true when subdomains or subfolders are operated by different entities.

According to Bing, the overall domain reputation will be affected if a disproportionate number of websites end up in violation of our webmaster guidelines. Taking an extreme case, if you offer free hosting on your subdomains and 95% of your subdomains are flagged as spam, Bing will expand penalties to the entire domain, even if the root website itself is not spam.

Another unfortunate case is hacked sites. Once a website is compromised, it is typical for hackers to create subfolders or subdirectories containing spam content, sometimes unbeknownst to the legitimate owner. When Bing detect this case, they generally penalize the entire website until it is clean.

Don't try to cheat search engine algorithms

It might be tempting to exploit loopholes in search engine algorithms. Unfortunately, exploiting loopholes will always lead to a website penalty when search engines close the loophole.

 

Google: we primarily use spam reports to improve our algorithms

John Mueller said on Twitter that Google primarily uses spam reports to improve Google’s algorithms. For example, they won’t remove a page from search just because of inappropriate structured data on the page.

Google spam report:

We wouldn't remove a page from search just because of inappropriate structured data. Also, while I get that it would be cool to have all reports manually taken action on, we try to use these to improve our algorithms primarily, which helps to improve more search results.
— ?? John ?? (@JohnMu) 7. November 2019

 

Google announces ‘the biggest leap forward in the past five years’

Google has announced a major change to the ranking algorithm. According to Google, it’s “a significant improvement to how [Google understands] queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.

What is new?

Google is applying BERT models to Search. BERT stands for ‘Bidirectional Encoder Representations from Transformers’. BERT is an open-source project for natural language processing. BERT helps computers understand language like humans do.

BERT models consider the full context of a word by looking at the words that come before and after it—particularly useful for understanding the intent behind search queries.

What’s the impact of the change?

BERT will impact one in 10 searches in the U.S. in English, and Google will bring this to more languages and locales over time.

Particularly for longer, more conversational queries, or searches where prepositions like ‘for’ and ‘to’ matter a lot to the meaning, Search will be able to understand the context of the words.

For featured snippets, Google is using a BERT model to improve featured snippets in the two dozen countries where this feature is available.

Google: we don’t index JavaScript content that requires user interaction

Google’s John Mueller said in a webmaster hangout on YouTube that Google won’t index dynamically created content if it requires an interaction from the website visitor.

“You’re saying this is behind JavaScript functions. There’s a difference from our point of view between functions or functionality that you have on the page that is loaded by default and functionality that has to be loaded on interaction.

Functionality that is visible in the page by default which is in the HTML – maybe it’s not visible by default but it’s in the HTML by default – that’s everything that we can pick up and index right away.

If there’s something on your page that needs some interaction to load […] for example if you use Javascript to load something from your database or from your API’s and then show that to the user when they click on something, then that’s something that we would not know about.

We would not know where to click on your pages to trigger this interaction with your server to load more content so that’s the aspect that, just purely from a technical point of view, wee wouldn’t know what to do to load that content.

We wouldn’t be able to index that content. However, like I mentioned, if it’s already in the HTML if you can do ‘view source’ or ‘inspect element’ and it’s in your HTML by default, it’s just not visible by default, then that would be fine.”

Google: don’t break the rules even if others are successful with it

Google’s John Mueller said on Twitter that you should not break Google’s guidelines even if a competitor does it without getting penalized:

Is your question "others are breaking the guidelines, can I do it too?"? You're welcome to use our spam report forms, but purposely breaking the guidelines if you want something out of them seems like a bad long term strategy…
— ?? John ?? (@JohnMu) November 5, 2019

 

Google: automatically translated content can be fine (but it can also trigger a penalty)

Do you want to offer your website content in multiple languages? If you use an automated translation service to translate your content, it rank well if the translation is good. If the translation is bad, then it’s bad content that might be penalized by Google.

We may take manual action, in particular if it ends up as low-quality machine-generated gibberish'y content. A lot of times, we just rank it the way we'd rank similar content, which ends up working reasonably well (at least, I don't hear a lot of complaints :-)). Why do you ask?
— ?? John ?? (@JohnMu) October 22, 2019

I don't think that would trigger manual actions, but if the translations are bad, then it's bad content in general :). However, machine translation is much much better than it used to be. Another option: noindex until reviewed by local users / speakers.
— ?? John ?? (@JohnMu) October 22, 2019

 

Voice assistant study: Microsoft’s Cortana offers most answers, Google Assistant proves most accurate


Perficient Digital ran 5,000 queries on seven devices including Alexa, Cortana, Google Assistant (Home, smartphones) and Siri.

Perficient Digital released the latest version of its now annual Digital Personal Assistants accuracy study. It compared responses to roughly 5,000 queries on seven devices including Amazon’s Alexa (Echo and Echo Show), Microsoft’s Cortana, Google Assistant (Home, smartphones), and Siri.

More answers, less accuracy. At the highest level, Google Assistant performed the best, but Cortana attempted to answer the most questions. Alexa also showed improvement in answer attempts. But accuracy declined on all devices, according to the study.

Three years of data show Cortana and Alexa have grown the most in answer attempts, with Cortana edging Google for most questions answered (though not always correctly).
Source: Perficient Digital 2019 DPA accuracy study

Alexa the second-most accurate assistant, after Google. The most accurate assistant is Google (on a smartphone). Alexa comes in second. However, accuracy seems to have declined across the board and most for Cortana, which could be related to its attempt to answer more questions. Siri also suffered a meaningful decline in accuracy.
Source: Perficient Digital 2019 DPA accuracy study

Perficient Digital explored the use of featured snippets by the assistants. It defines snippets as “answers provided by a digital personal assistant or a search engine that have been sourced from a third party” (with attribution).

Decline in use of snippets by Google. Google served up the most snippets, with Google Home beating out the Assistant on smartphones. However, Google Assistant on the smartphone also saw a significant decline use of snippets. It was the only platform to see such a decline.
Source: Perficient Digital 2019 DPA accuracy study

Finally, Alexa and Siri tied for the most jokes offered in response to queries. Accordingly, they were deemed to be “the funniest” assistants.

 

Previous News:

Google Quality Update Targets E-A-T, Page Quality and Interstitials

Yoast SEO Plugin for WordPress Now Offer Defragmented Schema Markup

New Google Algorithm May Update Page Ranking

Google+ is Shuting Down in April 2019

Preparing for Joomla 4.0

WordPress 5.0 Comes with Gutenberg Update

Enable Your Free SSL to Avoid Being Hurt by the Google Chrome Update



Saturday, November 16, 2019





« Back