Google Says E-A-T is not that important for e-commerce websites

 

Google’s John Mueller said in a webmaster hangout on YouTube that Google E-A-T (expertise, authority, trustworthiness) is not that important for e-commerce websites.

“E-A-T is something that we have in our quality rater guidelines and brought more focus on websites where the type of information is critical for the user – where they really need to know that they get the right information there.

So [this is] probably less the case for most e-commerce websites in general. what I would recommend doing is trying to find ways to improve or expand the content that you have otherwise on these pages so you will have this one block of text which is the same across lots of different websites from the product description […] give some additional context for the product.”

Watch the video on YouTube here:

Google Speaks on Two things you should know about robots.txt

Google’s John Mueller talked about robots.txt in a recent webmaster hangout. Here are two things that you should know about the robots.txt file on your website.

1. Your subdomains need their own robots.txt files

If you want to block directories and files on a subdomain, you have to add a robots.txt file to that subdomain. The robots.txt file of the main domain will not be used:

“Robots.txt is per hostname and protocol. If we’re looking at a web page that’s hosted on say www.example.com and that includes content from a different subdomain, or from a different domain, then we would use the primary robots.txt file on www.example.com for that page. […]

We check for that subdomain, for that hostname, whether we’re allowed to crawl it, so blocking something on the www version would not block it from being crawled from a different hostname or different subdomain.”

2. Your robots.txt file should not have a 503 HTTP status code

If your robots.txt file has a temporary 503 HTTP status code, Google won’t index any pages on your website.

“503 is a temporary error that basically tells us we should try again later [..] By default, when we see a server error we say we don’t know what the robots.txt file is so therefore we will not crawl anything from this hostname.”

If the robots.txt file has a permanent 503 HTTP status code, Google thinks that this is an error and pages will be indexed.

“Sometimes we see that these server errors are more like a permanent thing […] If we see the 500 or 503 error we stop crawling completely and then after a certain period of time, I don’t know, maybe a couple of months or so, […] we think ‘well, this is a permanent error so […] we try to see what we crawl. […]

If there’s some technical reason that you really need to stop crawling of your website then you can return 503 for your robots.txt file and [Google ]will stop crawling as soon as we reprocess that, which is usually within a day so.

 

SEO: Google Says don’t hope, act

Google’s John Mueller said on Twitter that you shouldn’t hope for a particular outcome when optimizing your website. You should give Google the clearest ranking signals that you can.

How to give Google clear ranking signals

Optimize your web pages if you want to give Google clear ranking signals. Then remove all errors that can have negative impact on your rankings

Google says there is no indexing limit

A webmaster asked on Twitter if there was a maximum amount of static pages Google can index for one static website? According to the webmaster, the limit used to be 250 static pages, and after that you had to break a website into smaller sites.

It’s unclear why the webmaster thought that there ever was a limit of 250 pages. Google’s John Mueller said that there was no limit:

What is click-depth?

The click-depth of a page is the number of clicks that it takes to get to a page starting from the home page of your website. The home page of your website has a click-depth of 0. Pages that can be accessed by clicking a link on the home page have a click-depth of 1. Pages that are linked from pages that have a click-depth of 1 have a click-depth of 2, etc.

Does the click-depth of a page an impact on its rankings?

Google's John Mueller said in a webmaster hangout that the number of clicks that are needed shows Google how relevant the pages on your website are:

"From our point of view, we don't count the slashes in the URL. So if you put it into /stores and then /location and that's how you want to keep your website on your server, that's perfectly fine.

What does matter for us a little bit is how easy it is to actually find the content there. So especially, if your home page is generally the strongest page on your website, and from the home page it takes multiple clicks to actually get to one of these stores, then that makes it a lot harder for us to understand that these stores are actually pretty important.

On the other hand, if it's one click from the home page to one of these stores, then that tells us that these stores are probably pretty relevant and that, probably, we should be giving them a little bit of weight in the search results as well.

So it's more a matter of how many links you have to click through to actually get to that content rather than what the URL structure itself looks like."

Although the important pages of your website should be available with one click, you should not overdo it:

"But it's still something where you have to be careful not to overdo it.

So if you link to all of your pages from the home page, then they're all on there. So it's not something where you'd have much value from that.

So you still need some structure, some context around those pages. But if you have a handful of pages that are really important for you, then that will be perfect to link from the home page."

Google has limited crawl resources. The deeper a page is buried on your website, the less likely Google will crawl it. In addition, the home page of a website is usually the website that has the most links from other websites, i.e. it has the highest PageRank in Google's algorithm.

Pages that are linked from pages with a high PageRank get better rankings that pages that a linked from pages with low PageRank.

What you should do now

The most important pages on your website should be one click away from the home page of your website. The fewer clicks it takes to get to a pages, the easier it is to get high rankings for that page.

 

Google: we don’t index all pages in a sitemap

Google’s John Mueller said on Twitter that Google does not index all pages that are in a sitemap. He also said that it was impossible to index all pages on the web.

 

Which issue do you mean, "indexed, not in sitemap" or like Martin mentioned, the opposite? We really don't index all pages in sitemaps or on the web, it would be impossible to do so.

— ?? John ?? (@JohnMu

Google: we do not make up URLs (but there might be an issue on your website)

On Twitter, a webmaster wondered why Google added an additional unnecessary variable at the end of an indexed URL of his website. John Mueller answered that Google does not make up URLs and that the URL probably can be found on the website.

In general, we wouldn't make up URLs like that, so I imagine we found them as links somewhere. With the normal canonicalization methods you can usually encourage our systems to focus on your preferred URLs.

— ?? John ?? (@JohnMu
 
Replying to @Paul47Insights

We already ignore links from sites like that, where there are unlikely to be natural links. No need to disavow :)

 

Google: we do not index all URLs on a website

Google does not index all URLs on a website, that’s what Google’s John Mueller said on Twitter this week. Last month, John Mueller said that Google doesn’t index all pages in sitemaps or on the web, and that it would be impossible to do so.

It looks like a pretty long URL… we +/- never index all pages within a website, so that can be working as intended, even if you'd like to have more :). (Also, impossible to say without the URL)

— ?? John ?? (@JohnMu)

How to show Google the important pages of your website

Google has limited crawl resources. The deeper a page is buried on your website, the less likely Google will crawl it. In addition, the home page of a website is usually the website that has the most links from other websites, i.e. it has the highest PageRank in Google’s algorithm.

 

BERT is now rolling out to over 70 languages worldwide

Google has confirmed that BERT, their new way for Google Search to better understand language, is now rolling out to over 70 languages worldwide.

BERT, our new way for Google Search to better understand language, is now rolling out to over 70 languages worldwide. It initially launched in Oct. for US English. You can read more about BERT below & a full list of languages is in this thread…. https://t.co/NuKVdg6HYM

— Google SearchLiaison (@searchliaison)

BERT is rolling out for: Afrikaans, Albanian, Amharic, Arabic, Armenian, Azeri, Basque, Belarusian, Bulgarian, Catalan, Chinese (Simplified & Taiwan), Croatian, Czech, Danish, Dutch, English, Estonian, Farsi, Finnish, French, Galician, Georgian, German, Greek, Gujarati, Hebrew, Hindi, Hungarian, Icelandic, Indonesian, Italian, Japanese, Javanese, Kannada, Kazakh, Khmer, Korean, Kurdish, Kyrgyz, Lao, Latvian, Lithuanian, Macedonian Malay (Brunei Darussalam & Malaysia), Malayalam, Maltese, Marathi, Mongolian, Nepali, Norwegian, Polish, Portuguese, Punjabi, Romanian, Russian, Serbian, Sinhalese, Slovak, Slovenian, Spanish, Swahili, Swedish, Tagalog, Tajik, Tamil, Telugu, Thai, Turkish, Ukrainian, Urdu, Uzbek & Vietnamese.

The BERT update has an impact on 10% of all queries:

It varies by language but is generally in line with the 1 of 10 figure we shared about US English.

— Danny Sullivan (@dannysullivan)

In general, very specific longtail searches show new results with the BERT algorithm.

 

What should you do now?

There are no particular BERT scores. Just ensure that your web pages have good content and good links. Google’s Danny Sullivan confirmed that the fundamentials remain unchanged

 

Why you should stop using last-click attribution in Google Ads

When was the last time you searched for something, clicked an ad and purchased immediately?

Probably never. That’s why it’s time for marketers to stop using last-click attribution for measuring success in Google Ads.

Often, people are searching on multiple devices and do extensive browsing and research before making a purchase. Understanding the impact that higher-funnel keywords have on conversions can help better utilize spend, cut out waste and inform other digital marketing channels such as SEO and social media.

As Google continues to push automated bidding strategies like maximize conversions, maximize conversion value and target ROAS (just to name a few), using non-last click (NLC) attribution becomes even more important.

These algorithms are designed to optimize ad spend based on specific criteria, but if they’re only seeing a small slice of the pie, you could be missing out on valuable traffic and giving too much credit to lower-funnel searches, like brand terms.

The Model Comparison Tool report in Google Analytics looks at historical data and gives estimates for how many conversions you would have had if you leveraged a different model. This tool can help you decide which model (Position-Based, Linear, or Time Decay) aligns best with your user flow.

Making the change to NLC is simple:

  1. Log into Google Ads
  2. Navigate to the Conversions tab, then click on the conversion action you want to change
  3. Click “edit settings” and then choose whichever attribution model is right for you

If you have multiple conversions being counted, make sure to change all conversions you’re tracking in the account.

So, how much does NLC improve performance? For one lead generation client in a competitive industry, we saw conversions increase 16% and CPA decreased by 12% just one month after switching to linear attribution. On the e-commerce side, one account saw a 5% increase in ROAS just 2 weeks after switching from last-click to position-based.

Note that if you’re using smart bidding strategies, moving to NLC could disrupt your campaigns for a few weeks while the algorithm adjusts, so be patient if things don’t improve immediately.

You want data that paints a full picture, and last-click attribution simply won’t cut it anymore. By opting for the right non-last click model instead, you can set yourself up for a huge performance spike.

Google has confirmed the November 2019 local search update on Twitter. What has changed, and what do you have to do to keep your local rankings?

 

What exactly has changed?

Google now uses neural matching to go beyond the exact words in business name or description. This helps Google to understand conceptually how these might be related to the words searchers use and their intents.

In early November, we began making use of neural matching as part of the process of generating local search results. Neural matching allows us to better understand how words are related to concepts, as explained more here: https://t.co/ShQm7g9CvN

— Google SearchLiaison (@searchliaison)

In other words, your business listing might be shown for search terms that do not appear in your business name or description.

The update has fully rolled out, rankings can change

Google confirmed that this was a global launch covering countries and languages worldwide. The update has fully rolled out:

Neural matching in local search -- which we call the Nov. 2019 Local Search Update -- has now fully rolled out. However, as with web search, results can change as there are smaller updates that happen all the time, as well as content itself that constantly changes.

— Google SearchLiaison (@searchliaison)
Just like regular search results, the new local search results can change over time.

What do you have to do now?

Google says that you don't have to change anything:

The use of neural matching in local search doesn’t require any changes on behalf of businesses. Those looking to succeed should continue to follow the fundamental advice we offer here: https://t.co/tPkyuyMjsP

— Google SearchLiaison (@searchliaison)

There are some things that you can do. For example, you should have a complete Google My Business profile, you should upload photos and videos, you should verify your locations, keep your hours accurate, and you should respond to reviews.

How does Google rank local results?

Google ranks local results based on relevance, distance, and prominence. These factors are combined to help find the best match for a search.

For example, Google algorithms might decide that a business that's farther away from your location is more likely to have what you're looking for than a business that's closer, and therefore rank it higher in local results.

Is your website listed in Google's local search results?

Use the local SEO tools in SEOprofiler to ensure that your web pages can be found in the right locations. For example, you get ranking checks on city-level, and you can track the positions of businesses that do not have a website.

 

Most recent news:

October 2019 Internet and SEO News Update

Top News and Updates From Google

September 2019 News Update

Google Quality Update Targets E-A-T, Page Quality and Interstitials

Yoast SEO Plugin for WordPress Now Offer Defragmented Schema Markup

New Google Algorithm May Update Page Ranking

Google+ is Shuting Down in April 2019

Preparing for Joomla 4.0

WordPress 5.0 Comes with Gutenberg Update

Enable Your Free SSL to Avoid Being Hurt by the Google Chrome Update

 

 



Wednesday, January 1, 2020





« Back

Powered by WHMCompleteSolution