Analyzing search results reveals a lot about Google’s view of useful content
Creating content is a process that has to get smarter all the time. Google is continually improving how it understands naturally expressed human language, as perfectly evidenced in its BERT update from last October.
Google has said – and webmaster trends analyst John Mueller has echoed – that there is really nothing drastically new to optimize for after the update, aside from ensuring that SEOs are writing naturally in their content rather than focusing too much on keywords.
The idea of creating content around topics rather than keywords is not particularly new, and so I am presenting an argument for making sure your content is addressing exactly what users want to see. In addition to all the content-research methods you know about already – performing keyword research, examining keyword intent, and using topic research tools – you should be mining the SERPs to see what Google has chosen to present, especially on the first page.
If there is anything to take from BERT, it is that, for how well Google understood query intent before, it now does it even better. So, the content Google sees as worthy of positions one and zero – as well as all the surrounding ancillary content on the page – is probably worth a closer look by SEOs who want to compete.
With all that said, let’s take a deep dive into analyzing search results for your own content creation, including looking at the various SERP features to see what they mean, discovering the apparent intent of the queries that led you to those particular results, and ultimately understanding and crafting more competitive content.
SERP features and intent
Search for anything on Google and you’ll get about 10 organic results in the form of those famous blue links. Those are the “money” parts of a SERP, of course, but nearly as important are all the images, graphs, boxes and news selections that appear alongside the organic results, depending on the query.
Discussing every possible feature that could appear is beyond the scope of this post, and you already know about meta tags, answer boxes and carousel lists. But since our goal is to analyze searcher intent, let’s look at a few SERP features that can be telling, given the right context.
Knowledge graphs, or panels, present users with basic information about the entity they have searched for, if applicable. Search for “Hyundai,” as you see below, and you get a knowledge panel showing the full name of the company, a blurb describing it, the customer service email and phone number, stock price and so on. That covers quite a bit of information in one concise box. And just to the left, as you would expect, is the top paid result, for the brand’s American division website.
So, what can we tell from the box? It is built for the consumer. From this box alone, you can call customer service, start thinking about buying Hyundai stock, or check out the latest Hyundai models. The panel is also a type of portal to a great range of related subjects, including Hyundai’s social media pages and other car manufacturers.
And the overall nugget from this particular query? It’s for users interested in learning about and buying Hyundais. While it would be basically futile to try to rank your Hyundai blog on page one of the SERPs for a seed term such as “hyundai,” at least you know the term is more of an informational query than anything else, and with the right kind of long-tail keywords and plenty of regular posts, you might be able to push your blog out there.
We all know that images frequently appear at the top of the SERPs for certain types of queries. That last part is important. If images don’t always appear, then we have to assume Google knows which types of queries call for image results and which do not. Google does this through its natural language processing, so you know that when you search “nutrition facts,” you get websites about nutrition, but when you search “nutrition chart,” the first results you get are images.
Even beginner SEOs know how to optimize images to rank higher. My point is that when you search “nutrition chart” or “pastel shirts men,” you get image results so if you want your company to get more visible on page one for these and similarly worded queries, you had better start getting your images out there using all the known tactics for image optimization.
People Also Ask
The “People also ask” feature is one of the most valuable on the first SERP. The PAA box usually appears under the featured snippet or video or image results. The box shows you questions that are topically related to the question you actually asked, and you can expand each question to reveal an organic result. The answer to each question acts like a miniature featured snippet, but, of course, users will see it only if they click that question.
The PAA boxes appear in results when Google determines that a user’s query is informational in nature. The query does not necessarily have to be a question to get a PAA box. However, to increase the chances of your content getting featured or at least making it as a PAA answer, you should write informational content such as a how-to guide and consider marking it up with how-to structured data.
Search an informational query related to your industry. If you are in general contracting and maintain a regularly updated blog, search “how to screw into concrete.”
If you find that you cannot compete with the featured snippet, try to write content that answers one of those PAA questions. Check out the current answers to see what they are doing well. Then, make your content better.
Writing better content
There are plenty of other features one could analyze, everything from stock market information to sports results to local packs and health features. By now you probably get the idea that by just reviewing the information right in front of your eyes, you can get clues to how to craft your own content, whether it be a blog post, image, local-pack result or “things-to-do-in” listicle.
By necessity, I have already covered how to interpret what you find on the SERPs to create ranking content. The three major types of searcher intent are:
Informational (“I want to know more.”)
Navigational (“I am looking for a specific website.”)
Transactional/Commercial (“I want to buy something.”)
Searcher intent has also been broken down into local, visual, branded, news, and video intent, among numerous other types. You can use various tools to dig deeper into specific SERP features, but my opinion is that there is no better instrument for figuring this stuff out than basic logic.
Search a query related to your field. Take an hour and really mine that first SERP for what it contains and what each part means. What is Google telling you by presenting this particular piece of content as the answer box? Why is that information in the knowledge panel? How is that People Also Ask question topically or semantically related to what I asked? What are the factors common to the content in positions one and two and six and nine and so on? How can my website compete with all of them?
Keep a few things in mind when attempting to answer these questions. If Google has ranked something in position zero, it is likely for good reason, and it may not be the written words of the content alone. Maybe that result is formatted in just the right way, as a how-to or a type of encyclopedia of similar topics. Perhaps the content intersperses written words with optimized infographics and videos.
You know those ten results on page one have something useful for searchers. Your job is to do it better. Also, remember that you can capitalize on some of your older content by updating it and optimizing it to be better than what’s on the SERPs now. Make this a habit, and keep up with it, to build your web pages’ EAT score and stay competitive.
Google: there’s no need to use Dublin Core tags
Google’s John Mueller said on Twitter that it’s not necessary to use Dublin Core tags. Dublin Core meta tags can be used to describe digital resources (video, images, web pages, etc.), as well as physical resources such as books or CDs, and objects like artworks.
The state of tracking and data privacy in 2020
Here's where search marketers find themselves in the current entanglement of data and privacy and where we can expect it to go from here.
January 2020 felt like a turning point. CCPA went into effect, Google Chrome became the latest browser to commit to a cookie-less future and, after months of analytics folks sounding the alarm, digital marketers sobered to a vision of the future that looks quite different than today.
This article is not a complete history of consumer privacy nor a technical thesis on web tracking, although I link to a few good ones in the following paragraphs.
Instead, this is the state of affairs in our industry, an assessment of where search marketers find themselves in the current entanglement of data and privacy and where we can expect it to go from here.
This is also a call to action. It’s far from hyperbole to suggest that the future of digital and search marketing will be greatly defined by the actions and inactions of this current calendar year.
Why is 2020 so important? Let’s assume with some confidence that your company or clients find the following elements valuable, and review how they could be affected as the associated trends unfold this year.
Channel attribution will stumble as tracking limitations break measurability and show artificial performance fluctuations.
Campaign efficiency will lose clarity as retargeting efficacy diminishes and audience alignment blurs.
Customer experience will falter as marketers lose control of frequency capping and creative sequencing.
Despite the setbacks, it is not my intention to imply that improved regulation is a misstep for the consumers or companies we serve. Marketing is at its best when all of its stakeholders benefit and at its worst when an imbalance erodes mutual value and trust. But the inevitable path ahead, regardless of the destination, promises to be long and uncomfortable unless marketers are educated and contribute to the conversation.
That means the first step is understanding the basics.
A brief technical history of web tracking (for the generalist)
Search marketers know more than most about web tracking. We know enough to set people straight at dinner parties — “No, your Wear OS watch is not spying on you” — and follow along at conferences like SMX when a speaker references the potentially morbid future of data management platforms. Yet most of us would not feel confident in front of a whiteboard explaining how cookies store data or advising our board of directors on CCPA compliance.
That’s okay. We’ve got other superpowers, nice shiny ones that have their own merit. Yet the events unfolding in 2020 will define our role as marketers and our value to consumers. We find ourselves in the middle of a privacy debate, and we should feel equipped to participate in it with a grasp of the key concepts.
What is the cookie?
A cookie stores information that is passed between browser and server to provide consistency as users navigate pages and sites. Consistency is an operative word. For example, that consistency can benefit consumers, like the common shopping cart.
Online shoppers add a product to the cart and, as they navigate the site, the product stays in the shopping cart. They can even jump to a competitor site to price compare and, when they return, the product is still in the shopping cart. That consistency makes it easier for them to shop, navigate an authenticated portion of a site, and exist a modern multi-browser, multi-device digital world.
Consistency can also benefit marketers. Can you imagine what would happen to conversion rates if users had to authenticate several times per visit? The pace of online shopping would grind to a crawl, Amazon would self combust, and Blockbuster video would rise like a phoenix.
But that consistency can violate trust.
Some cookies are removed when you close your browser. Others can accrue data over months or years, aggregating information across many sites, sessions, purchases and content consumption. The differences between cookie types can be subtle while the implications are substantial.
Comparing first- and third-party cookies
It is important for marketers to understand that first- and third-party cookies are written, read and stored in the same way. Simo Ahava does a superb job expanding on this concept in his open-source project that is absolutely recommended reading. Here’s a snippet.
It’s common in the parlance of the web to talk about first-party cookies and third-party cookies. This is a bit of a misnomer. Cookies are pieces of information that are stored on the user’s computer. There is no distinction between first-party and third-party in how these cookies are classified and stored on the computer. What matters is the context of the access.
The difference is the top-level domain that the cookie references. A first-party cookie references and interacts with the one domain and its subdomains.
Other important web tracking concepts
Persistent cookies and session cookies refer to duration. Session cookies expire at the end of the session when the browser closes. Persistent cookies do not. Data duration will prove to be an important concept in the regulation sections.
Cookies are not the only way to track consumers online. Fingerprinting, which uses the dozens of browser and device settings as unique identifiers, has gotten a lot of attention from platform providers, including a foreshadowed assault in Google’s Privacy Sandbox announcement.
Privacy Sandbox is Google’s attempt at setting a new standard for targeted advertising with an emphasis on user privacy. In other words, Google’s ad products and Chrome browser hope to maintain agreeable levels of privacy without the aggressive first-party cookie limitations displayed by other leading browsers like Safari and Firefox.
Drivers: How we got here
It would be convenient if we could start this story with one event, like a first domino to fall, that changed the course of modern data privacy and contributed to the world we see in 2020. For example, if you ask a historian about WWI, many would point to a day in Sarajevo. One minute Ol’ Archduke Ferdinand was enjoying some sun in his convertible, the next minute his day took a turn for the worse. It is hard to find that with tracking and data privacy.
Facebook’s path to monetization certainly played a part. In the face of market skepticism about the social media business model, Facebook found a path to payday by opening the data floodgates
While unfair to give Facebook all the credit or blame, the company certainly supported the narrative that data became the new oil. An iconic Economist article drew several parallels to oil, including the consolidated, oligopolistic tendencies of former oil giants.
“The giants’ surveillance systems span the entire economy: Google can see what people search for, Facebook what they share, Amazon what they buy,” the Economist wrote. “They own app stores and operating systems, and rent out computing power…”
That consolidation of data contributed to an increase in the frequency and impact of data leaks and breaches. Like fish in a bucket, nefarious actors knew where to look to reap the biggest rewards on their hacking efforts.
It was a matter of time until corporate entities attempted to walk the blurring line of legality, introducing a new weaponization of data that occurred outside of the deepest, darkest bowels of the internet.
Enter Cambridge Analytica. Two words that changed the way every web analyst introduced themselves to strangers. “I do analytics but, you know, not in, like, a creepy way.”
Cambridge Analytica, the defunct data-mining firm entwined in political scandal, shed a frightening light on the granularity and unchecked accessibility of platform data. Investigative reporting revealed to citizens around the world that their information could not only be used by advertising campaigns to sell widgets, but also by political campaigns to sell elections. For the first time in many homes, the effects of modern data privacy became tangible and personal.
Outcomes: Where we are today
The state of data privacy in 2020 can perhaps best be understood by framing it in terms of drivers and destinations. Consumer drivers, like those mentioned in the previous section, created reactions from stakeholders. Some micro-level outcomes, like actions taken by individual consumers, were predictable.
For example, the #deletefacebook hashtag first trended after the Cambridge Analytica story broke and surveys found that three-quarters of Americans tightened their Facebook privacy settings or deleted the app on their phone.
The largest outcomes are arguably happening at macro levels, where one (re-)action affects millions or hundreds of millions of people. We have seen some of that from consumers with the adoption of ad blockers. For publishers and companies that live and die with the ad impression, losing a quarter of your ad inventory due to ad blockers was, and still is, devastating.
Only weeks after Cambridge Analytica found its infamy in the headlines, the European Union adopted GDPR to enhance and defend privacy standards for its citizens, forcing digital privacy discussions into both living rooms and board rooms around the world.
Let’s use the following Google Trends chart for “data privacy” in the United States to dive deeper into five key outcomes.
General Data Protection Regulation (GDPR) has handed out more than €114 million in fines to companies doing business in the EU since becoming enforceable in May 2018. It’s been called “Protection + Teeth” in that the law provides a variety of data protection and privacy rights to EU citizens while allowing fine enforcement of up to €20 million or 4 percent of revenue, whichever hurts violators the most.
Months later, the United States welcomed the California Consumer Privacy Act (CCPA), which went into effect in January 2020 — becoming enforceable in July. Similar to GDPR, a central theme is transparency, in that Californians have the right to understand which data is collected and how that data is shared or sold to third parties.
CCPA is interesting for a few reasons. California is material. The state represents a double-digit share of both the US population and gross domestic product. It is also not the first time that California’s novel digital privacy legislation influenced a nation-wide model. The state introduced the first data breach notification laws in 2003, and other states quickly followed.
California is not alone with CCPA, either. Two dozen US state governments have introduced bills around digital tracking and data privacy, with at least a dozen pending legislation. That includes Nevada’s SB220 which became enacted and enforceable within a matter of months in 2019.
Corporate responses have come in many forms, from ad blockers I mentioned to platform privacy updates to the dissolution of ad-tech providers. I will address some of these stories and trends in the following section, but, for now, let’s focus on the actions of one technology that promises to trigger exponential effects on search marketing: web browsers.
The Safari browser introduced Intelligent Tracking Prevention (ITP) in 2017 to algorithmically limit cross-site tracking. Let’s pause to dissect the last few words in that sentence.
Algorithmically = automated decisions that prioritize scale over discernment
Limit = block immediately or after a short duration
Cross-site tracking = first- and third-party cookies
ITP 1.0 was only the beginning. From there, the following iterations tightened cookie duration, storage, and the role of first-party cookies for web analytics. Abigail Matchett explains the implications for users of Google Analytics.
“All client-side cookies (including first-party trusted cookies such as Google Analytics) were capped to seven days of storage. This may seem like a brief window as many users do not visit a website each week. However, with ITP 2.2 and ITP 2.3… all client-side cookies are now capped to 24-hours of storage for Safari users… This means that if a user visits your site on Monday, and returns on Wednesday, they will be granted a new _ga cookie by default.”
You are beginning to see why this is a big deal. Whether intended or not, these actions reinforce the use of quantitative metrics rather than quality measures by obstructing attribution. There is far more than can be said on ITP so if you are ready for a weekend read, I recommend this thorough technical assessment of the ITP 2.1 effects on analytics.
If ITP got marketer’s attention, Google reinforced it by announcing that Chrome would stop supporting third-party cookies in two years, codifying for marketers that cookie loss was not a can to be kicked down the road.
“Cookies have always been unreliable,” Simo Ahava told me. “To be blind-sided by the recent changes in web browsers means you haven’t been looking at data critically before. We are entering a post-cookie world of web analytics.”
Where it goes from here
The state of tracking and data privacy can take several paths from here. I outline a few of the most plausible then ask others in the analytics and digital space to offer their insights and recommendations.
2020 Path A: Lack of clarity leads to little change from search marketers
This outcome seemed like a real possibility in the first week of January as California enacted CCPA while enforcement deadlines got delayed. It was not yet clear what enforcement would look like later in the year and it appeared, despite big promises, that tomorrow would look a lot like today.
This path looked less likely after the second week of January. That leads us to the next section.
2020 Path B: Compounding tracking limitations keep marketers on their heels
Already in 2020 we have seen CCPA take effect, Chrome put cookies on notice, stocks for companies that rely on third-party cookies tumble, and the sacrifice of data providers that threatened consumer trust.
And that’s just January.
2020 Path C: Correction as consumer fear eases in response to industry action
The backlash to tracking and privacy is a reaction to imbalance. Consumers are protecting their data, politicians are protecting their constituents, and platforms are protecting their profits. As difficult as it is to see from our vantage point today, it’s most likely that these imbalances will normalize as stakeholders feel safe. The question is how long it will take and how many counter adjustments are required in the wake of over or under correcting.
As digital marketers, who in some ways represent both the consumers with whom we identify and the platforms with whom we depend, are in a unique position to expedite the correction and return to balance.
Finally, I would like to congratulate Simon Poulton on the birth of his first child, Matthew. We started writing this article together then someone wonderful decided to show up early. We all look forward to seeing you again at SMX someday soon. Congrats, Simon
Google says you can host your website anywhere in the world
Google’s John Mueller said that Google does not prioritize crawling of websites that are hosted in the USA. You can host your website anywhere you want
John ? @JohnMu
Replying to @JohnMu and 3 others
Anyway, back to the original question, @visalvadayar , we don't prioritize crawling of sites in the US. Crawling internationally works fine, and the difference for search is minimal (the speed of light + network issues, essentially). Host your site where you want to host.
2. Google says most sites don’t have toxic links
Google’s John Mueller said that most websites do not have ‘toxic’ links. If they do, they usually created them on their own:
Most sites don't have "toxic" links, or at least, created them on their own. IMO there are more important things to focus on, by our engineers, and definitely by site owners.
— John (@JohnMu) January 28, 2020
Toxic links are spam links that have a negative impact on your search engine rankings. If you built spam links in the past, use the link disinfection tool in SEOprofiler to get rid of them:
3. How You Can Manage Thin Content on Your Website
What is thin content?
Thin content pages are pages with less than 400-500 words that do not have a clear focus. Some people also consider unoptimized pages, duplicate pages and outdated pages thin content.
How to identify thin content on your web pages
Use a website audit tool to find thin content pages on your website. Just run a website audit. The audit report shows thin content pages on your website,.
What you should do with your thin content pages
There are four strategies for dealing with thin content pages: do nothing, update the pages, redirect the pages, or remove the pages.
1. Do nothing
You don't have to do anything if the page gets many website visitors and/or if the page is ranked well on Google and other search engines. Of course, the content of the page should be up-to-date.
2. Update the page
If the content of the page is relevant but not up to date, update the page to make sure that it is relevant to your current offerings.
If the content is relevant but not detailed enough, add more information about the topic on the page. Make sure that the web page plays a relevant role in converting visitors on your website.
Updating content is less work than creating new content. In additions, the old pages usually contain some backlinks. When updating a page, you can also add videos, infographics and other content that makes the page more linkworthy.
3. Redirect the page
If you have another page on your website that contains a better version of the same content, just redirect the thin content page to the rich-content page.
4. Remove the page
If the content of a page is outdated, or if it does not comply with current regulations, it might be a good idea to remove the page from your website.
This should only be done if it is not possible to redirect the old page to another page of your website that is up-to-date.
Other errors on your web pages
Your website can contain many more errors that have a negative impact on your search engine rankings. The website audit tool helps you find them:
Check your web pages regularly (with an automated tool)
Updating or replacing content that doesn't provide value for your customers is always a good idea. You should also check the redirects on your website. One bad redirect can lead to massive problems.
4. Evergreen Content Doesn't Need To Change
Google's John Mueller was asked how can one communicate to Google that a page with evergreen content is as value as it was when it was first published several years ago. John responded that "If it's evergreen, then by definition you don't need to change it. No need to do anything special."
The SEO wanted to know if the date should be removed, updated, or something else. John said do nothing, just keep it as is.
Here are those tweets:
Saswata Baksi @am_saswat
· Jan 5, 2020
Hey, @JohnMu Any recommendations for "EVERGREEN" content, which literally need no changes for several years! How to notify Google that content still has the same value as previous! Am I need to delete the date? Or something else.
John ? @JohnMu
If it's evergreen, then by definition you don't need to change it. No need to do anything special. Keep your dates, make it great.
The SEO then said it would be a disadvantage not to do anything. Having an old date show in the Google search results may deter a searcher from clicking on the article in search. John responded.
Vincent Malischewski @VMalischewski
· Jan 6, 2020
Replying to @JohnMu @am_saswat
Won't a 'great' article be disadvantaged by an old timestamp if another great article has been written more recently?
Users will probably be influenced by the earliest published date snippet in the serps
John ? @JohnMu
Why would an article be disadvantaged by a date?
So you should manually change the date, manipulate it, to be more recent? Either update the article or leave it alone?
5. Google updates mobile-first indexing best practices
Google has announced an update to their mobile-first indexing best practices on Twitter.
We made some significant updates to our developers documentation on mobile-first indexing (#?1), Whether your site has been moved over already or not, it's worth checking it out ??https://t.co/yo4mGQZkqh
— Google Webmasters (@googlewmc) January 22, 2020
What is new?
There are a few changes compared to the previous version of the guidelines. Most of the recommendations are obvious:
The mobile version and the desktop version of your website should have the same content.
Your pages should be crawlable.
Invisible elements such as meta robots and structured data should be the same for desktop and mobile.
If you use structured data on mobile pages, the URLs should point to mobile pages.
Use high quality images and the same img alt texts on mobile and desktop.
Ensure that hreflang implemantation and canonical attributes are correct.
If you have a mobile website, Google will ignore your desktop website. Google's John Mueller confirmed this on Twitter:
No, we will ignore any & all content on the desktop site if we index the mobile version. We will only index the mobile version.
— ?? John ?? (@JohnMu) January 22, 2020
Optimize your website for mobile
Google's John Mueller keeps telling webmasters and Google that they should use responsive website design to create their mobile websites.
Theoretically, you could have dedicated mobile pages (for example on an m.example.com subdomain). However, John Mueller strongly recommends avoiding dedicated mobile pages because they make things too complicated.
Having one responsive website that works with mobile and desktop is the best option.
6. Google says featured snippet URLs won’t be repeated in the regular results from now on
Google’s Danny Sullivan said on Twitter that URLs that are shown in featured snippets won’t be repeated in the regular results from now on. Until now, websites that owned the featured snippet often appeared twice on the first results page. The change has rolled out globally.
Danny Sullivan ? @dannysullivan
Replying to @mark_barrera and 4 others
If a web page listing is elevated into the featured snippet position, we no longer repeat the listing in the search results. This declutters the results & helps users locate relevant information more easily. Featured snippets count as one of the ten web page listings we show.
Is your website listed with a featured snippet?
Is your website listed with a featured snippet on Google’s result pages? Or is your website listed with sitelinks? Can your website be found in the image results? Knowing how your website is ranked on Google will help you to improve your web pages so that you get more conversions.
Google’s different result types
Google (and other search engines) have many different result types on their search result pages. In addition to the regular results (blue link with a short description) there are many different result types.
The Ranking Monitor in SEOprofiler shows you how your website is listed. Just enter the keywords that you want to monitor on the page ‘Monitored keywords’ in the Ranking Monitor. You can choose a country and even a particular location from which you want to check your rankings.
The Ranking Monitor automatically checks the rankings of your web pages for these keywords, and it also checks the result types. The results type is listed is displayed next to the keywords that you monitor:
Image Search Ranking
"When it comes to images the number one thing to consider is how you'd want to be found in image search? What you expect users to search for? And how can your site be useful to them when they find you?"
7. Google says hidden text does not make or break a website
Google’s John Mueller said on Reddit that Google uses many different signals for ranking and that a website won’t outrank your site just because of hidden text.
“A site is not going to outrank your site just because of hidden text. We use many, many signals for ranking. Inversely, just having hidden text on a page won’t get the site banned from Google.
Lots of sites get things wrong, lots of sites have text accidentally hidden (or even purposely hidden until you interact with the UI) — sites aren’t perfect and so our algorithms work to deal with these imperfections in a reasonable way.
Sometimes that means the top ranking site – the one our algorithms currently think is a good match for a user’s query – is one that does a lot of things technically incorrect.”
Sunday, February 9, 2020