Google Outlines Vision for a More Private Web
Writing in The Keyword on 22 August, Justin Schuh, Director of Chrome Engineering, outlined a plan to "develop a set of standards" to enhance user privacy on the web.
Named Privacy Sandbox, the plan proposes privacy budgets, wherein browsers will allow websites to collect group information through API calls without exposing individual anonymity. When a site's budget depletes, the browser ignores any further calls from the website.
Schuh states that: "We want to find a solution that both really protects user privacy and also helps content remain freely accessible on the web."
The latter part is significant as, according to Google, on average, publishers lose up to 52% of revenue when advertising is made less relevant through the blocking of cookies.
As Schuh admits in conversation with TechCrunch: "It’s going to be a multi-year journey".
In a separate post, published on The Chromium Blog, Google called for ideas from the web community to help develop these new standards.
Websites that receive income from onsite advertising should consider how the Privacy Sandbox could affect revenue. Read through the early proposals outlined by Google.
Google Updates Dynamic Rendering Document
Taking to Twitter on 22 August, Google’s Martin Splitt announced that he had updated Google’s support page on dynamic rendering.
Yay! @LizziHarvey and I updated the dynamic rendering docs to clarify a very frequent question: Is dynamic rendering considered cloaking? The answer shouldn't surprise you 😂 https://t.co/l4s3JkOLcx pic.twitter.com/IH4MY8firS— Martin Splitt @ 🔜🇨🇦#WebUnleashed #Toronto (@g33konaut) August 22, 2019
The updated document now reads: "Googlebot generally doesn’t consider dynamic rendering as cloaking. As long as your dynamic rendering produces similar content, Googlebot won’t view dynamic rendering as cloaking."
Ensure that your site does not serve different information to users and search engines. You can read more about what is considered cloaking in this Search Console Help guide.
Chrome to Support Lazy Loading at Desktop Level
In an announcement on Web.dev, Google stated that Chrome would soon support lazy loading on desktop.
For webmasters without the skills of a developer, lazy loading can be implemented with just a basic knowledge of HTML, as seen below:
Supported values include:
Lazy loading decreases load time as modules load on demand. This can be advantageous for users if their bandwidth is slow. The feature is to be updated until the stable version of Chrome 76 is released.
Read through the announcement and consider whether your site’s performance could benefit from lazy loaded pages.
Testing Tools Updated
Writing on The Webmaster Central Blog on 7 August, Google announced that it had updated its testing tools so that they are using the evergreen chromium renderer.
Websites using ES6+, Web Components, and other platform features will now be rendered with the latest stable Chromium, in Googlebot and Google’s other testing tools.
The changes will affect rendering within:
- Search Console's URL inspection tool
- Mobile-friendly test
- Rich results test
- AMP test
Webmasters are now able to view precisely what Googlebot sees during a crawl. If you conduct a test with the updated tools, it means that you can entirely rely on the results provided.
Core update Guidance Provided to Webmasters
On 1 August Google published a lengthy post about "what webmasters should know" regarding its core updates.
The search engine stresses that although some pages might not perform as well after a core update, it does not mean that they have violated the search engine’s guidelines.
Google states performance fluctuations are instead related to how content is assessed on an overall basis.
Sites that do experience drops should consider a variety of questions, including:
- Does the content provide a substantial, complete or comprehensive description of the topic?
- Does the headline and page title avoid being exaggerating or shocking in nature?
- If the content draws on other sources, does it avoid simply copying or rewriting those sources and instead provide substantial additional value and originality?
- Is the content free from spelling or stylistic issues?
- Does the content provide substantial value when compared to other pages in search results?
Read through Google’s Search Rater Guidelines to make sure that your site is creating the best possible content for users and consider whether your site offers Expertise, Authority and Trust (EAT).
Changes Made to Image Results on Desktop
In an announcement in The Keyword on 6 August, Google stated that it was overhauling the way that search results are presented on desktop devices.
When users select an image, it will now appear in a side panel next to the search results. Users will also be presented with added information sourced from the page where it was selected.
Images of products now produce essential information, such as price, availability, and user reviews.
A Reddit thread on 28 August also noted that users are no longer able to filter image results through size and colour options.
Ensure that your images provide as much information to users and search engines as possible and remember to include image titles, alt information, and appropriate descriptions. Read through Google’s image best practice guidelines for more details.
Added Structured Data Types Reporting in Search Console
Three more data types of structured data were added to the rich results report during the second week of August.
- Product markup, which allows for product descriptions to appear within SERPs.
- Sitelink search box markup, which allows Google to display a search box so that users can refine searches within a particular website.
- Unparsable types, which refers to structured data that is not correctly implemented.
Structured data types that were already in the rich results report include:
- Q&A pages
- Job Postings
The product markup report will be incredibly useful for eCommerce sites in particular. If you have recently added the data type to your products, check through the Search Console to see whether it has flagged any errors.
Featured Snippets Algorithm Change
On 1 August Google announced in The Keyword that it had changed its Featured Snippets algorithm to make the feature more useful to users.
According to Google, the algorithm now understands which information needs to be "fresh" or “new”. This is important for when a user is searching for information required from the current year.
Google gives three examples of featured snippets that require new information:
- Snippets discussing current events.
- Snippets featuring information that is regularly updated.
- Snippets featuring information that changes over time.
The algorithm is also able to ascertain what information might change quickly or suddenly.
Google states that: "For queries where fresh information is important, our systems will try to find the most useful and up-to-date featured snippets."
The search engine also mentions that information that is evergreen does not need to be changed regularly for inclusion in featured snippets.
Publishers should consider what elements of content fall under "evergreen" and what content might need updating or regularly generating for Featured Snippet inclusion.
Google provides clarification and example for each of the above snippet type in The Keyword.
Study Finds That 33% of People in the US are Using Voice Search Regularly
A report by eMarketer states that just over one-third of people are now using Google’s Voice Search functionality at least once per month.
The report estimates that 111 million people will use voice search throughout 2019, up from 102 million in 2018.
The same study estimates that 77 million people in the US will use smart speakers this year, with another 11 to be added million by 2021.
Voice Search is becoming an increasingly popular type of search. If your website is based in the US, consider adding Speakable structured data so that your site's content can be distributed across different channels to reach a broader audience base.
Bing Makes Website Verification Easier for Webmasters
On 21 August Bing announced that it was adding support for website verification through Domain Connect.
Webmasters using IONOS, GoDaddy, Media Temple, 1&1, or Plesk will now be able to verify their websites within Bing Webmaster Tools.
This means that there are now four verification methods:
- XML file authentication
- Meta tag authentication
- Adding a CNAME record to DNS
- Domain Connect verification
Domain Connect is an open standard that allows for webmasters to easily configure DNS for a domain that runs on a DNS provider so that the site works with a service running an independent service provider.
Read through Bing’s instructions on how to verify your site if you have not already done so. The new method will allow more webmasters to make use of Bing Webmaster Tools so that they can make use of its advanced reporting suite.
Yelp Introduces Personalised Search Results
On 27 August Yelp announced that it was allowing its users to tailor personalised search results around their lifestyles, diet, and accessibility.
Search Console API Cleaned by Google
On 26 August Google announced that it was "cleaning up" its Search Console API and that it no longer supported a small set of Android app search appearance types.
Google Explains April's Indexing Issues
In light of recent April’s indexing issues, Google published a post on its Webmaster Central Blog, explaining how the problems occurred, and what solutions it put into place to resolve them.
Content Policies to be Simplified for Publishers
Writing in Google Ad Manager, Google announced that it was going to update its content policies for publishers sometime in September.
Over Half of Consumers Research Products Through Videos
As reported by the Search Engine Journal, new research by Google has found that 55% of consumers use videos to research products before making purchase decisions.