Consolidating your website traffic on canonical URLs

Consolidating your website traffic on canonical URLs

In Search Console, the Performance report currently credits all page metrics to the exact URL that the user is referred to by Google Search. Although this provides very specific data, it makes property management more difficult; for example: if your site has mobile and desktop versions on different properties, you must open multiple properties to see all your Search data for the same piece of content.

To help unify your data, Search Console will soon begin assigning search metrics to the (Google-selected) canonical URL, rather than the URL referred to by Google Search. This change has several benefits:

  • It unifies all search metrics for a single piece of content into a single URL: the canonical URL. This shows you the full picture about a specific piece of content in one property.
  • For users with separate mobile or AMP pages, it unifies all (or most, since some mobile URLs may end up as canonical) of your data to a single property (the “canonical” property).
  • It improves the usability of the AMP and Mobile-Friendly reports. These reports currently show issues in the canonical page property, but show the impression in the property that owns the actual URL referred to by Google Search. After this change, the impressions and issues will be shown in the same property.

When will this happen?

We plan to transition all performance data at the end of March. In order to provide continuity to your data, we will pre-populate your unified data beginning from January 2018. We will also enable you to view both old and new versions for a few weeks during the transition to see the impact and understand the differences.

API and Data Studio users: The Search Console API will change to canonical data at the end of March.

How will this affect my data?

  • At an individual URL level, you will see traffic shift from any non-canonical (duplicate) URLs to the canonical URL.
  • At the property level, you will see data from your alternate property (for example, your mobile site) shifted to your “canonical property”. Your alternate property traffic probably won’t drop to zero in Search Console because canonicalization is at the page, not the property level, and your mobile property might have some canonical pages. However, for most users, most property-level data will shift to one property. AMP property traffic will drop to zero in most cases (except for self-canonical pages).
  • You will still be able to filter data by device, search appearance (such as AMP), country, and other dimensions without losing important information about your traffic.

You can see some examples of these traffic changes below.

Preparing for the change

  • Consider whether you need to change user access to your various properties; for example: do you need to add new users to your canonical property, or do existing users continue to need access to the non-canonical properties.
  • Modify any custom traffic reports you might have created in order to adapt for this traffic shift.
  • If you need to learn the canonical URL for a given URL, you can use the URL Inspection tool.
  • If you want to save your traffic data calculated using the current system, you should download your data using either the Performance report’s Export Data button, or using the Search Console API.

Examples

Here are a few examples showing how data might change on your site. In these examples, you can see how your traffic numbers would change between a canonical site (called example.com) and alternate site (called m.example.com).

Important: In these examples, the desktop site contains all the canonical pages and the mobile contains all the alternate pages. In the real world, your desktop site might contain some alternate pages and your mobile site might contain some canonical pages. You can determine the canonical for a given URL using the URL Inspection tool.

Total traffic

In the current version, some of your traffic is attributed to the canonical property and some to the alternate property. The new version should attribute all of your traffic to the canonical property.

Canonical property
(http://example.com)
Alternate property
(http://m.example.com)
Current

New, based on canonical URLs

Change +0.7K     |        +3K -0.7K        |          -3K

Individual page traffic

You can see traffic changes between the duplicate and canonical URLs for individual pages in the Pages view. The next example shows how traffic that used to be split between the canonical and alternate pages are now all attributed to the canonical URL:

Canonical property
(http://example.com)
Alternate property
(http://m.example.com)

Old

New

Change

+150     |        +800

-150     |        -800

Mobile traffic

In the current version, all of your mobile traffic was attributed to your m. property. The new version attributes all traffic to your canonical property when you apply the “Device: Mobile” filter as shown here:

Canonical property
(http://example.com)
Alternate property
(http://m.example.com)

Old

New

Change

+0.7K      | +3K

-0.7K      | -3K

In conclusion

We know that this change might seem a little confusing at first, but we’re confident that it will simplify your job of tracking traffic data for your site. If you have any questions or concerns, please reach out on the Webmaster Help Forum.

Sursa: Google Webmaster Central Blog

Official Google Webmaster Central Blog: Dynamic Rendering with Rendertron

Official Google Webmaster Central Blog: Dynamic Rendering with Rendertron

Many frontend frameworks rely on JavaScript to show content. This can mean Google might take some time to index your content or update the indexed content. 

Dynamic rendering is useful for content that changes often and needs JavaScript to display.


  
const apiUrl = ‘https://api.thecatapi.com/v1/images/search?limit=50’;


  const tpl = document.querySelector(‘template’).content;

  const container = document.querySelector(‘ul’);

  function init () {

    fetch(apiUrl)

    .then(response => response.json())

    .then(cats => {

      container.innerHTML = ;

      cats

        .map(cat => {

          const li = document.importNode(tpl, true);

          li.querySelector(‘img’).src = cat.url;

          return li;

        }).forEach(li => container.appendChild(li));

    })

  }

  init();

  document.querySelector(‘button’).addEventListener(‘click’, init);

The mobile-friendly test shows that the page is mobile-friendly, but the screenshot is missing all the cats! The headline and button appear but none of the cat pictures are there.

While this problem is simple to fix, it’s a good exercise to learn how to setup dynamic rendering. Dynamic rendering will allow Googlebot to see the cat pictures without changes to the web app code.

Set up the server

To serve the web application, let’s use express, a node.js library, to build web servers.

const express = require(‘express’);


const app = express();

const DIST_FOLDER = process.cwd() + ‘/docs’;

const PORT = process.env.PORT || 8080;

// Serve static assets (images, css, etc.)

app.get(‘*.*’, express.static(DIST_FOLDER));

// Point all other URLs to index.html for our single page app

app.get(‘*’, (req, res) => {

 res.sendFile(DIST_FOLDER + ‘/index.html’);

});

// Start Express Server

app.listen(PORT, () => {

 console.log(`Node Express server listening on http://localhost:${PORT} from ${DIST_FOLDER}`);

});

You can try the live example here – you should see a bunch of cat pictures, if you are using a modern browser. To run the project from your computer, you need node.js to run the following commands:

npm install –save express rendertron-middleware
node server.js

The form to create a new Google Cloud Platform project.
  1. Create a new project in the Google Cloud console. Take note of the “Project ID” below the input field.



  2. Clone the Rendertron repository from GitHub with:

    git clone https://github.com/GoogleChrome/rendertron.git 

    cd rendertron 

  3. Run the following commands to install dependencies and build Rendertron on your computer:

    npm install && npm run build

  4. Enable Rendertron’s cache by creating a new file called config.json in the rendertron directory with the following content:

    { “datastoreCache”: true }

  5. Run the following command from the rendertron directory. Substitute YOUR_PROJECT_ID with your project ID from step 1.

    gcloud app deploy app.yaml –project YOUR_PROJECT_ID

  6. Select a region of your choice and confirm the deployment. Wait for it to finish.


  7. Enter the URL YOUR_PROJECT_ID.appspot.com (substitute YOUR_PROJECT_ID for your actual project ID from step 1 in your browser. You should see Rendertron’s interface with an input field and a few buttons.


Rendertron’s UI after deploying to Google Cloud Platform

When you see the Rendertron web interface, you have successfully deployed your own Rendertron instance. Take note of your project’s URL (YOUR_PROJECT_ID.appspot.com) as you will need it in the next part of the process.

Add Rendertron to the server

The web server is using express.js and Rendertron has an express.js middleware. Run the following command in the directory of the server.js file:

npm install –save rendertron-middleware

This command installs the rendertron-middleware from npm so we can add it to the server:

const express = require(‘express’);

const app = express();

const rendertron = require(‘rendertron-middleware’);

Configure the bot list

Rendertron uses the user-agent HTTP header to determine if a request comes from a bot or a user’s browser. It has a well-maintained list of bot user agents to compare with. By default this list does not include Googlebot, because Googlebot can execute JavaScript. To make Rendertron render Googlebot requests as well, add Googlebot to the list of user agents:

const BOTS = rendertron.botUserAgents.concat(‘googlebot’);

const BOT_UA_PATTERN = new RegExp(BOTS.join(‘|’), ‘i’);

Rendertron compares the user-agent header against this regular expression later.

Add the middleware

To send bot requests to the Rendertron instance, we need to add the middleware to our express.js server. The middleware checks the requesting user agent and forwards requests from known bots to the Rendertron instance. Add the following code to server.js and don’t forget to substitute “YOUR_PROJECT_ID” with your Google Cloud Platform project ID:

app.use(rendertron.makeMiddleware({

 proxyUrl: ‘https://YOUR_PROJECT_ID.appspot.com/render’,

 userAgentPattern: BOT_UA_PATTERN

}));

Bots requesting the sample website receive the static HTML from Rendertron, so the bots don’t need to run JavaScript to display the content.

Testing our setup

To test if the Rendertron setup was successful, run the mobile-friendly test again.

Unlike the first test, the cat pictures are visible. In the HTML tab we can see all HTML the JavaScript code generated and that Rendertron has removed the need for JavaScript to display the content.

Conclusion

You created a dynamic rendering setup without making any changes to the web app. With these changes, you can serve a static HTML version of the web app to crawlers.

Post content

Sursa: Google Webmaster Central Blog

We updated our job posting guidelines

We updated our job posting guidelines

Last year, we launched job search on Google to connect more people with jobs. When you provide Job Posting structured data, it helps drive more relevant traffic to your page by connecting job seekers with your content. To ensure that job seekers are getting the best possible experience, it’s important to follow our Job Posting guidelines.

We’ve recently made some changes to our Job Posting guidelines to help improve the job seeker experience.

  • Remove expired jobs
  • Place structured data on the job’s detail page
  • Make sure all job details are present in the job description

Remove expired jobs

When job seekers put in effort to find a job and apply, it can be very discouraging to discover that the job that they wanted is no longer available. Sometimes, job seekers only discover that the job posting is expired after deciding to apply for the job. Removing expired jobs from your site may drive more traffic because job seekers are more confident when jobs that they visit on your site are still open for application. For more information on how to remove a job posting, see Remove a job posting.

Place structured data on the job’s detail page

Job seekers find it confusing when they land on a list of jobs instead of the specific job’s detail page. To fix this, put structured data on the most detailed leaf page possible. Don’t add structured data to pages intended to present a list of jobs (for example, search result pages) and only add it to the most specific page describing a single job with its relevant details.

Make sure all job details are present in the job description

We’ve also noticed that some sites include information in the JobPosting structured data that is not present anywhere in the job posting. Job seekers are confused when the job details they see in Google Search don’t match the job description page. Make sure that the information in the JobPosting structured data always matches what’s on the job posting page. Here are some examples:

  • If you add salary information to the structured data, then also add it to the job posting. Both salary figures should match.
  • The location in the structured data should match the location in the job posting.

Providing structured data content that is consistent with the content of the job posting pages not only helps job seekers find the exact job that they were looking for, but may also drive more relevant traffic to your job postings and therefore increase the chances of finding the right candidates for your jobs.

If your site violates the Job Posting guidelines (including the guidelines in this blog post), we may take manual action against your site and it may not be eligible for display in the jobs experience on Google Search. You can submit a reconsideration request to let us know that you have fixed the problem(s) identified in the manual action notification. If your request is approved, the manual action will be removed from your site or page.

For more information, visit our Job Posting developer documentation and our JobPosting FAQ.

Sursa: Google Webmaster Central Blog

Send your recipes to the Google Assistant

Send your recipes to the Google Assistant

Last year, we launched Google Home with recipe guidance, providing users with step-by-step instructions for cooking recipes. With more people using Google Home every day, we’re publishing new guidelines so your recipes can support this voice guided experience. You may receive traffic from more sources, since users can now discover your recipes through the Google Assistant on Google Home. The updated structured data properties provide users with more information about your recipe, resulting in higher quality traffic to your site.

Updated recipe properties to help users find your recipes

We updated our recipe developer documentation to help users find your recipes and experience them with Google Search and the Google Assistant on Google Home. This will enable more potential traffic to your site. To ensure that users can access your recipe in more ways, we need more information about your recipe. We now recommend the following properties:

  • Videos: Show users how to make the dish by adding a video array
  • Category: Tell users the type of meal or course of the dish (for example, “dinner”, “dessert”, “entree”)
  • Cuisine: Specify the region associated with your recipe (for example, “Mediterranean”, “American”, “Cantonese”)
  • Keywords: Add other terms for your recipe such as the season (“summer”), the holiday (“Halloween”, “Diwali”), the special event (“wedding”, “birthday”), or other descriptors (“quick”, “budget”, “authentic”)

We also added more guidance for recipeInstructions. You can specify each step of the recipe with the HowToStep property, and sections of steps with the HowToSection property.

Add recipe instructions and ingredients for the Google Assistant

We now require the recipeIngredient and recipeInstructions properties if you want to support the Google Assistant on Google Home. Adding these properties can make your recipe eligible for integration with the Google Assistant, enabling more users to discover your recipes. If your recipe doesn’t have these properties, it won’t be eligible for guidance with the Google Assistant, but it can still be eligible to appear in Search results.

For more information, visit our Recipe developer documentation. If you have questions about the feature, please ask us in the Webmaster Help Forum.

Sursa: Google Webmaster Central Blog

Official Google Webmaster Central Blog: Google I/O 2018

Official Google Webmaster Central Blog: Google I/O 2018

Google I/O 2018 is starting today in California, to an international audience of 7,000+ developers. It will run until Thursday night. It is our annual developers festival, where product announcements are made, new APIs and frameworks are introduced, and Product Managers present the latest from Google.

However, you don’t have to physically attend the event to take advantage of this once-a-year opportunity: many conferences and talks are live streamed on YouTube for anyone to watch. You will find the full-event schedule here.

Dozens upon dozens of talks will take place over the next 3 days. We have hand picked the talks that we think will be the most interesting for webmasters and SEO professionals. Each link shared will bring you to pages with more details about each talk, and you will find out how to tune in to the live stream. All times are California time (PCT). We might add other sessions to this list.

Tuesday, May 8th

  • 3pm – Web Security post Spectre/Meltdown, with Emily Schechter and Chris Palmer – more info.
  • 5pm – Dru Knox and Stephan Somogyi talk about building a seamless web with Chrome – more info.

Wednesday, May 9th

  • 9.30am – Ewa Gasperowicz and Addy Osmani talk about Web Performance and increasing control over the loading experience – more info.
  • 10.30am – Alberto Medina and Thierry Muller will explain how to make a WordPress site progressive – more info.
  • 11.30am – Rob Dodson and Dominic Mazzoni will cover “What’s new in web accessibility” – more info.
  • 3.30pm – Michael Bleigh will introduce how to leverage AMP in Firebase for a blazing fast website – more info.
  • 4.30pm – Rick Viscomi and Vinamrata Singal will introduce the latest with Lighthouse and Chrome UX Report for Web Performance – more info.

Thursday, May 10th

  • 8.30am – John Mueller and Tom Greenaway will talk about building Search-friendly JavaScript websites – more info.
  • 9.30am – Build e-commerce sites for the modern web with AMP, PWA, and more, with Adam Greenberg and Rowan Merewood – more info.
  • 12.30pm – Session on “Building a successful web presence with Google Search” by John Mueller and Mariya Moeva – more info.

We hope you can make the time to watch the talks online, and participate in the excitement of I/O ! The videos will also be available on Youtube after the event, in case you can’t tune in live.

Posted by Vincent Courson, Search Outreach Specialist, and the Google Webmasters team



Sursa: Google Webmaster Central Blog

helping webmasters and content creators

helping webmasters and content creators

Great websites are the result of the hard work of website owners who make their content and services accessible to the world. Even though it’s simpler now to run a website than it was years ago, it can still feel like a complex undertaking. This is why we invest a lot of time and effort in improving Google Search so that website owners can spend more time focusing on building the most useful content for their users, while we take care of helping users find that content. 

Most website owners find they don’t have to worry much about what Google is doing—they post their content, and then Googlebot discovers, crawls, indexes and understands that content, to point users to relevant pages on those sites. However, sometimes the technical details still matter, and sometimes a great deal.

For those times when site owners would like a bit of help from someone at Google, or an explanation for why something works a particular way, or why things appear in a particular way, or how to fix what looks like a technical glitch, we have a global team dedicated to making sure there are many places for a website owner to get help from Google and knowledgeable members of the community.
The first place to start for help is Google Webmasters, a place where all of our support resources (many of which are available in 40 languages) are within easy reach:

Our second path to getting help is through our Google Webmaster Central Help Forums. We have forums in 16 languages—in English, Spanish, Hindi, French, Italian, Portuguese, Japanese, German, Russian, Turkish, Polish, Bahasa Indonesia, Thai, Vietnamese, Chinese and Korean. The forums are staffed with dedicated Googlers who are there to make sure your questions get answered. Aside from the Googlers who monitor the forums, there is an amazing group of Top Contributors who generously offer their time to help other members of the community—many times providing greater detail and analysis for a particular website’s content than we could. The forums allow for both a public discussion and, if the case requires it, for private follow-up replies in the forum.
A third path for support to website owners is our series of Online Webmaster Office Hours — in English, German, Japanese, Turkish, Hindi and French. Anyone who joins these is welcome to ask us questions about website appearance in Google Search, which we will answer to the best of our abilities. All of our team members think that one of the best parts of speaking at conferences and events is the opportunity to answer questions from the audience,  and the online office hours format creates that opportunity for many more people who might not be able to travel to a specialized event. You can always check out the Google Webmaster calendar for upcoming webmaster officer hours and live events.


Beyond all these resources, we also work hard to ensure that everyone who wants to understand Google Search can find relevant info on our frequently updated site How Search Works.


While how a website behaves on the web is openly visible to all who can see it, we know that some website owners prefer not to make it known their website has a problem in a public forum. There’s no shame in asking for support, but if you have an issue for your website that seems sensitive—for which you don’t think you can share all the details publicly—you can call out that you would prefer to share necessary details only with someone experienced and who is willing to help, using the forum’s “Private Reply” feature.
Are there other things you think we should be doing that would help your website get the most out of search? Please let us know — in our forums, our office hours, or via Twitter @googlewmc.

Posted by Juan Felipe Rincón from Google’s Webmaster Outreach & Support team



Sursa: Google Webmaster Central Blog

Google Search at I/O 2018

Google Search at I/O 2018

With the eleventh annual Google I/O wrapped up, it’s a great time to reflect on some of the highlights.

What we did at I/O

The event was a wonderful way to meet many great people from various communities across the globe, exchange ideas, and gather feedback. Besides many great web sessions, codelabs, and office hours we shared a few things with the community in two sessions specific to Search:

The sessions included the launch of JavaScript error reporting in the Mobile Friendly Test tool, dynamic rendering (we will discuss this in more detail in a future post), and an explanation of how CMS can use the Indexing and Search Console APIs to provide users with insights. For example, Wix lets their users submit their homepage to the index and see it in Search results instantly, and Squarespace created a Google Search keywords report to help webmasters understand what prospective users search for.

During the event, we also presented the new Search Console in the Sandbox area for people to try and were happy to get a lot of positive feedback, from people being excited about the AMP Status report to others exploring how to improve their content for Search.

Hands-on codelabs, case studies and more

We presented the Structured Data Codelab that walks you through adding and testing structured data. We were really happy to see that it ended up being one of the top 20 codelabs by completions at I/O. If you want to learn more about the benefits of using Structured Data, check out our case studies.

During the in-person office hours we saw a lot of interest around HTTPS, mobile-first indexing, AMP, and many other topics. The in-person Office Hours were a wonderful addition to our monthly Webmaster Office Hours hangout. The questions and comments will help us adjust our documentation and tools by making them clearer and easier to use for everyone.

Highlights and key takeaways

We also repeated a few key points that web developers should have an eye on when building websites, such as:

  • Indexing and rendering don’t happen at the same time. We may defer the rendering to a later point in time.
  • Make sure the content you want in Search has metadata, correct HTTP statuses, and the intended canonical tag.
  • Hash-based routing (URLs with “#”) should be deprecated in favour of the JavaScript History API in Single Page Apps.
  • Links should have an href attribute pointing to a URL, so Googlebot can follow the links properly.

Make sure to watch this talk for more on indexing, dynamic rendering and troubleshooting your site. If you wanna learn more about things to do as a CMS developer or theme author or Structured Data, watch this talk.

We were excited to meet some of you at I/O as well as the global I/O extended events and share the latest developments in Search. To stay in touch, join the Webmaster Forum or follow us on Twitter, Google+, and YouTube.

 



Sursa: Google Webmaster Central Blog

New URL inspection tool & more in Search Console

New URL inspection tool & more in Search Console

A few months ago, we introduced the new Search Console. Here are some updates on how it’s progressing.

Welcome “URL inspection” tool

One of our most common user requests in Search Console is for more details on how Google Search sees a specific URL. We listened, and today we’ve started launching a new tool, “URL inspection,” to provide these details so Search becomes more transparent. The URL Inspection tool provides detailed crawl, index, and serving information about your pages, directly from the Google index.

Enter a URL that you own to learn the last crawl date and status, any crawling or indexing errors, and the canonical URL for that page. If the page was successfully indexed, you can see information and status about any enhancements we found on the page, such as linked AMP version or rich results like Recipes and Jobs.



URL is indexed with valid AMP enhancement

If a page isn’t indexed, you can learn why. The new report includes information about noindex robots meta tags and Google’s canonical URL for the page.



URL is not indexed due to ‘noindex’ meta tag in the HTML

A single click can take you to the issue report showing all other pages affected by the same issue to help you track down and fix common bugs.

We hope that the URL Inspection tool will help you debug issues with new or existing pages in the Google Index. We began rolling it out today; it will become available to all users in the coming weeks.

More exciting updates

In addition to the launch of URL inspection, we have a few more features and reports we recently launched to the new Search Console:

Thank you for your feedback

We are constantly reading your feedback, conducting surveys, and monitoring usage statistics of the new Search Console. We are happy to see so many of you using the new issue validation flow in Index Coverage and the AMP report. We notice that issues tend to get fixed quicker when you use these tools. We also see that you appreciate the updates on the validation process that we provide by email or on the validation details page.

We want to thank everyone who provided feedback: it has helped us improve our flows and fix bugs on our side.

More to come

The new Search Console is still beta, but it’s adding features and reports every month. Please keep sharing your feedback through the various channels and let us know how we’re doing.

Sursa: Google Webmaster Central Blog

Introducing the Indexing API for job posting URLs

Introducing the Indexing API for job posting URLs

Last June we launched a job search experience that has since connected tens of millions of job seekers around the world with relevant job opportunities from third party providers across the web. Timely indexing of new job content is critical because many jobs are filled relatively quickly. Removal of expired postings is important because nothing’s worse than finding a great job only to discover it’s no longer accepting applications.

Today we’re releasing the Indexing API to address this problem. This API allows any site owner to directly notify Google when job posting pages are added or removed. This allows Google to schedule job postings for a fresh crawl, which can lead to higher quality user traffic and job applicant satisfaction. Currently, the Indexing API can only be used for job posting pages that include job posting structured data.

For websites with many short-lived pages like job postings, the Indexing API keeps job postings fresh in Search results because it allows updates to be pushed individually. This API can be integrated into your job posting flow, allowing high quality job postings to be searchable quickly after publication. In addition, you can check the last time Google received each kind of notification for a given URL.

Follow the Quickstart guide to see how the Indexing API works. If you have any questions, ask us in the Webmaster Help Forum. We look forward to hearing from you!

Sursa: Google Webmaster Central Blog

Official Google Webmaster Central Blog: How we fought webspam

Official Google Webmaster Central Blog: How we fought webspam





We always want to make sure that when you use Google Search to find information, you get the highest quality results. But, we are aware of many bad actors who are trying to manipulate search ranking and profit from it, which is at odds with our core mission: to organize the world’s information and make it universally accessible and useful. Over the years, we’ve devoted a huge effort toward combating abuse and spam on Search. Here’s a look at how we fought abuse in 2017.

We call these various types of abuse that violate the webmaster guidelines “spam.” Our evaluation indicated that for many years, less than 1 percent of search results users visited are spammy. In the last couple of years, we’ve managed to further reduce this by half.

Google webspam trends and how we fought webspam in 2017


As we continued to improve, spammers also evolved. One of the trends in 2017 was an increase in website hacking—both for spamming search ranking and for spreading malware. Hacked websites are serious threats to users because hackers can take complete control of a site, deface homepages, erase relevant content, or insert malware and harmful code. They may also record keystrokes, stealing login credentials for online banking or financial transactions. In 2017 we focused on reducing this threat, and were able to detect and remove from search results more than 80 percent of these sites. But hacking is not just a spam problem for search users—it affects the owners of websites as well. To help website owners keep their websites safe, we created a hands-on resource to help webmasters strengthen their websites’ security and revamped our help resources to help webmasters recover from a hacked website. The guides are available in 19 languages.
We’re also recognizing the importance of robust content management systems (CMSs). A large percentage of websites are run on one of several popular CMSs, and subsequently spammers exploited them by finding ways to abuse their provisions for user-generated content, such as posting spam content in comment sections or forums. We’re working closely with many of the providers of popular content management systems like WordPress and Joomla to help them also fight spammers that abuse their forums, comment sections and websites.

Another abuse vector is the manipulation of links, which is one of the foundation ranking signals for Search. In 2017 we doubled down our effort in removing unnatural links via ranking improvements and scalable manual actions. We have observed a year-over-year reduction of spam links by almost half.

Working with users and webmasters for a better web

We’re here to listen: Our automated systems are constantly working to detect and block spam. Still, we always welcome hearing from you when something seems … phishy. Last year, we were able to take action on nearly 90,000 user reports of search spam.

Reporting spam, malware and other issues you find helps us protect the site owner and other searchers from this abuse. You can file a spam report, a phishing report or a malware report. We very much appreciate these reports—a big THANK YOU to all of you who submitted them.


We also actively work with webmasters to maintain the health of the web ecosystem. Last year, we sent 45 million messages to registered website owners via Search Console letting them know about issues we identified with their websites. More than 6 million of these messages are related to manual actions, providing transparency to webmasters so they understand why their sites got manual actions and how to resolve the issue.

Last year, we released a beta version of a new Search Console to a limited number of users and afterwards, to all users of Search Console. We listened to what matters most to the users, and started with popular functionalities such as Search performance, Index Coverage and others. These can help webmasters optimize their websites’ Google Search presence more easily.

Through enhanced Safe Browsing protections, we continue to protect more users from bad actors online. In the last year, we have made significant improvements to our safe browsing protection, such as broadening our protection of macOS devices, enabling predictive phishing protection in Chrome, cracked down on mobile unwanted software, and launched significant improvements to our ability to protect users from deceptive Chrome extension installation.


We have a multitude of channels to engage directly with webmasters. We have dedicated team members who meet with webmasters regularly both online and in-person. We conducted more than 250 online office hours, online events and offline events around the world in more than 60 cities to audiences totaling over 220,000 website owners, webmasters and digital marketers. In addition, our official support forum has answered a high volume of questions in many languages. Last year, the forum had 63,000 threads generating over 280,000 contributing posts by 100+ Top Contributors globally. For more details, see this post. Apart from the forums, blogs and the SEO starter guide, the Google Webmaster YouTube channel is another channel to find more tips and insights. We launched a new SEO snippets video series to help with short and to-the-point answers to specific questions. Be sure to subscribe to the channel!


Despite all these improvements, we know we’re not yet done. We’re relentless in our pursue of an abuse-free user experience, and will keep improving our collaboration with the ecosystem to make it happen.

Posted by Cody Kwok, Principal Engineer

Sursa: Google Webmaster Central Blog