Subscribe: Google Webmaster Central Blog
http://googlewebmastercentral.blogspot.com/atom.xml
Preview: Google Webmaster Central Blog

Google Webmaster Central Blog



Official news on crawling and indexing sites for the Google index.



Updated: 2017-03-24T19:51:15.437-07:00

 



#NoHacked: A year in review

2017-03-20T08:03:47.732-07:00

We hope your year started out safe and secure!We wanted to share with you a summary of our 2016 work as we continue our #NoHacked campaign. Let’s start with some trends on hacked sites from the past year. State of Website Security in 2016First off, some unfortunate news. We’ve seen an increase in the number of hacked sites by approximately 32% in 2016 compared to 2015. We don’t expect this trend to slow down. As hackers get more aggressive and more sites become outdated, hackers will continue to capitalize by infecting more sites.On the bright side, 84% webmasters who do apply for reconsideration are successful in cleaning their sites. However, 61% of webmasters who were hacked never received a notification from Google that their site was infected because their sites weren't verified in Search Console. Remember to register for Search Console if you own or manage a site. It’s the primary channel that Google uses to communicate site health alerts.More Help for Hacked Webmasters We’ve been listening to your feedback to better understand how we can help webmasters with security issues. One of the top requests was easier to understand documentation about hacked sites. As a result we’ve been hard at work to make our documentation more useful.First, we created new documentation to give webmasters more context when their site has been compromised. Here is a list of the new help documentation:Top ways websites get hacked by spammersGlossary for Hacked SitesFAQs for Hacked SitesHow do I know if my site is hacked?Next, we created clean up guides for sites affected by known hacks. We’ve noticed that sites often get affected in similar ways when hacked. By investigating the similarities, we were able to create clean up guides for specific known type of hack. Below is a short description of each of the guides we created:Gibberish Hack: The gibberish hack automatically creates many pages with non-sensical sentences filled with keywords on the target site. Hackers do this so the hacked pages show up in Google Search. Then, when people try to visit these pages, they’ll be redirected to an unrelated page, like a porn site. Learn more on how to fix this type of hack.Japanese Keywords Hack: The Japanese keywords hack typically creates new pages with Japanese text on the target site in randomly generated directory names. These pages are monetized using affiliate links to stores selling fake brand merchandise and then shown in Google search. Sometimes the accounts of the hackers get added in Search Console as site owners. Learn more on how to fix this type of hack.Cloaked Keywords Hack: The cloaked keywords and link hack automatically creates many pages with non-sensical sentence, links, and images. These pages sometimes contain basic template elements from the original site, so at first glance, the pages might look like normal parts of the target site until you read the content. In this type of attack, hackers usually use cloaking techniques to hide the malicious content and make the injected page appear as part of the original site or a 404 error page. Learn more on how to fix this type of hack.Prevention is Key As always it’s best to take a preventative approach and secure your site rather than dealing with the aftermath. Remember a chain is only as strong as its weakest link. You can read more about how to identify vulnerabilities on your site in our hacked help guide. We also recommend staying up-to-date on releases and announcements from your Content Management System (CMS) providers and software/hardware vendors.Looking ForwardHacking behavior is constantly evolving, and research allows us to stay up to date on and combat the latest trends. You can learn about our latest research publications in the information security research site. Highlighted below are a few specific studies specific to website compromises:Cloak of Visibility: Detecting When Machines Browse a Different WebInvestigating Commercial Pay-Per-Install and the Distribution of Unwanted SoftwareUsers Really Do Plug in USB Drives They FindAd Injection at S[...]



Closing down for a day

2017-03-07T11:04:29.745-08:00

Note: This post is specific to Google's organic web-search. For Google's other services, please check with the appropriate help center (e.g., for Google Shopping) or help forum. Even in today's "always-on" world, sometimes businesses want to take a break. There are times when even their online presence needs to be paused. This blog post covers some of the available options so that a site's search presence isn't affected. Option: Block cart functionality If a site only needs to block users from buying things, the simplest approach is to disable that specific functionality. In most cases, shopping cart pages can either be blocked from crawling through the robots.txt file, or blocked from indexing with a robots meta tag. Since search engines either won't see or index that content, you can communicate this to users in an appropriate way. For example, you may disable the link to the cart, add a relevant message, or display an informational page instead of the cart. Option: Always show interstitial or pop-up If you need to block the whole site from users, be it with a "temporarily unavailable" message, informational page, or popup, the server should return a 503 HTTP result code ("Service Unavailable"). The 503 result code makes sure that Google doesn't index the temporary content that's shown to users. Without the 503 result code, the interstitial would be indexed as your website's content. Googlebot will retry pages that return 503 for up to about a week, before treating it as a permanent error that can result in those pages being dropped from the search results. You can also include a "Retry after" header to indicate how long the site will be unavailable. Blocking a site for longer than a week can have negative effects on the site's search results regardless of the method that you use. Option: Switch whole website off Turning the server off completely is another option. You might also do this if you're physically moving your server to a different data center. For this, have a temporary server available to serve a 503 HTTP result code for all URLs (with an appropriate informational page for users), and switch your DNS to point to that server during that time. Set your DNS TTL to a low time (such as 5 minutes) a few days in advance.Change the DNS to the temporary server's IP address. Take your main server offline once all requests go to the temporary server. … your server is now offline ...When ready, bring your main server online again. Switch DNS back to the main server's IP address.Change the DNS TTL back to normal. We hope these options cover the common situations where you'd need to disable your website temporarily. If you have any questions, feel free to drop by our webmaster help forums! PS If your business is active locally, make sure to reflect these closures in the opening hours for your local listings too! Posted by John Mueller, Webmaster Trends Analyst, Switzerland [...]



Introducing the Mobile-Friendly Test API

2017-01-31T04:53:37.944-08:00

With so many users on mobile devices, having a mobile-friendly web is important to us all. The Mobile-Friendly Test is a great way to check individual pages manually. We're happy to announce that this test is now available via API as well.

The Mobile-Friendly Test API lets you test URLs using automated tools. For example, you could use it to monitor important pages in your website in order to prevent accidental regressions in templates that you use. The API method runs all tests, and returns the same information - including a list of the blocked URLs - as the manual test. The documentation includes simple samples to help get you started quickly.

(image)

We hope this API makes it easier to check your pages for mobile-friendliness and to get any such issues resolved faster. We'd love to hear how you use the API -- leave us a comment here, and feel free to link to any code or implementation that you've set up! As always, if you have any questions, feel free to drop by our webmaster help forum.


(image)



Protect your site from user generated spam

2017-01-27T05:49:52.777-08:00

As a website owner, you might have come across some auto-generated content in comments sections or forum threads. When such content is created on your pages, not only does it disrupt those visiting your site, but it also shows some content that you may not want to be associated with your site to Google and other search engines.In this blog post, we will give you tips to help you deal with this type of spam in your site and forum. Some spammers abuse sites owned by others by posting deceiving content and links, in an attempt to get more traffic to their sites. Here are a few examples: Comments and forum threads can be a really good source of information and an efficient way of engaging a site's users in discussions. This valuable content should not be buried by auto-generated keywords and links placed there by spammers. There are many ways of securing your site’s forums and comment threads and making them unattractive to spammers:Keep your forum software updated and patched. Take the time to keep your software up-to-date and pay special attention to important security updates. Spammers take advantage of security issues in older versions of blogs, bulletin boards, and other content management systems. Add a CAPTCHA. CAPTCHAs require users to confirm that they are not robots in order to prove they're a human being and not an automated script. One way to do this is to use a service like reCAPTCHA, Securimage and  Jcaptcha . Block suspicious behavior. Many forums allow you to set time limits between posts, and you can often find plugins to look for excessive traffic from individual IP addresses or proxies and other activity more common to bots than human beings. For example, phpBB, Simple Machines, myBB, and many other forum platforms enable such configurations.Check your forum’s top posters on a daily basis. If a user joined recently and has an excessive amount of posts, then you probably should review their profile and make sure that their posts and threads are not spammy.Consider disabling some types of comments. For example, It’s a good practice to close some very old forum threads that are unlikely to get legitimate replies.If you plan on not monitoring your forum going forward and users are no longer interacting with it, turning off posting completely may prevent spammers from abusing it.Make good use of moderation capabilities. Consider enabling features in moderation that require users to have a certain reputation before links can be posted or where comments with links require moderation.If possible, change your settings so that you disallow anonymous posting and make posts from new users require approval before they're publicly visible.Moderators, together with your friends/colleagues and some other trusted users can help you review and approve posts while spreading the workload. Keep an eye on your forum's new users by looking on their posts and activities on your forum.  Consider blacklisting obviously spammy terms. Block obviously inappropriate comments with a blacklist of spammy terms (e.g. Illegal streaming or pharma related terms) . Add inappropriate and off-topic terms that are only used by spammers, learn from the spam posts that you often see on your forum or other forums. Built-in features or plugins can delete or mark comments as spam for you. Use the "nofollow" attribute for links in the comment field. This will deter spammers from targeting your site. By default, many blogging sites (such as Blogger) automatically add this attribute to any posted comments.Use automated systems to defend your site.  Comprehensive systems like Akismet, which has plugins for many blogs and forum systems are easy to install and do most of the work for you. For detailed information about these topics, check out our Help Center document on User Generated Spam and comment spam. You can also visit our Webmaster Central Help Forum if you need any help. Posted by Anouar Bendahou, Search Quality Strategist, Google Ireland [...]



What Crawl Budget Means for Googlebot

2017-01-27T05:50:18.679-08:00

Recently, we've heard a number of definitions for "crawl budget", however we don't have a single term that would describe everything that "crawl budget" stands for externally. With this post we'll clarify what we actually have and what it means for Googlebot. First, we'd like to emphasize that crawl budget, as described below, is not something most publishers have to worry about. If new pages tend to be crawled the same day they're published, crawl budget is not something webmasters need to focus on. Likewise, if a site has fewer than a few thousand URLs, most of the time it will be crawled efficiently.Prioritizing what to crawl, when, and how much resource the server hosting the site can allocate to crawling is more important for bigger sites, or those that auto-generate pages based on URL parameters, for example.Crawl rate limitGooglebot is designed to be a good citizen of the web. Crawling is its main priority, while making sure it doesn't degrade the experience of users visiting the site. We call this the "crawl rate limit," which limits the maximum fetching rate for a given site. Simply put, this represents the number of simultaneous parallel connections Googlebot may use to crawl the site, as well as the time it has to wait between the fetches. The crawl rate can go up and down based on a couple of factors: Crawl health: if the site responds really quickly for a while, the limit goes up, meaning more connections can be used to crawl. If the site slows down or responds with server errors, the limit goes down and Googlebot crawls less. Limit set in Search Console: website owners can reduce Googlebot's crawling of their site. Note that setting higher limits doesn't automatically increase crawling.Crawl demandEven if the crawl rate limit isn't reached, if there's no demand from indexing, there will be low activity from Googlebot. The two factors that play a significant role in determining crawl demand are: Popularity: URLs that are more popular on the Internet tend to be crawled more often to keep them fresher in our index. Staleness: our systems attempt to prevent URLs from becoming stale in the index. Additionally, site-wide events like site moves may trigger an increase in crawl demand in order to reindex the content under the new URLs. Taking crawl rate and crawl demand together we define crawl budget as the number of URLs Googlebot can and wants to crawl. Factors affecting crawl budgetAccording to our analysis, having many low-value-add URLs can negatively affect a site's crawling and indexing. We found that the low-value-add URLs fall into these categories, in order of significance: Faceted navigation and session identifiersOn-site duplicate contentSoft error pagesHacked pages Infinite spaces and proxies Low quality and spam content Wasting server resources on pages like these will drain crawl activity from pages that do actually have value, which may cause a significant delay in discovering great content on a site. Top questionsCrawling is the entry point for sites into Google's search results. Efficient crawling of a website helps with its indexing in Google Search. Q: Does site speed affect my crawl budget? How about errors?A: Making a site faster improves the users' experience while also increasing crawl rate. For Googlebot a speedy site is a sign of healthy servers, so it can get more content over the same number of connections. On the flip side, a significant number of 5xx errors or connection timeouts signal the opposite, and crawling slows down. We recommend paying attention to the Crawl Errors report in Search Console and keeping the number of server errors low. Q: Is crawling a ranking factor?A: An increased crawl rate will not necessarily lead to better positions in Search results. Google uses hundreds of signals to rank the results, and while crawling is necessary for being in the results, it's not a ranking signal. Q: Do alternate URLs and embedded content count in the [...]



Enhancing property sets to cover more reports in Search Console

2016-12-12T01:36:18.572-08:00

Since initially announcing property sets earlier this year, one of the most popular requests has been to expand this functionality to more sections of Search Console. Thanks to your feedback, we're now expanding property sets to more features!

(image)

Property sets help to show how your business is seen by Google across separate websites or apps. For example, if you have multiple international or brand-specific websites, and perhaps even an Android app, it can be useful to see changes of the whole set over time: are things headed in the expected direction? are there any outliers that you'd want to drill down into? Similarly, you could monitor your site's hreflang setup across different versions of the same website during a planned transition, such as when you move from HTTP to HTTPS, or change domains.

With Search Console's property sets, you can now just add any verified properties to a set, let the data collect, and then check out features like the mobile usability report, review your AMP implementation, double-check rich cards, or hreflang / internationalization markup, and more.


We hope these changes make it easier to understand your properties in Search Console. Should you have any questions - or if you just want to help others, feel free to drop by our Webmaster Help Forums.


(image)



An update on Google's feature-phone crawling & indexing

2016-11-30T04:36:07.784-08:00

Limited mobile devices, "feature-phones", require a special form of markup or a transcoder for web content. Most websites don't provide feature-phone-compatible content in WAP/WML any more. Given these developments, we've made changes in how we crawl feature-phone content (note: these changes don't affect smartphone content):

1. We've retired the feature-phone Googlebot

We won't be using the feature-phone user-agents for crawling for search going forward.

2. Use "handheld" link annotations for dynamic serving of feature-phone content.

Some sites provide content for feature-phones through dynamic serving, based on the user's user-agent. To understand this configuration, make sure your desktop and smartphone pages have a self-referential alternate URL link for handheld (feature-phone) devices:

This is a change from our previous guidance of only using the "vary: user-agent" HTTP header. We've updated our documentation on making feature-phone pages accordingly. We hope adding this link element is possible on your side, and thank you for your help in this regard. We'll continue to show feature-phone URLs in search when we can recognize them, and when they're appropriate for users.

3. We're retiring feature-phone tools in Search Console

Without the feature-phone Googlebot, special sitemaps extensions for feature-phone, the Fetch as Google feature-phone options, and feature-phone crawl errors are no longer needed. We continue to support sitemaps and other sitemaps extensions (such as for videos or Google News), as well as the other Fetch as Google options in Search Console.


We've worked to make these changes as minimal as possible. Most websites don't serve feature-phone content, and wouldn't be affected. If your site has been providing feature-phone content, we thank you for your help in bringing the Internet to feature-phone users worldwide!

For any questions, feel free to drop by our Webmaster Help Forums!

(image)



Saying goodbye to Content Keywords

2016-11-29T06:29:44.346-08:00

In the early days - back when Search Console was still called Webmaster Tools - the content keywords feature was the only way to see what Googlebot found when it crawled a website. It was useful to see that Google was able to crawl your pages at all, or if your site was hacked.

In the meantime, you can easily check any page on your website and see how Googlebot fetches it immediately, Search Analytics shows you which keywords we've shown your site in search for, and Google informs you of many kinds of hacks automatically. Additionally, users were often confused about the keywords listed in content keywords. And so, the time has come to retire the Content Keywords feature in Search Console.

The words on your pages, the keywords if you will, are still important for Google's (and your users') understanding of your pages. While our systems have gotten better, they can't read your mind: be clear about what your site is about, and what you'd like to be found for. Tell visitors what makes your site, your products and services, special!

What was your most surprising, or favorite, keyword shown? Let us know in the comments!

(image)



Rich Cards expands to more verticals

2016-11-21T14:47:31.068-08:00

At Google I/O in May, we launched Rich Cards for Movies and Recipes, creating a new way for site owners to present previews of their content on the Search results page. Today, we’re expanding to two new verticals for US-based sites: Local restaurants and Online courses. Evolution of search results for queries like [best New Orleans restaurants] and [leadership courses]: with rich cards, results are presented in new UIs, like carousels that are easy to browse by scrolling left and right, or a vertical three-pack that displays more individual courses By building Rich Cards, you have a new opportunity to attract more engaged users to your page. Users can swipe through restaurant recommendations from sites like TripAdvisor, Thrillist, Time Out, Eater, and 10Best. In addition to food, users can browse through courses from sites like Coursera, LinkedIn Learning, EdX, Harvard, Udacity, FutureLearn, Edureka, Open University, Udemy, Canvas Network, and NPTEL. If you have a site that contains local restaurant information or offers online courses, check out our developer docs to start building Rich Cards in the Local restaurant and Online courses verticals. While AMP HTML is not required for Local restaurant pages and Online Courses rich cards, AMP provides Google Search users with a consistently fast experience, so we recommend that you create AMP pages to further engage users. Users consuming AMP’d content will be able to swipe near instantly from restaurant to restaurant or from recipe to recipe within your site. Users who tap on your Rich Card will be taken near instantly to your AMP page, and be able to swipe between pages within your site. Check out our developer site for implementation details. To make it easier for you to create Rich Cards, we made some changes in our tools: The Structured Data Testing Tool displays markup errors and a preview card for Local restaurant content as it might appear on Search. The Rich Cards report in Search Console shows which cards across verticals contain errors, and which ones could be enhanced with more markup. The AMP Test helps validate AMP pages as well as mark up on the page. What’s next? We are actively experimenting with new verticals globally to provide more opportunities for you to display richer previews of your content. If you have questions, find us in the dedicated Structured data section of our forum, on Twitter or on Google+. Post by Stacie Chan, Global Product Partnerships [...]



Building Indexable Progressive Web Apps

2016-11-09T04:42:17.074-08:00

Progressive Web Apps (PWAs) are taking advantage of new technologies to bring the best of mobile sites and native applications to users -- and they’re one of the most exciting new ideas on the web. But to truly have an impact, it's important that they’re indexable and linkable. Every recommendation presented in this article is an existing best practice for indexability -- regardless of whether you're building a Progressive Web App or a simple static website. Nonetheless, we have collated these best practices to provide a checklist to guide you: Make Your Content CrawlableWhy? Historically, websites would always generate or render their HTML on the server which is the simplest way to ensure your content is directly linkable. Web applications popularised the concept of client-side rendering in which content is updated dynamically on the page as the users navigates without requiring the page to be reloaded.The modern approach is hybrid rendering, in which server-side rendering is used when a user navigates directly to a URL and client-side rendering is used after the initial page load for subsequent navigation and asynchronous requests.Our server-side PWA sample demonstrates pure server-side rendering, while our hybrid PWA sample demonstrates the combined approach. If you are unfamiliar with the server-side and client-side rendering terminology, check out these articles on the web read here and here. .boxbox { width: 100%; word-wrap:break-word; padding: 0.2em; } .badbox { background-color: #eba; } .goodbox { background-color: #ded; } .avoidbox { background-color: #ffd; } .boxbox h5 { font-size: 1em; font-weight: bold; margin: 0;} .boxbox p { margin-top: 0.6em; margin-bottom: 0.6em; } br.endboxen { clear: both; } h3.subhead { margin-top: 2em; } Best Practice:Use server-side or hybrid rendering so users receive the content in the initial payload of their web request.Always ensure your URLs are independently accessible:https://www.example.com/product/25/The above should deep link to that particular resource.If you can’t support server-side or hybrid rendering for your Progressive Web App and you decide to use client-side rendering, we recommend using the Google Search Console “Fetch as Google tool” to verify your content successfully renders for our search crawler. Don’t:Don't redirect users accessing deep links back to your web app's homepage.Additionally, serving an error page to users instead of deep linking should also be avoided. Provide Clean URLsWhy? Fragment identifiers (#user/24601/ or #!user/24601/) were an effective workaround for browsers to AJAX new content from a server without reloading the page. This design is known as client-side rendering.However, the fragment identifier syntax isn’t compatible with some web tools, frameworks and protocols such as Facebook’s Open Graph protocol. The History API enables us to update the URL without fragment identifiers while still fetching resources asynchronously and therefore avoiding page reloads -- it’s the best of both worlds. The AJAX crawling scheme (with its #! / escaped-fragment URLs) made sense at its time, but is now no longer recommended. Our hybrid PWA and client-side PWA samples demonstrate the History API. Best Practice:Provide clean URLs without fragment identifiers (# or #!) such as:https://www.example.com/product/25/If using client-side or hybrid rendering be sure to support browser navigation with the History API. Avoid:Using the #! URL structure to drive unique URLs is discouraged:https://www.example.com/#!product/25/It was introduced as a workaround before the advent of the History API. It is considered a separate pattern to the purely # URL structure. Don’t:Using the # URL structure without the accompanying ! symbol is unsupported:https://www.example.com/#product/25/This URL structur[...]



Mobile-first Indexing

2016-11-04T10:12:55.616-07:00

Today, most people are searching on Google using a mobile device. However, our ranking systems still typically look at the desktop version of a page’s content to evaluate its relevance to the user. This can cause issues when the mobile page has less content than the desktop page because our algorithms are not evaluating the actual page that is seen by a mobile searcher. To make our results more useful, we’ve begun experiments to make our index mobile-first. Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we're going to continue to build a great search experience for all users, whether they come from mobile or desktop devices. We understand this is an important shift in our indexing and it’s one we take seriously. We’ll continue to carefully experiment over the coming months on a small scale and we’ll ramp up this change when we’re confident that we have a great user experience. Though we’re only beginning this process, here are a few recommendations to help webmasters prepare as we move towards a more mobile-focused index. If you have a responsive site or a dynamic serving site where the primary content and markup is equivalent across mobile and desktop, you shouldn’t have to change anything.If you have a site configuration where the primary content and markup is different across mobile and desktop, you should consider making some changes to your site.Make sure to serve structured markup for both the desktop and mobile version.Sites can verify the equivalence of their structured markup across desktop and mobile by typing the URLs of both versions into the Structured Data Testing Tool and comparing the output. When adding structured data to a mobile site, avoid adding large amounts of markup that isn’t relevant to the specific information content of each document.Use the robots.txt testing tool to verify that your mobile version is accessible to Googlebot. Sites do not have to make changes to their canonical links; we’ll continue to use these links as guides to serve the appropriate results to a user searching on desktop or mobile.If you are a site owner who has only verified their desktop site in Search Console, please add and verify your mobile version.If you only have a desktop site, we'll continue to index your desktop site just fine, even if we're using a mobile user agent to view your site.If you are building a mobile version of your site, keep in mind that a functional desktop-oriented site can be better than a broken or incomplete mobile version of the site. It's better for you to build up your mobile site and launch it when ready.   If you have any questions, feel free to contact us via the Webmaster forums or our public events. We anticipate this change will take some time and we’ll update you as we make progress on migrating our systems. Posted by Doantam Phan, Product Manager [...]



Here’s to more HTTPS on the web!

2016-11-04T00:49:10.092-07:00

Cross-posted from the Google Security Blog. Security has always been critical to the web, but challenges involved in site migration have inhibited HTTPS adoption for several years. In the interest of a safer web for all, at Google we’ve worked alongside many others across the online ecosystem to better understand and address these challenges, resulting in real change. A web with ubiquitous HTTPS is not the distant future. It’s happening now, with secure browsing becoming standard for users of Chrome.Today, we’re adding a new section to the HTTPS Report Card in our Transparency Report that includes data about how HTTPS usage has been increasing over time. More than half of pages loaded and two-thirds of total time spent by Chrome desktop users occur via HTTPS, and we expect these metrics to continue their strong upward trajectory.Percentage of pages loaded over HTTPS in ChromeAs the remainder of the web transitions to HTTPS, we’ll continue working to ensure that migrating to HTTPS is a no-brainer, providing business benefit beyond increased security. HTTPS currently enables the best performance the web offers and powerful features that benefit site conversions, including both new features such as service workers for offline support and web push notifications, and existing features such as credit card autofill and the HTML5 geolocation API that are too powerful to be used over non-secure HTTP. As with all major site migrations, there are certain steps webmasters should take to ensure that search ranking transitions are smooth when moving to HTTPS. To help with this, we’ve posted two FAQs to help sites transition correctly, and will continue to improve our web fundamentals guidance.We’ve seen many sites successfully transition with negligible effect on their search ranking and traffic. Brian Wood, Director of Marketing SEO at Wayfair, a large retail site, commented: “We were able to migrate Wayfair.com to HTTPS with no meaningful impact to Google rankings or Google organic search traffic. We are very pleased to say that all Wayfair sites are now fully HTTPS.” CNET, a large tech news site, had a similar experience: “We successfully completed our move of CNET.com to HTTPS last month,” said John Sherwood, Vice President of Engineering & Technology at CNET. “Since then, there has been no change in our Google rankings or Google organic search traffic.”Webmasters that include ads on their sites also should carefully monitor ad performance and revenue during large site migrations. The portion of Google ad traffic served over HTTPS has increased dramatically over the past 3 years. All ads that come from any Google source always support HTTPS, including AdWords, AdSense, or DoubleClick Ad Exchange; ads sold directly, such as those through DoubleClick for Publishers, still need to be designed to be HTTPS-friendly. This means there will be no change to the Google-sourced ads that appear on a site after migrating to HTTPS. Many publishing partners have seen this in practice after a successful HTTPS transition. Jason Tollestrup, Director of Programmatic Advertising for the Washington Post, “saw no material impact to AdX revenue with the transition to SSL.”As migrating to HTTPS becomes even easier, we’ll continue working towards a web that’s secure by default. Don’t hesitate to start planning your HTTPS migration today! Posted by Adrienne Porter Felt and Emily Schechter, Chrome Security Team [...]



Using AMP? Try our new webpage tester

2016-10-13T11:59:39.885-07:00

Accelerated Mobile Pages (AMP) is a great way to make content on your website accessible in an extremely fast way. To help ensure that your AMP implementation is working as expected , Search Console now has an enhanced AMP testing tool.

(image)

This testing tool is mobile-friendly and uses Google's live web-search infrastructure to analyze the AMP page with the real Googlebot. The tool tests the validity of the AMP markup as well as any structured data on the page. If issues are found, click on them to see details, and to have the line in the source-code highlighted. For valid AMP pages, we may also provide a link to a live preview of how this page may appear in Google's search results.

(image)

With the share button on the bottom right, you can now share a snapshot of the results that you're currently seeing with others. This makes it easier to discuss issues with your team, whether they're regular occurrences or one-time quirks that you need to iron out. Just click the share button and pass on the URL for this test snapshot. This share feature is now also available in the mobile-friendly testing tool.

We hope this tool makes it easier to create great AMP’d content while finding and resolving issues that may appear on your AMP pages. For any questions, feel free to drop by our webmaster's help forum.



(image)



Webmaster Forums Top AMP Questions

2016-09-30T02:13:17.427-07:00

It has been busy here at Google Webmaster Central over the last few weeks, covering a lot of details about Accelerated Mobile Pages that we hope you have found useful. The topics have included:What is AMP?How to get started with Accelerated Mobile PagesHow can Google Search Console help you AMPlify your siteHow to best evaluate issues with your Accelerated Mobile PagesTop 8 things to consider when you AMPlify a siteHow to set up Analytics on your AMP pageHow to set up Ads on your AMP pageWe’ve also been seeing a few AMP questions coming to the Webmaster forums about getting started using AMP on Google Search. To help, we’ve compiled some of the most common questions we’ve seen: Q: I’m considering creating AMP pages for my website. What is the benefit? What types of sites and pages is AMP for?Users love content that loads fast and without any fuss - using the AMP format may make it more compelling for people to consume and engage with your content on mobile devices. Research has shown that 40% of users abandon a site that takes more than three seconds to load. The Washington Post observed an 88% decrease in article loading time and a 23% increase in returning users from mobile search from adopting AMP. The AMP format is great for all types of static web content such as news, recipes, movie listings, product pages, reviews, videos, blogs and more. Q: We are getting errors logged in Search Console for AMP pages; however, we already fixed these issues. Why are we still seeing errors?The short answer is that changes to your AMP pages take about a week to be updated in Search Console. For a more in-depth answer on why, Google’s Webmaster Trends Analyst John Mueller shared a detailed post on Search Console latency challenges. Q: Our AMP pages are not showing up on Google Search. What should we do?Only valid AMP pages will be eligible to show on Google Search. Check the validity of your  AMP pages by using the AMP HTML Web Validator, the Chrome or Opera Extension or through a more automated process such as a cron job to make sure all new content is valid. While it’s good practise overall to include schema.org structured data in your AMP pages (we recommend JSON-LD), it's especially important for news publishers. News content that includes valid markup properties are eligible to be shown within the Top Stories section in Google Search results. To test your structured data, try using the structured data testing tool. If you have more questions that are not answered here, share your feedback in the comments below or on our Google Webmasters Google+ page. Or as usual, feel free to post in our Webmasters Help Forum. Posted by Tomo Taylor, AMP Community Manager [...]



Penguin is now part of our core algorithm

2016-09-23T05:03:55.936-07:00

Google's algorithms rely on more than 200 unique signals or "clues" that make it possible to surface what you might be looking for. These signals include things like the specific words that appear on websites, the freshness of content, your region and PageRank. One specific signal of the algorithms is called Penguin, which was first launched in 2012 and today has an update.

After a period of development and testing, we are now rolling out an update to the Penguin algorithm in all languages. Here are the key changes you'll see, which were also among webmasters' top requests to us:

  • Penguin is now real-time. Historically, the list of sites affected by Penguin was periodically refreshed at the same time. Once a webmaster considerably improved their site and its presence on the internet, many of Google's algorithms would take that into consideration very fast, but others, like Penguin, needed to be refreshed. With this change, Penguin's data is refreshed in real time, so changes will be visible much faster, typically taking effect shortly after we recrawl and reindex a page. It also means we're not going to comment on future refreshes.
  • Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site. 

The web has significantly changed over the years, but as we said in our original post, webmasters should be free to focus on creating amazing, compelling websites. It's also important to remember that updates like Penguin are just one of more than 200 signals we use to determine rank.

As always, if you have feedback, you can reach us on our forums, Twitter and Google+.

(image)



8 tips to AMPlify your clients

2016-09-21T02:17:40.347-07:00

Here is our list of the top 8 things to consider when helping your clients AMPlify their websites (and staying ahead of their curiosity!) after our announcement to expand support for Accelerated Mobile Pages. Getting started can be simpleIf a site uses a popular Content Management System (CMS), getting AMP pages up and running is as straightforward as installing a plug-in. Sites that use custom HTML or that are built from scratch will require additional development resources. Not all types of sites are suitableAMP is great for all types of static web content such as news, recipes, movie listings, product pages, reviews, videos, blogs and more. AMP is less useful for single-page apps that are heavy on dynamic or interactive features, such as route mapping, email or social networks. You don’t have to #AMPlify the whole siteAdd AMP to a client's existing site progressively by starting with simple, static content pages like articles, products, or blog posts. These are the “leaf” pages that users access through platforms and search results, and could be simple changes that also bring the benefits of AMP to the website. This approach allows you to keep the homepage and other “browser” pages that might require advanced, non-AMP dynamic functionality. If you're creating a new, content-heavy website from scratch, consider building the whole site with AMP from the start. To begin with, check out the getting started guidelines. The AMP Project is open source and still evolvingIf a site's use case is not supported in the AMP format yet, consider filing a feature request on GitHub, or you could even design a component yourself. AMP pages might need to meet additional requirements to show up in certain placesIn order to appear in Google’s search results, AMP pages need only be valid AMP HTML. Some products integrating AMP might have further requirements than the AMP validation. For example, you'll need to mark up your AMP pages as Article markup with Structured Data to make them eligible for the Google Top Stories section. There is no ranking change on SearchWhether a page or site has valid and eligible AMP pages has no bearing on the site’s ranking on the Search results page. The difference is that web results that have AMP versions will be labeled with an icon. AMP on Google is expanding globallyAMP search results on Google will be rolling out worldwide when it launches in the coming weeks. The Top Stories carousel which shows newsy and fresh AMP content is already available in a number of countries and languages. Help is on handThere’s a whole host of useful resources that will help if you have any questions: Webmasters Help Forum: Ask questions about AMP and Google’s implementation of AMPStack Overflow: Ask technical questions about AMPGitHub: Submit a feature request or contribute What are your top tips to #AMPlify pages? Let us know in the comments below or on our Google Webmasters Google+ page. Or as usual, if you have any questions or need help, feel free to post in our Webmasters Help Forum. Posted by Tomo Taylor, AMP Community Manager [...]



How to best evaluate issues with your Accelerated Mobile Pages

2016-09-19T02:15:20.955-07:00

As you #AMPlify your site with Accelerated Mobile Pages, it’s important to keep an eye periodically on the validation status of your pages, as only valid AMP pages are eligible to show on Google Search. When implementing AMP, sometimes pages will contain errors causing them to not be indexed by Google Search. Pages may also contain warnings that are elements that are not best practice or are going to become errors in the future. Google Search Console is a free service that lets you check which of your AMP pages Google has identified as having errors. Once you know which URLs are running into issues, there are a few handy tools that can make checking the validation error details easier. 1. Browser Developer Tools To use Developer Tools for validation:Open your AMP page in your browserAppend "#development=1" to the URL, for example, http://localhost:8000/released.amp.html#development=1.Open the Chrome DevTools console and check for validation errors.Developer Console errors will look similar to this: 2. AMP Browser Extensions With the AMP Browser Extensions (available for Chrome and Opera), you can quickly identify and debug invalid AMP pages. As you browse your site, the extension will evaluate each AMP page visited and give an indication of the validity of the page. When there are errors within an AMP page, the extension’s icon shows in a red color and displays the number of errors encountered.When there are no errors within an AMP page, the icon shows in a green color and displays the number of warnings, if any exist.When the page isn’t AMP but the page indicates that an AMP version is available, the icon shows in a blue color with a link icon, and clicking on the extension will redirect the browser to the AMP version. Using the extensions means you can see what errors or warnings the page has by clicking on the extension icon. Every issue will list the source line, source column, and a message indicating what is wrong. When a more detailed description of the issue exists, a “Learn more” link will take you to the relevant page on ampproject.org. 3. AMP Web Validator The AMP Web Validator, available at validator.ampproject.org, provides a simple web UI to test the validity of your AMP pages. To use the tool, you enter an AMP URL, or copy/paste your source code, and the web validator displays error messages between the lines. You can make edits directly in the web validator which will trigger revalidation, letting you know if your proposed tweaks will fix the problem. What's your favourite way to check the status of your AMP Pages? Share your feedback in the comments below or on our Google Webmasters Google+ page. Or as usual, if you have any questions or need help, feel free to post in our Webmasters Help Forum.Posted by Tomo Taylor, AMP Community Manager [...]



How can Google Search Console help you AMPlify your site?

2016-10-19T03:56:19.755-07:00

If you have recently implemented Accelerated Mobile Pages on your site, it’s a great time to check which of your AMP pages Google has found and indexed by using Search Console. Search Console is a free service that helps you monitor and maintain your site's presence in Google Search, including any Accelerated Mobile Pages. You don't have to sign up for Search Console for your AMP pages to be included in Google Search results, but doing so can help you understand which of your AMP pages are eligible to show in search results. To get started with Search Console, create a free account or sign in here and validate the ownership of your sites. Once you have your site set up on Search Console, open the Accelerated Mobile Pages report under Search Appearance > Accelerated Mobile Pages to see which AMP pages Google has found and indexed on your site, as shown here: The report lists AMP-related issues for AMP pages that are not indexed, so that you can identify and address them. Search Console also lets you monitor the performance of your AMP pages on Google Search in the Search Analytics report. This report tells you which queries show your AMP pages in Search results, lets you compare how their metrics stack against your other results and see how the visibility of your AMP pages has changed over time. To view your AMP page metrics, such as clicks or impressions, select Search Appearance > Search Analytics > Filter by AMP. (Note: if you’ve only just created your Search Console account or set up your AMP pages and they have not been detected yet, remember that Google crawls pages only periodically. You can wait for the scheduled regular recrawl, or you can request a recrawl.) Have you been using Search Console to monitor your AMP pages? Give us feedback in the comments below or on our Google Webmasters Google+ page. Or as usual, if you have any questions or need help, feel free to post in our Webmasters Help Forum. UPDATE: To help ensure that your AMP implementation is working as expected, Search Console now has an enhanced AMP testing tool. Posted by Tom Taylor, AMP Community Manager [...]



How to get started with Accelerated Mobile Pages

2016-10-12T00:32:07.614-07:00

(image)

Interested in Accelerated Mobile Pages but not sure how to get started? AMPlifying your site for lightning speed might be easier than you think.

If you use a Content Management System (CMS) like WordPress, Drupal, or Hatena, getting set up on AMP is as simple as installing and activating a plug-in. Each CMS has a slightly different approach to AMPlifying pages, so it’s worth checking with your provider on how to get started.

On the other hand, if your site uses custom HTML, or you want to learn how AMP works under the hood, then check out the AMP Codelab for a guided, hands-on coding experience designed to take you through the process of developing your first pages. The Codelab covers the fundamentals:

  • How AMP improves the user experience of the mobile web
  • The foundations of an AMP page
  • AMP limitations
  • How AMP web components solve common problems
  • How to validate your AMP pages
  • How to prepare your AMP pages for Google Search

Once you are done with the basics, why not geek out with the Advanced Concepts Codelab?

Have you tried the Codelabs or added an AMP plugin to your site? Share your feedback in the comments below or on our Google Webmasters Google+ page. Or as usual, if you have any questions or need help, feel free to post in our Webmasters Help Forum.


(image)



What is AMP?

2016-09-12T02:18:28.235-07:00

(image)

Users today expect mobile websites to load super fast. The reality is that it can often take several seconds. It is no surprise that 40% of people abandon a website that takes more than 3 seconds to load. To reduce the time content takes to get to a user’s mobile device we started working on the Accelerated Mobile Pages Project, an open source initiative to improve the mobile web experience for everyone.


Accelerated Mobile Pages are HTML pages that take advantage of various technical approaches to prioritize speed and a faster experience for users by loading content almost instantaneously.

Later this year, all types of sites that create AMP pages will have expanded exposure across the entire Google Mobile Search results page, like e-commerce, entertainment, travel, recipe sites and many more. Visit the “Who” page on AMPProject.org for a flavour of some of the sites already creating AMP content and try the demo at (g.co/ampdemo) to see AMP versions of pages labeled with (image) .

(image) (image)

In advance of AMP expanding in Google Search, over the next few weeks we’ll be posting pointers to help you #AMPlify your site. Follow along with the #AMPlify hashtag on G+ and Twitter.


Have you already built AMP pages for your site? Share your feedback in the comments below or on our Google Webmasters Google+ page. Or as usual, if you have any questions or need help, feel free to post in our Webmasters Help Forum.

(image)



A reminder about widget links

2016-09-08T01:29:09.714-07:00

Google has long taken a strong stance against links that manipulate a site’s PageRank. Today we would like to reiterate our policy on the creation of keyword-rich, hidden or low-quality links embedded in widgets that are distributed across various sites. Widgets can help website owners enrich the experience of their site and engage users. However, some widgets add links to a site that a webmaster did not editorially place and contain anchor text that the webmaster does not control. Because these links are not naturally placed, they're considered a violation of Google Webmaster Guidelines. Below you can find the examples of widgets which contain links that violate Google Webmaster Guidelines: Google's webspam team may take manual actions on unnatural links. When a manual action is taken, Google will notify the site owners through Search Console. If you receive such a warning for unnatural links to your site and you use links in widgets to promote your site, we recommend resolving these issues and requesting reconsideration. You can resolve issues with unnatural links by making sure they don't pass PageRank. To do this, add a rel="nofollow" attribute on the widget links or remove the links entirely. After fixing or removing widget links and any other unnatural links to your site, let Google know about your change by submitting a reconsideration request in Search Console. Once the request has been reviewed, you'll get a notification about whether the reconsideration request was successful or not. Also, we would like to remind webmasters who use widgets on their sites to check those widgets for any unnatural links. Add a rel="nofollow" attribute on those unnatural links or remove the links entirely from the widget. For more information, please watch our video about widget links and refer to our Webmaster Guidelines on Link Schemes. Additionally, feel free to ask questions in our Webmaster Help Forums, where a community of webmasters can help with their experience.Posted by Agnieszka Łata, Trust & Safety Search Team and Eric Kuan, Webmaster Relations Specialist [...]



Showcase your site’s reviews in Search

2016-09-07T06:53:22.204-07:00

Today, we’re introducing Reviews from the web to local Knowledge Panels, to accompany our recently launched best-of lists and critic reviews features. Whether your site publishes editorial critic reviews, best-of places lists, or aggregates user ratings, this content can be featured in local Knowledge Panels when users are looking for places to go. Reviews from the webAvailable globally on mobile and desktop, Reviews from the web brings aggregated user ratings of up to three review sites to Knowledge Panels for local places across many verticals including shops, restaurants, parks and more. By implementing review snippet markup and meeting our criteria, your site’s user-generated composite ratings will be eligible for inclusion. Add the Local Business markup to help Google match reviews to the right review subject and help grow your site’s coverage. For more information on the guidelines for the Reviews from the web, critic review and top places lists features, check out our developer site. Critic reviewsIn the U.S. on mobile and desktop, qualifying publishers can participate in the critic review feature in local Knowledge Panels. Critic reviews possess an editorial tone of voice and have an opinionated position on the local business, coming from an editor or on-the-ground expert. For more information on how to participate, see the details in our critic reviews page. The local information across Google Search helps millions of people, every day, discover and share great places. If you have any questions, please visit our webmaster forums.Posted by Ronnie Falcon, Product Manager [...]



More Safe Browsing Help for Webmasters

2017-02-13T05:29:24.346-08:00

(Crossposted from the Google Security Blog.) For more than nine years, Safe Browsing has helped webmasters via Search Console with information about how to fix security issues with their sites. This includes relevant Help Center articles, example URLs to assist in diagnosing the presence of harmful content, and a process for webmasters to request reviews of their site after security issues are addressed. Over time, Safe Browsing has expanded its protection to cover additional threats to user safety such as Deceptive Sites and Unwanted Software.To help webmasters be even more successful in resolving issues, we’re happy to announce that we’ve updated the information available in Search Console in the Security Issues report. The updated information provides more specific explanations of six different security issues detected by Safe Browsing, including malware, deceptive pages, harmful downloads, and uncommon downloads. These explanations give webmasters more context and detail about what Safe Browsing found. We also offer tailored recommendations for each type of issue, including sample URLs that webmasters can check to identify the source of the issue, as well as specific remediation actions webmasters can take to resolve the issue.We on the Safe Browsing team definitely recommend registering your site in Search Console even if it is not currently experiencing a security issue. We send notifications through Search Console so webmasters can address any issues that appear as quickly as possible.Our goal is to help webmasters provide a safe and secure browsing experience for their users. We welcome any questions or feedback about the new features on the Google Webmaster Help Forum, where Top Contributors and Google employees are available to help.For more information about Safe Browsing’s ongoing work to shine light on the state of web security and encourage safer web security practices, check out our summary of trends and findings on the Safe Browsing Transparency Report. If you’re interested in the tools Google provides for webmasters and developers dealing with hacked sites, this video provides a great overview. Posted by Kelly Hope Harrington, Safe Browsing Team [...]



Helping users easily access content on mobile

2017-01-10T03:31:31.594-08:00

January 10, 2017 update: Starting today, pages where content is not easily accessible to a user on the transition from the mobile search results may not rank as high. As we said, this new signal is just one of hundreds of signals that are used in ranking and the intent of the search query is still a very strong signal, so a page may still rank highly if it has great, relevant content. Please head to the webmaster forums if you have any questions. In Google Search, our goal is to help users quickly find the best answers to their questions, regardless of the device they’re using. Today, we’re announcing two upcoming changes to mobile search results that make finding content easier for users. Simplifying mobile search results Two years ago, we added a mobile-friendly label to help users find pages where the text and content was readable without zooming and the tap targets were appropriately spaced. Since then, we’ve seen the ecosystem evolve and we recently found that 85% of all pages in the mobile search results now meet this criteria and show the mobile-friendly label. To keep search results uncluttered, we’ll be removing the label, although the mobile-friendly criteria will continue to be a ranking signal. We’ll continue providing the mobile usability report in Search Console and the mobile-friendly test to help webmasters evaluate the effect of the mobile-friendly signal on their pages. Helping users find the content they’re looking for Although the majority of pages now have text and content on the page that is readable without zooming, we’ve recently seen many examples where these pages show intrusive interstitials to users. While the underlying content is present on the page and available to be indexed by Google, content may be visually obscured by an interstitial. This can frustrate users because they are unable to easily access the content that they were expecting when they tapped on the search result. Pages that show intrusive interstitials provide a poorer experience to users than other pages where content is immediately accessible. This can be problematic on mobile devices where screens are often smaller. To improve the mobile search experience, after January 10, 2017, pages where content is not easily accessible to a user on the transition from the mobile search results may not rank as highly. Here are some examples of techniques that make content less accessible to a user:Showing a popup that covers the main content, either immediately after the user navigates to a page from the search results, or while they are looking through the page.Displaying a standalone interstitial that the user has to dismiss before accessing the main content.Using a layout where the above-the-fold portion of the page appears similar to a standalone interstitial, but the original content has been inlined underneath the fold. Examples of interstitials that make content less accessible An example of an intrusive popup An example of an intrusive standalone interstitial Another example of an intrusive standalone interstitial   By contrast, here are some examples of techniques that, used responsibly, would not be affected by the new signal:Interstitials that appear to be in response to a legal obligation, such as for cookie usage or for age verification.Login dialogs on sites where content is not publicly indexable. For example, this would include private content such as email or unindexable co[...]



Promote your local businesses reviews with schema.org markup

2016-08-12T09:00:35.796-07:00

Since the launch of critic reviews last year, we have been focused on supporting more types of reviews, like restaurant reviews, cafes, or any other type of a local business. Recently we’ve announced the availability of critic reviews for local businesses. By incorporating structured data to their sites, publishers can promote their content on local Knowledge Graph cards and users can enjoy a range of reviews and opinions.

Critic reviews are available across mobile, tablet and desktop, allowing publishers to increase the visibility of their reviews and expose their reviews to new audiences, whenever a local Knowledge Graph card is surfaced. English reviews for businesses in the US are already supported and we’ll very soon support many other languages and countries.

Publishers with critic reviews for local entities can get up and running by selecting snippets of reviews from their sites and annotating them and the associated business with schema.org markup. This process, detailed in our critic reviews markup instructions, allows publishers to communicate to Google which snippet they prefer, what URL is associated with the review and other metadata about the local business that allows us to ensure that we’re showing the right review for the right entity.

Google can understand a variety of markup formats, including the JSON-LD data format, which makes it easier than ever to incorporate structured data about reviews into webpages! Publishers can get started here. And as always, if you have any questions, please visit our webmaster forums.

(image)