Showing posts with label Google Webmaster Tools. Show all posts
Showing posts with label Google Webmaster Tools. Show all posts

Become a Leading SEO Mechanic with Both Google & Bing Webmaster Tools

Webmaster Tools offerings from both Google and Bing can offer a wealth of insight to business owners. In order to get the whole spectrum of insights, marketers must learn just what they can do with both Google and Bing Webmaster tools. Using both together allows you greater insight into the factors contributing to the success—or lack thereof—of your SEO strategy.

Internet Marketing Ninjas COO Chris Boggs and Grant Simmons, director of SEO and social product at The Search Agency, shared their advice on better integrating data from Google Webmaster and Bing Webmaster Tools earlier this year at SES San Francisco.

Google Webmaster Tools: Proactively Monitor and Have a Plan in Place to React (P.R.E.P.A.R.E)

Internet Marketing Ninjas COO/CMO and SEMPO Chairman Chris Boggs started the presentation with the topic everyone really wanted to hear: Google Webmaster Tools (GWT). He started with SEO diagnostic principles and explained that you need to be both proactive and reactive when monitoring SEO. Marketers need to have a plan as well as the ability to manage from a reactive perspective, he said. If you come across something in your diagnoses, your analytics are going to be a good second opinion. Without tools, it’s just a guessing game.

Once you have this in mind, you can start digging into GWT by focusing on a few things first:

1. Quick Barometers
Boggs referred to the “Brand 7 Pack” as a company’s homepage and six sitelinks that appear in search results. If you don’t have seven, you have an SEO problem, he said. Your social entities such as Google+ should also be ranking, with your titles to be clear and easy to understand. If you want to see what your domain looks like from Google’s perspective and see the cleanliness of your page titles, type in “site:” and then your domain name without the “www.” Below is a screenshot of a website with a good 7 pack:

macys-7-pack-google-serp

You can then go to your Webmaster Tools account to diagnose any problems you may see and determine exactly where the problem lies and how to fix it. From a reactive mode perspective, look at your analytics and verify. It’s very important for SEOs to live by this mantra. Webmaster Tools isn’t something to take for granted. Have an agency or consultant monitor the findings in GWT and relay information to design, development, and marketing teams.

2. HTML Improvements
Visit the HTML Improvements category to determine if your titles and descriptions look bad on a Google SERP. You can see if Google agrees, then click on anything with blue writing to learn more about the problem.

Boggs was asked after the presentation what tool might get users in trouble if they don’t understand it, and this was his answer. He explained that almost every site is going to have some duplicate descriptions and titles, so he wouldn’t try to get that number down to zero. You don’t need to remove every single warning from GWT.

How to Find the Tool: Located under Search Appearance.

3. Sitelinks
You can visit the sitelinks tab to demote a certain sitelink (one of the links under your company homepage shown on a search results page like in the screenshot above). Google is going to automatically generate links to appear as your sitelinks, but you can tell Google if you don’t want something there.

How to Find the Tool: Located under Search Appearance.

4. Search Queries
Here, you can look at the top pages as well as the top queries for your site. Most people will just take the default information, but Boggs stressed that there are tabs for a reason. Look at the top queries as well as use those “more” tabs to get more information.

How to Find the Tool: Located under Search Traffic.

5. Links
You can click on “links to your site” to get a full list of those linking back the most, but the tool that many forget to use is the “internal links” tool. Internal links are very important; Boggs explained it’s worth the time to go through and look at the number of these internal links and then download the table so you can really slice it and dice it.

How to Find the Tools: Located under Search Traffic.

6. Manual Actions and Malware
With this tool, no news is good news. If you get a manual action warning, it means you need to do something that is probably substantial in order to keep your rankings where they are. Malware is also something you can look into which is another place you don’t want to see anything.

How to Find the Tool: Find manual Action under Search Traffic, Malware under Crawl.

7. Index Status
If your page index is 10x, you might have a problem. The advanced tab here gives you a much better look at that data.

How to Find the Tool: Located under Google Index.

8. Content Keywords
What you want to look for here are the words you are using in your content. You don’t want to see a lot of “here” or promotional phrases. Identify where your gaps are or where you have too much content.

How to Find the Tool: Located under Google Index.

9. Crawl Errors
Google now has a feature phone tab to help you with crawl errors. You have to understand any crawl errors that might occur and remember that you should provide data that is very specific to mobile, as well. You can also take a look at your crawl stats, which means the time spent downloading, and make sure there is no spike.

How to Find the Tools: Both located under Crawl.

Finally, Boggs explained that Google Webmasters Tools should be thought of proactively by pairing it with Google Analytics. What kinds of things is GWT telling you when it comes to your analytics and how that data is affected? Consider this screenshot from Boggs’ presentation:

gwt-ga-more-less-obvious
In the end, Boggs explained that expertise is knowing the most basic things about SEO and doing them repeatedly, perfectly, every time. You’re going to come across situations where there are a lot of hooks and changes in the algorithm. Something someone might have done one to five years ago could be a very bad move now. That’s part of the game.

Bing Webmaster Tools: Bing Stands for “Bing Is Not Google”

Director of SEO and Social Product at The Search Agency, Grant Simmonsbegan his presentation with the quote “Bing stands for Bing is not Google,” and the laughter amongst the marketers and SEOs just about said it all. It’s true; Bing is often not taken as seriously as Google because it just isn’t as popular, yet Bing Webmaster Tools (BWT) does offer some good insights that Google does not.

Once you’re signed upand logged in, consider the top things that you should look at first to really get a handle on BWT:

1. Dashboard
You want to make sure that pages you think you have are the ones the Bing has indexed. If that number isn’t what you expected, ask yourself a few questions: Are they crawling my site frequently? Am I not updating my site? These are all quick things you can see right from the dashboard, and you can even look at search keywords to see how people are finding you.

Quick Fact: Bing doesn’t use Google Analytics.

2. Diagnostic Tools
The diagnostic tools category is comprised of 7 subcategories: keyword research, link explorer, fetch as Bingbot, markup validator, SEO analyzer, verify Bingbot, and site move.

How to Find the Tool: This is a category all on its own!

3. SEO Analyzer
This tool works great when analyzing just one URL. You simply type in the URL and hit “Analyze” to get an overview of the SEO connected with that URL on the right hand side of the page. The tool will highlight any issue your site is having on the page; if you click on that highlighted section, Bing will give you the Bing best practice so you can make improvements.

How to Find the Tool: Located under Diagnostics & Tools.

4. SEO Reports
This tool shares a look at what is going on with your whole site (as opposed to just one URL). You will get a list of SEO suggestions and information about the severity of your issue, as well as a list of links associated with that particular error. The tool runs automatically every other week for all of the sites you have verified with BWT (so not your competitor’s sites).

How to Find the Tool: Located under Reports & Data.

5. Link Explorer
You can run this tool on any website to get an overview of the top links associated with that site (only the top links, however, which is considered one of the limitations of the tool). Export the links into an Excel spreadsheet and then slice and dice the information as you’d like.

How to Find the Tool: Located under Diagnostics & Tools.

6. Inbound Links
Link Explorer is probably one of the more popular tools when it comes to BWT, so it’s certainly worth mentioning. However, according to Simmons, Inbound Links is a better tool that doesn’t have as many limitations. This tool will show you trends over time so you can really see if there is value on deep page links. You can see up to 20,000 links, as well as the anchor text used, with the ability to export.

How to Find the Tool: Located under Reports & Data.

7. Crawl Information
It’s important to remember that the Bing bots are different than the Google bots, and the crawl information tool can help give you insight. From a high level, Simmons explained that when the tool gives you the stats, you should be looking at the challenges you might have from the migration you did last year. Are your 301s still in place? Are they still driving traffic? From the 302 pages, should they be made permanent? It’s also a good idea to look at the last time your site was crawled. If it’s been a while, remember Bing likes fresh content and you may need to make some updates. Again, this information is exportable.

How to Find the Tool: Located under Reports & Data.

8. Index Explorer
Simmons said this is one of the coolest things found in BWT, one reason being that Google doesn’t really have anything like it. You can see stats for a particular page, which can be good to see based on a subdirectory or section of your site. The tool has great filters and offers an awesome visual representation of crawled and indexed pages.

How to Find the Tool: Located under Reports & Data.

Of course, there is a lot more to BWT than just the eight features listed above, including the keyword research tool, geo targeting, disavow tool (they were the first to offer this), and crawl control. Their features are very comparable to Google, they have excellent navigation and even a few extra capabilities. Simmons concluded the presentation by saying that we should really focus on BWT to make a difference.

Do you think Boggs and Simmons singled out the best tools in both GWT and BWT? Simmons will speak to attendees at SES Chicago in early November on what it takes to become a leading SEO mechanic, alongside Vizion Interactive’s Josh McCoy. Keep an eye out at SEW for coverage!


Original Article Post by Amanda DiSilvestro @ Search Engine Watch

How to Use PPC Data to Guide SEO Strategy in a '(Not Provided)' World

We can no longer precisely track traffic for Google in organic search at the keyword level. As "(not provided)" creeps its way up to 100 percent, so does the lack of our ability to track Google organic keyword conversions.

Tell your friends, family, loved ones, the boss. Then if you haven't immediately lost their attention with the use of acronyms and jargon, also let them know that we're still able to measure our efforts and gain strategic insight in many ways.
This article is an attempt to explain what we see in keyword reports currently, show how PPC data can help guide SEO efforts, and finally a consolidation of initial thoughts and ideas to assist in moving forward.

Smart SEO professionals will still prove their worth. Together we can overcome this daunting hurdle.

What Do We See in Google Organic Keyword Reports?

Short answer: We aren't seeing an accurate representation of keywords people are using to get to our sites.

The easiest way to look at this is by visualizing the browser versions that are still passing keyword referral data.

Google Organic Visit Share vs Provided Query Share

Above, the light green color is the percent of keywords that are still passing keywords next to the darker Google organic visits.

In essence, we're mostly seeing keywords from outdated versions of Safari and MSIE (Internet Explorer). So the search behavior associated with the demographics using outdated browsers is what we see coming from Google in analytics packages like Google Analytics. Probably not a comprehensive picture into what is actually happening.

Using PPC Data to Guide SEO Strategy

Google needs marketers to be able to quantify their efforts when it comes to AdWords. Therefore, keyword data is passed and there to take advantage of.

The thought here is that if a page performs well in a PPC campaign, it will translate to performing well at the top of organic listings, though people clicking ads versus organic listings probably behave differently to some degree.

There are many ways PPC data could be used to help guide SEO strategy, this is just one to get the juices flowing.

Step 1: Identify Top Performing PPC Landing Pages

If using Google Analytics, from Dashboard click Acquisition > Adwords > Destination URLs. Assuming you have sufficient conversion tracking set up here, it should give you all the information you need to understand which pages are doing the best.

After filtering out the homepage, sorting by the conversion metric of your choice, adding Keyword as a secondary dimension, then exporting 100 rows you will have the top performing 100 landing page/keyword combinations for PPC. Revenue is always a good indication that people like what they see.

Using PPC data for SEO strategy

 

Step 2: Pull Ranking Data

Next, pull in Google Webmaster Tool Ranking data for the associated keywords. You can access this data in Google Analytics from Dashboard > Acquisition > Search Engine Optimization > Queries, or in Google Webmaster Tools.

Specify the largest date range possible (90 days) and download the report. Then use VLOOKUP to pull in ranking data into the spreadsheet containing the top PPC landing page/keyword combinations.

Using PPC data and SEO Rankings strategy

 

Step 3: Form SEO Strategy

Now that we know where our site shows up in organic for the top PPC keyword/landing URL combinations, we can begin forming strategy.

One obvious strategy is to make sure that the PPC and organic landing pages are the same. Sending PPC traffic to organic canonical pages can only increase the possibilities of linking and social sharing, assuming the organic page converts well.

Another option is to filter the Average Rank column to only include first page rankings, in an attempt to identify low-hanging fruit. Once an opportunity is identified, compare SEO metrics to determine where focus should be placed and how best to meet and beat your competitors.

Additional Thoughts on SEO Strategy in a 100% '(Not Provided)' World

1. '(Not Provided)' Still Counts as Organic

Conversion information is still applied to the organic channel, don't forget! We no longer have the ability to say someone who Googled [hocus pocus] bought $1,000 worth of "hocus pocus" stuff. But we can say that someone clicked an organic listing, landed on the hocus pocus page, and bought $1,000 of stuff.

Note: "(not provided)" shouldn't be confused with the issue of iOS 6 organic traffic showing up as direct. Last we checked this was hiding about 14 percent of Google searches, but is becoming less of an issue with the adoption of iOS7.

2. Bing Still Has Organic Keyword-Level Tracking

Bing doesn't use secure search, so we can still see what people are searching to get to our sites, conversions, sales, etc. Bing data could help quantify SEO efforts, but it's still only 9.1 percent of organic search share.

Note: People searching Bing versus Google probably behave differently to some degree.

3. Google Webmaster Tool Search Query Data Provides Partial Insight

Google gives us access to the top 2,000 search queries every day. After understanding limitations, the search query report can be invaluable as it gives a glimpse of how your site performs from Google's side of the fence. Google also recently mentioned they will be increasing the amount of data available to a year!

By linking Google Webmaster Tools with AdWords, Google also has given us a report using the same search query data except with more accurate numbers (not rounded).

Conclusion

Clearly, page-level tracking is more important than ever. Google has forced SEO professionals to look at what pages are ranking and where, and then pull in other sources to guess on performance and form strategies.

Google will most likely respond to the outcry by giving us access to more detailed search query data in Google Webmaster Tools. As mentioned before, they have already announced an increase of data from 90 days to a year. This may be a sign of how they might help us out in the future.


Original Article Post by Ben Goodsell @ Search Engine Watch

Sitemaps & SEO: An Introductory Guide

Way back in the "good old days" of SEO, many "SEO firms" made a pretty good living "submitting your website to thousands of search engines." While that has never been a sound tactic/method of achieving SEO nirvana, today's SEO provides us with opportunities to ensure that we get our content – in all shapes, sizes, and forms – indexed in the search engines, to the best of our ability.

When it comes to the crawling phase of SEO and bot visibility, we often first check what we hold from search engines via robots.txt and meta robots tag usage. But equally important is the content/URLs that we feed search engines.

Long ago, the best practice was to create an HTML sitemap of at least all your higher-level pages and link this HTML sitemap from the footer of all site pages. This allowed search engines the ability to have a buffet of site URLs from any one page on your site.

Then along came XML sitemaps. Extensible Markup Language is the preferred means of data digestion by search engines.

With this tool at our disposal, a site administrator has the ability to tell/feed search engines data on the pages of a site they want crawled as well as the priority or hierarchy of site content alongside information on when the page was last updated.

Let's walk through the initial first steps of how to create sitemaps for varied content types.

How to Build a Standard XML Sitemap

Below is an anatomy of a standard XML sitemap URL entry.

<url>
<loc>http://www.example.com/mypage </loc>
<lastmod>2013-10-10 </lastmod>
<changefreq>monthly </changefreq>
<priority>1 </priority>
</url>


This points out the areas I noted above where you can provide information on URLs desired for crawl as well as additional URL information.

Some content management systems allow the functionality for dynamic or auto-generated sitemaps. Is this easy? Yes. Is it error free? No. More on that in a moment.

If you don't have the functionality to generate a sitemap with your CMS, then you must create an XML sitemap from scratch. You wouldn't want to do this manually because of the time burden. That's why there are tools for this.

There are many XML sitemap generators. Some are free, but they often have a crawl cap on site URLs, so this defeats the purpose.

Most good sitemap generators are paid. One fairly straightforward tool you can use for sitemap generation is Sitemap Writer Pro. It's well worth the $25.

If you do choose to use other tools, choose the one that allows you to review the crawl of URLs and allows you to easily remove any duplicated URLs, dynamic parameters, excluded URLs, etc. Remember, you only want to include the pages on the site that you want a search engine to index and value.

How to Upload and Submit Your Sitemap

Now that the standard XML sitemap is built, you need to upload the file to your site. This file should reside directly off the root, with a relevant page naming convention such as /sitemap.xml.

Once you've done this, go to Google Webmaster Tools and submit the sitemap:

Google Webmaster Tools Submit Sitemap

Then do the same with Bing Webmaster Tools:

Bing Webmaster Tools Submit Sitemap

Yes, they may find the sitemap on your site, but it's smart to feed search engines this information and give Google and Bing the ability to report on indexing issues.

How to Find Sitemap Errors

You've given your URLs to the top search engines in the preferred XML markup, but how are they indexing the content? Are they having any issues? The wonderful caveat of providing this information directly to Webmaster Tools accounts is that you can review what content you may be withholding from search engines by accident.

Google has done a much better job of sitemap issue transparency compared to Bing, which provides a much smaller amount of data for review.

Google Webmaster Tools Sitemap Errors

In this instance, we've submitted an XML sitemap and received an error that URLs in the sitemap are also featured in the robots.txt file.

It's important to pay attention to this type of error and warning information. They may not be able to even read the XML sitemap. And, we can also glean information on what important URLs we are accidently withholding from crawls in the robots.txt file.

As a follow-up to the point above, on the negative aspect of dynamically-generated sitemaps, these can often include many URLs that are excluded from search engine view intentionally in the robots.txt file. The last thing we want to do is tell a search engine to both crawl and not crawl the same page at the same time.

Sitemap monitoring is essential for any SEO initiative. At its most basic point, it will tell you how many URLs in your XML sitemap you have provided them, how many are currently indexed in Google, as well as the last time the sitemap file was processed.

Wash, Rinse, Repeat

You may have run through process above and are feeling pretty confident about transparency and delivery of site URLs to the search giants. But aside from the standard XML sitemap information, you can provide to Google and Bing, these engines also will accept information on your site's image, video, news and mobile content.

Conveniently, these types of sitemaps can be created, placed on the site and submitted in the same fashion as the standard XML sitemap. Additionally, using the preferred tool I mentioned earlier, you'll also have the ability/functionality to create these sitemaps.

Anatomy of Supporting XML Sitemaps

Image XML Sitemaps

Provide data on site images and the page locations of these images:

<url>
<loc>http://www.example.com/mypage </loc>
<lastmod>2013-10-10 </lastmod>
<changefreq>monthly </changefreq>
<priority>1 </priority>
<image:image>
<image:loc>
http://www.example.com/images/myfirstimage.gif
</image:loc>
</image:image>
<image:image>
<image:loc>
http://www.example.com/images/mysecondimage.gif
</image:loc>
</image:image>
</url>


Video XML Sitemaps

Instruct the search engines on the page locations of your videos and video embeds as well as information on their titles, descriptions, access levels, etc.:

<url>
<loc>
http://www.example.com/mypage </loc>
<lastmod>2013-05-06 </lastmod>
<changefreq>monthly </changefreq>
<priority>0.5 </priority>
<video:video>
<video:content_loc>
http://www.youtube.com/v/W10j21236=en_US
</video:content_loc>
<video:player_loc
allow_embed="yes">http://www.site.com/videoplayer.swf?video=123 < /video:player_loc>
<video:thumbnail_loc>
http://img.youtube.com/vi/W1021236=1/default.jpg
</video:thumbnail_loc>
<video:title>My Video Name </video:title>
<video:description>
My Video Description
</video:description>
<video:rating>2 </video:rating>
<video:view_count>498 </video:view_count>
<video:publication_date>2013-05-06 </video:publication_date>
<video:family_friendly>yes </video:family_friendly>
<video:duration>10 </video:duration>
<video:expiration_date>2016-05-06 </video:expiration_date>
<video:requires_subscription>no </video:requires_subscription>
</video:video>
</url>


Mobile XML Sitemaps

Do you have mobile pages in a directory on your site? Let search engines know more about your URLs catering to mobile users:

<url>
<loc>http://www.example.com/mobile/oneofmymobilepages </loc>
<lastmod>2013-10-10 </lastmod>
<changefreq>monthly </changefreq>
<priority>0.8 </priority>
<mobile:mobile/>
</url>


News XML Sitemaps

News sites can provide information about news pieces, their location on the site, as well as news type, language, and access information:

<url>
<loc>http://www.example.com/news/mynewsarticle </loc>
<news:news>
<news:publication>
<news:name>My News Site </news:name>
<news:language>en </news:language>
</news:publication>
<news:access>Subscription </news:access>
<news:genres>PressRelease, Blo </news:genres>
<news:publication_date>2013-10-10 </news:publication_date>
<news:title>Title of News Piece </news:keywords>
</news:news>
</url>

Conclusion

With as much effort as goes into the development of great content, especially nowadays, taking the added time of ensuring that you've done everything in your power to ensure full indexation is critical to getting the value back out of the effort.


Original Article Post by Mark Jackson @ Search Engine Watch

Google Fixes Webmaster Tools Bug, Missing Search Query Data to Return

Google Search Queries Sept 25

When Google turned on secure search last Monday, it meant webmasters were seeing nearly all their keyword referral data as "(not provided)". The best workaround to get organic search data was to access similar keyword data under Search Queries within the Search Traffic section of Google Webmaster Tools.

However, once the secure search switch was flipped, that keyword data stopped being updated or provided in Google Webmaster Tools. Well, webmasters can breathe a sigh of relief, as the missing keyword data in Google Webmaster Tools was simply a bug, Google has confirmed.

"We've recently fixed a small bug related to data reporting in Webmaster Tools. We expect reporting to return to normal in the coming days," a Google spokesperson told Search Engine Watch.

This is great news, as currently Webmaster Tools is the only way to get actual data from Google regarding the keywords that searchers are using when they land on your website. It's great that this source of keyword information will be staying around despite the secure search change.

Just yesterday, no search query data appeared in Google Webmaster Tools beyond Sept. 23. Now keyword data is available through September 25.


Original Article Post by Jennifer Slegg @ Search Engine Watch

Google Keyword Data Goes Missing Again, This Time From Webmaster Tools

gwt-keywords-meme

Encrypted search has been hard on SEO professionals and marketers. For two years, we've been hit with keywords "(not provided)" in Google Analytics. While the keyword data was still available in Google Webmaster Tools with impressions and click-throughs, it wasn't able to be tied directly to landing pages.

Now, it appears that Google Webmaster Tools is missing the keywords data, as well. At least, it doesn't appear query data has been recorded since last Monday, September 23.

search-queries-missing

The following, while not confirmed to Search Engine Watch by Google, has been confirmed independently by several independent SEO professionals on dozens of sites.

This is the search queries report from Webmaster Tools in the Search Traffic section. The timeframe selected is the past week from September 22 through September 29. As you can see, there are data plots for two days – 22 and 23 – and no more data afterward.

This is further confirmed in Google Analytics:

If you tie your Webmaster Tools and Analytics accounts together, you can also view query data under Traffic Sources > Search Engine Optimization. If you use it, you likely also know Webmaster Tools data takes several days to copy to Analytics, hence the yellow warning. However, this graph clearly shows there has been no data inserted for an entire week.

gwt-ga-comparison

Historical query data isn't available in Google Analytics. Additionally, the dates for which query data are available don't line up between the two products. Webmaster Tools has data going back through July 2, 2013. Analytics query data goes back to June 23, 2013. Still, both have no more query data after September 23.

Search Engine Watch reached out to Google earlier today, asking if this was a new policy going forward, or if it is simply a bug. We are still awaiting a reply.

There has been discussion in the Google webmaster help forums on this topic.

"Thanks for posting, everyone!" wrote Google's John Mueller, a member of the Webmaster Tools team in Europe, Google Webmaster Central forum. "The team is aware of the problem and working on speeding that data back up again. Thanks for your patience in the meantime.


Original Article Post by Thom Craver @ Search Engine Watch

Google Webmaster Tools Give Users More Link Data

google-webmaster-tools-iconGoogle's Distinguished Engineer Matt Cutts kicked off SES San Francisco this morning by announcing a change to the way Google Webmaster Tools serves backlinks to users. Now, instead of getting a huge list of backlinks in alphabetical order, they are giving a better representation of all the backlinks.

"If I download my backlinks in Webmaster Tools, my list ends at H. If you are Amazon or eBay, you get 000000a.com to 000000c.com," Cutts said.
When Google is serving 100,000 backlinks in Webmaster Tools, it wasn't that useful to users when they could get so many results from a single domain, and there was no way to sort them.

Shortly after the announcement this morning at SES, Google published a blog post detailing the changes:
Based on feedback from the webmaster community, we're improving how we select these backlinks to give sites a fuller picture of their backlink profile. The most significant improvement you'll see is that most of the links are now sampled uniformly from the full spectrum of backlinks rather than alphabetically. You're also more likely to get example links from different top-level domains (TLDs) as well as from different domain names. The new links you see will still be sorted alphabetically.

Starting soon, when you download your data, you'll notice a much broader, more diverse cross-section of links. Site owners looking for insights into who recommends their content will now have a better overview of those links, and those working on cleaning up any bad linking practices will find it easier to see where to spend their time and effort.

This is a great change for webmasters, especially when people are trying to clean up after receiving a bad backlink warning. This was a problem for larger sites, and especially if someone was cleaning up a site that had thousands of low-quality/spammy backlinks pointing to a site.

The change has already gone live, so you can download a better cross-section of your backlinks in Google Webmaster Central right now.


Original Article Post by Jennifer Slegg @ Search Engine Watch

Better backlink data for site owners


In recent years, our free Webmaster Tools product has provided roughly 100,000 backlinks when you click the "Download more sample links" button. Until now, we've selected those links primarily by lexicographical order. That meant that for some sites, you didn't get as complete of a picture of the site's backlinks because the link data skewed toward the beginning of the alphabet.

Based on feedback from the webmaster community, we're improving how we select these backlinks to give sites a fuller picture of their backlink profile. The most significant improvement you'll see is that most of the links are now sampled uniformly from the full spectrum of backlinks rather than alphabetically. You're also more likely to get example links from different top-level domains (TLDs) as well as from different domain names. The new links you see will still be sorted alphabetically.

Starting soon, when you download your data, you'll notice a much broader, more diverse cross-section of links. Site owners looking for insights into who recommends their content will now have a better overview of those links, and those working on cleaning up any bad linking practices will find it easier to see where to spend their time and effort.

Thanks for the feedback, and we'll keep working to provide helpful data and resources in Webmaster Tools. As always, please ask in our forums if you have any questions.





Google AdWords Adds New Paid & Organic Report

Google AdWords is introducing a new feature for advertisers to give more data right within the AdWords interface, even when it isn't paid ads specific. This is part of their campaign to connect data between the Google AdWords, Google Analytics, and Webmaster Tools.

adwords-paid-and-organic-report

The new paid & organic report, which can help advertisers see their search footprints and enable them to determine if there are keyword areas that can be supplemented with paid advertising. It also allows you to view detailed reports to show for particular keywords, how much organic traffic as well as how it advertising traffic you are getting or have the potential to get.

Google suggesting several ways for advertisers confined to the inclusion of organic traffic information beneficial to their business. You can:
  • Look for additional keywords where you might have impressions on natural search, but without any related ads.
  • Use it to optimize your presence for your most important high-value keyword phrases, so you can see where you need to improve your presence.
  • Use it to test website improvements and AdWords changes, as you can compare traffic across both AdWords and organic in the same interface, which enables you to adjust accordingly.

In order to take advantage of this new report, you need to have a Google AdWords and Webmaster Tools account, and you will need to verify and sync them.

In unrelated AdWords news:
  • Google has introduced a new option for reporting trending traffic. Now, you can toggle between daily, weekly, monthly and quarterly so you can quickly and easily see any resulting trends during those time periods.
  • And finally, Google will officially retire the AdWords Keyword Tool on August 26. However, the keyword planner has been out for several months so you can easily get all the same data in their all-in-one tool.

Article Post @ Search Engine Watch

Google Webmaster Tools Search Query Data is Accurate (and Valuable)

Google Webmaster Tools Search Queries
There is a widespread belief within the SEO industry that Google Webmaster Tools search query can be dismissed as worthless and inaccurate. This is wrong – an unfortunate example of how misinformation can steer people away from using a free, valuable tool.

That said, it's understandable how people can reach these types of conclusions. After all, it takes some time to figure out exactly how Google displays the data and how to navigate around some of the pitfalls. It takes more than just a cursory glance to glean the true value of this data!

Here are some general items to keep in mind about Google Webmaster Tools search query data as you read this article.
  • It's generated from Google's side of the fence. It includes "(not provided)" and all other referral data lost to secured search (Android, iOS 6), so +1 for more accuracy!
  • Google masks data in the form of buckets. They round exact data into pre-determined numbers making the available data less accurate than it could be, so -1 for less accuracy!
  • Limited to the top 2,000 keywords the site received clicks for. Stay tuned for a secret fix.

'(Not Provided)' and Secure Search Referral Data

This is all included in GWT search query data! A big deal when considering that many sites see "(not provided)" as their highest referring keyword.

In an oddly similar worded article written back in March of 2012, I showed how it was possible with careful segmentation of Google Analytics and GWT data, to recover "(not provided)" data. An example of how it is possible to gain granular insight from search query data, despite what many people currently perceive.

GWT Clicks vs. Analytics Visits

Especially when comparing GWT numbers with other analytics platforms, it's important to take everything into consideration to avoid seeing vast differences:
  • There's generally a 2-3 day lag in GWT data.
  • When comparing GWT numbers to analytics numbers, consider variables like time zone, mobile, image, video, etc. Be sure to filter everything out before comparing GWT clicks with analytics visits.
  • Google gets data upon users clicking your listing in search results. Most analytics depend on the execution of JavaScript tagging and there is probably a small percent of users who click through to a site then bounce before tracking has fired.
  • Google mentions the possibility of bots (crawlers) occasionally (rarely) negatively affecting the accuracy of numbers.
  • Always be sure to check comparison dates are the same.

Google Webmaster Tools pioneer Vanessa Fox summed it up well when she stated, “...we're looking at historical trends for traffic, not replacing web analytics programs...“.

Bucketed Display

Below is a list of how Google seems to be displaying bucketed numbers, they just round to the closest number. So if a listing were to get 451 clicks, that would be displayed as having 500 clicks.
gwt-search-query-buckets

To neutralize the obvious inaccuracy of the bucketed numbers, trend the data over time. When the data is downloaded every day, it exposes the fluctuation of the bucketed search data giving insight to a more approximate average over time. As you can see the larger the number the bigger the bucket becomes, adding to the importance of grabbing search data daily to get the lowest numbers possible.

Between eliminating the margin of error when data is trended over time and the accuracy gained by "(not provided)" and other lost or mislabeled secured search referral data loss, there really is value to had here.

Data Limitations

Google documentation mentions, “Webmaster Tools shows data for the top 2,000 queries that returned your site at least once or twice in search results in the selected period."

For most sites this is a comprehensive look at search result presence. For others it is not enough, but it's possible to separately verify subfolders in GWT. By doing this you can get search query data specific to that subfolder or category. (There's the secret fix I promised.)

Identify Trends

It is easy to point out the most advantageous aspect of GWT search query data. By creating categories of keywords, you can track seasonality (impressions) on the same graph as average ranking, click-through rate (CTR), and number of clicks.

Imagine a scenario where you have all referring keywords for all products / articles / similar pages in categories and a complete and immediate understanding of whether a decrease in traffic can be attributed to a drop in ranking, CTR, or simply less search.

Conclusion

With more companies and products using Google Webmaster Tool search query data, and Googlers Maile Ohye and John Mueller vouching for the accuracy of the numbers, it's a shame so many people have dismissed free insight as invalid. Don't be one of them! Downloading, categorizing, and trending over time is the best way to get the most out of what Google is giving us.


Article Post @ Search Engine Watch

Making smartphone sites load fast


Users tell us they use smartphones to search online because it’s quick and convenient, but today’s average mobile page typically takes more than 7 seconds to load. Wouldn’t it be great if mobile pages loaded in under one second? Today we’re announcing new guidelines and an updated PageSpeed Insights tool to help webmasters optimize their mobile pages for best rendering performance.

Prioritizing above-the-fold content

Research shows that users’ flow is interrupted if pages take longer than one second to load. To deliver the best experience and keep the visitor engaged, our guidelines focus on rendering some content, known as the above-the-fold content, to users in one second (or less!) while the rest of the page continues to load and render in the background. The above-the-fold HTML, CSS, and JS is known as the critical rendering path.

We can achieve sub-second rendering of the above-the-fold content on mobile networks by applying the following best practices:
  • Server must render the response (< 200 ms)
  • Number of redirects should be minimized
  • Number of roundtrips to first render should be minimized
  • Avoid external blocking JavaScript and CSS in above-the-fold content
  • Reserve time for browser layout and rendering (200 ms)
  • Optimize JavaScript execution and rendering time
These are explained in more details in the mobile-specific help pages, and, when you’re ready, you can test your pages and the improvements you make using the PageSpeed Insights
tool.

As always, if you have any questions or feedback, please post in our discussion group.


View manual webspam actions in Webmaster Tools

We strive to keep spam out of our users’ search results. This includes both improving our webspam algorithms as well as taking manual action for violations of our quality guidelines. Many webmasters want to see if their sites are affected by a manual webspam action, so today we’re introducing a new feature that should help. The manual action viewer in Webmaster Tools shows information about actions taken by the manual webspam team that directly affect that site’s ranking in Google’s web search results. To try it out, go to Webmaster Tools and click on the “Manual Actions” link under “Search Traffic."

You’ll probably see a message that says, “No manual webspam actions found.” A recent analysis of our index showed that well under 2% of domains we've seen are manually removed for webspam. If you see this message, then your site doesn't have a manual removal or direct demotion for webspam reasons.

If your site is in the very small fraction that do have a manual spam action, chances are we’ve already notified you in Webmaster Tools. We’ll keep sending those notifications, but now you can also do a live check against our internal webspam systems. Here’s what it would look like if Google had taken manual action on a specific section of a site for "User-generated spam":

Partial match. User-generated spam affects mattcutts.com/forum/


In this hypothetical example, there isn’t a site-wide match, but there is a “partial match." A partial match means the action applies only to a specific section of a site. In this case, the webmaster has a problem with other people leaving spam on mattcutts.com/forum/. By fixing this common issue, the webmaster can not only help restore his forum's rankings on Google, but also improve the experience for his users. Clicking the "Learn more" link will offer new resources for troubleshooting.

Once you’ve corrected any violations of Google’s quality guidelines, the next step is to request reconsideration. With this new feature, you'll find a simpler and more streamlined reconsideration request process. Now, when you visit the reconsideration request page, you’ll be able to check your site for manual actions, and then request reconsideration only if there’s a manual action applied to your site. If you do have a webspam issue to address, you can do so directly from the Manual Actions page by clicking "Request a review."

The manual action viewer delivers on a popular feature request. We hope it reassures the vast majority of webmasters who have nothing to worry about. For the small number of people who have real webspam issues to address, we hope this new information helps speed up the troubleshooting. If you have questions, come find us in the Webmaster Help Forum or stop by our Office Hours.


Google Webmaster Tools: An Overview

Google Webmaster Tools (GWT) is the primary mechanism for Google to communicate with webmasters. Google Webmaster Tools helps you to identify issues with your site and can even let you know if it's been infected with malware (not something you ever want to see, but if you haven't spotted it yourself, or had one of your users tweet at you to let you know, it's invaluable).
And the best part? It's absolutely free. If you don't have a GWT account, then you need to go get one now.

This guide to Google Webmaster Tools will walk you through the various features of this tool, and give you insight into what actionable data can be found within. (For more in depth help, go to Google's Webmaster Help.)

Verification

Before you can access any data on your site, you have to prove that you're an authorized representative of the site. This is done through a process of verification.

There are five main methods of verification currently in place for GWT. There's no real preference as to which method you use, although the first two tend to be the most commonly used as they've been around for longer.

Verify
  • The HTML file upload. Google provides you with a blank, specially named file that you just have to drop in the root directory of your site. Once you've done that, you just click on the verify button and you'll have access to your GWT data for this site.
  • HTML tag. Clicking on this option will provide you with a metatag that you can insert into the head of your home page. Once it's there click on the verify button to view your GWT data. One item to note about using this method of verification is that it's possible for the tag to be accidentally removed during an update to the home page, which would lead to a revocation of the verification, but reinserting the tag and clicking verify again will fix that.
  • Domain Name Provider. Select your Domain Name provider from the drop down list and Google will give you a step by step guide for verification along with a unique security token for you to use.
  • Google Analytics. If the Google account you're using for GWT is the same account as for GA (assuming you're using GA as your analytics solution), is an admin on the GA account, and you're using the asynchronous tracking code (with the code being in the head of your home page), then you can verify the site this way.
  • Google Tag Manager. This option allows you to use the Google Tag Manager to verify your site.
Verify Alternate Methods

 

The Dashboard

Now that you're verified, you can log in and start to examine the data for your site.

GWT Site Dashboard

The first screen you'll see is the dashboard. This gives you a quick view into some of the more pertinent information for your site, along with any new messages from Google. We'll cover each of the widgets shown here in their own sections.

Site Messages

GWT Site Messages

When Google wants to communicate with a webmaster, this is the place they'll do so. There may be messages that inform you that you have pages infected with malware, that they've detected a large number of pages on your site, which may be an indication of other problems, or even just an informational message that your WordPress installation really needs to be updated to remove the possibility of anyone exploiting already known security holes in that platform.

Not all messages are bad. There's also the possibility that you'll get one that congratulates you on an increase in traffic to one or more of your pages.

 

Settings

GWT Settings

Clicking on the gear icon in the top right gives you access to the tools that formerly resided in the Configuration menu item.

Webmaster Tools Preferences

GWT Preferences

Here you can specify whether you'd like to receive a daily digest of your messages or not, and the email account you'd like them sent to.

Site Settings

GWT Site Settings

Here you can tell Google some things about your site if you're not able to tell them in other ways.

For example, if you have a .com site, hosted in Duluth, but it's targeted to the UK, there aren't too many signals to the search engines that that's your intention. In this tab you can set your geographic target to the UK, which informs Google of your intentions for this site.

You can also set your preferred domain – whether you want the site to show up in the search results with the www or without the www. Most sites will redirect from one to the other, or contain canonical tags, which will preclude the need for setting this here, but if you don't have that capability, this is your way to tell Google.

The crawl rate option allows you to slow down the rate of Google's spider's crawl. You'd only really do this if you witnessed server issues due to Google's crawling, for the most part you're going to let Google figure out what the correct crawl rate is for your site based on how frequently you add and update content.

Change of Address

GWT Change of Address

If, on the rare occasion that you would do so, you decide to migrate your entire site to a new domain, this is where you let Google know.

Once you've set up your new site, permanently redirected the content from your old site to your new using a 301 redirect, added and verified your new site on GWT, then you come to this option and inform Google of the move.

This should help the index to be updated slightly more quickly than if Google were to just self detect and follow the 301s.

Google Analytics Property

GWT Enable Webmaster Tools Data in Google Analytics

If you'd like to be able to see your GWT data in Google Analytics (GA), you can use this tool to associate a site with a GA account. Simply select any currently linked GA account to associate it with this site. Should you not have a GA account, you have the option to create one here.

Users & Site Owners

GWT Users and Site Owners

Here you can see a list of all authorized users on the account, and their level of authorization. A new user can be added here if needs be.

Owners have permission to access every item on the site.

Users with "Full" permission can do everything except add users, link a GA account and inform Google of a change of address.

Users with "Restricted" permission have the same restrictions as those with "Full" permission plus the following: they only have viewing capabilities on configuration data, cannot submit sitemaps or request URL removals, cannot submit URLs, cannot submit reconsideration requests, and only have the capability to view crawl errors and malware notifications (they can't mark any of them as fixed).

Verification Details

GWT Verification Details

This lets you see any verification issues / successes.

 

Associates

GWT Associates

This section allows you to associate different Google accounts with your GWT account, so that they can be designated as officially connected to the account/site. They can't see any data in GWT, but they can perform actions on behalf of your site (e.g., creating an official YouTube account for the site, or posting to Google+ on behalf of the site through an associated account).
To add an associate user, simply:

  • Click on the "Add a new User" button.
  • Enter the email address that's associated with the account you're associating.
  • Select the type of association you want.
  • Click "Add".

To associate a Google+ page, if it's the same account on GWT and Google+, you're done. If you're using different accounts:
  • Navigate to the Google+ page.
  • Click on the profile button on the left.
  • Click "About", in the links section.
  • Add a link to the site.

 

Search Appearance

GWT Search Appearance Overview

Clicking on the ? icon to the right of this menu option delivers a nice breakdown of the various elements of a search engine results page (SERP).

 

Structured Data

GWT Structured Data

Here you can see information about all structured data elements that Google's located on your site, whether they're from schema.org or older microformats.

 

Data Highlighter

GWT Data Highlighter

The data highlighter allows you to help Google identify some types of structured data on the pages without the need for the code to actually be implemented.

 

HTML Improvements

GWT HTML Improvements

Here is where GWT will inform you of issues with your title and description tags. As titles and descriptions should be unique for each page and should be within certain character length ranges, this section points out where you have issues that can and should be corrected.

For example, if all of your tag pages have the same description, then you aren't telling the search engines much about what is on those pages.

Clicking through on any of these errors will give you a more descriptive overview of the error and will also give you a list of pages where the error was detected.

Sitelinks

GWT Sitelinks

Whenever Google determines that your site is an authority for a particular keyword they'll show a collection of links below the main link, pointing to what they believe to be the most important links on that page. From time to time they'll show a link that you don't particularly want to be surfaced, and this is where you'll correct that issue.

Sitelinks

While you can't specify the actual pages that you want to display in the sitelinks (that would be far too open to abuse), you can specify which pages you want removed. Simply enter the URL of the page with the sitelinks (not always just the homepage), and then type in the URL of the sitelink that you want to be removed.

Note that the erroneous URL may then precluded from displaying in the sitelinks for a certain period of time, but may return at some point in the future (any time after 90 days from your last visit to the sitelinks page) if it still appears to be an important link on that page, so you may want to periodically review your sitelinks. Also note that Google has now placed a limit of 100 on the number of demotions you can have active for a particular site.

Search Traffic

 

Search Queries

Here you can get an overview of the top keywords that returned a page from your site in the search results. Note the data shown here is collected in a slightly different way from your analytics platform, including GA, so don't expect the number to exactly tally.

GWT Top Search Queries

What this does is give you an idea of the top traffic driving keywords for your site, the number of impressions and clicks, and therefore the click through rate (CTR), and the average position that your page was ranking for that particular query.

GWT Top Pages

You can also view the same data by page rather than by keyword. This shows you the top traffic generating pages on your site, and perhaps helps you identify those that you should concentrate on optimizing, as a high traffic generator in 11th position would be a much higher traffic generator in 8th.

Links to Your Site

GWT Links to Your Site

This section identifies the domains that link to you the most, along with your most linked to content. While you most likely won't see every link that Google's found for your site, you will see more than if you went to google.com and performed a search for "link:yoursite.com".

Internal Links

GWT Internal Links

Here you can see the top 1,000 pages on your site sorted by the number of internal links to those pages. If you have a small number of pages on your site, you can reverse the sort order by clicking on the Links header.

Any pages that show 0 internal links have been orphaned and should either be linked to from somewhere on your site, or redirected to an appropriate page if they're old legacy pages.

 

Google Index

Index Status

GWT Index Status

The Index Status allows you to track the status of your site within the Google index. How many pages are they showing as being indexed? Are there any worrying trends? Have you accidentally blocked large sections of your site from Googlebot? This is a great place to get the answers to those questions and more.

 

Content Keywords

GWT Content Keywords

This section displays the most common keywords found by the Google crawler as it navigated your site. One thing to keep an eye on here is if you see unexpected, unrelated keywords showing up, that's usually an indication that your site may have been hacked and hidden keywords have been injected into your pages.

Remove URLs

GWT Remove URLs

If you receive a cease and desist letter from an attorney demanding that you remove a page from your site, if you accidentally break a news embargo, or release an obituary while the person is still breathing, you'll most likely want to get that page out of Google as soon as possible.

The first step is to either remove the page itself or 301 it elsewhere so that it can't be crawled and indexed. This prevents users and crawlers from getting to it, but the URL will still be in the index, and the page can still be found in the cache. That's where this tool comes in.

Enter the URL that you want to remove, click continue, then select whether you want it removed from the search results and the cache, just from the cache or if you want an entire directory removed. Clicking Submit Request adds it to the removal queue. Typically this request will be processed in 2-12 hours.

 

Crawl

Crawl Errors

GWT Site Errors

Here's where you find out about the errors that Google has detected when crawling your site over the past 90 days. This is an invaluable tool as it can absolutely help you identify a variety of issues on your site, from server errors to missing pages, and errors in between.

GWT shows you the number of errors, lists the pages and shows a graph of your count over time for that particular error so you can see whether it's been a gradual change or a more sudden occurrence (perhaps a code push caused unforeseen errors with a section of the site that no one noticed).

This section should be a frequent port of call, as you keep an eye out for any new issues that could be impacting the crawling of your site. If your site has either a mobile presence or is in Google News, you will see tabs dedicated to any crawl errors specific to those products.

Crawl Stats

GWT Crawl Stats

The crawl stats section gives you an idea of how fast the crawlers are able to read pages on your site. Spikes are to be expected here, but is you see a sustained drop in pages crawled, or a sustained spike in time spent downloading a page, or in the size of a page, then that's an indication that you should take a look and see what's changed on your site – perhaps you added a new partner module that's created a bigger than expected addition to the footprint on your site.

Fetch as Google

GWT Fetch as Google

Here is where you can basically view your pages as Google sees them. They'll return the HTTP response, the date and time, and the HTML code, including the first 100kb of visible text on the page.

This is a great way of verifying that the Google crawler sees the page as you expect it to (remember the crawler is supposed to see the same page as the user would see), and that there are no externally injected hidden links on the page. If the page looks how you expect it to, then you can submit it to the index. You are allowed 500 fetches / submissions a week, and 10 linked page submissions per week (submitting a page and all pages linked from it at the same time).

GWT Fetch as Google How Fetched

 

Blocked URLs

GWT Blocked URLs

This section is the place to test out your current robots.txt against any pages on your site to verify whether they can be crawled or not. You can also test out modifications to your robots.txt to see whether they'd work as you anticipate against various pages on your site.

Sitemaps

GWT Sitemaps

Here's where you can access all of the information about the sitemaps that GWT has been informed of. To test a sitemap, click the add/test sitemap button, and GWT will inform you if the sitemap appears to be valid. If it is then simply add the sitemap using the same procedure, except clicking the add sitemap button.
Note that the default view here is to only show the sitemaps that you have added. To view those that have been added by other authorized users on the account click the "all" tab.

The page shows you the sitemaps that you've submitted, the number of pages they found in each, and the number of those pages that they've indexed. You can also see quite clearly if there are any issues that they've detected within your sitemaps. Simply click on the warnings hyperlink to view them all.

 

URL Parameters

GWT URL Parameters

With the introduction and use of canonical tags, this feature isn't used as much as it used to be. What it does is that it allows you to specify URL query string parameters that shouldn't be considered when examining URLs on the site to determine unique URLs.

For example, if you had a tracking parameter that you use for a particular campaign, then the page is obviously the exact same page as when it's reached without the tracking parameter. Entering the tracking parameter in here tells Google that they should ignore the tracking parameter when looking at the URL.

 

Malware

GWT Malware

If Google has detected any malware on your site, this is where they'll list it out (it will also appear in the messages section). If you see a page here you'll want to get it fixed as soon as possible and click on the "Request a Review" button that will be displayed here.

Additional Tools

GWT Other Resources

This section contains links to tools that are outside of GWT, but are of interest to webmasters, such as the Structured Data Testing tool, which enables webmasters to test their schema implementations, the Structured Data Markup Helper, and others.

 

Labs

The labs section contains functionality that's in testing mode. When it's deemed to be "ready for prime time" it will be promoted to the regular sections of GWT, or it may just vanish if it's determined to not be useful.

 

Author Stats

GWT Author Stats

With the big push to tie up bylines to Google+ accounts, this tool allows you to see data for pages which you are the author for, so you'd need to be logged into an account in GWT that you've previously set up as an author.

 

Custom Search

GWT custom search
This allows you to set up Google customized search for your own site.

 

Instant Previews

GWT Instant Previews

This tool allows you to see how your site looks using Google's Instant Preview feature (the view of your site that can be seen in the search results when you mouse over the double arrows that show up next to a result). However, Google removed Instant Previews in April, so this feature isn't of any value to webmasters.

 

Site Performance

GWT Site Performance

This section of Labs has been shut down and links off to alternative resources.

 

Bing Webmaster Tools

Now that you're up to speed on Google Webmaster Tools, don't forget about another search engine offering a free toolset to webmasters that you should also be using: Bing. See "Bing Webmaster Tools: An Overview" for a complete guide.

Article Post @ Search Engine Watch
 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2013. Google Updates - All Rights Reserved
Template Created by Creating Website Published by Mas Template
Proudly powered by Blogger