Showing posts with label Matt Cutts. Show all posts
Showing posts with label Matt Cutts. Show all posts

Matt Cutts on SEO, PageRank, Spam & the Future of Google Search at Pubcon Las Vegas

pubcon-keynote

Matt Cutts kicked off day two at Pubcon with another of his signature keynotes, dispelling myths, talking about spammers and about Jason Calcanis’ keynote from day one, at the urging of the audience.

First, Cutts spoke about Google’s “Moonshot changes,” which he broke down into these areas:
  • Knowledge graph
  • Voice search
  • Conversational search
  • Google now
  • Deep Learning
He revealed that Deep Learning is the ability to make relationships between words and apply it to more words, and how it can help improve search and the nuances of search queries.

Deep Learning in Regular and Voice Search

He explained that voice search is changing the types of search queries people use, but also that it can be refined without repeating previous parts of the user’s search queries. It does this when it knows the user is still searching the same topic, but drilling down to more specifics.

Cutts shared an example where he was searching for weather and continued on with the query without having to keep retyping “What is the weather?” because Google can recognize the user is still refining the previous search query. “Will it rain tomorrow?” in voice search will bring up the weather results for location for Las Vegas, Nevada. Then when he says “What about in Mountain View?” and Google shows weather for Mountain View, knowing that it is a refined voice query. Then “How about this weekend?” is searched and it shows Saturday weather for Mountain View.

Hummingbird, Panda & Authorship

Next up, Cutts spoke about Hummingbird and he feels that a lot of the blog posts about how to rank with Hummingbird are not that relevant. The fact is, Hummingbird was out for a month and no one noticed. Hummingbird is primarily a core quality change. It doesn’t impact SEO that much, he said, despite the many blog posts claiming otherwise.

Of most interest to some SEOs is that Google is looking at softening Panda. Those sites caught in grey areas of Panda--if they are quality sites--could see their sites start ranking again.

Google is also looking at boosting authority through authorship. We have seen authorship becoming more and more important when it comes for search results and visibility in those results; Cutts confirmed this is the direction in which Google will continue to move.

Google on Mobile Search Results

Next, he discussed the role of smartphones and their impact on search results. This is definitely an area SEOs need to continue to focus on, as it is clear that sites that are not mobile-friendly will see a negative impact on their rankings in the mobile search results.

Smartphone ranking will take several things into account, he explained:
  • If your phone doesn’t display Flash, Google will not show flash sites in your results.
  • If your website is Flash heavy, you need to consider its use, or ensure the mobile version of your site does not use it.
  • If your website routes all mobile traffic to the homepage rather than the internal page the user was attempting to visit, it will be ranked lower.
  • If your site is slow on mobile phones, Google is less likely to rank it.
Cutts was pretty clear that with the significant increase in mobile traffic, not having a mobile-friendly site will seriously impact the amount of mobile traffic Google will send you. Webmasters should begin prioritizing their mobile strategy immediately.

Penguin, Google’s Spam Strategy & Native Advertising

Matt next talked about their spam strategy. When they originally launched Penguin and the blackhat webmaster forums had spammers bragging how they weren’t touched by Penguin, the webspam team’s response was, “Ok, well we can turn that dial higher.” They upped the impact it had on search results. Cutts said that when spammers are posting about wanting to do him bodily harm, he knows his spam team is doing their job well.

He did say they are continuing their work on some specific keywords that tend to be very spammy, including “payday loans,” “car insurance,” “mesothelioma,” and some porn keywords. Because they are highly profitable keywords, they attract the spammers, so they are working on keeping those specific areas as spam-free as possible through their algorithms.

He discusses advertorials and native advertising and how they are continuing to penalize those who are utilizing it without properly using disclaimers to show that it is paid advertising. Google has taken action on several dozen newspapers in US and UK that were not labeling advertorials and native advertising as such, and that were passing PageRank. He did say there is nothing wrong with advertorials and native advertising if it is disclosed as such; it’s only when it is not disclosed that Google will take action against it.

Spam networks are still on Google’s radar and they are still bringing them down and taking action against them.

Bad News for PageRank Fans

For PageRank devotees, there is some bad news. PageRank is updated internally within Google on a daily basis and every three months or so, they would push out that information to the Google toolbar so it would be visible to webmasters. Unfortunately, the pipeline they used to push the data to the toolbar broke and Google does not have anyone working on fixing it. As a result, Cutts said we shouldn’t expect to see any PageRank updates anytime soon--not anytime this year. He doesn’t know if they will fix it, but they are going to judge the impact of not updating it. The speculation that PageRank could be retired is not that far off from the truth, as it currently stands.

Communication with Webmasters, Rich Snippets, Java & Negative SEO

Google continues to increase their communication with webmasters. They made new videos covering malware and hacking, as Google is seeing these problems more and more, yet not all webmasters are clear about what it is and how to fix it. They are working on including more concrete examples in their guidelines, to make it easier for people to determine the types of things that are causing ranking issues and point webmasters in the right direction to fix it.

Cutts stressed that he is not the only face for Google search. They have 100 speaking events per year and do Hangouts on Air to educate webmasters. They hold Webmaster Office Hours, to increase communication and give users the chance to engage and ask questions of the search team.

Google is becoming smarter at being able to read JavaScript, as it has definitely been used for evil by spammers. However, Cutts cautions that even though they are doing a better job at reading it, don’t use that as an excuse to create an entire site in JS.

Rich snippets could get a revamp and they will dial back on the number of websites that will be able to display rich snippets. “More reputable websites will get rich snippets while less reputable ones will see theirs removed,” says Matt.

Matt also says negative SEO isn’t as common as people believe and is often self-inflicted. One person approached Matt to say a competitor was ruining them by pointing paid links to their site. Yet when he looked into it, he discovered paid links from 2010 pointing to their site, and said there was no way competitors would have bought paid links back in 2010 to point to their site, since the algorithm penalizing paid links wasn’t until a couple years after those paid links went live.

The Future of Google Search: Mobile, Authorship & Quality Search Results

On the future of search, he again stressed the importance of mobile site usability. YouTube traffic on mobile has skyrocketed from 6 percent two years ago, to 25 percent last year, to 40 percent of all YouTube this year. Some countries have more mobile traffic than they do desktop traffic. Cutts reiterated, “If your website looks bad in mobile, now is the time to fix that.”

Google is also working on machine learning and training their systems to be able to comprehend and read at an elementary school level, in order to improve search results.

Authorship is another area where Google wants to improve, because tying an identity to an authorship profile can help keep spam out of Google. They plan to tighten up authorship to combat spam and they found if they removed about 15 percent of the lesser quality authors, it dramatically increased the presence of the better quality authors.

They are also working on the next generation of hacked site detection, where Cutts said he is not talking about ordinary blackhat, but “go to prison blackhat.” Google wants to prevent people from being able to find any results for the really nasty search queries, such as child porn. Cutts said, “If you type in really nasty search queries, we don’t want you to find it in Google.”

Cutts’ current advice (again) to webmasters is that it's important to get ready for mobile. He spoke to the convenience for website visitors when you utilize their auto-complete web form annotations, to make it easier for people to fill out forms on your site. The mark-up to add to the forms is easy to do, and will be available in the next few months.

The next generation of the algorithm will look at the issue of ad-heavy websites, particularly those with a large number of ads placed above the fold. This is really not a surprise, as it makes for a bad user experience and Google has previously announced that their page layout algorithm is targeting this. But sites using JavaScript to make it appear to Googlebot that the ads aren’t above the fold should look at replacing the ads before the algorithm impacts them.

Matt Cutts Q&A

During Q&A, Cutts discussed links from press release sites. He said Google identified the sites that were press release syndication sites and simply discounted them. He does stress that press release links weren’t penalized, because press release sites do have value for press and marketing reasons, but those links won’t pass PageRank.

The problem of infinite scrolling websites was raised, such as how Twitter just keeps loading more tweets as you continue to scroll down. He cautions that while Google tries to do a good job, other search engines don’t handle infinite scrolling as well. He suggests any sites utilizing infinite scrolling also have static links, such as with a pagination structure, so bots can have access to all the information if their bots don’t wait for the infinite loading of the page.

Someone asked about whether being very prolific on blogs and posting a ton of posts daily has any impact on search rankings. Cutts used the Huffington Post as an example, as they have a huge number of authors, so logically they have many daily posts. However, he says posting as much as your audience expects to see is the best way to go.

In closing, Cutts said they are keeping a close eye on the mix of organic search results with non-organic search results and says he would also like to hear feedback on it.

While no new features were announced during his keynote at Pubcon, Cutts packed his presentation with many takeaways for webmasters.


Original Article Post by Jennifer Slegg @ Search Engine Watch

Google's Matt Cutts on Search vs. Social: Don't Rely on Just One Channel

Matt Cutts

You wondered if you should just throw in the hat when it comes to trying to rank in organic search results and maybe spend more time on other avenues of traffic generation, such as social media leads or pay per click? This is actually the topic of the latest Google webmaster help video from Google's Distinguished Engineer Matt Cutts.
Since Google is been actively updating its search results, it is hard for people to trust Google anymore. Should one start focusing on getting leads from social media other than search engine results?
Google will constantly be upgrading its algorithm, Cutts said. It's never going to be complete, they will continue to update it as spammers find new ways to rank and as they integrate new signals or change the weight of those signals in the algorithm.

Cutts added that since the start of the summer (when this particular video was filmed) Google has been trying to make it even harder for spammers to rank from black hat techniques.

"So we are always going to be updating, always going to change things to make things better, always trying to innovate the way that we record search algorithms, that's just the nature of the beast," Cutts said. "The goal is always the same, to return the highest quality set of search results."

He said webmasters could probably spend their time better than trying to figure out every little nuance of the Google search algorithm.

"So rather than trying to figure out reverse engineering our algorithm and trying to find all the different ways to rank higher and trying to take shortcuts, as long as you're trying to make a fantastic site that people love, that's really compelling, that there are always going to, that's the sort of thing that puts you on the same side as Google," Cutts said.

Now what about getting leads from other sources, such as social media leads? Cutts discussed this type of traffic generation next.

"I'm all for having eggs in lots of different baskets, because if your website goes down and then you can always have your brick-and-mortar business," Cutts said. "If your ranking on Google is not as good, then you can have other channels that you can use – from print media advertising, to billboards, to Twitter, to Facebook. So you should always have a very well-rounded portfolio of ways to get leads whether from people walking through your door or Yellow Pages or whatever it is, because you can't count on any one channel always working out perfectly.

"As long as you have great content you should do well in Google," he added. "But you know if people are spamming or you hire a bad SEO, that can lead to unpredictable results.

So yes, this is another case where Google is praising good content as being the best path forward, because that is what the end user wants to see. And Google 
is about making the end user – not the SEO company – happy.


Original Article Post by Jennifer Slegg @ Search Engine Watch

Google Doesn't Consider Geolocation Techniques Spam

Matt Cutts

One of the cardinal rules of SEO is to make sure you serve the same content to Googlebot that you serve to end users. Many years ago this used to be a spam technique where it would show Googlebot one version of a webpage, while sending the users to completely different version, often that had nothing to do with what Googlebot thought it did.

So if you're using geo-targeting in order to serve country-specific content to the end-user, how should you handle this? That's the topic of the latest webmaster help video.
Using Geo-detection technique is against Google, I am offering the useful information (price, USP's) to the users based on the Geo-location, will Google consider this as a Spam i.e. showing X content to SE and Y content to user.
Google's Distinguished Engineer Matt Cutts explained exactly how geo-targeting should be used by webmasters so that there are no problems from a Google penalty perspective.
Geolocation is not spam. As long as you are showing "oh someone is coming from a French IP address and let’s direct them to the French version of my page or the French domain for my business", that is totally fine. "Someone comes in from a German IP address I’ll redirect them over to the German version of my page" that’s totally fine.
He also made it clear that you shouldn't treat Googlebot any differently than you would at ordinary user coming into your website.
So if Googlebot comes in, you check the IP address, and imagine we’re coming from the United States, just redirect Googlebot to the United States version of the page, or the dot-com, or whatever it is you would serve to regular United States users. So geolocation is not spam."
Cutts also said that Google treats traffic differently based on geolocation, "so that if the user comes in, they send them to what they think is the most appropriate page based on a lot of different signals, but usually the IP address of the user."
Now comes into the gray area, when it comes to the different content you're serving to Googlebot versus the end user. Cutts said:
Showing X content to search engines and Y content to users, that is cloaking, that’s showing different content to Google than to users, and that is something I would be very careful about.
But. as long as you’re treating Googlebot just like every other user, whatever IP address they come from when you’re geo-locating, as long as you don’t have special code that looks for the user agent of Googlebot or special code that looks for the IP address of Googlebot, and you just treat Googlebot exactly like you would treat a visitor from whatever country were coming from, then you’ll be totally fine.
Because you’re not cloaking you’re not doing anything different for Google, you’re doing the exact same thing for Google that you would do for any other visitor coming from that web address. As long as you handle it that way you’ll be in good shape you won’t be cloaking and you’ll be able to return nicely geo-located pages for Google and search engines without any risk whatsoever.
So when it comes to geolocation, you'll be fine as long as you make sure that what you serve to Googlebot is the same as what you would serve to a user from the same country that Googlebot is coming from.


Original Article Post by Jennifer Slegg @ Search Engine Watch

Matt Cutts: Google Penguin 2.1 is Going Live Today

google-penguin-watch-out-webspam

Google Penguin 2.1 is launching today, according to a tweet from Google's Distinguished Engineer Matt Cutts.

The first update to the second-generation Penguin algorithm designed to target web spam will affect "~1% of searches to a noticeable degree."

Google Penguin 2.0 went live on May 22 and affected 2.3 percent of English-U.S. queries. When it launched, Cutts explained that while it was the fourth Penguin-related launch, Google referred to the change internally as Penguin 2.0 because it was an updated algorithm rather than just a data refresh.

"It's a brand new generation of algorithms," Cutts said in May. "The previous iteration of Penguin would essentially only look at the home page of a site. The newer generation of Penguin goes much deeper and has a really big impact in certain small areas."

This is Google's fifth Penguin-related launch.

Google originally launched the algorithm that was eventually become known as Penguin 1.0 in April 2012. There were two refreshes last year: in May and October.

You can learn more about Google Penguin 2.0 in our stories below.
Original Article Post by Danny Goodwin @ Search Engine Watch

Matt Cutts: You Don't Have to Nofollow Internal Links

Matt Cutts

A recent Google webmaster help video brings up an interesting question: should a webmaster nofollow internal links within their site or does it not make any difference because they are simply internal links?

Here's the question asked of Google's Distinguished Engineer Matt Cutts:
Does it make sense to use rel="nofollow" for internal links? Like, for example, to link to your login page? Does it really make a difference?
"Rel='nofollow' means the PageRank won't flow through that link as far as discovering the link, PageRank computation and all that sort of stuff," Cutts said. "So for internal links, links within your site, I would try to leave the nofollow off.

"So if it’s a link from one page on your site to another page on your site, you want that featuring to flow, you want Googlebot to be able to find that page," Cutts continued. "So almost every link within your site, that is a link going from one page on your site to another page on your site, I would make sure the PageRank does flow which would mean leaving off the nofollow."

Of course, there are always exceptions to the rule, and things like login pages can be the exception. He said it doesn't hurt to put the nofollow link for a link pointing to a login page, or things like terms and conditions or other "useless" pages. However, it doesn't hurt at all for those pages to be crawled by Google.

He brought up the fact that nofollow is a useful tool when it comes to linking to outside sites. He brings up blog comments and forum profile links specifically as something that should be nofollowed. And when you're giving an authentic link, you want to make sure that is the site you can trust and that you're giving your stamp of approval.

Cutts made a good suggestion, saying if there are pages within your site that you will be placing nofollow links pointing to it, then it makes more sense to use no index so that those pages aren’t indexed at all. And he made another valid point that sometimes having those login pages indexed is useful for users who want to find a login page so that they can simply log into your site, especially if it's the type of site doesn’t have the login fields on every page.

Cutts closed with an important reminder: it makes sense to ensure Googlebot is able to crawl all your pages on your site.


Original Article Post by Jennifer Slegg @ Search Engine Watch

Does Google Penalize For Invalid HTML? Matt Cutts Says No

Matt Cutts

It's easy to make a mistake when trying to create perfect HTML code that validates correctly every time. When one of the features in Google Webmaster Tools is the ability to validate your code to see if there are any errors, certainly it raises the question of how important it is to have validated code.

While we all know the reasons why we should write valid HTML, in reality it doesn't always happen. But from an SEO perspective, how important is valid HTML when it comes to your Google rankings in organic search. That is the topic of the latest Google webmaster help video.

Does the crawler really care about valid HTML? Validating google.com gives me 23 errors, 4 warning(s)

In a new video, Google's Distinguished Engineer Matt Cutts explained why it's best to have validated HTML code and ensure your code is clean.

"It makes it more maintainable, it makes it easier whenever you want to upgrade, it makes it better if you want to hand that code off to somebody else, there's just a lot of good reasons to do it," Cutts said.

There are a lot of coding purists, where every little piece of code has to perfectly validate before they will launch a website, and then there's the other end of the spectrum where many have atrociously coded HTML, yet it ranks extremely well. The rest fall somewhere in between atrocious and perfect.

Cutts continued saying that Google needs to work with the webpages that are available, not the perfectly validated webpages in Google's perfect world. And because of this, Google's webcrawler has to compensate for people's poorly coded HTML, or for told that has been changed for things like for loading purposes.

"So Google does not penalize you if you have invalid HTML because there would be a huge number of webpages like that and some people know the rules and then decided to make things a little bit faster or to tweak things here there and so their pages don't validate and there are enough pages they don't validate that we said OK this would actually hurt search quality if we said only the pages that validate are allowed to rank or rank those a little bit higher."

He does caution that Google could make changes in the future.

"Now I wouldn't be surprised if they correlate relatively well, you know maybe it's a signal we'll consider in the future, but at least for right now do it because it's good for maintenance, it's easier for you if you want to change the site in the future, don't just do it because you think it'll give you higher search rankings."


Original Article Post by Jennifer Slegg @ Search Engine Watch

Matt Cutts: Too Many nofollow Links Won't Hurt Your Google Search Rankings

Matt Cutts

One of the latest webmaster help videos brings up a really interesting SEO question: what happens if you get a lot of nofollow links, such as links gained for the explicit purpose of generating direct traffic. Could that negatively impact your search rankings?

SEO professionals have known for quite some time that putting nofollow on the link is a great way to tell Google that you don't want to pass PageRank for this particular link or that you somehow don't trust the sites the link is on. So if a large number of nofollowed links are pointing at a single site, could this potentially raise a flag to Google that the site is untrustworthy for some reason?

"No, typically nofollow links cannot hurt your site – so upfront, very quick answer on that point," said Matt Cutts, Google's Distinguished Engineer. "That said, let me just mention one weird corner case, which is if you are leaving comments on every blog in the world, even if those links might be nofollow, if you are doing it so much that people know you and they're really annoyed by you and people spam report about you, we might take some manual spam action, for example."

Fortunately, things like blog comments tend to have a very particular footprint that would be easy to spot. Cutts mentioned a specific example of where blog comments where problematic, even though they were already nofollowed, because it was done on such a massive scale.

"So I remember for a long time on TechCrunch, any time that people showed up, there was this guy, Anon.TC, would show up and make some nonsensical comment. And it was clear that he was just trying to piggyback on the traffic and drive the traffic from people reading the article directly to whatever he was promoting. And so even if those links were nofollow, if we see enough mass-scale action that we consider deceptive or manipulative, we do reserve the right to take action," Cutts said.

"So we carve out a little bit of an exception if we see truly huge-scale abuse. But for the most part, nofollow links are dropped out of our link graph as we're crawling the web, and so those links that are nofollow should not affect you from an algorithmic point of view," Cutts said.

That said, Cutts indicated that in the future this potentially might not be the case, if some spam loophole is found.

"I always give myself just the smallest out in case we find somebody who's doing a really creative attack or mass abuse or something like that. But in general, no. As long as you're doing regular, direct-traffic building and you're not annoying the entire web or something like that, you should be in good shape."

Original Article Post by Jennifer Slegg @ Search Engine Watch

Matt Cutts Talks Google Panda Basics: Make Sure You've Got Quality Content

Matt Cutts

What should you do if you think your site might be affected by Google's Panda algorithm? And what types of content get impacted negatively by Panda? That is the topic of a recent video featuring Google's Distinguished Engineer Matt Cutts.
That first gives is a bit of a primer on how Panda rolled out previously and how it currently is rolling out into the search algorithm.

"So Panda is a change that we rolled out, at this point a couple years ago, targeted towards lower quality content. And it used to be that roughly every month or so, we would have a new update. And we would say, OK, there's something new, there's a launch, we've got new data, let's refresh the data.

"And it had gotten to the point where with Panda, the changes were getting smaller, they were more incremental. We had pretty good signals. We'd pretty much gotten the low hanging wins. So there weren't a lot of really big changes going on with the latest Panda changes."

So as Google got better at finding low quality content, they adjusted how and when new Panda updates would impact the search results.

"And we said, let's go ahead, and rather than have it be a discrete data push – that is, something that happens every month or so, at its own time, when we refresh the data, let's just go ahead and integrated into indexing. So at this point, we think that Panda is affecting a small enough number of webmasters on the edge that we said, let's go ahead and integrate it into our main process for indexing."

But what if you did get hit with Panda? First off, it likely means that your content is either poor quality, or it's of the cut-and-paste variety that can be found on many free article sites.

You should also check in your Google Webmaster Tools to see if there's any kind of alerts for you in your account that can help you determine if it is Panda or something else that is negatively affecting your Google search rankings.

"And so, if you think you might be affected by Panda, the overriding kind of goal is to try to make sure that you've got high-quality content, the sort of content that people really enjoy, that's compelling, the sort of thing that they'll love to read that you might see in a magazine or in a book, and that people would refer back to, or send friends to, those sorts of things," Cutts said. "So that would be the overriding goal."

So what if you think it might be the quality of your content that is affecting your rankings? Panda was pretty tough on many types of content that Google deemed to be of poor quality.

"So if you are not ranking as highly as you were in the past, overall, it's always a good idea to think about, OK, can I look at the quality of the content on my site? Is there stuff that's derivative, or scraped, or duplicate, and just not as useful?"

Not surprisingly, Cutts said this is a type of content that doesn't rank well, and it's the quality content that will be higher up in the Google search rankings.


Original Article Post by Jennifer Slegg @ Search Engine Watch

SES San Francisco Keynote: Google's Matt Cutts & Patrick Thomas on Search Engine Censorship

Mike Grehan Patrick Thomas Matt Cutts
Image Credit: Simon Heseltine

Designing a search engine is no easy task.

Should a search engine filter content such as violent images, pornography, malware, spam, personally identifiable information, hate content, hacking instructions, bomb-making instructions, pro-anorexia sites, Satanism and witchcraft, necrophilia, content farms, and black hat SEO firms from its search results?

Welcome to the slippery slope Google and many other search engines must navigate when creating algorithms to crawl the web. With 60 trillion URLs on the web, a search engine needs to make many tough calls to on content ensure users get what they're looking for.

SES San Francisco kicked off today with a keynote featuring Google's Distinguished Engineer Matt Cutts and Patrick Thomas, a specialist on Google's User Policy team, who walked through several difficult decisions Google's search team has faced recently and through the years.

The keynote began with some news about backlinks in Google Webmaster Tools. You can read more about that here.

As Thomas explained, there is plenty of great content. While some decisions are straightforward, every search engine must determine where to draw the line.

In Google's case, they have a cheat sheet of sorts, where they aim to ensure the results are as comprehensive as possible, removals are kept to a minimum, they rely on algorithms over manual action, they help users avoid identity theft, and they don't push any offensive content to users if they didn't specifically search for it.

"We want to be a little careful not to shock and offend you," Thomas said. "We try to balance what our users search for and what's on web, but not offend users."

The web is the sum of human knowledge and experience. You're going to come across a lot of controversial content, Thomas said.

Every website believes it should rank number one, so Google realized that it must minimize the manual influence, Cutts said. For instance, everyone will disagree on the definition of spam. After all, one man's spam is another's great marketing strategy, just as one man's censorship is another's law, and one man's censorship is someone else's corporate policy.

What exactly is a content farm? Different people have different lines.
In the case of removing pages from its index due to defamatory content, Google must deal with issues of he said, she said, or even this country said, that country said. When everyone is alleging everyone is doing something wrong, what do you do?

Discussion then turned to the worst of the worst: Explicitly sexual content, hate speech, and violent content.

One sticky issue is, what exactly is pornography? Is it a woman giving a self-breast exam? Or a painting of a nude woman? Or is it more "edgy" than this?

In the case of filtering porn, Cutts said that if a 10-year-old cub scout is searching Google and his mom was watching, would she be shocked by Google's results? Maybe a bikini wouldn't be the end of world, but Google definitely doesn't want to have explicit sexual content on terms such as "breast", as most searchers might want results about breast cancer.

Then there are hate speech type sites. If you remove a hate site from the search results, it's still on the web. If you remove a hate site with an algorithm, then what happens to the anti-hate sites, because most of these sites share much of the same language as the hate sites.

Then there's violent content. They cited an example, which ultimately wasn't censored, of a picture from Syria featuring piles of dead bodies. Thomas said it was "extremely newsworthy, probably coming from primary sources on ground. But you probably wouldn't see this on nightly news broadcast."

Another controversial topic Google deals with involves autocomplete search suggestions. Cutts said the raw material for Google's suggestions is what people are typing in, combined with content on web that supported such terms, such as in the case of a "Bernie Madoff (or Allen Stanford) Ponzi scheme" search suggestion.

Most recently, Google found itself internally debating censoring autocomplete suggestions when the Boston bombing happened earlier this year. "Used pressure cooker bomb" became a top suggestion.

Google as a search engine indexes the web, and sometimes influences the web, whether it's pushing the importance of fast page speed or quality content through algorithmic updates like Panda. There may be no easy answers or ever universal agreement on the best way to design a search engine, but as this keynote session highlighted, the choices are rarely black and white.

After the session, I interviewed Cutts. Here's what he had to say about the new Webmaster Tools data and more:

Original Article Post by Danny Goodwin @ Search Engine Watch

Matt Cutts on Auto-Generated Content: Google Will Take Action

Matt Cutts

The latest webmaster help video from Google's Distinguished Engineer Matt Cutts deals with auto-generated content and whether manual action is taken against these types of sites (i.e., whether the site will receive a Google penalty).

Many years ago, these websites were often referred to as "Made for AdSense" or "MFA" sites. The only reason these MFA sites were created was in the hopes that people would land on the page, not find what they're looking for, and click an AdSense ad to leave the page rather than hitting the back button.

Most of these types of sites are automatically generated with a script that takes snippets of either search results or web pages with those keywords on it. There is no real content to them, just the auto-generated snippets.

One user who has seen these types of sites asked Cutts specifically if Google is taking any action against these sites.
What does Google do against sites that have a script that automatically picks up the search query and makes a page about it? Ex: you Google [risks of drinking caffeine], end up at a page: "we have no articles for DRINKING CAFFEINE" with lots of ads.

"We are absolutely willing to take action against those sites," Cutts said. "So we have rules in our guidelines about auto-generated pages that have very little value and I have put out in the past specific calls for sites where you search for product a VCR or laptop or whatever and you think you really get a review and the new land there and the very first thing you see is '0 Reviews Found for [blah blah blah]'."

From Google's perspective, if a searcher lands on one of these pages and doesn't find what they are looking for, that results in a bad user experience, which is why Google would take action against these zero value sites.

Cutts also said that webmasters of sites with search results should make sure that those search results snippets aren't being indexed unless there's something highly unusual about it, such as data no one else has. But in this case, you still need to ensure that you aren't allowing Google to index search result pages that are saying "0 Results Found" because that isn't helpful to Google searchers.

Bottom line: if you're creating web pages of auto-generated text and snippets, you should be aware that Google can and will penalize your site. And if you're just a user who is finding these types of pages in the Google search results, you can send in a spam report.


Article Post @ Search Engine Watch

Matt Cutts: Google +1s Don't Lead to Higher Ranking

google-plus-questionIt isn't often that Google's Distinguished Engineer Matt Cutts comes right out to debunk a highly publicized blog post regarding something to do with ranking in Google.

Everybody has their own opinions of what works and what doesn't work, and SEO in itself can be highly subjective, primarily because Google doesn't really come out and specifically admit the things that work, because they don't want people gaming the system.

Moz published a blog post "Amazing Correlation Between Google +1s and Higher Search Rankings" claiming that Google +1s had a direct correlation with higher search rankings in Google – and that it was higher than any other ranking factor. The post was written by Cyrus Shepard, the "Senior Content Astronaut" at Moz, and the data was taken from their 2013 ranking factors.

It's a pretty sensational title, and immediately sparked a lot of discussion. His post brought up a lot of points about why he feels this correlation is correct, such as posts shared on Google+ are crawled and indexed almost immediately, and that posts on the site pass "link equity". He also noted that authorship shares in the rankings as well. However, he's also stating it as fact, instead of just a possibility without any specific hard data with proof, such as specific sites where an increase in rankings can be solely attributed to Google +1's.

In addition to grabbing the attention of many in the SEO industry (many of whom trashed the post as being highly flawed), Cutts immediately stepped into debunk the claim of the correlation between rankings and +1s. Specifically, Cutts wrote:
Just trying to decide the politest way to debunk the idea that more Google +1s lead to higher Google web rankings. Let's start with correlation != causation: http://xkcd.com/552/
But it would probably be better to point to this 2011 post (also from SEOMoz/Moz) from two years ago in which a similar claim was made about Facebook shares: http://moz.com/blog/does-google-use-facebook-shares-to-influ... . From that blog post from two years ago: "One of the most interesting findings from our 2011 Ranking Factors analysis was the high correlation between Facebook shares and Google US search position."

This all came to a head at the SMX Advanced search conference in 2011 where Rand Fishkin presented his claims. I did a polite debunk of the idea that Google used Facebook shares in our web ranking at the conference, leading to this section in the 2011 blog post: "Rand pointed out that Google does have some access to Facebook data overall and set up a small-scale test to determine if Google would index content that was solely shared on Facebook. To date, that page has not been indexed, despite having quite a few shares (64 according to the OpenGraph)."

If you make compelling content, people will link to it, like it, share it on Facebook, +1 it, etc. But that doesn't mean that Google is using those signals in our ranking.
Rather than chasing +1s of content, your time is much better spent making great content.
So his belief falls in line with what a lot of SEO professionals are doing for long-term SEO success, where creating great quality content that is more likely to be shared is the best kind of strategy when it comes to content.

He does continue to reiterate that +1s and rankings are not related. "Most of the initial discussion on this thread seemed to take from the blog post the idea that more Google +1s led to higher web ranking. I wanted to preemptively tackle that perception."

Cutts also mentioned that another SEO has been doing a rigorous study on whether it +1s lead to higher rankings are not, which he suspects will be released the next month or two. If it is providing specific examples in the study, it will be good to be the most conclusive evidence SEOs will have about whether it is or isn't a ranking factor with concrete data to back it up.

Cutts made similar statements last year at SES San Francisco, when he said that Google doesn't put a lot of weight on +1's yet and advised people not to assume Google+ equates to rankings.

Below are a few reactions from Twitter. What's your take?
moz-tweet-dave-naylor
moz-tweet-ben-cook
moz-tweet-chad-lio
moz-tweet-paul-gailey


Article Post @ http://searchenginewatch.com/article/2290337/Matt-Cutts-Google-1s-Dont-Lead-to-Higher-Ranking

Google's Pure Spam Manual Action: Matt Cutts on How to Fix it

Matt Cutts
Author's Note: This is the first in a weekly series detailing specific spam warnings that webmasters might find displayed in Google Webmaster Tools manual action viewer, the types of things that are flagging each warning, and what webmasters should do to fix it to see the warning removed.

Google has released a series of seven videos designed to help webmasters resolve specific types of spam issues that have been identified on their site. With Google Webmaster Tools offering more specific details about why a website might be penalized, these videos are designed to help you know exactly what kind of manual spam action your site has been impacted by, and the specific steps you can take to correct the issues in Google's eyes.

What is Pure Spam?

Google considers pure spam to be anything to spam that anyone with a bit of tech savviness can tell that it spam. Often called "black hat", Cutts said this includes such things as “auto generated gibberish, cloaking, scraping, throwaway sites, or throwaway domains, where someone is more or less doing churn and burn where they are creating as many sites as possible to make as much money as possible before the get caught.”

Cutts said this is that type of spam that Google takes the most action against. He added that it's rare for people to actually file reconsideration requests for sites that are classified as pure spam, because many webmasters approach them as churn and burn.

For example, here's an image of auto-generated spam site Cutts included in a blog post a number of years ago:
Matt Cutts
Image Credit: Matt Cutts Blog


Fixing Pure Spam on a New Website

Sometimes there are legitimate cases where site owners have purchased the domain only to discover that there is a huge amount of spam in the domain's history, making it difficult for a new owner to then create something legitimate on that domain. People can look up a domain's history on archive.org and see what kind of spam issues had been happening, so it will become a priority to ensure that the new owner is starting with a clean slate with none of the spam content anywhere to be seen.

If this sort of thing happened to you, you must take special care to ensure that the new site you're putting on the previously spamming domain is high quality and nothing that could be remotely confused with being spammy. You essentially need to create actions within the site that gives Google signals that the site is now trustworthy and should be included in the index.

Fixing Pure Spam on an Existing Website

If your site has been flagged as being pure spam, this is probably one of the more difficult spam flags to overcome, because it is reserved for the spammiest of websites. That means, when you file your reconsideration request, you need to ensure that there is nothing anywhere on the site that could be remotely considered spam.

When you're trying to clean up, ensure everything that violate the Google webmaster guidelines has been removed, and that the quality guidelines are being followed to the letter. You should look at it from the perspective of building an entirely new site with new quality content.

Cutts said it's important for webmasters who are trying to clean up from a pure spam warning that they document everything they do, whether it is having purchased the domain from a previous owner, discovering and then removing spam you didn't realize existed on your site, or just simply not knowing better when you created what you thought was a fabulous auto-generated site.

When you finally file a reconsideration request, be sure that you include the steps you took to clean it up and when, so that Google can investigate and decide if the site has really turned over a new leaf.

Article Post @ Search Engine Watch

Matt Cutts Advises When to nofollow Widgets, Infographics

Matt Cutts
If you're creating infographics, or other pieces of embeddable code for things such as widgets, counters, and badges, how should you link those things for best SEO practices according to Google? This topic is the latest webmaster help video from Google's Distinguished Engineer Matt Cutts.

What should we do with embeddable codes in things like widgets and infographics? Should we include the rel="nofollow" by default? Advert the user that the code includes a link and give him the option of not including it?

This thing got very popular a few years ago, when people were making embeddable things such as badges and counters and including a direct link back to a site, often with very keyword rich anchor text. However, as you can imagine with any great link technique, it soon became pretty spammy with many of those things linking to poker sites or online pharmacies. As Google started negating some of those links, it fell out of common practice.

However, with the resurgence of infographics in the past year or two, embeddable code offered to webmasters to put on their site has become very popular once again. But when it comes to link back to your site, how should you handle this so Google doesn't penalize the link? Should you go for keyword rich anchor text? Or just simply no follow it, and use it strictly for the traffic a direct click would get you.

"I would not rely on widgets and infographics as your primary source to gather links, and I would recommend putting a nofollow, especially with widgets," Cutts said

He brings up the fact that a lot of the problem stems from people using things like counters and widgets, but not realizing there was some sort of link back to a site embedded within it. Not all of these types of embeddable code made it obvious that you might be linking to a poker site or pharmacy site, is many of them were hidden quite well, so there is not much of an editorial choice at all by the webmasters who might be using these things on their sites.

"Depending on the scale of stuff you're doing with infographics you might consider putting a rel=no follow on infographic links as well. The value of those things might be branding, they might be to drive traffic, they might be to sort of let people to know that your site or your service exists but I wouldn't expect a link from a widget to necessarily carry the same weight as an editorial link freely given where someone is recommending something and talking about it in a blog post."



Article Post @ Search Engine Watch

Matt Cutts on How to Deal with Harmful Backlinks: Just Do a Disavow

Matt Cutts
In the post-Penguin world, webmasters have become hyper aware of the dangers of low-quality links pointing to their website. Many fear such links, such as those from a porn website, could hurt their Google traffic and rankings – even if they had nothing to do with obtaining them.

So if webmasters are following the proper steps on bad backlinks, can those links – especially if they were generated by competitor – still hurt your website? That is the question in the latest Google Webmaster Help video:
Recently I found two porn websites linking to my site. I disavow those links and wrote to admins asking them to remove those links but... what can I do if someone, (my competition), is trying to harm me with bad backlinks?

Google's Distinguished Engineer Matt Cutts said that is precisely what webmaster should do in the situation. Try to contact the website to resolve it to get the links removed. If that doesn’t work, then submit the links for Google to disavow.

Cutts said that if the links are on sites you don’t want to be associated with whatsoever, to go and disavow the site from the domain level (e.g., example.com) rather than that the exact page the link appears on (e.g., example.com/directory/directory/spamlinkpage.html)

Once you have completed these steps, Cutts said you should no longer have to worry about those links harming your website.


Article Post @ Search Engine Watch

Matt Cutts on Google's International Spam-Fighting Efforts

Matt Cutts If you've ever been curious just how much web spam fighting goes on for non-U.S. markets, it is the topic of the latest webmaster help video from Google's Distinguished Engineer Matt Cutts. 
Is the webspam team taking the same measures to counter spam in international markets like India like they do in the US market? It just seems like there are a lot of junk sites that come up in the first page of results when searching on google.co.in.

"The web spam team has both engineers who work on algorithmic spam and we have the manual web spam team and both of those work on spam around the world," Cutts said. "So Google.co.in in India, we want the algorithms whether they be link spam or keyword stuffing or whatever work in every language is much as we can, and so we do try to make sure to a degree that it is possible for us to do it, we internationalize those algorithms.

Cutts said that Google engineers fight spam in 40 different languages throughout the world. Not only that, they also have engineers located throughout the world, including in Hyderabad, who are not only working on Google.com, but are also working on the international versions of Google, such as in Google.co.in to fight spam.

He said that English search spam in the U.S. on Google.com tends to get more attention, since not every engineer can speak specific languages. But he follow that up that if you see any problematic spam in any of the Google versions, to submit a spam report so it can be looked into, to post in the Google webmaster help forums, or even send a tweet about it.


Article Post @ Search Engine Watch

Matt Cutts: Be Careful About How You Choose Your ccTLDs

Matt CuttsWith great .com names becoming more and more scarce, and with the resale value of fabulous .com names being worth thousands or even millions of dollars, many businesses have resorted to that great name – but in a country-specific TLD (ccTLD), even if it's a country they don’t do business in.

But in a world where Google is geo-targeting search results, could using a country specific TLD hurt your search rankings in Google, or can Google figure it out for themselves that your keyword.tv domain isn’t actually a business located in Tuvalu?

Because this is an issue that more webmasters have been dealing with, the latest Google webmaster help video is all about dealing with ccTLDs and how it could affect rankings.

As memorable .COM domains become more expensive, more developers are choosing alternate new domains like .IO and .IM - which Google geotargets to small areas. Do you discourage this activity?

Google does consider many of the ccTLDs to be specific to that country when it comes to search results, so they are warning to be careful about how you choose your TLDs. Particularly, you need to be careful when you are using a ccTLD with the intent to be something different from how it is generally used.

For example, Google's Distinguished Engineer Matt Cutts talks about a business wanting to use the .li ccTLD where the .li would stand for Long Island. However, .li is the ccTLD for Liechtenstein, and the usage for that particular ccTLD is overwhelmingly sites about or targeting Liechtenstein. So using .li to stand for Long Island would be trying to change the intent of that ccTLD and would not be in the best interest of searchers and would likely not produce the best search results for a site that wants to target Long Island.

"There are a few ccTLDs that are sort of generic," Cutts said. "Because for example.io stands for something related to the Indian Ocean but there were very few domains that were actually relevant to that and a lot of startups were using that and it was something that was really much more applicable to the entire world.

"And so we do periodically review that list and if we see something that’s primarily used worldwide and is not really all that specific to that country that we might go ahead and say okay this is a generic ccTLD so go ahead and even if you have a .io domain, don’t target it just a the Indian Ocean, anyone worldwide could potentially see that in their search results or are more likely to."

He does caution webmasters that they shouldn’t assume that all ccTLDs will be considered generic or will be considered generic in the future. So be careful about the ccTLDs that you might want to misappropriate for your own use.

"But I wouldn't get too far ahead of it, because if you jump on to a certain, for example, there's .KY, and if you say, oh, I'm going to make that all about Kentucky, well, that might work for you, but it might not work for you. And so it's the sort of thing where if you assume that you are going to be able to take things away from the Cayman Islands and turn it into Kentucky, well, if the Cayman Islands is already using .KY, then I wouldn't assume that you'll be able to necessarily apply it in this general or generic sort of way."

Finally, earlier this year Google released an updated list of various ccTLDs, rTLDs and gccTLDs that Google now considers to be top level domains when it comes to search results. And yes .tv is on that list, along with other popular ccTLDs such as .me, .bz and .ws, along with .io from Matt’s example.


Article Post @ Search Engine Watch
 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2013. Google Updates - All Rights Reserved
Template Created by Creating Website Published by Mas Template
Proudly powered by Blogger