skip to main |
skip to sidebar
Ever since Google launched the Penguin update back in April 2012, the SEO community has debated the impact of negative SEO, a practice whereby competitors can point hundreds or thousands of negative backlinks at a site with the intention of causing harm to organic search rankings or even completely removing a site from Google's index. Just jump over to Fiverr and you can find many gigs offering thousands of wiki links, or directory links, or many other types of low-quality links for $5.
By creating the Disavow Links tool, Google acknowledged this very real danger and gave webmasters a tool to protect their sites. Unfortunately, most people wait until it's too late to use the Disavow tool; they look at their backlink profile and disavow links after they've been penalized by Google. In reality, the Disavow Links tool should be used before your website suffers in the SERPs.
Backlink audits have to be added to every SEO professional's repertoire. These are as integral to SEO as keyword research, on-page optimization, and link building. In the same way that a site owner builds links to create organic rankings, now webmasters also have to monitor their backlink profile to identify low quality links as they appear and disavow them as quickly as they are identified.
Backlink audits are simple: download your backlinks from your Google Webmaster account, or from a backlink tool, and keep an eye on the links pointing to your site. What is the quality of those links? Do any of the links look fishy?
As soon as you identify fishy links, you can then try to remove the links by emailing the webmaster. If that doesn't work, head to Google's disavow tool and disavow those links. For people looking to protect their sites from algorithmic updates or penalties, backlink audits are now a webmaster's best friend.
If your website has suffered from lost rankings and search traffic, here's a method to determine whether negative SEO is to blame.
A Victim of Negative SEO?
A few weeks ago I received an email from a webmaster whose Google organic traffic dropped by almost 50 percent within days of Penguin 2.0. He couldn't understand why, given that he'd never engaged in SEO practices or link building. What could've caused such a massive decrease in traffic and rankings?
The site is a 15-year-old finance magazine with thousands of news stories and analysis, evergreen articles, and nothing but organic links. For over a decade it has ranked quite highly for very generic informational financial keywords – everything from information about the economies of different countries, to very detailed specifics about large corporations.
With a long tail of over 70,000 keywords, it's a site that truly adds value to the search engine results and has always used content to attract links and high search engine rankings.
The site received no notifications from Google. They simply saw a massive decrease in organic traffic starting May 22, which leads me to believe they were impacted by Penguin 2.0.
In short, he did exactly what Google preaches as safe SEO. Great content, great user experience, no manipulative link practices, and nothing but value.
So what happened to this site? Why did it lose 50 percent of its organic traffic from Google?
Backlink Audit
I started by running a LinkDetox report to analyze the backlinks. Immediately I knew something was wrong:
Upon further investigation, 55 percent of his links were suspicious, while 7 percent (almost 500) of the links were toxic:
So the first step was to research those 7 percent toxic links, how they were acquired, and what types of links they were.
In LinkDetox, you can segment by Link Type, so I was able to first view only the links that were considered toxic. According to Link Detox, toxic links are links from domains that aren't indexed in Google, as well as links from domains whose theme is listed as malware, malicious, or having a virus.
Immediately I noticed that he had many links from sites that ended in .pl. The anchor text of the links was the title of the page that they linked to.
It seemed that the sites targeted "credit cards", which is very loosely in this site's niche. It was easy to see that these were scraped links to be spun and dropped on spam URLs. I also saw many domains that had expired and were re-registered for the purpose of creating content sites for link farms.
Here's an example of what I saw, repeated over and over again:
From this I knew that most of the toxic links were spam, and links that were not generated by the target site. I also saw many links to other authority sites, including entrepreneur.com and venturebeat.com. It seems that this site was classified as an "authority site" and was being used as part of a spammers way of adding authority links to their outbound link profile.
Did Penguin Cause the Massive Traffic Loss?
I further investigated the backlink profile, checking for other red flags.
His Money vs Brand ratio looked perfectly healthy:
His ratio of "Follow" links was a little high, but this was to be expected given the source of his negative backlinks:
Again, he had a slightly elevated number of text links as compared to competitors, which was another minor red flag:
One finding that was quite significant was his Deep Link Ratio, which was much too high when compared with others in his industry:
In terms of authority, his link distribution by SEMrush keyword rankings was average when compared to competitors:
Surprisingly, his backlinks had better TitleRank than competitors, meaning that the target site's backlinks ranked for their exact match title in Google – an indication of trust:
Penalized sites don't rank for their exact match title.
The final area of analysis was the PageRank distribution of the backlinks:
Even though he has a great number of high quality links, the percentage of links that aren't indexed in Google is substantially great. Close to 65 percent of the site's backlinks aren't indexed in Google.
In most cases, this indicates poor link building strategies, and is a typical profile for sites that employ spam link building tactics.
In this case, the high quantity of links from pages that are penalized, or not indexed in Google, was a case of automatic links built by spammers!
As a result of having a prominent site that was considered by spammers to be an authority in the finance field, this site suffered a massive decrease in traffic from Google.
Avoid Penguin & Unnatural Link Penalties
A backlink audit could've prevented this site from being penalized from Google and losing close to 50% of their traffic. If a backlink audit had been conducted, the site owner could've disavowed these spam links, performed outreach to get these links removed, and documented his efforts in case of future problems.
If the toxic links had been disavowed, all of the ratios would've been normalized and this site would've never been pegged as spam and penalized by Penguin.
Backlink Audits
Whatever tool you use - whether it's Ahrefs, LinkDetox, or OpenSiteExplorer – it's important that you run and evaluate your links on a monthly basis. Once you have the links, make sure you have metrics for each of the links in order to evaluate their health.
Here's what to do:
- Identify all the backlinks from sites that aren't indexed in Google. If they aren't indexed in Google, there's a good chance they are penalized. Take a manual look at a few to make sure nothing else is going on (e.g., perhaps they just moved to a new domain, or there's an error in reporting). Add all the N/A sites to your file.
- Look for backlinks from link or article directories. These are fairly easy to identify. LinkDetox will categorize those automatically and allow you to filter them out. Scan each of these to make sure you don't throw out the baby with the bathwater, as perhaps a few of these might be healthy.
- Identify links from sites that may be virus infected or have malware. These are identified as Toxic 2 in LinkDetox.
- Look for paid links. Google has long been at war with link buying and it's an obvious target. Find any links that have been paid and add them to the list. You can find these by sorting the results by PageRank descending. Evaluate all the high PR links as those are likely the ones that were purchased. Look at each and every one of the high quality links to assess how they were acquired. It's almost always pretty obvious if the link was organic or purchased.
- Take the list of backlinks and run it through the Juice Tool to scan for other red flags. One of my favorite metrics to evaluate is TitleRank. Generally, pages that aren't ranking for their exact match title have a good chance of having a functional penalty or not having enough authority. In the Juice report, you can see the exact title to determine if it's a valid title (for example, if the title is "Home", of course they won't rank for it, whether they have a penalty). If the TitleRank is 30+, you can review that link by doing a quick check, and if the site looks spammy, add it to your "Bad Links" file. Do a quick scan for other factors, such as PageRank and DomainAuthority, to see if anything else seems out of place.
By the end of this stage, you'll have a spreadsheet with the most harmful backlinks to a site.
Upload this Disavow File, to make sure the worst of your backlinks aren't harming your site. Make sure you then upload this disavow file when performing further tests on Link Detox as excluding these domains will affect your ratios.
Don't be a Victim of Negative SEO!
Negative SEO works; it's a very real threat to all webmasters. Why spend the time, money, and resources building high quality links and content assets when you can work your way to the top by penalizing your competitors?
There are many unethical people out there; don't let them cause you to lose your site's visibility. Add backlink audits and link profile protection as part of your monthly SEO tasks to keep your site's traffic safe. It's no longer optional.
To Be Continued...
At this point, we're still working on link removals, so there is nothing conclusive to report yet on a recovery. However, once the process is complete, I plan to write a follow-up post here on SEW to share additional learnings and insights from this case.
Article Post @
Search Engine Watch
Some of the most amazing minds in the world come together annually in Las Vegas, Nevada, to share bits of the somewhat incredible, totally awesome and you-won't-sleep-tonight world of online and offline security.
Yes, it's time to once again take a dive into the off-grid world of Black Hat and DEF CON, the information security and hacking conferences.
Though there is a lot of crossover, Black Hat has become more the domain of security professionals, researchers, and academics, whereas DEF CON is an anything goes world that borders on renaissance festival meets hacker fest.
So what are the newest, biggest, baddest potential dangers facing your online presence today?
The Attacks
This post will cover how you might wind up the victim of a Google malware warning, selling Viagra off your .gov, or allowing access to your internal network via your printer.
And you thought WordPress was your biggest worry! Don't fret too much – for most of you, it still is.
Still, the last thing you want to be doing this weekend is sending emails to customers explaining a security breach, exposing their private data, or watching site traffic dive and SERP positions plummet because of a Google malware warning. Knowledge is power.
Let's start with simple, but powerful vulnerability.
1. Ad Holes!
Ever get one of those nasty Google red malware warnings when navigating to your site? Or found out users were getting malicious downloads, but you weren't sure how it happened?
Aside from the typical brute force attack on your passwords where someone got into your site, XSS, or your lack of updates on WordPress security issues, what else might be causing those nasty headaches for you and your users?
Advertising networks.
Why?
Your advertising networks are likely to be one of your biggest security holes on your website.
Most of the advertising networks, even the biggest, let either JavaScript or JavaScript source files to be downloaded into the big security hole on your website.
What to Do
Run a little test. See if you can pull in a simple hello world JavaScript file into an ad on your site or run the JavaScript in the ad itself. Can you? Guess what: you have a huge security hole in your website.
So then ask yourself, unless you have big pull at these networks and can get them to change this huge, obvious issue. Are you making enough money off these ads to make them worth your site security? If not, then remove them; find a network that blocks these scripts or make sure your security protocols watch for unusual behavior that would alert you to this activity before you get the red kiss of death. And as users, this is why you use adblockers.
2. Bypassing Malware Scanners
This one was a bit of brilliance you can't help, but respect. This little exploit uses your site to deliver payloads that bypass all scanners and though it is good that you don't get the malware warning.
The bad thing for you is, they give your users nasty viruses and they can also be used to deliver payloads into your site that could be used to take your site out of the rankings or even get into your internal networks, not to mention use your site as part of a vast bot network to attack other sites.
Not to get too technical, the RDI ("Reflected DOM Injection") in this case uses a set up website to host encrypted content, and a web utility which is provided by a well known website such as Yahoo Cache or in this case, Google Translate.
Why?
When you visit the page, in the background unbeknownst to you, you're also visiting one of these services, which are used to unlock an encrypted file that downloads the malicious attack.
Now you may be thinking ok, but that is on their site. Using the ad networks could that not just have downloaded code into your site, while evading the entire malware scan? This code then delivering a payload that would infect your users or maybe turn your site into a zombie? See where this is going?
That code could be anything. Nothing, no scanner, would pick it up. It looks benign unless activated by the actions of a user.
What to Do?
In this case you need to make sure your site is one, locked down from injectable scripting (Cross Site Scripting as well XSS), payload downloads, and password cracking. There is no failsafe method, but if you make this hard there is a very good chance the attacker will move on to someone easier.
Unless your site is just a really, really great target. In that case you should have a security team that works on defeating these attacks already.
3. HTTPS Cracking in 30 Seconds or Less
The details of how https and SSL are compromised can be found in this Information Week article, which noted that "all versions of the transport layer security (TLS) and secure sockets layer (SSL) protocols are vulnerable to the attack, but not every HTTPS-using site is necessarily at risk."
Basically BREACH (short for the very long - Browser Reconnaissance and Exfiltration via Adaptive Compression of Hypertext and extension of the CRIME SSL/TLS attack) can extract secure data from the https session such as login information, email addresses, and security credentials. Now larger data such as an entire email cannot be extracted this way.
All of this may be unimportant to you if you run a free music download service, but it's crucial if you handle financial transactions.
Why?
Pretty self-apparent, get important otherwise secure data.
What to Do?
There are certain site conditions that will make you much more likely to this type of attack.
- If your applications or site pages use the "HTTP Response Body" (if you don't know, no worries, your programmers will).
- In addition to "HTTP Response Body" you also (POST) string query parameters to reflect user data.
- Most important, you must have data that an attacker would want in the first place.
These conditions are why all HTTPS/SSL are vulnerable, but not all sites need to be concerned. If you would like to check your site, you can use BreachAttack.com.
Note: If you think TLS/SPDY compression (things like Google use SDPY) is any better, they aren't; the original previously mentioned attack called CRIME was able to perform a similar compromise.
The severity of this is such the Department of Homeland Security has sent out this missive. There is no cure. Your best bet, if you handle critical data is to get a good security professional to review your site and suggest best practices for site monitoring and best risk management.
4. Mass Scanners. Mass Injectors.
Let's call it a version of Moore's law for "hacking". Not quite the same, but the basics are there.
As the power of computers becomes exponentially greater, as cloud and the power of cluster systems such as Hadoop come online for ease of public use the power of a single attacker becomes that of 1,000 or even 10,000 bot machines or humans.
Wonder why anyone would ever find your site interesting? How did you wind up with that malware? Why you?
Well, why not you?
There has always been the ability to scan for vulnerabilities, but now with the power of Hadoop one researcher HyperionGrey brought to DEF CON the power of the cluster scan.
With this tool, you can enter the type of vulnerability you're seeking, hit enter, return (in this case) 300,000 websites with that vulnerability in less than 10 seconds.
So? What is the power of a search query without the power to act on it? Well, there is now, with the power of computing, the power to act. Want to scan for the ability to inject your code into the vulnerable SQL databases and not one, but into several 100,000 sites fast, there you go.
HyperionGrey, who developed this scan, is very clear in telling people not to use it for evil. However, you know it will be – and even if his tool goes away, there will be others.
Either way, the power of the massive scan and massive (insert bad thing here) injector is here and will only become more powerful as computing becomes more powerful and the existing knowledge base becomes more plentiful – all at an exponential rate
One Method – The XSS
Here is a quick view on the current state of web application exploit vulnerabilities. For Cross Site Scripting it is believed that up to 82 percent of all site applications are vulnerable (if you connect to a database with a form it qualifies).
Why?
For instance, XSS (Cross Site Scripting) is one of the most common security vulnerabilities on a website.
XSS allows attackers to do such fun activities as deface websites, sell Viagra on you .gov URL, or inject information into a website – and now it can do it to many sites at once by finding those sites quickly and easily. (Applies to SQL injection of databases too).
For some, it's just fun to say "HEY SEE ME, I JUST TOOK OVER YOUR WEBSITE". For others, it's about taking you out of the SERPS, gathering important data, or just selling their product off your awesome site. And for others, it's about very serious things, like getting your users' passwords and information.
What to Do?
If you're an SEO professional, you need to know there are many "negative SEO" or "competitive leveling" techniques you can perform with this exploit, but I'm not here to tell you how. All you need to know is that your website needs to be protected, that it's easy it is these days to get infected, and the path to start on to protect yourself. For everyone else, meh, it's the same.
If you're looking for your site's most vulnerable spots, take a primer from Black Hat presenters Ryan Barnett of Trustwave and Greg Wroblewski of Microsoft and check these areas of your sites for potential attack surfaces:
5. Exploiting Offline to Get to Your Online
Do you know one of the most vulnerable routes into your corporate network, protected data, or secrets? Look over to one of your most benign and forgotten of devices, one that sits there alone next to your desk, or perhaps down the hall.
Yes, your most vulnerable access point may indeed be your printer!
Last year's Black Hat and DEF CON revealed how easy it was to retrieve all your corporate info off a printer. (Because they are hardly ever secure, usually connected to a port, and store everything scanned to them.)
This year they took it one step further: 97 percent of the printers mentioned haven't updated their firmware. So, here's how it works:
- Send a print job with the firmware that has a special command and control center and a tunnel connector readied.
- Update the printer with your firmware on the print job.
- Then in this case, a command and control was opened through the port connection, the tunnel went searching for connected devices to carry out the intended attack (in this case a denial of service, but it could be to do damage in your network, grab data, plant spyware, whatever they are inside your network).
How easy was it? Well if you knew what you were doing and you didn't have to be an expert once you were inside, not that hard.
How hard is it to find a printer? Well I did a 5 second search on Shodan, a search engine that looks for ports and devices with "opportunities."
Here is one of the first four searches that were returned for just the word "printers". Hmm, seems like some good information to start, doesn't it?
What to Do?
Update your firmware and secure everything that is attached to your external network. This is vital. The weakest link is a way in.
In this case, espionage wasn't even the goal. They just used the printer to create a denial of service attack on another website and it just took one printer. Imagine is they had two.
Being Security Conscious
Scared yet? If so, that really isn't my intention. However, there is a lot of insecurity in that ethereal land we call cyberspace
Is security on your radar? Have you given thought to it and, if so, are you keeping up with your site, servers, and printer's latest updates, patches, and security warnings? Are you security conscious?
Do you have someone available, whether on staff or just a consultant, depending on your needs, who can help you if you get into an emergency security situation?
For instance, if your site goes down because of a denial of service attack, how much will that cost you? How long will it take you to get rid of that Google malware warning? How many negative links will that hidden Viagra store generate to your once top ranked site? What will your customers do if you have to send out that most dreaded of emails, announcing that you've had a security breach?
Website Security Isn't Just for the Big Guys Anymore
Site security used to be the stuff of big companies and governments, but with so many WordPress users attacked almost daily (and often en masse) and "hacking's" ability to take your site out of your most lucrative SEO positioning, can you afford not to keep site security in your direct line of sight? It is just as much a part of your SEO as your need for the proper keyword.
Ask someone who lost time, money, and their mind trying to recover from an attack; it ain't pretty. And this article is only the tip of a very large iceberg.
Think of this as the sample platter of website vulnerabilities that exist right now. How secure is your site?
Article Post @
Search Engine Watch