Showing posts with label SEO. Show all posts
Showing posts with label SEO. Show all posts

Halloween Inspired SEO Tricks to Keep Spiders at Bay

A Jack o'lantern

Over the years, I’ve made a habit of prodigiously extolling search engine best practices per contra to taking shortcuts designed to trick search engines into trusting that an online destination is something it is not. This Halloween, I’ve decided to produce an antithetical essay to my digital morals and beliefs by way of parody and embrace the dark side of search engine spoofs.

Fear of spiders?

Not a problem. There are many ways to keep unwanted arachnids from crawling through your content.

For starters, why not avoid using visible text on your website? Embed as much content as possible in images and pictures. Better yet, make your site into one big splash page that appears to scroll to infinity and beyond.

Also, make certain that the imagery loads as slowly as possible. Consider yourself lucky that you will be able to streamline your web metrics around paid search campaigns and not worry about those pesky organic referral terms [not provided] by Google anymore. Keeping spiders out of your content is your first step toward complete search engine invisibility.

If your site is inherently text heavy, consider using dropdown and/or pop-up boxes for navigation. Configure these with Flash or have them require JavaScript actions to function. If possible, put the rest of your web content exclusively in frames. Designing a website in such a manner is another great way to keep all those bad robots out of your site.

When it comes to URL structures, try to include as many ampersands, plus signs, equal signs, percentage signs, session IDs, and other dynamic parameters as possible in a multifaceted, splendidly deep file structure. That way, your website will be made up of really long URL strings that can confuse humans and spiders alike. Even better, add filters and internal site search functionality, metrics tags, and other superfluous attributes to your URLs, just to keep the search engines guessing about your site structure. Get ready to burn your site’s crawl equity to the ground, while watching your bandwidth spend soar, when you wrap your site up like a mummy with this navigational scheme.

If you really want to turn your website into a graveyard for search engine spiders, consider using completely unnecessary redirects on as many different URLs as possible, taking multiple hops along the way. Combine permanent and temporary redirects with soft 404 errors that can keep your content alive in search indices forever. Make certain to build in canonical tag conflicts, XML sitemap errors, perpetual calendars and such, reveling in the knowledge that you will never have to waste precious development time fixing broken links again!

Content creation budget got you down? Build in new economic efficiencies by using the exact same content across as many domains as your budget can spawn. Invest in machine-generated content instead of having to listen to those troublesome user reviews. Make “Spamglish” the official language of your website. Since you don’t have to worry about looking at what keywords Google allows to send traffic to your Frankensite, feel free to target irrelevant keywords on as many pages as possible.

Additionally, try to keep all the title tags exactly the same on the critically important pages within your site. Spiders don’t have good eyesight – sometimes you have to shout to get their attention. Consider keyword stuffing as a way to make certain that the search engines understand precisely what your site is all about. If you don’t have room to stuff unnecessary contextual redundancies into your web content, consider using hidden text that flickers like a ghost when users mouse over what looks like dead space.

Still not convinced you can hide your site from the search engines this Halloween?

Break out the big tricks, my friends, because we’ve got some link building treats to share.

If your ultimate goal is to bury a domain name for all eternity, make certain that you participate in as many link farming free-for-all sites as possible. When you get a chance to do so, go ahead and “splog” other’s guest books and forums. In addition to buying site-wide text links, demand that your backlinks be placed in the footers. While you’re at it, sell a similar “service” to others.

Consider hiding some links in places that surprise visitors and always embrace bad linking neighborhoods. You know the type of sites I’m talking about… they’re the spooky ones and the non-paranormal that a business person would avoid.
Have a wonderful Halloween this year, with the knowledge that you too can make a website completely disappear!


Disclaimer: I don't actually endorse that you try any of the above; everything in this particular column should be taken with a serious dose of tongue-in-cheek.


Original Article Post by P.J. Fusco @ Search Engine Watch

The Value of Referrer Data in Link Building

referrer-links

Before we get into this article let me first state, link building is not dead.  There are a lot of opinions floating around the web on both sides; this is just mine.  Google has shut down link networks and Matt Cutts continues to make videos on what types of guest blogging are OK.  If links were dead, would Google really put in this effort?  Would anyone get an “unnatural links” warning?

The fact is, links matter.  The death is in links that are easy to manipulate.  Some may say link building is dead but what they mean is, “The easy links that I know how to build are dead.” 

What does this mean for those of us who still want high rankings and know we need links to get them?  Simply, buckle up, because you have to take off your gaming hat and put on your marketing cap.  You have to understand people and you have to know how to work with them, either directly or indirectly.

I could write a book on what this means for link building as a whole, but this isn't a book, so I'll try to keep focused.  In this article, we're going to focus on one kind of link building and one source of high quality link information that typically goes unnoticed: referrer data.

I should make one note before we launch in, I'm going to use the term loosely  to provide additional value.  We'll get into that shortly but first, let's see how referrer data helps and how to use it.

The Value Of Referrer Data

Those of you who have ignored your analytics can stop reading now and start over with “A Guide To Getting Started With Analytics.”  Bookmark this article and maybe come back to it in a few weeks.  Those of you who do use your analytics on at least a semi-regular basis and are interested in links can come along while we dig in.

The question is, why is referrer data useful?  Let's think about what Google's been telling us about valuable links: they are those that you would build if there were no engines.  So where are we going to find the links we'd be happy about if there were no engines?  Why, in our traffic, of course.

Apart from the fact that traffic is probably one of, if not the best, indicator of the quality and relevancy of a link to your site, your traffic data can also help you find the links you didn't know you had and what you did to get them. Let's start there.

Referrers To Your Site

Every situation is a bit different (OK – sometimes more than a bit) so I'm going to have to focus on general principles here and keep it simple. 

When you look to your referrer data, you're looking for a few simple signals.  Here's what you're looking for and why:
  1. Which sites are directing traffic to you?  Discovering which sites are directing traffic to you can give you a better idea of the types of sites you should be looking for links from (i.e. others that are likely to link to you, as well). You may also find types of sites you didn't expect driving traffic. This happens a lot in the SEO realm, but obviously can also happen in other niches.  Here, you can often find not only opportunities, but relevancies you might not have predicted.
  2.  What are they linking to?  The best link building generates links you don't have to actively build. The next best are those that drive traffic.  We want to know both. In looking through your referrer data, you can find the pages and information that appeal to other website owners and their visitors.  This will tell you who is linking to you and give you ideas on the types of content to focus on creating.  There's also nothing stopping you from contacting the owner of the site that sent the initial link and informing them of an updated copy (if applicable) or other content you've created since that they might also be interested in.
  3. Who are they influential with?  If you know a site is sending you traffic, logically you can assume the people who visit that site (or the specific sub-section in the case of news-type sites) are also interested in your content (or at least more likely to be interested than standard mining techniques).  Mining the followers of that publisher for social connections to get your content in front of them is a route that can increase your success rate in link strategies ranging from guest blogging to pushing your content out via Facebook paid advertising.  Admittedly, this third area of referrer data is more akin to refining a standard link list, but it's likely a different audience than you would have encountered (and with a higher-than-standard success rate for link acquisition or other actions).
As I noted above, I plan to use the term referrer data loosely.  As if point three wasn't loose enough, we're going to quickly cover a strategy that ties nicely with this: your competitor's referrer data.

Competitor Data

You probably can't call up a competitor and ask them for their traffic referrer data (if you can, I wish I was in your sector).  For the rest of us, I highly recommend pulling backlink referrer data for your competitors using one of the many great tools out there.  I tend to use Moz Open Site Explorer and Majestic SEO personally, but there are others.

What I'm interested in here are the URLs competitors link to.  While the homepage can yield interesting information, it can often be onerous to weed through and I generally relegate that to different link time frames. 

Generally, I will put together a list of the URLs linked to, then review these as well as the pages linking to them.  This helps give us an idea of potential domains to target for links, but more importantly, they can let us know the types of relevant content that others are linking to. 

If we combine this information with the data collected above when mining our referrer data, we can be left with more domains to seek links on and broader ideas for content creation.  You'll probably also find other ways the content is being linked to. Do they make top lists?  Are they producing videos or whitepapers that are garnering links from authority sites?  All of this information meshes together to make the energies you put into your own referrer mining more effective, allowing you to produce a higher number of links per hour than you'd be able to get with your own.

Is This It?

No.  While mining your referrer data can be a great source of information regarding the types of links you have that you should be seeking more of, it's limited to the links and traffic sources you already have.  It's a lot like looking to your Analytics for keyword ideas (prior to (not provided) at least).  It can only tell you what's working of what you have already. 


A diversified link profile is the key to a healthy long term strategy.  This is just one method you can use to help find what works now and keep those link acquisition rates up while exploring new techniques.


Original Article Post by Dave Davies @ Search Engine Watch

Matt Cutts on SEO, PageRank, Spam & the Future of Google Search at Pubcon Las Vegas

pubcon-keynote

Matt Cutts kicked off day two at Pubcon with another of his signature keynotes, dispelling myths, talking about spammers and about Jason Calcanis’ keynote from day one, at the urging of the audience.

First, Cutts spoke about Google’s “Moonshot changes,” which he broke down into these areas:
  • Knowledge graph
  • Voice search
  • Conversational search
  • Google now
  • Deep Learning
He revealed that Deep Learning is the ability to make relationships between words and apply it to more words, and how it can help improve search and the nuances of search queries.

Deep Learning in Regular and Voice Search

He explained that voice search is changing the types of search queries people use, but also that it can be refined without repeating previous parts of the user’s search queries. It does this when it knows the user is still searching the same topic, but drilling down to more specifics.

Cutts shared an example where he was searching for weather and continued on with the query without having to keep retyping “What is the weather?” because Google can recognize the user is still refining the previous search query. “Will it rain tomorrow?” in voice search will bring up the weather results for location for Las Vegas, Nevada. Then when he says “What about in Mountain View?” and Google shows weather for Mountain View, knowing that it is a refined voice query. Then “How about this weekend?” is searched and it shows Saturday weather for Mountain View.

Hummingbird, Panda & Authorship

Next up, Cutts spoke about Hummingbird and he feels that a lot of the blog posts about how to rank with Hummingbird are not that relevant. The fact is, Hummingbird was out for a month and no one noticed. Hummingbird is primarily a core quality change. It doesn’t impact SEO that much, he said, despite the many blog posts claiming otherwise.

Of most interest to some SEOs is that Google is looking at softening Panda. Those sites caught in grey areas of Panda--if they are quality sites--could see their sites start ranking again.

Google is also looking at boosting authority through authorship. We have seen authorship becoming more and more important when it comes for search results and visibility in those results; Cutts confirmed this is the direction in which Google will continue to move.

Google on Mobile Search Results

Next, he discussed the role of smartphones and their impact on search results. This is definitely an area SEOs need to continue to focus on, as it is clear that sites that are not mobile-friendly will see a negative impact on their rankings in the mobile search results.

Smartphone ranking will take several things into account, he explained:
  • If your phone doesn’t display Flash, Google will not show flash sites in your results.
  • If your website is Flash heavy, you need to consider its use, or ensure the mobile version of your site does not use it.
  • If your website routes all mobile traffic to the homepage rather than the internal page the user was attempting to visit, it will be ranked lower.
  • If your site is slow on mobile phones, Google is less likely to rank it.
Cutts was pretty clear that with the significant increase in mobile traffic, not having a mobile-friendly site will seriously impact the amount of mobile traffic Google will send you. Webmasters should begin prioritizing their mobile strategy immediately.

Penguin, Google’s Spam Strategy & Native Advertising

Matt next talked about their spam strategy. When they originally launched Penguin and the blackhat webmaster forums had spammers bragging how they weren’t touched by Penguin, the webspam team’s response was, “Ok, well we can turn that dial higher.” They upped the impact it had on search results. Cutts said that when spammers are posting about wanting to do him bodily harm, he knows his spam team is doing their job well.

He did say they are continuing their work on some specific keywords that tend to be very spammy, including “payday loans,” “car insurance,” “mesothelioma,” and some porn keywords. Because they are highly profitable keywords, they attract the spammers, so they are working on keeping those specific areas as spam-free as possible through their algorithms.

He discusses advertorials and native advertising and how they are continuing to penalize those who are utilizing it without properly using disclaimers to show that it is paid advertising. Google has taken action on several dozen newspapers in US and UK that were not labeling advertorials and native advertising as such, and that were passing PageRank. He did say there is nothing wrong with advertorials and native advertising if it is disclosed as such; it’s only when it is not disclosed that Google will take action against it.

Spam networks are still on Google’s radar and they are still bringing them down and taking action against them.

Bad News for PageRank Fans

For PageRank devotees, there is some bad news. PageRank is updated internally within Google on a daily basis and every three months or so, they would push out that information to the Google toolbar so it would be visible to webmasters. Unfortunately, the pipeline they used to push the data to the toolbar broke and Google does not have anyone working on fixing it. As a result, Cutts said we shouldn’t expect to see any PageRank updates anytime soon--not anytime this year. He doesn’t know if they will fix it, but they are going to judge the impact of not updating it. The speculation that PageRank could be retired is not that far off from the truth, as it currently stands.

Communication with Webmasters, Rich Snippets, Java & Negative SEO

Google continues to increase their communication with webmasters. They made new videos covering malware and hacking, as Google is seeing these problems more and more, yet not all webmasters are clear about what it is and how to fix it. They are working on including more concrete examples in their guidelines, to make it easier for people to determine the types of things that are causing ranking issues and point webmasters in the right direction to fix it.

Cutts stressed that he is not the only face for Google search. They have 100 speaking events per year and do Hangouts on Air to educate webmasters. They hold Webmaster Office Hours, to increase communication and give users the chance to engage and ask questions of the search team.

Google is becoming smarter at being able to read JavaScript, as it has definitely been used for evil by spammers. However, Cutts cautions that even though they are doing a better job at reading it, don’t use that as an excuse to create an entire site in JS.

Rich snippets could get a revamp and they will dial back on the number of websites that will be able to display rich snippets. “More reputable websites will get rich snippets while less reputable ones will see theirs removed,” says Matt.

Matt also says negative SEO isn’t as common as people believe and is often self-inflicted. One person approached Matt to say a competitor was ruining them by pointing paid links to their site. Yet when he looked into it, he discovered paid links from 2010 pointing to their site, and said there was no way competitors would have bought paid links back in 2010 to point to their site, since the algorithm penalizing paid links wasn’t until a couple years after those paid links went live.

The Future of Google Search: Mobile, Authorship & Quality Search Results

On the future of search, he again stressed the importance of mobile site usability. YouTube traffic on mobile has skyrocketed from 6 percent two years ago, to 25 percent last year, to 40 percent of all YouTube this year. Some countries have more mobile traffic than they do desktop traffic. Cutts reiterated, “If your website looks bad in mobile, now is the time to fix that.”

Google is also working on machine learning and training their systems to be able to comprehend and read at an elementary school level, in order to improve search results.

Authorship is another area where Google wants to improve, because tying an identity to an authorship profile can help keep spam out of Google. They plan to tighten up authorship to combat spam and they found if they removed about 15 percent of the lesser quality authors, it dramatically increased the presence of the better quality authors.

They are also working on the next generation of hacked site detection, where Cutts said he is not talking about ordinary blackhat, but “go to prison blackhat.” Google wants to prevent people from being able to find any results for the really nasty search queries, such as child porn. Cutts said, “If you type in really nasty search queries, we don’t want you to find it in Google.”

Cutts’ current advice (again) to webmasters is that it's important to get ready for mobile. He spoke to the convenience for website visitors when you utilize their auto-complete web form annotations, to make it easier for people to fill out forms on your site. The mark-up to add to the forms is easy to do, and will be available in the next few months.

The next generation of the algorithm will look at the issue of ad-heavy websites, particularly those with a large number of ads placed above the fold. This is really not a surprise, as it makes for a bad user experience and Google has previously announced that their page layout algorithm is targeting this. But sites using JavaScript to make it appear to Googlebot that the ads aren’t above the fold should look at replacing the ads before the algorithm impacts them.

Matt Cutts Q&A

During Q&A, Cutts discussed links from press release sites. He said Google identified the sites that were press release syndication sites and simply discounted them. He does stress that press release links weren’t penalized, because press release sites do have value for press and marketing reasons, but those links won’t pass PageRank.

The problem of infinite scrolling websites was raised, such as how Twitter just keeps loading more tweets as you continue to scroll down. He cautions that while Google tries to do a good job, other search engines don’t handle infinite scrolling as well. He suggests any sites utilizing infinite scrolling also have static links, such as with a pagination structure, so bots can have access to all the information if their bots don’t wait for the infinite loading of the page.

Someone asked about whether being very prolific on blogs and posting a ton of posts daily has any impact on search rankings. Cutts used the Huffington Post as an example, as they have a huge number of authors, so logically they have many daily posts. However, he says posting as much as your audience expects to see is the best way to go.

In closing, Cutts said they are keeping a close eye on the mix of organic search results with non-organic search results and says he would also like to hear feedback on it.

While no new features were announced during his keynote at Pubcon, Cutts packed his presentation with many takeaways for webmasters.


Original Article Post by Jennifer Slegg @ Search Engine Watch

9 Ways to Prepare for a Future Without Cookie Tracking

It was over a year ago that I first wrote about do not track legislation, and luckily for most organizations the browser-provided imperative is loosely supported or regulated today, with very few sites adhering to interpretation and compliance of the preference.

For the most part, do not track legislation is often misunderstood by the general public, and even our regulators in its definition, usage, and most importantly the privacy implications and confidence it is meant to instill.

From a digital practitioner standpoint “do not track” is the least of my worries but upcoming news about Microsoft and Google pursuing cookie-less tracking capabilities indicates to me that education on how digital information is collected and shared will become even more important in the near future.

Rather than panicking, there is a lot we can do today to enact guiding principles that will likely ease a transition into tighter privacy controls in the future.

Education

One of the biggest problems facing digital marketing and analytics practitioners will be education. The industry has evolved so quickly that much of the technology that we rely on every day is likely taken for granted.

Personalization is one such area that relies on tracking, profiling, and delivering a lot of information about visitor preferences and behavior, which many of us likely take for granted.

One might argue that personalization is a byproduct of contextual advertising, and without underlying tracking technologies, wouldn't be possible to deliver.
Teasing apart a key delivery mechanism such as a session or persistent cookie will be very challenging, but explaining the importance of cookies and their usage to visitors and customers even more so.

What can you do to prepare?

1. Ensure your privacy policy is up to date and fully transparent.
2. Explain what tracking technologies are used (savvy users will know how to check for this themselves anyways).
3. What cookies are employed and for what reason.

Usage

It’s probably safe to say that aside from a few specific highly-regulated industries and regions, most digital marketing practitioners don’t spent too much time or due diligence in reviewing data usage models with third-party vendors and their technology.

Regulators focus both on collection and usage of data in these scenarios, particularly when third parties are involved because in many cases, these partners assume ownership of the data collected on your digital properties. This is the same reason why many browsers automatically block third-party cookies, to ensure data collection services and the usage of visitor information are being entrusted to the right recipients.

What can you do to prepare?

4. Explain how data collected is used.
5. Explain how disabling functionality may affect user experience or functionality.
6. Ensure correlating verbiage between your privacy policy and acceptable use policy are complementary.

Consent

In my opinion, this is where most of the opportunity is for much of North America. Very few companies actually gather consent in a clear and concise manner.

To be brutally honest, most of us think that relying on a single line radio box at the bottom of a registration page, with a link to a hundred page disclosure is acceptable. From a legal standpoint, it probably will cover you from any litigation, but from a customer experience perspective, hundreds of pages of disclosure tend to make the average Joe either uninterested or a little paranoid.

What can you do to prepare?

7. Humanize your terms and conditions. Less legalese and more transparency.
8. Separate your service level agreement and delivery conditions from your data consent.
9. Introduce ways visitors and customers can opt-into and out of technology that enables digital marketing and personalization quickly and easily.

Conclusion

Think about the steps you can take today to instill a greater confidence in your digital business and marketing efforts today. Sometimes little things go a long way to earn the respect and trust of visitors and customers, making the impact of future technology tracking capabilities or regulatory guidelines easier to transition into.

Have you done anything to prepare your website and visitors for the future of tracking and digital marketing personalization?


Original Article Post by Garry Przyklenk @ Search Engine Watch

Become a Leading SEO Mechanic with Both Google & Bing Webmaster Tools

Webmaster Tools offerings from both Google and Bing can offer a wealth of insight to business owners. In order to get the whole spectrum of insights, marketers must learn just what they can do with both Google and Bing Webmaster tools. Using both together allows you greater insight into the factors contributing to the success—or lack thereof—of your SEO strategy.

Internet Marketing Ninjas COO Chris Boggs and Grant Simmons, director of SEO and social product at The Search Agency, shared their advice on better integrating data from Google Webmaster and Bing Webmaster Tools earlier this year at SES San Francisco.

Google Webmaster Tools: Proactively Monitor and Have a Plan in Place to React (P.R.E.P.A.R.E)

Internet Marketing Ninjas COO/CMO and SEMPO Chairman Chris Boggs started the presentation with the topic everyone really wanted to hear: Google Webmaster Tools (GWT). He started with SEO diagnostic principles and explained that you need to be both proactive and reactive when monitoring SEO. Marketers need to have a plan as well as the ability to manage from a reactive perspective, he said. If you come across something in your diagnoses, your analytics are going to be a good second opinion. Without tools, it’s just a guessing game.

Once you have this in mind, you can start digging into GWT by focusing on a few things first:

1. Quick Barometers
Boggs referred to the “Brand 7 Pack” as a company’s homepage and six sitelinks that appear in search results. If you don’t have seven, you have an SEO problem, he said. Your social entities such as Google+ should also be ranking, with your titles to be clear and easy to understand. If you want to see what your domain looks like from Google’s perspective and see the cleanliness of your page titles, type in “site:” and then your domain name without the “www.” Below is a screenshot of a website with a good 7 pack:

macys-7-pack-google-serp

You can then go to your Webmaster Tools account to diagnose any problems you may see and determine exactly where the problem lies and how to fix it. From a reactive mode perspective, look at your analytics and verify. It’s very important for SEOs to live by this mantra. Webmaster Tools isn’t something to take for granted. Have an agency or consultant monitor the findings in GWT and relay information to design, development, and marketing teams.

2. HTML Improvements
Visit the HTML Improvements category to determine if your titles and descriptions look bad on a Google SERP. You can see if Google agrees, then click on anything with blue writing to learn more about the problem.

Boggs was asked after the presentation what tool might get users in trouble if they don’t understand it, and this was his answer. He explained that almost every site is going to have some duplicate descriptions and titles, so he wouldn’t try to get that number down to zero. You don’t need to remove every single warning from GWT.

How to Find the Tool: Located under Search Appearance.

3. Sitelinks
You can visit the sitelinks tab to demote a certain sitelink (one of the links under your company homepage shown on a search results page like in the screenshot above). Google is going to automatically generate links to appear as your sitelinks, but you can tell Google if you don’t want something there.

How to Find the Tool: Located under Search Appearance.

4. Search Queries
Here, you can look at the top pages as well as the top queries for your site. Most people will just take the default information, but Boggs stressed that there are tabs for a reason. Look at the top queries as well as use those “more” tabs to get more information.

How to Find the Tool: Located under Search Traffic.

5. Links
You can click on “links to your site” to get a full list of those linking back the most, but the tool that many forget to use is the “internal links” tool. Internal links are very important; Boggs explained it’s worth the time to go through and look at the number of these internal links and then download the table so you can really slice it and dice it.

How to Find the Tools: Located under Search Traffic.

6. Manual Actions and Malware
With this tool, no news is good news. If you get a manual action warning, it means you need to do something that is probably substantial in order to keep your rankings where they are. Malware is also something you can look into which is another place you don’t want to see anything.

How to Find the Tool: Find manual Action under Search Traffic, Malware under Crawl.

7. Index Status
If your page index is 10x, you might have a problem. The advanced tab here gives you a much better look at that data.

How to Find the Tool: Located under Google Index.

8. Content Keywords
What you want to look for here are the words you are using in your content. You don’t want to see a lot of “here” or promotional phrases. Identify where your gaps are or where you have too much content.

How to Find the Tool: Located under Google Index.

9. Crawl Errors
Google now has a feature phone tab to help you with crawl errors. You have to understand any crawl errors that might occur and remember that you should provide data that is very specific to mobile, as well. You can also take a look at your crawl stats, which means the time spent downloading, and make sure there is no spike.

How to Find the Tools: Both located under Crawl.

Finally, Boggs explained that Google Webmasters Tools should be thought of proactively by pairing it with Google Analytics. What kinds of things is GWT telling you when it comes to your analytics and how that data is affected? Consider this screenshot from Boggs’ presentation:

gwt-ga-more-less-obvious
In the end, Boggs explained that expertise is knowing the most basic things about SEO and doing them repeatedly, perfectly, every time. You’re going to come across situations where there are a lot of hooks and changes in the algorithm. Something someone might have done one to five years ago could be a very bad move now. That’s part of the game.

Bing Webmaster Tools: Bing Stands for “Bing Is Not Google”

Director of SEO and Social Product at The Search Agency, Grant Simmonsbegan his presentation with the quote “Bing stands for Bing is not Google,” and the laughter amongst the marketers and SEOs just about said it all. It’s true; Bing is often not taken as seriously as Google because it just isn’t as popular, yet Bing Webmaster Tools (BWT) does offer some good insights that Google does not.

Once you’re signed upand logged in, consider the top things that you should look at first to really get a handle on BWT:

1. Dashboard
You want to make sure that pages you think you have are the ones the Bing has indexed. If that number isn’t what you expected, ask yourself a few questions: Are they crawling my site frequently? Am I not updating my site? These are all quick things you can see right from the dashboard, and you can even look at search keywords to see how people are finding you.

Quick Fact: Bing doesn’t use Google Analytics.

2. Diagnostic Tools
The diagnostic tools category is comprised of 7 subcategories: keyword research, link explorer, fetch as Bingbot, markup validator, SEO analyzer, verify Bingbot, and site move.

How to Find the Tool: This is a category all on its own!

3. SEO Analyzer
This tool works great when analyzing just one URL. You simply type in the URL and hit “Analyze” to get an overview of the SEO connected with that URL on the right hand side of the page. The tool will highlight any issue your site is having on the page; if you click on that highlighted section, Bing will give you the Bing best practice so you can make improvements.

How to Find the Tool: Located under Diagnostics & Tools.

4. SEO Reports
This tool shares a look at what is going on with your whole site (as opposed to just one URL). You will get a list of SEO suggestions and information about the severity of your issue, as well as a list of links associated with that particular error. The tool runs automatically every other week for all of the sites you have verified with BWT (so not your competitor’s sites).

How to Find the Tool: Located under Reports & Data.

5. Link Explorer
You can run this tool on any website to get an overview of the top links associated with that site (only the top links, however, which is considered one of the limitations of the tool). Export the links into an Excel spreadsheet and then slice and dice the information as you’d like.

How to Find the Tool: Located under Diagnostics & Tools.

6. Inbound Links
Link Explorer is probably one of the more popular tools when it comes to BWT, so it’s certainly worth mentioning. However, according to Simmons, Inbound Links is a better tool that doesn’t have as many limitations. This tool will show you trends over time so you can really see if there is value on deep page links. You can see up to 20,000 links, as well as the anchor text used, with the ability to export.

How to Find the Tool: Located under Reports & Data.

7. Crawl Information
It’s important to remember that the Bing bots are different than the Google bots, and the crawl information tool can help give you insight. From a high level, Simmons explained that when the tool gives you the stats, you should be looking at the challenges you might have from the migration you did last year. Are your 301s still in place? Are they still driving traffic? From the 302 pages, should they be made permanent? It’s also a good idea to look at the last time your site was crawled. If it’s been a while, remember Bing likes fresh content and you may need to make some updates. Again, this information is exportable.

How to Find the Tool: Located under Reports & Data.

8. Index Explorer
Simmons said this is one of the coolest things found in BWT, one reason being that Google doesn’t really have anything like it. You can see stats for a particular page, which can be good to see based on a subdirectory or section of your site. The tool has great filters and offers an awesome visual representation of crawled and indexed pages.

How to Find the Tool: Located under Reports & Data.

Of course, there is a lot more to BWT than just the eight features listed above, including the keyword research tool, geo targeting, disavow tool (they were the first to offer this), and crawl control. Their features are very comparable to Google, they have excellent navigation and even a few extra capabilities. Simmons concluded the presentation by saying that we should really focus on BWT to make a difference.

Do you think Boggs and Simmons singled out the best tools in both GWT and BWT? Simmons will speak to attendees at SES Chicago in early November on what it takes to become a leading SEO mechanic, alongside Vizion Interactive’s Josh McCoy. Keep an eye out at SEW for coverage!


Original Article Post by Amanda DiSilvestro @ Search Engine Watch

Searcher Personas: A Case for User-Centric SEO

It wasn't so long ago that, when educating the uninitiated on the SEO process from the bottom up, we would explain that keywords are foundational to SEO – start with your keywords and work up from there.

Lately, we've seen some very fascinating (though not unexpected) developments from Google that give users ever more control in the driver seat.

Hummingbird was a big stride toward better semantic and conversational search. "(Not provided)" took website visitor keywords data away from us. Keyword search volume data moved deeper inside AdWords with the Keyword Planner. Meanwhile, user segmentation was introduced to Google Analytics, giving marketers the ability to perform cohort analysis.

The message is clear: Google is moving away from keywords. Today's SEO is about the user and the way people explore using search queries. In fact, digital marketing as a whole is moving further into user-centricity; we, as SEO professionals, are on the bandwagon whether we like it or not.

So how do we put the user-centric concept into practice for SEO?

Keyword research is as important as ever, but we now start with searcher personas (which are very similar to user personas, marketing personas and customer personas). We use the keyword research as a data source to better understand those personas, a concept well established in marketing.

Personas: The Fundamentals

There are two main functions of the persona: to provide context around the users represented by the persona and to create a sense of empathy for those users.

In the ideal world, we would be capable of understanding and being empathetic towards each and every potential customer. Since that's impossible at the scale we work in, the idea is to group target customers together and give each group the qualities of an individual human. So each persona you create will effectively "speak for" all users represented by it.

To understand the concept of the persona, you need to understand the concept of the archetype. The definition of an archetype is "…the original pattern of which all things of the same type are copies."

Read up on Carl Jung, a famous Swiss psychoanalyst, to further your knowledge in this area. A classic example of the archetype is "The Shadow," who is expressed in popular culture by a variety of characters like Darth Vader, Agent Smith or Mr. Hyde.
qualities-of-the-shadow

Although each of these characters has unique qualities, they also share a number of collective qualities: those of "The Shadow."

When we're building a persona, we will use data (like keyword search volumes, market research, user polls, web analytics, etc.) to find those collective qualities that are similar across a large group of people. When it comes down to actually creating our persona, we want a character that is like a real person.

A real person isn't in the 35-45 year old age bracket. A real person was born at a specific time on a specific day.

In creating personas, we seek precision over accuracy. We will give our persona a specific age, knowing that it doesn't accurately reflect the age of each person represented in our persona.

Incorporating Searcher Personas into Your SEO Work

Ideally, when beginning your persona development efforts, you will take a "top-down" approach, where you begin by creating digital marketing personas that will work across all channels (not just search engine marketing). You continue by performing deeper analysis on the habits of each persona from the natural search perspective.

I prefer this approach because the same high-level personas can be used to tie together all digital (and maybe even offline) marketing efforts – everyone involved in marketing and content production (not just the SEO) uses the same personas.

When you're ready to start building personas, think about your process. Consider something like this:
  • Choose target personas. What types of people are you looking to attract to your website? Group them together into 3-5 types of people and give them titles that reflect who they are.
  • Sticky noting. Gather some team members together and brainstorm to flesh out each of your personas using your own existing knowledge and assumptions. Some people use actual sticky notes; I prefer a big Excel spreadsheet.
  • Define business context. Check your work to be certain that each persona is aligned with your business objectives and offerings, and that you fully understand the context. Add this to your sticky noting.
  • Gather data. This is possibly the most challenging portion of the process. Data is not easy to get a hold of and the good stuff is usually very expensive. It's extremely important that you use the right data and interpret it correctly, as you don't want to let poor data interpretation steer your personas off course. For search, your keyword research is an important data source.
  • Create cards. Gather all of your sticky noting and data together, then summarize it all on one shareable, printable, visually attractive "card" (a PowerPoint slide works like a charm) for each persona. Remember that you are aiming for a precise, not accurate, portrait of your persona.
The end result may look something like this:
the-trade-buyer

Ultimately, you want everyone involved in marketing to be thinking and talking (and dreaming) about the target personas when developing and executing marketing initiatives. The personas are the "people" you are marketing to, whose needs you are serving.

There is much more to share on the subject of building data-driven personas, and how the keyword research process is impacted. We would love to know, for future discussions: do you already use (or plan to use) personas in your search engine marketing initiatives?

Wes Walls of iProspect contributed to this post.


Original Article Post by Guillaume Bouchard @ Search Engine Watch

How to Use PPC Data to Guide SEO Strategy in a '(Not Provided)' World

We can no longer precisely track traffic for Google in organic search at the keyword level. As "(not provided)" creeps its way up to 100 percent, so does the lack of our ability to track Google organic keyword conversions.

Tell your friends, family, loved ones, the boss. Then if you haven't immediately lost their attention with the use of acronyms and jargon, also let them know that we're still able to measure our efforts and gain strategic insight in many ways.
This article is an attempt to explain what we see in keyword reports currently, show how PPC data can help guide SEO efforts, and finally a consolidation of initial thoughts and ideas to assist in moving forward.

Smart SEO professionals will still prove their worth. Together we can overcome this daunting hurdle.

What Do We See in Google Organic Keyword Reports?

Short answer: We aren't seeing an accurate representation of keywords people are using to get to our sites.

The easiest way to look at this is by visualizing the browser versions that are still passing keyword referral data.

Google Organic Visit Share vs Provided Query Share

Above, the light green color is the percent of keywords that are still passing keywords next to the darker Google organic visits.

In essence, we're mostly seeing keywords from outdated versions of Safari and MSIE (Internet Explorer). So the search behavior associated with the demographics using outdated browsers is what we see coming from Google in analytics packages like Google Analytics. Probably not a comprehensive picture into what is actually happening.

Using PPC Data to Guide SEO Strategy

Google needs marketers to be able to quantify their efforts when it comes to AdWords. Therefore, keyword data is passed and there to take advantage of.

The thought here is that if a page performs well in a PPC campaign, it will translate to performing well at the top of organic listings, though people clicking ads versus organic listings probably behave differently to some degree.

There are many ways PPC data could be used to help guide SEO strategy, this is just one to get the juices flowing.

Step 1: Identify Top Performing PPC Landing Pages

If using Google Analytics, from Dashboard click Acquisition > Adwords > Destination URLs. Assuming you have sufficient conversion tracking set up here, it should give you all the information you need to understand which pages are doing the best.

After filtering out the homepage, sorting by the conversion metric of your choice, adding Keyword as a secondary dimension, then exporting 100 rows you will have the top performing 100 landing page/keyword combinations for PPC. Revenue is always a good indication that people like what they see.

Using PPC data for SEO strategy

 

Step 2: Pull Ranking Data

Next, pull in Google Webmaster Tool Ranking data for the associated keywords. You can access this data in Google Analytics from Dashboard > Acquisition > Search Engine Optimization > Queries, or in Google Webmaster Tools.

Specify the largest date range possible (90 days) and download the report. Then use VLOOKUP to pull in ranking data into the spreadsheet containing the top PPC landing page/keyword combinations.

Using PPC data and SEO Rankings strategy

 

Step 3: Form SEO Strategy

Now that we know where our site shows up in organic for the top PPC keyword/landing URL combinations, we can begin forming strategy.

One obvious strategy is to make sure that the PPC and organic landing pages are the same. Sending PPC traffic to organic canonical pages can only increase the possibilities of linking and social sharing, assuming the organic page converts well.

Another option is to filter the Average Rank column to only include first page rankings, in an attempt to identify low-hanging fruit. Once an opportunity is identified, compare SEO metrics to determine where focus should be placed and how best to meet and beat your competitors.

Additional Thoughts on SEO Strategy in a 100% '(Not Provided)' World

1. '(Not Provided)' Still Counts as Organic

Conversion information is still applied to the organic channel, don't forget! We no longer have the ability to say someone who Googled [hocus pocus] bought $1,000 worth of "hocus pocus" stuff. But we can say that someone clicked an organic listing, landed on the hocus pocus page, and bought $1,000 of stuff.

Note: "(not provided)" shouldn't be confused with the issue of iOS 6 organic traffic showing up as direct. Last we checked this was hiding about 14 percent of Google searches, but is becoming less of an issue with the adoption of iOS7.

2. Bing Still Has Organic Keyword-Level Tracking

Bing doesn't use secure search, so we can still see what people are searching to get to our sites, conversions, sales, etc. Bing data could help quantify SEO efforts, but it's still only 9.1 percent of organic search share.

Note: People searching Bing versus Google probably behave differently to some degree.

3. Google Webmaster Tool Search Query Data Provides Partial Insight

Google gives us access to the top 2,000 search queries every day. After understanding limitations, the search query report can be invaluable as it gives a glimpse of how your site performs from Google's side of the fence. Google also recently mentioned they will be increasing the amount of data available to a year!

By linking Google Webmaster Tools with AdWords, Google also has given us a report using the same search query data except with more accurate numbers (not rounded).

Conclusion

Clearly, page-level tracking is more important than ever. Google has forced SEO professionals to look at what pages are ranking and where, and then pull in other sources to guess on performance and form strategies.

Google will most likely respond to the outcry by giving us access to more detailed search query data in Google Webmaster Tools. As mentioned before, they have already announced an increase of data from 90 days to a year. This may be a sign of how they might help us out in the future.


Original Article Post by Ben Goodsell @ Search Engine Watch

20 Inspiring (Mildly Edited) Historical Quotes About SEO & Social

Great philosophers, writers, politicians, comedians, and darts commentators throughout the years have been quoted over and over again, their words of wisdom resonate as we strive to improve our very being.

But what if they'd have been around today, in the era of online marketing, Google, Twitter, and Facebook? What may they have said or written instead?
We'll never know for sure, but we can take a guess at what Shakespeare, FDR, Bacon, Marx and others may have said...

On SEO

1.

Teddy Roosevelt

2.

Leonardo DaVinci

3.

Groucho Marx

4.

FDR

5.

Plato

6.

Lech Walesa

7.

Andy Warhol

8.

Sid Waddell

9.

Douglas Bader

10.

William Shakespeare

11.

Francis Bacon

12.

Ben Franklin

13.

Francis Bacon

14.

Leonardo DaVinci

On Social

15.

Winston Churchill

16.

Rudyard Kipling

17.

Teddy Roosevelt

18.

FDR

19.

Winston Churchill

20.

Oscar Wilde

Original Article Post by Simon Heseltine @ Search Engine Watch

6 Major Google Changes Reveal the Future of SEO

The last few weeks have been amazing. Google has made some big changes and they are all part of a longer term strategy that has many components.

In short, Google is doing a brilliant job of pushing people away from tactical SEO behavior and toward a more strategic approach.

You could argue that "tactical SEO is dead", but that's not quite right. And don't run around saying "SEO is dead" because that is far from the truth, and I might just scream at you.

Instead, let's take a few steps back and understand the big picture. Here's a look at the major developments, some of Google's initiatives driving this change, and the overall impact these changes will have on SEO.

1. '(Not Provided)'

Google made the move to make all organic searches secure starting September 23. This means we've lost the ability to get keyword data for users arriving to our websites from Google search.

Losing Google keyword data is sad for a number of reasons. This impacts publishers in many ways, including losing a valuable tool for understanding what the intent of customers that come to their site, for conversion optimization, and much more.

For tactical SEO efforts, it just means that keywords data is harder to come by. There are ways to work around this, for now, but it just won't be quite as simple as it used to be.

2. No PageRank Update Since February

Historically, Google has updated the PageRank numbers shown in the Google Toolbar every 3 months ago or so, but those numbers haven't been updated since February. This means 8 months have gone by, or two updates have been skipped.

In addition, Google's Distinguished Engineer Matt Cutts has said Toolbar PageRank won't be updated again this year, leading many to speculate that PageRank is going away. I won't miss it because I don't look at PageRank often and I normally don't have a Google toolbar in my browser.

However, a lot of people still use it as a crude measurement of a site's prominence.

For sites with a home page that has PageRank 7 or higher, it may in fact be reasonable to assume that the site has some chops. Correspondingly, sites with a home page that has a PageRank of 3 or lower, it is either new, or probably a low quality experience. Stuff in the middle, you just don't know.

If Google shuts off this data flow entirely, which wouldn't be surprising, then they will have to rely on other real world (and better) measurements instead. This would actually be better than using PageRank anyway, because Google says they don't use it that way themselves, so why should we?

3. Hummingbird

There are a few elements to Google's Hummingbird algorithm, announced in time for Google's official birthday, but like Caffeine before it, this is really a major platform change. Google has built a capability to understand conversational search queries much better than before.

For example, submit a query to Google such as "show me pictures of Fenway Park", and it does:

Knowledge Graph show me pictures of Fenway Park

Then you can follow that query with this one: "who plays there", and you get this result:
Knowledge Graph who plays there

Both of these show conversational search at work (but note that the Boston Beacons folded in 1968 after just one season, so that is an error in that result – shows that they have much work to do!).

Hummingbird really changes the keyword game quite a bit. Over time, exact keyword matches will no longer be such a big deal.

The impact of this algorithm is likely to be quite substantial over the next 2 or so years. Net-net, they have drastically reduced access to the raw data, and are rolling out technology that changes the way it all works at the same time!

4. Google+

OK, this one isn't new. Google launched Google+ June 28, 2011.
While it seemed to get off to a slow start initially, many argue that it has developed a lot of momentum, and is growing rapidly. The data on Google+'s market share is pretty hard to parse, but there are some clear impacts on search, such as the display of personalized results:

google plus usage personalization

In addition, you can also see posts from people on Google+ show up in the results too. This is true even if you perform your search in "incognito" mode:
google plus impact on SEO

And, while I firmly believe that a link in a Google+ share isn't treated like a regular web link, it seems likely to me that it does have some SEO value when combined with other factors
.
How Google+ fits into this picture is that it was built from the ground up to be a content sharing network that helps with establishing "identities" and "semantic relevance". It does this quite well, and in spite of what you might read in some places, there is a ton of activity in all kinds of different verticals on Google+.

5. Authorship

OK, authorship also isn't new (launched on June 7, 2011), but it is a part of a bigger picture. Google can use this to associate new pieces of content with the person who wrote it.

Over time, this data can be potentially used to measure which authors write stuff that draw a very strong response (links, social shares, +1s, comments) and give them a higher "Author Rank" (note that Google doesn't use this term, but those of us in the industry do).

We won't delve into the specifics of how Author Rank might work now, but you can read "Want to Rank in Google? Build Your Author Rank Now" for my thoughts on ways they could look at that.

That said, in the future you can imagine that Google could use this as a ranking signal for queries where more comprehensive articles are likely to be a good response. Bottom line: your personal authority matters.

I also should mention Publisher Rank, the concept of building a site's authority, which is arguably more important. Getting this payoff depends on a holistic approach to building your authority.

6. In-Depth Articles

Google announced a new feature, in-depth articles August 6. You can see an example of this here:

In-depth Articles Obamacare

The Google announcement included a statement that "up to 10% of users' daily information needs involve learning about a broad topic." That is a pretty big number, and I think over time that this feature will become a pretty big deal. Effectively, this is an entirely new type of way to rank in the SERPs.

This increases the payoff from Author Rank and Publisher Rank – there is a lot to be gained by developing both of these, assuming that Google actually does make it a ranking factor at some point. Note that I wrote some thoughts on how the role of in-depth articles could evolve.

Is There a Pattern Here?

Yes, there is. The data they have taken away has been historically used by publishers to optimize their SEO efforts in a very tactical manner.

How do I get higher PageRank? What are the keywords I should optimize for? Taking these things out of the picture will reduce the focus on these types of goals.

On the other side of the coin, the six major Google changes listed above are all moves that encourage more strategic behavior. Note that I didn't bring up Google Now, which is also a really big deal too, and it's another big piece of the Google plan, just not a major driver of the point I'm trying to make today.

All of these new pieces play a role in getting people to focus on their authority, semantic relevance, and the user experience. Again, this is what Google wants.
For clarity, I'm not saying that Google designed these initiatives specifically to stop people from being tactical and make them strategic. I don't really know that. It may simply be the case that Google operates from a frame of reference that they want to find and reward outstanding sites, pages, and authors that offer outstanding answers to user's search queries. But the practical impact is the same.

The focus now is on understanding your target users, producing great content, establishing your authority and visibility, and providing a great experience for the users of your site. Properly architecting your site so that the search engines can understand it, including using schema and related markup, addressing local search (if that is relevant to you), and work of this type still matters, too.

But, the obsession with tactical items like PageRank and keywords is going to fade away. As Google tweaks the way their service operates, and look for ways to capture new signals, they do things that naturally push you in that direction. It isn't going to stop. Expect more of the same going forward!


Original Article Post by Eric Enge @ Search Engine Watch

3 Key Ways B2B SEO Impacts Demand Generation Efforts

Pardot recently announced the launch of their 2013 State of Demand Generation Report, surveying 400 B2B buyers and 138 B2B companies to get insight into their demand generation habits and best practices.

As the concept of demand generation takes greater hold in online marketing discussions, here are critical points B2B SEO professionals need to consider, based on select research highlights and evolving trends.

What Is Demand Generation?

Demand generation as a concept is meant to cover all marketing initiatives that drive awareness of a company and it's products and solutions. Demand generation is sometimes confused with lead generation, but the latter is more focused on building the sales funnel and arguably places less relevance on initiatives that are not directly related to that effort (such as brand development).
Online Media Funnel

Here are a few popular definitions of demand generation:
  • Demand generation covers all marketing activities that create awareness about your product, company and industry. It includes a mix of inbound and outbound marketing. via LeadFormix
  • Focused on targeted marketing programs, demand generation drive awareness and interest in a company's products and/or services. Predominant in BtoB marketing, demand generation marries marketing programs and structures sales processes. via DemandGenReport
  • Create, nurture, and manage buying interest through campaign management, lead management, marketing analysis, and data management. via Eloqua
SEO plays a key role in demand generation initiatives because of the significance of organic search traffic and leads as a percentage of total inbound marketing efforts. I would also argue that a B2B SEO strategy can gain even more traction when marketers take into consideration link building and social media impact in outbound sales efforts as well (example: link placement in tradeshow sponsorships or 301 redirects when using "marketing friendly" promotional links).

In Pardot's launch of their demand generation Report, three key findings highlighted the announcement and should catch the B2B SEO professional's attention.

1. Search Engines (Google) Dominate First Phase Research

Finding: 72 percent of product research for a future business purchase begins on Google.

While this finding is a major boon for B2B SEO professionals a critical component to continued strategic buy-in for SEO is performance metric development. As illustrated in a recent MarketingSherpa article, even though organic search can be a significant contributing channel for lead volume, a fair percentage B2B marketers are skeptical of SEO in particular. Google's recent move to encrypted search by default, rendering all keyword traffic "(not provided)", certainly complicates measurement as well.

Ray "Catfish" Comstock just wrote a column on moving forward with SEO in a "(not provided)" world and I've also talked about performance measurement late last year.

At a high level, evolving benchmarks SEO professionals need to consider include:
  • Developing multiple levels of conversion goals
  • Integration of lead scoring universally and specific to SEO leads
  • Web traffic performance measurement and comparisons between channels
  • Integration of Google and Bing Webmaster Tools report evaluation, especially when considering significant "(not provided)" traffic percentages.
Bottom line: inbound marketing analysis of search engine performance is an even greater component of significance for SEO professionals. "(Not provided)" is an inconvenience but doesn't really restrict our ability to demonstrate impact of SEO initiatives.

2. Content Must Be Optimized Across Buying Stages

Finding: 76 percent of buyers prefer different content at each stage of their research.

I've been a proponent of content marketing for quite some time, but content solely for the sake of SEO is short-sighted. Marketers have a better understanding of the fact that content marketing assets need to be built with multiple goals for lead development in mind.

SEO professionals can be more valuable in the research and measurement components of content marketing development, regardless of its place in the sales funnel.

For example, Google Webmaster Tools report on search queries provides SEO professionals with insight into phrases visitors used to find content marketing assets on site. This data can be used to make more informed decisions on new content assets to create and SEO performance of optimized pages as well.

IMG webmaster-tools-impressions.png

Coupling this information with conversion metrics in Google Analytics or marketing automation solution provides a more complete picture of SEO, content marketing, and ROI.

For content assets meant more for brand awareness, SEO professionals can tie in inbound link reports (such as Google Webmaster Tools) and reporting data from social media platforms like Twitter, Facebook, and LinkedIn. I wrote about the advancement of reporting elements in social media platforms in a recent blog post as well.

3. Technological Innovation Creates Opportunity & Challenges for B2B SEO

Finding: 76 percent of the top 17 SaaS companies are already using marketing automation.

While this finding might not have significant face value to B2B organizations outside of this industry niche, what's important for B2B SEO professionals to takeaway is that marketers are gaining greater access to information related to buyer behavior. Technological innovation puts every marketing program into the spotlight for scrutiny in effectiveness.

B2B SEO professionals need to stay aware of the vendors developing new resources for marketing performance management, gaining at least high level understanding of features and functionality. Destinations for research directly related to Search Engine Watch include ClickZ's Stats and Tools section, ClickZ Intel for white papers across marketing disciplines, and the Online Marketing Institute.

Final Thoughts

Demand generation is a concept B2B SEO professionals need to understand because of the impact (or lack of impact) we can have in the process. While tactical SEO is a given, taking ownership of performance initiatives around the success of demand generation initiatives, with an emphasis on organic search, is also important.

Are your SEO initiatives coinciding more closely with broader demand generation initiatives? I would love to read your perspective via comments below.


Original Article Post by Derek Edmond @ Search Engine Watch
 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2013. Google Updates - All Rights Reserved
Template Created by Creating Website Published by Mas Template
Proudly powered by Blogger