Leap year meditation

Comments Off on Leap year meditation

Once in four years, just once, perhaps we could:

Forgive, forget, relax, care, stand out, speak up, contribute, embrace, create, make a ruckus, give credit, skip, smile, speak truth and refuse to compromise–more than we usually do. Pick just one or two and start there.

Hey, it’s just one day.

Careful, though, it might become a habit.

February 29th 2012 Uncategorized

Startup, Meet Agency: How Silicon Alley and Madison Avenue Are Finally Working Together

Comments Off on Startup, Meet Agency: How Silicon Alley and Madison Avenue Are Finally Working Together

While entrepreneurs have increasing resources for cracking the marketing world, challenges remain — and a number of agencies are helping out with incubator programs.

February 29th 2012 Uncategorized

Yes Virginia, Bing DOES use the Meta Keywords Tag

Comments Off on Yes Virginia, Bing DOES use the Meta Keywords Tag

Not sure how this escaped my attention until now, but late last year, Bing apparently publicly acknowledged that they still support the META Keywords tag.Bing and meta keywords tag

Here we all were thinking that the tag had quietly died in it’s sleep after a prolonged illness. After all, the last remaining engines it was hooked up to had gradually switched off support, Danny Sullivan had publicly declared the tag dead and spammy meta tag content had become the subject of myth and legend, mentioned only in humorous anecdotes shared over drinks at search industry conferences.

But despite what we all thought, Bing has indeed been indexing the content of the META Keywords tag, but as a signal for detecting low quality sites, rather than influencing page rank. What has caused all the renewed interest and kerfuffle? Duane Forrester, Senior Product Manager for Bing, kicked off the confusion when he said this about the tag on Webmaster World last year:

Meta keywords is a signal. One of roughly a thousand we analyze… Abusing meta keywords can hurt you.

Then followed a cloud of webmaster confusion and forum banter about whether SEO’s should resurrect the tag on their client sites or not.

As far as I know, Bing is the only remaining major search engine putting their hand up to claim they index the META Keywords tag. According to Forrester’s recent discussions with Danny Sullivan, they use it as one of their page quality signals, so it doesn’t contribute to page ranking as such.

So here’s the deal: the content of the tag may help Bing understand the context of your page, but it won’t impact where your page ranks on Bing. In fact, if it is stuffed with too many keywords or repetitions, the tag may send a *low quality* signal to Bing about your site, so it is best created very carefully or not used at all.

For educational purposes, we are still asking Search Engine College students to create a META Keywords tag as part of their assessment items. However, this is to ensure that students know how to craft the tag correctly, in case they decide to use it or are required to craft one in the future for employers or client sites.

If you are still using the META Keywords tag and it looks like an endless keyword repository, I’d suggest changing the content to focus on keyword variations that are still related to your page content, but that you’re unlikely to use in the visible content on the page. Things like synonyms, plurals, jargon, regional variations, related terms and word stemming.

If you don’t currently use a Meta Keywords tag, don’t automatically assume you need to create one just for Bing. If you get it wrong, it will likely do your site more harm than good.


February 29th 2012 bing, News, SEO

Fade or gain?

Comments Off on Fade or gain?

An idea introduced to a population almost always fades away.

Send 1,000 people a coupon, and perhaps 20 use it. To get more usage, you either need to ping the audience again or find a new group of people.

This explains why marketers are always in search of new people to reach, and also insist on frequency of messaging–it maximizes the percentage of the group that is reached and minimizes the fade of the idea.

There’s an important exception to the rule of fading ideas, though. Every once in a while, an idea starts with a small population and actually reaches new users, people outside the population. Instead of the idea fading, it gains traction as it spreads. Imagine a cold getting started at an elementary school but soon the cold infects parents, teachers and the co-workers of those parents…

Eventually, even these viral ideas fade away (if they didn’t, then every single person on Earth would know about LOLcats and be into slacklining.) But before that happens, an idea spread by an excited tribe can have huge reach, particularly if it’s digital.

One mathematical cause of this viral spread is the outlier who becomes quite active in sharing the idea. This superuser might tell a hundred or a thousand or more other people about it. Using his own pulpit, reaching his own tribe, the superuser raises the average (the R0 value) to over one, causing the idea to continue spreading.

Monday’s publication of Stop Stealing Dreams has exceeded my expecations for feedback and impact. While a typical bestseller might sell 2,000 copies a day, this free manifesto was downloaded and shared more than 60,000 times since yesterday. I’ve gotten comments from around the world, and it’s clear that the manifesto has struck a chord–and that’s exactly why I wrote it. (Translations in two countries are already underway… I’ll post them on the download page as they become available).

And now the moment of truth–will the people who read it, share it? Will they take the file and email it to 5 or 50 of their peers? Will they use it to start a conversation among parents or teachers or, best of all, students?


February 29th 2012 Uncategorized

Exploring the New Features in Bing Webmaster Tools

Comments Off on Exploring the New Features in Bing Webmaster Tools

Posted by Daniel Butler

Bing recently announced some pretty cool new features within their Webmaster Tools, so in this blog post we are going to delve a little deeper to see exactly what these tools are capable of.

The Markup Validator (Beta)


Found within the ‘Crawl’ tab of BWMT, the Beta Markup tool works in a similar way to the Google rich snippets testing tool extracting the following elements from a specified URL:

  • Microdata
  • Microformats
  • RDFa
  • Schema.org
  • pen Graph

The inclusion of the open graph is a nice touch, and I can see this coming in handy. Upon submitting a URL, we are presented with a neat extract of any featured markup. Let’s use imdb.org as an example:


However other than extracting elements from a page, there seems to be little actual validation taking place. There are no references to missing elements for example, or whether the mark up could potentially generate a rich snippet.

Let's take a closer look at a URL with incomplete mark up. In the following example an “fn” field is missing for the hproduct element of a page, causing a flag to be raised within Google’s testing tool:


However pasting this same URL within the Bing markup validator just produces the below:


The URL actually being tested here contains hreview-aggregate and extensive use of hreview but there are no references within the Bing Validator, so results are also incomplete.

I really want to like this tool, but I need jam in my Victoria sponge – as this is still in a Beta format, fingers crossed for an update (or perhaps a rename).

Bing Keyword Research Tool

So Bing have finally released their own keyword tool:


Overview of features:

  • Broad/Exact (select ‘strict’ for exact) match keyword search volumes
  • 6 month data history (you can select any date range within this period)
  • Export data for a max of 100 keywords at a time
  • Filter by country and language
  • History feature to track previous research queries

A very clean and simple to use interface but a shame that the data isn’t yet available via an API as there is going to be quite a bit of heavy lifting if you’re generating a substantial keyword research campaign, but none the less we now have some data to play with from Bing directly.

There are a ton of awesome posts to check out on SEOmoz that go into detail about the keyword research process, so I’m not going to go into great detail here, but with the data available from Bing I would be looking to:

  1. Consolidate data into a single spreadsheet
  2. Obtain current rankings for each keyword in both Bing and Google
  3. Use the Google Adwords API to extract monthly search volume for each keyword
  4. Using Google analytics, marry up keywords and associated traffic
  5. Break down keywords into meaningful categories
  6. Use pivot tables/charts to compile this data for identifying key opportunities (low hanging fruit) in both search engines:

    1. Along one axis display separated search volumes for both Google and Bing, also traffic from analytics
    2. On the other axis display current ranking position in both Google and Bing
    3. Filter this chart by ranking between position 5 and 20.

For illustration purposes here is a quick mock up of how this can be developed:


The numbers along the bottom reflect specific keywords, but for demonstration purposes these have been labelled as numbers.

Although the keyword data from Bing isn’t yet available within an API, Bing has released an API for the rest of the data within Webmaster Tools (looking forward to having a play around with this).

Look forward to hearing about your experiences using Bing’s latest tools.

Whew! That, my friends was my first ever SEOmoz post. Did I get round to introducing myself? I’m Dan, Senior SEO consultant at SEOgadget.  I’d love to know what you think and how you’re using the new features in Bing’s toolset. Until the next time!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

February 29th 2012 Uncategorized

Think And Act Like a Publisher Part 2 : How To Write Headlines That Sell #smallbiz #writetip

Comments Off on Think And Act Like a Publisher Part 2 : How To Write Headlines That Sell #smallbiz #writetip

If you visit a newsagent and scan past the magazines on offer you’ll see an array of headlines trying to stand out in a cluttered space. Whatever your interest -from philately to photography- publishers try to entice you to choose their publication and read their stories.


How to stand out from the crowd

Running a small business is difficult enough at the moment never mind finding out how you can make potential customers choose your business out of all of those on offer. If you want to get found and read online, you have to learn some lessons from publishers – in fact you have to start behaving like a publisher!

The first step in this process is to understand your audience, the basis for any marketing campaign. Understand what problems and challenges your audience may have and see if you can ‘tailor’ your product or service as a ‘solution’ to that.

I often find that once you’ve created a heading then it’s easier to weave an article or story around the heading – in fact just as I have with this blog! So the tip of the day is to brainstorm on a headline that would work for your audience, forcing you to focus on a topic that will make that audience look up and take note. If your own creative juices aren’t flowing freely then this is something that you could ask employees, friends, suppliers or customers for help with. You could even create a mini competition for any headline (and topic!) that you end up using.

Remember – you are looking to ‘engage’ with your customer not alienate them but also remember that sometimes being a bit provocative can work – sometimes it can backfire completely.

Happy Headline Hunting!

Nick will be speaking at The Online Business Makeover at Microsoft HQ London on 12th March 2012. You can find a full list of speakers here. To find out more about top content for search engines and actual readers, as well as how to use social media to boost your online presence, book your ticket today. And we still have a couple of 50% discounted places left if you quote promotional code MICROSOFTGUEST when you book. Don’t miss out – order soon!

Read Nick’s previous blog posts here: Think and Act Like a Publisher #SEO

As always, if you have any questions or comment, feel free to leave them in the comments below, post a thread in our forums, ping us on Twitter or write on our Facebook wall and join our Microsoft Advertising for SMB LinkedIn group!

February 29th 2012 advertisers, SEO

Comic for February 29, 2012

Comments Off on Comic for February 29, 2012

February 29th 2012 Uncategorized

Product Pulse- February 28th, 2012

Comments Off on Product Pulse- February 28th, 2012

The updates just keep coming! Yahoo! has just rolled out the latest in a series of product launches and updates to ensure that users are getting the best and most optimal experience on Yahoo! sites. From a new comedy channel on Yahoo! Screen, to a prediction engine that could forecast the results of the next presidential election, Product Pulse has you covered on the latest product news from Yahoo! Stay tuned each month as we keep you posted on the next and best that Yahoo! has to offer.

Flickr Contacts page gets a sleeker makeover
The first phase in Flickr’s 2012 redesign has just launched! Flickr’s contact page has been revamped using the new Justifed layout which will make it easier to see the stories your friends are telling with their photos. While the previous layout choices are still available, the new design optimizes for seamlessly displaying more images at larger sizes, so you can see more of the activity from contacts, friends and family at once. The new design offers Flickr members:

A more elegant layout, with whitespace and thumbnails removed for vibrant, un-cropped photos that way the photo is meant to appear (portrait, landscape, panorama, Instagram square)

Simpler and more dynamic navigation, including:

  • Scroll-over functionality gives way to high-resolution pop-up images
  • A simple light-box option for a full-screen view of the image selected
  • Limitless photo scrolling removes the need to click from page to page

New ways to interact with images, including quick access to “favorite” images and comment right on the thumbnails

Check out the new Flickr contacts page here

Yahoo! Screen introduces the Yahoo! Comedy Channel
The live broadcast of Bill Mahr’s “CrazyStupidPolitics: Live from Silicon Valley” last night kicks off the launch of the Yahoo! Comedy Channel on Yahoo! Screen, which will feature a slate of original web shows from some of the top production companies and renown comedic talent. The lineup includes “Sketchy” a weekly video comedy series starting some of the biggest names in comedy and many others. Missed the Bill Mahr live broadcast? You can watch all the clips on Yahoo! Screen, hear all about his surprise headline announcement, and check out the rest of the video series available.

Flickr Android App Coming To A Language Near You!
Flickr rolled out an update to the Flickr Android app that includes nine new languages, including German, French, Italian, Spanish, Portuguese, Korean, Traditional Chinese, Vietnamese and Bahasa Indonesian. The popular Flickr Android app, which initially launched in September 2011, allows even more Flickr members worldwide to take photos directly from the app, enhance them with filters, and quickly send pictures to Flickr, Facebook, Twitter and elsewhere. Check it out here.

The Signal Predicts Obama will win the Election
David Rothschild and David Pennock from Yahoo! Labs made a bold prediction in The Signal last week that Obama will win the general election in November by 303 electoral votes. The Signal, a Yahoo! blog uses the first ever real-time prediction engine that combines powerful scientific algorithms with multiple, real-time and historical data sources to generate predictions. This model does not use polls or prediction markets to directly gauge what voters are thinking. Instead, it forecasts the results of the Electoral College based on past elections, economic indicators, measures of state ideology, presidential approval ratings, incumbency, and a few other politically agnostic factors. To date, the post has attracted over 40,000 comments, a record for The Signal.Stay current on the Signal’s latest posts here.

February 29th 2012 Android

SMX West 2012: Evening Forum with Danny Sullivan

Comments Off on SMX West 2012: Evening Forum with Danny Sullivan

SMX West 2012: Evening Forum with Danny Sullivan was originally published on BruceClay.com, home of expert search engine optimization tips.

From the SMX West agenda: The audience is the panel in this session. Search marketers share thoughts, ideas and knowledge. The session is moderated by Search Engine Land editor-in-chief Danny Sullivan, and the audience shapes the agenda.

Audience: Dave Naylor today, on his blog, writes about what might have changed in respect to link value in Google.

Danny: They may have stopped following nofollow. They may be changing the wieght. He has no idea what they changed, but suspects is realted to spamming rather than changing what was working well. It was interesting they said it was a signal they used for a long time. They could be using at words around a link to associate with a link itself.  People will look into it further and we’ll see what emerges. It’s also difficult because we don’t know what they do with link signals to begin with. It might end up being a very specific thing you won’t want to do, which people aren’t really doing much of anyway.

SMX West logoAudience: What would happen with search engine results if there was no SEO?

Danny: Matt Cutts would be working for Facebook… ;) That’s not possible. Even if you’re not doing SEO, you’ve done SEO on your pages. Let’s say if everyone just published a bunch of pages. The SE still has to come up with an algorithm that will use signals. The result would be accidental SEO. Maybe better to think of the debates in the industry about whether or not SEO is ethical, black hat, fair, etc. He finds this debate boring, he’s heard it all so many times. There are people that understand SEO means working with good content and making that content visible to search engines. It’s always been akin to public relations. You have a good story to tell. If you’re smart you package the story so a reporter wants to hear more about it. Ultimately the story succeeds not because you tricked the reporter, but because the story is a good story. If content deserves to do well, then the content should do well from there. Results, in a world without SEO, wouldn’t be better, they’d be different or worse, most likely.

Audience: We recently revamped our site. We’ve reduced the number of crawl errors significantly. Is there a good way to manage crawl errors for large sites that have been revamped? Is there a number that the crawl errors should be under?

Danny: SEs will spend a certain amount of time on your site. Reducing errors helps make sure more of your site is visible. Also, you may have errors but still have great traffic. Worry more if you have a drop in traffic. Check out the technical SEO track tomorrow.

Audience: With SEOmoz, it’ll still rank my page and a failing grade I get is lack of Meta description.

Danny: Meta descriptions aren’t a ranking signal. But it’s one of the few things we have left that we have control over. Google or Bing, in their infinite wisdom, may use it as an excerpt in the SERP.

Audience: Do you think there will be a time when Google stops using the link graph and replaces it with social signals?

Danny: Social signals are growing in importance and will potentially eclipse links. When Google came along, they ranked pages via links on a page. That was gamed, so then they decided to let people vote – that was the link model. Counting links, as a way to vote for what’s popular on the web, is like when the country said everyone can vote, as long as their a white male landowner. Not all the Internet population is a landowner. Do they rent space from WordPress, who automatically nofollows links? Participation has now gotten simpler as you can just click a “Like” button.

Audience: What’s the best tool you recommend to measure authority of a domain.

Danny: He understands the use of such metrics in order to prioritize where to put your effort. But these third-party tools (Compete, Alexa) are just guesses and you can look at PageRank but that’s outdated.

Audience: Google’s given a fair amount of screen real estate to Google+. It’s a lot of hype. Given the importance of the real estate, is it overhyped or underhyped?

Danny: A string of articles this week have suggested Google+ is a ghost town. But even if it’s a ghost town, to participate in Search Plus results and other Google features, you need a Google+ account. It’s having a direct impact on rankings. It’s one of Google’s main ranking signals. Who wants to ignore that? It’s not hard – slap the button on your page. Just doing a read-only, it’s better than not being there at all. The influence of +1s add to this issue. I don’t think it’s hype when you’re a search marketer. If you care about Google, you’ve got to care about Google+. If you’re a general user, maybe you don’t care about Google+.

Audience: If a U.S. site has had a few penalties, not great rankings, but great rankings overseas, should I rel=”canonical”?

Danny: The fact it’s doing well overseas shows they don’t think it’s doing completely bad. It makes me think it’s not actually penalized in the U.S., maybe something else is wrong with it. I’d reach out to Google to find out if it’s got standing penalties. Leave the sites that are working alone and keep them away from the site that’s not doing well. Try to catch lunch with a Google engineer this week.

Audience: This one’s for domainers/developers/affiliate marketers. It’s regarding the documentation about Google quality raters. How much do you think manual raters, when they see sites .net and .org sites with same keyword structures pop up, will they flag them?

Danny: Google will give new sites a chance at the top to see how they’ll be met by users. Quality raters generally adjust algos overall. If a site drops out of results in a scenario like this it’s more likely an engineer came by and saw the site is part of a spammy network and so will drop the whole network. If there was something decided to be wrong with the company, all related sites would go. Probably one site was doing better with click-thrus and conversions, etc., and one was doing worse and didn’t stay.


February 29th 2012 Uncategorized

An Evening Forum With Danny Sullivan

Comments Off on An Evening Forum With Danny Sullivan

Hey, hey, hey! It’s the last session of Day 1 and it’s my most favorite session because it’s straight-up Danny banter. What’s more awesome than that?

Danny kicks things off by asking whose attended this forum before. Apparently I’m the only one who has. OKAY THEN! He’s basically just going to open things up and let people ask questions. He’ll try to answer them and then he’ll let the audience answer each other’s questions too. Something like that.

Okay, let’s start.

Dave Naylor shared his speculations about what might change in the Google algo in regard to link value. What are your thoughts on that?

Google every month now has been announcing the changes they made to search. And they bury things in there. Yesterday they said they stopped using a certain link signal. Maybe it means they’re not using the anchor text in a link anymore. He doesn’t think that’s it. They could be recalculating how they were counting links. They could be looking at the proximity. It could be that they’re following nofollow now – he doesn’t think that’s the case.  He did ask them but Google wouldn’t say. Danny doesn’t know what they would have changed. He suspects its more related to spamming things than trying to change something that was working well.  It was interesting that they said it was a signal they’ve used for a long time, but there are so many things they could have been doing.

He thinks people will be looking into it further and we’ll see if something emerges from it. It’s hard because we don’t know what they did with link signals to begin with. It might be a very specific thing you shouldn’t do anymore that people weren’t doing anyway.

What would happen to search engine results if there was no SEO?

Matt Cutts would be working for Facebook.  He doesn’t think that’s possible. The reason he says that is even if you’re not doing SEO, you’ve done SEO to your pages. The search engine is going to look for something on a page.  If no one did anything to their pages, the search engine is still going to have to decide what it thinks is relevant and it’s going to use a certain set of signals. You’d have an accidental SEO that was happening.    The results wouldn’t be better if there was no SEO. They would just be different. And maybe even worse.

We went through a site revamp. We had more than 110k crawl errors. We’re currently at 60k crawl errors. Is there a better solution to managing crawl errors in addition to 301 redirects? How much importance should be given to crawl errors?

The search engines are going to spend a certain amount of time at your site. If they’re deciding they’re going to crawl 200k pages and they get 100k errors, they’re not going to get to the good stuff. Some of the crawl errors may be that you have the same page with a bunch of different parameters to your links are getting split up so zero is what you’d like to reduce it to.  He’d go back to the console and see if there are common errors that Google is finding. If you can ID patterns, then you can have someone technical go through and have someone go through your redirects all at once. He’d worry more if he saw a drop in traffic and then saw a rise in the number of errors.

Speaking of errors, she has an old meta question.  Are Meta Descriptions worth our time?

They’re important. They’re not a ranking element.  They encourage people to click through to your Web site. That’s kind of why you want to have it. It lets you put your best foot forward. And then the other ranking things come into it.

Will there be a time when Google moves to a user graph instead of a link graph?

He thinks they’re already going there. He doesn’t think they’ll abandon using links but the social signals are going to grow up to be another significant source and will pass links.

What do you think is the best metric to measure authority of a Web site or how authoritative a site is?

He’s not a big fan of those kind of metrics. He understands why people want them and you’re trying to prioritize your Web site but the third-party metrics are guesses. They don’t represent what the search engine thinks about your site. He recommends looking at the metrics the search engines are giving you. The better metric is to go back to your own traffic and benchmarking yourself against others in your space. That’s where he would look.

Google’s given a fair amount of real estate to Google+ so it’s gotten a lot of hype. Given the importance of that real estate do you think it’s under- or over-hyped?

It’s funny because we had a whole string of articles this week about how there’s no one on Google+. And not once did it mention that even if it is a ghost town, if you want to show up in those People & Pages boxes, you need to have an account.  He doesn’t know anyone whose an SEO and saying they can’t be bothered. It’s having a direct impact on rankings. He did a search for [cars] and Ford came up at 9 because he’s friends with them on Google+. Who wants to ignore that? And it’s not hard.  Stick the button on your page.  He doesn’t think it’s hype when you’re a search marketer. If you care about Google, you have to care about Google+. If you’re a general user, than maybe you don’t have to care so much.

Google took all the ads they make money off of and TOOK THEM DOWN for Google+.

Any tools for predicting traffic based on SEO/long-tail traffic?

You can’t go through and try to add up all the ways someone can find your site. You can look at your pages and the words on it and try and go that way. You’re probably better off looking at your peers. Another thing you can do is look at a peer site and look at the percentage of traffic they get from different sources.

An attendee creates a Missed Opportunity Matrix to bring home the fact that they’re doing this thing that feels completely black box.

If a domain in the US went through a few penalties but has excellent rankings overseas, what would you do to boost in the US?

If a domain’s been pinged, I probably wouldn’t to touch it. The fact that it’s doing well overseas makes me think you don’t have a penalty and there’s something else going on with it.  He’d go back to Google first and see what’s up.

And that’s it from Day.  Catch up on everything you missed today and we’ll see you back here tomorrow.

February 29th 2012 Uncategorized