The Next Big Phase of Google Search Is Coming Early Next Year

No Comments »

Last month, Google announced Accelerated Mobile Pages, a new open source project, which is basically its answer to Facebook’s Instant Articles. Like Instant Articles, the purpose of the project is to enable web pages to load more quickly on mobile devices.

Google announced on Tuesday that it will begin sending traffic to AMP pages in Google search beginning early next year. They didn’t give a specific date, but said they intend to share “more concrete specifics on timing very soon.” Stay tuned for that. It remains to be seen whether or not these pages will get a ranking boost by default, but given Google’s emphasis on the mobile experience, it seems very likely that AMPs will benefit.

Are you planning to implement Accelerated Mobile Pages? Let us know in the comments.

“We want webpages with rich content like video, animations and graphics to work alongside smart ads, and to load instantaneously,” Google explained when the project was announced. “We also want the same code to work across multiple platforms and devices so that content can appear everywhere in an instant—no matter what type of phone, tablet or mobile device you’re using.”

The program utilizes a new open framework called AMP HTML, which is built on existing web technologies, and is aimed at letting websites build light-weight pages.

As far as ranking goes, Danny Sullivan said in a tweet last month that AMP pages won’t rank better because they’re AMP, but noted that Google already rewards speedy pages, so they can still benefit.

Google has already made mobile-friendliness a ranking signal, and the whole point of AMPs is to make for a better mobile experience. It’s hard to imagine a scenario in which AMPs aren’t benefiting in rankings.

“Thousands of publishers have expressed interest in AMP since the preview launched with the likes of the BBC, Sankei, New York Times, News Corp, Washington Post and more,” write David Besbris (Vice President of Engineering, Google Search) and Richard Gingras (Head of News, Google) in a blog post. “Since then, many others have committed their support to the project, including and NZN Group in Brazil; CBS Interactive, AOL, Thrillist, Slate, International Business Times/Newsweek, Al Jazeera America and The Next Web in the US; El Universal and Milenio in Mexico; The Globe and Mail and Postmedia in Canada, as well as many more across the globe. The Local Media Consortium (LMC), a partnership of 70+ media companies collectively representing 1,600 local newspapers and television stations, has also voiced their support.”

The two also announced that Outbrain, AOL, OpenX, DoubleCLick, and AdSense are working within the project’s framework to improve the ad experience for users, publishers, and advertisers. More information on this will come in the near future, they say.

“Ensuring that traffic to AMP articles is counted just like current web articles is also a major focus of the project,” they write. “comScore, Adobe Analytics, and Chartbeat have all stated that they intend to provide analytics for AMP pages within their tools. They have since been joined by many others: Nielsen, ClickTale and Google Analytics. This development is significant for the AMP Project because publishers developing for AMP will not skip a beat in terms of analytics and measurement — analytics for AMP are real time and will work within your existing provider.”

According to Google, there are over 4,500 developers expressing interest in AMP with over 250 contributions of new code, samples, and documentation having been made. Discussions are also underway related to analytics and template features.

With Google Search the mobile experience has been they key narrative throughout 2015, and it looks like that will continue throughout next year, largely driven by AMP.

Has this development been on your radar thus far? What do you think of the project? Discuss.

Image via Google/

November 26th 2015 blogging, Google, Mobile, Search, SEO

Google Confirms Next Penguin Will Be ‘Huge’

No Comments »

As far as we know, webmasters can expect Google’s Penguin update to make a return before the year is over. It’s possible that this won’t be the case, but based on previous comments from Googlers, it should be.

The SEO and webmaster communities have been waiting for Google to launch a new Penguin refresh for a long time. Google has been promising a new version that will update in real time, so those impacted by it won’t have to wait for Google to push another one to have any hope of recovery. It will instead be constantly updating.

A couple months ago, Google’s John Mueller said he expected Penguin to be here before the end of the year. Google’s Gary Illyes said it was in the “foreseeable future” and that he “hoped” it would be here before the end of the year.

Late last month, Illyes talked about it a little on Twitter with curious parties, tweeting that the update was still not ready for primetime.

He did, however, indicate it was still on track for this year:

Illyes also indicated at the time that it was not too late for Penguin to acknowledge disavow files.

This week, Barry Schwartz at Search Engine Roundtable points to further comments from Illyes on Twitter confirming that the next Penguin will indeed update in real time, and will be an actual update to the algorithm as opposed to a data refresh. He calls it a “huge change”.

Assuming that the update does come before the end of the year, that gives us roughly five weeks, so it should be very soon.

The update will be quite welcome to webmasters waiting for a chance to recover from previous iterations of Penguin, but even more significantly, those impacted by the algorithm in the future won’t have to wait so long to see the fruits of any potential recovery efforts. At least in theory.

November 25th 2015 Google, Search, SEO

Google Updates Search Quality Rater Guidelines

No Comments »

Google announced that it has updated its guidelines for search quality raters. The reason behind this (much like the reason for many of the company’s announcements) is the increasing use of mobile devices.

The company says it recently completed a “major” revision of the guidelines with mobile in mind.

“Developing algorithmic changes to search involves a process of experimentation,” says Google search growth and analysis senior product manager Mimi Underwood. “Part of that experimentation is having evaluators—people who assess the quality of Google’s search results—give us feedback on our experiments. Ratings from evaluators do not determine individual site rankings, but are used help us understand our experiments. The evaluators base their ratings on guidelines we give them; the guidelines reflect what Google thinks search users want.”

“In 2013, we published our human rating guidelines to provide transparency on how Google works and to help webmasters understand what Google looks for in web pages,” Underwood adds. “Since that time, a lot has changed: notably, more people have smartphones than ever before and more searches are done on mobile devices today than on computers. We often make changes to the guidelines as our understanding of what users wants evolves, but we haven’t shared an update publicly since then.”

You can see the update here.

Google says it won’t update the public document with every little change, but will try to do so for the big ones.

Image via Google

November 20th 2015 Google, Search, SEO

Search Expert Duane Forrester Joins Bruce Clay, Inc. as VP of Organic Search Operations to Take the Road Less Traveled

No Comments »

Search Expert Duane Forrester Joins Bruce Clay, Inc. as VP of Organic Search Operations to Take the Road Less Traveled was originally published on, home of expert search engine optimization tips.

Where does the former lead SEO at Microsoft/MSN and leader in the development of Bing Webmaster Tools go after an eight-year tenure at Microsoft? If the corporate world is a freeway, Duane Forrester heads for the exit, takes the road less traveled, and joins Bruce Clay, Inc. in a newly created position as Vice President, Organic Search Operations.

Duane Forrester

It’s time to enjoy the open road.

Bruce Clay, Inc. adds Duane Forrester to the team in order to provide businesses with an exclusive advantage in search engine optimization methodology and digital marketing strategy. Forrester, who was awarded Search Personality of the Year at the 2014 U.S. Search Awards, is a visible and popular figure in the search industry.

“Everyone knows that Duane could have gone to work for any company he wanted,” says our president, Bruce Clay. “It is an honor that he chose to work here.”

Duane will have oversight of the organic search direction at Bruce Clay, Inc., as well as the company’s content, social media and design departments. He will also serve as a spokesman for Bruce Clay, Inc. by attending and speaking at industry conferences.

“Duane has great experience with how things work, and that will help us to enhance our existing capabilities for all aspects of digital marketing, including SEO, social media and web design. This is the man who created the Bing Webmaster Tools — that will give us a wealth of insight,” Bruce points out.

An SEO with Many Talents

Duane’s professional career path reveals a man who wears many hats — but one who wears each hat confidently. It’s easy to see why, no matter which hat he wears, Duane gives his all to the project or company at hand. He comes from a small business background and helped run a family-owned motel for 16 years. It was during this time that he learned what it takes to run a small business and discovered that he has a passion for helping companies grow.

Duane has made significant contributions to every place he’s worked, from developing the first player’s program for Caesar’s Palace in Canada to managing SEO at a sports startup that has since become a leader in the sports content publishing world.

Then, Microsoft called. During his eight-year career at Microsoft, Duane served as senior product manager at Bing — a role that led to the creation of Bing Webmaster Tools — and spent nearly five years as the primary touchpoint between Bing and the webmaster community. During this time he became known as the approachable face of Bing, bridging the gap between the search engine’s inner workings and the information-hungry SEO community.

“If you’ve ever heard Duane speak at a conference, you know that he is able to answer any question about how a search engine handles particular SEO concerns, from 404 pages to HTTPS,” said Mindy Weinstein, author and director of training at Bruce Clay, Inc. “He always has an answer and a thorough explanation — we are so excited to be able to take that experience and infuse it into our analysts’ ongoing education and our open enrollment SEOToolSet Training.”

So, the Big Question: Why BCI?

“Over the past decade I’ve been approached by countless agencies,” says Duane. “However, Bruce Clay, Inc. is different because of three main pillars that we stand on: services, proprietary tools, and talent. Most agencies don’t have as deep a tool set. They use rich tools, but not proprietary tools. If they need something fine-tuned for a client, they can’t walk down the hall and ask the programmer to add a parameter — which is something we can do.”

“When it comes to talent, I see agencies struggle. BCI is very stable and has an expert knowledge base. I know the team that’s here — and they’re smart SEOs.”

The Road Ahead

For Duane, the role of VP of organic search operations at BCI will allow him to take all of his cumulative experience and use it in new ways. He will be free to talk openly about Google, turn ideas into products faster, and enjoy the open roads of sunny Southern California — on the weekends, of course.

For BCI, having Duane on the team is a natural part of the company’s growth and commitment to excellence. For the last eight years in a row, Bruce Clay, Inc. has made the Inc. Magazine 500|5000 list as one of the fastest-growing private companies in the U.S. Most recently, in September the company added author, speaker, and paid search expert David Szetela as VP of paid search marketing operations.

“A growing number of our clients come to us for both SEO and PPC services, and Duane’s deep understanding of both disciplines will help accelerate that growth,” David explains.

Imaging SEO consulting from Bruce Clay, Inc.

Are You Confident in Your SEO Agency?

SEO is technical. You pay an SEO agency for its technical insight, experience, and ability to drive traffic to your site that grows your business. What if an agency had all of the above, plus skills and experience that no other agency could claim?

What if an SEO agency had Duane Forrester on the team?

With the addition of Duane Forrester, Bruce Clay, Inc. brings his knowledge of SEO to the world-class offerings of the company.

“If companies want to win at SEO, they should talk to us at some point,” says Bruce.

So let’s talk.


November 17th 2015 SEO

Facebook Lets Google Index Mobile App

No Comments »

Google is now reportedly indexing Facebook’s mobile app as part of its app indexing efforts.

Google has been encouraging app developers to utilize its app indexing resources to enable mobile users to get to content within apps from Google search results. Google has even made app indexing a ranking signal.

Apparently Facebook wasn’t getting involved until now. The Wall Street Journal is reporting that Facebook is now allowing Google to crawl in-app content. Of course the content is the same stuff Google was already accessing on the web version, such as public profile information. From the report:

The agreement means that results from Google searches on smartphones will display some content from Facebook’s app, including public profile information. The listings will appear as “deep links” that will take users to the relevant part of the Facebook app, the spokeswoman said.

Google can’t show content shared through logged-in and private Facebook app sessions, meaning it is still locked out of most information inside the walled garden of Facebook’s social network. For those searches, users will have to use Facebook’s search service, which it recently updated.

Facebook has been going to great lengths to improve its own search service. The company recently announced some major improvements to search across personal connections and public posts, and is now testing in-profile search.

It’s unclear from the Journal’s report whether Google is indexing Facebook’s app on just Android, just iOS, or both. The ability to index Android apps has been around for a longer time, but Google recently expanded its efforts to include iOS.

For more on app indexing, view our related coverage here.

Image via Google Play

November 17th 2015 Facebook, Google, Mobile, Search, SEO

Is Google about to Kill Its Penguin?

Comments Off

Is Google about to Kill Its Penguin? was originally published on, home of expert search engine optimization tips.

TL;DR – A theory: The next Google Penguin update, expected to roll out before year’s end, will kill link spam outright by eliminating the signals associated with inorganic backlinks. Google will selectively pass link equity based on the topical relevance of linked sites, made possible by semantic analysis. Google will reward organic links and perhaps even mentions from authoritative sites in any niche. As a side effect, link-based negative SEO and Penguin “penalization” will be eliminated.

Is Google about to kill Penguin?

Is the End of Link Spam Upon Us?

Google’s Gary Illyes has recently gone on record regarding Google’s next Penguin update. What he’s saying has many in the SEO industry taking note:

  1. The Penguin update will launch before the end of 2015. (Since it’s been more than a year since the last update, this would be a welcome release.)
  2. The next Penguin will be a “real-time” version of the algorithm.

Many anticipate that once Penguin is rolled into the standard ranking algorithm, ranking decreases and increases will be doled out in near real-time as Google considers negative and positive backlink signals. Presumably, this would include a more immediate impact from disavow file submissions — a tool that has been the topic of much debate in the SEO industry.

But what if Google’s plan is to actually change the way Penguin works altogether? What if we lived in a world where inorganic backlinks didn’t penalize a site, but were instead simply ignored by Google’s algorithm and offered no value? What if the next iteration of Penguin, the one that is set to run as part of the algorithm, is actually Google’s opportunity to kill the Penguin algorithm altogether and change the way they consider links by leveraging their knowledge of authority and semantic relationships on the web?

We at Bruce Clay, Inc. have arrived at this theory after much discussion, supposition and, like any good SEO company, reverse engineering. Let’s start with the main problems that the Penguin penalty was designed to address, leading to our hypothesis on how a newly designed algorithm would deal with them more effectively.

Working Backwards: The Problems with Penguin

Of all of the algorithmic changes geared at addressing webspam, the Penguin penalty has been the most problematic for webmasters and Google alike.

It’s been problematic for webmasters because of how difficult it is to get out from under. If some webmasters knew just how difficult it would be to recover from Penguin penalties starting in April of 2012, they may have decided to scrap their sites and start from scratch. Unlike manual webspam penalties, where (we’re told) a Google employee reviews link pruning and disavow file work, algorithmic actions are reliant on Google refreshing their algorithm in order to see recovery. Refreshes have only happened four times since the original Penguin penalty was released, making opportunities for contrition few and far between.

Penguin has been problematic for Google because, at the end of the day, Penguin penalizations and the effects they have on businesses both large and small have been a PR nightmare for the search engine. Many would argue that Google could care less about negative sentiment among the digital marketing (specifically SEO) community, but the ire toward Google doesn’t stop there; many major mainstream publications like The Wall Street Journal, Forbes and CNBC have featured articles that highlight Penguin penalization and its negative effect on small businesses.

Dealing with Link Spam & Negative SEO Problems

Because of the effectiveness that link building had before 2012 (and to a degree, since) Google has been dealing with a huge link spam problem. Let’s be clear about this; Google created this monster when it rewarded inorganic links in the first place. For quite some time, link building worked like a charm. If I can borrow a quote from my boss, Bruce Clay: “The old way of thinking was he who dies with the most links wins.”

This tactic was so effective that it literally changed the face of the Internet. Blog spam, comment spam, scraper sites – none of them would exist if Google’s algorithm didn’t, for quite some time, reward the acquisition of links (regardless of source) with higher rankings.

black hood

Negative SEO: a problem that Google says doesn’t exist, while many documented examples indicate otherwise.

And then there’s negative SEO — the problem that Google has gone on record as saying is not a problem, while there have been many documented examples that indicate otherwise. Google even released the disavow tool, designed in part to address the negative SEO problem they deny exists.

The Penguin algorithm, intended to address Google’s original link spam issues, has fallen well short of solving the problem of link spam; when you add in the PR headache that Penguin has become, you could argue that Penguin has been an abject failure, ultimately causing more problems than it has solved. All things considered, Google is highly motivated to rethink how they handle link signals. Put simply, they need to build a better mousetrap – and the launch of a “new Penguin” is an opportunity to do just that.

A Solution: Penguin Reimagined

Given these problems, what is the collection of PhDs in Mountain View, CA, to do? What if, rather than policing spammers, they could change the rules and disqualify spammers from the game altogether?

By changing their algorithm to no longer penalize nor reward inorganic linking, Google can, in one fell swoop, solve their link problem once and for all. The motivation for spammy link building would be removed because it simply would not work any longer. Negative SEO based on building spammy backlinks to competitors would no longer work if inorganic links cease to pass negative trust signals.

Search Engine Technologies Defined

Knowledge Graph, Hummingbird and RankBrain — Oh My!

What is the Knowledge Graph?
The Knowledge Graph is Google’s database of semantic facts about people, places and things (called entities). Knowledge Graph can also refer to a boxed area on a Google search results page where summary information about an entity is displayed.

What is Google Hummingbird?
Google Hummingbird is the name of the Google search algorithm. It was launched in 2013 as an overhaul of the engine powering search results, allowing Google to understand the meaning behind words and relationships between synonyms (rather than matching results to keywords) and to process conversational (spoken style) queries.

What is RankBrain?
RankBrain is the name of Google’s artificial intelligence technology used to process search results with machine learning capabilities. Machine learning is the process where a computer teaches itself by collecting and interpreting data; in the case of a ranking algorithm, a machine learning algorithm may refine search results based on feedback from user interaction with those results.

What prevents Google from accomplishing this is that it requires the ability to accurately judge which links are relevant for any site or, as the case may be, subject. Developing this ability to judge link relevance is easier said than done, you say – and I agree. But, looking at the most recent changes that Google has made to their algorithm, we see that the groundwork for this type of algorithmic framework may already be in place. In fact, one could infer that Google has been working towards this solution for quite some time now.

The Semantic Web, Hummingbird & Machine Learning

In case you haven’t noticed, Google has made substantial investments to increase their understanding of the semantic relationships between entities on the web.

With the introduction of the Knowledge Graph in May of 2012, the launch of Hummingbird in September of 2013 and the recent confirmation of the RankBrain machine learning algorithm, Google has recently taken quantum leaps forward in their ability to recognize the relationships between objects and their attributes.

Google understands semantic relationships by examining and extracting data from existing web pages and by leveraging insights from the queries that searchers use on their search engine.

Google’s search algorithm has been getting “smarter” for quite some time now, but as far as we know, these advances are not being applied to one of Google’s core ranking signals – external links. We’ve had no reason to suspect that the main tenets of PageRank have changed since they were first introduced by Sergey Brin and Larry Page back in 1998.

Why not now?

What if Google could leverage their semantic understanding of the web to not only identify the relationships between keywords, topics and themes, but also the relationships between the websites that discuss them? Now take things a step further; is it possible that Google could identify whether a link should pass equity (link juice) to its target based on topic relevance and authority?

Bill Slawski, the SEO industry’s foremost Google patent analyzer, has written countless articles about the semantic web, detailing Google’s process for extracting and associating facts and entities from web pages. It is fascinating (and complicated) analysis with major implications for SEO.

For our purposes, we will simplify things a bit. We know that Google has developed a method for understanding entities and the relationship that they have to specific web pages. An entity, in this case, is “a specifically named person, place, or thing (including ideas and objects) that could be connected to other entities based upon relationships between them.” This sounds an awful lot like the type of algorithmic heavy lifting that would need to be done if Google intended to leverage its knowledge of the authoritativeness of websites in analyzing the value of backlinks based on their relevance and authority to a subject.

Moving Beyond Links

SEOs are hyper-focused on backlinks, and with good reason; correlation studies that analyze ranking factors continue to score quality backlinks as one of Google’s major ranking influences. It was this correlation that started the influx of inorganic linking that landed us in our current state of affairs.

But, what if Google could move beyond links to a model that also rewarded mentions from authoritative sites in any niche? De-emphasizing links while still rewarding references from pertinent sources would expand the signals that Google relied on to gauge relevance and authority and help move them away from their dependence on links as a ranking factor. It would also, presumably, be harder to “game” as true authorities on any subject would be unlikely to reference brands or sites that weren’t worthy of the mention.

This is an important point. In the current environment, websites have very little motivation to link to outside sources. This has been a problem that Google has never been able to solve. Authorities have never been motivated to link out to potential competitors, and the lack of organic links in niches has led to a climate where the buying and selling of links can seem to be the only viable link acquisition option for some websites. Why limit the passage of link equity to a hyperlink? Isn’t a mention from a true authority just as strong a signal?

There is definitely precedent for this concept. “Co-occurrence” and “co-citation” are terms that have been used by SEOs for years now, but Google has never confirmed that they are ranking factors. Recently however, Google began to list unlinked mentions in the “latest links” report in Search Console. John Mueller indicated in a series of tweets that Google does in fact pick up URL mentions from text, but that those mentions do not pass PageRank.

What’s notable here is not only that Google is monitoring text-only domain mentions, but also that they are associating those mentions with the domain that they reference. If Google can connect the dots in this fashion, can they expand beyond URLs that appear as text on a page to entity references, as well? The same references that trigger Google’s Knowledge Graph, perhaps?

In Summary

We’ve built a case based on much supposition and conjecture, but we certainly hope that this is the direction in which Google is taking their algorithm. Whether Google acknowledges it or not, the link spam problem has not yet been resolved. Penguin penalties are punitive in nature and exceedingly difficult to escape from, and the fact of the matter is that penalizing wrongdoers doesn’t address the problem at its source. The motivation to build inorganic backlinks will exist as long as the tactic is perceived to work. Under the current algorithm, we can expect to continue seeing shady SEOs selling snake oil, and unsuspecting businesses finding themselves penalized.

Google’s best option is to remove the negative signals attached to inorganic links and only reward links that they identify as relevant. By doing so, they immediately eviscerate spam link builders, whose only quick, scalable option for building links is placing them on websites that have little to no real value.

By tweaking their algorithm to only reward links that have expertness, authority and trust in the relevant niche, Google can move closer than ever before to solving their link spam problem.

November 13th 2015 Google, SEO

Overstock Hurt By Google Search Changes Again

Comments Off has had a lot of financial trouble over the years as a direct result of how its content shows up in Google search results. It is perhaps one of the best examples of how drastically a reliance on Google traffic can hurt a business when things go wrong.

Overstock released its financials for Q3 this week, with earnings down a reported 11% thanks in part to algorithmic changes at Google. This wasn’t the only problem the company pointed to, but it was a significant one.

In the actual earnings report, Overstock said, “We are experiencing some slowing of our overall revenue growth which we believe is due in part to changes that Google made in its natural search engine algorithms, to which we are responding. While we work to adapt to Google’s changes, we are increasing our emphasis on other marketing channels, such as sponsored search and display ad marketing, which are generating revenue growth but with higher associated marketing expenses than natural search.”

CEO Dr. Patrick Byrne told investors on a conference call, “Third of the problem was the Google search change, as it affects everybody. It affected — it was a little bit different this year than it was in previous years in some respects in who it helped and who it hurt. But we think we’ve already learned our way out of that.”

These comments did not thing to help the company’s stock price, which immediately tanked by 17%.

Overstock was famously penalized by Google for its search tactics in 2011. The company had been encouraging websites and colleges to post links to Overstock pages so students could get discounts. Before the penalty hit, the program had already been stopped, but thanks to a slow removal of links from some participating sites, Google caught wind of it and dealt a major blow to the company, leading to an ugly financial year for Overstock. They went from having top position search results to page five and six results.

That debacle happened because of what Overstock did. These were unnatural links, and the company learned the hard way that Google won’t stand for them. This time, they just got hit by the algorithm as so many often do.

Image via Overstock

November 12th 2015 Google, Search, SEO

The Power of a Page Analyzer: I Ran ‘The Great Gatsby’ through an SEO Tool & This Is What Happened

Comments Off

The Power of a Page Analyzer: I Ran ‘The Great Gatsby’ through an SEO Tool & This Is What Happened was originally published on, home of expert search engine optimization tips.

Ever wondered what would happen if you ran classic literature through an SEO tool? Me, too!

IMG_5960I’ve got a sweet spot for tools that give me an idea of how I’m doing as a search marketer and content publisher. One test of an SEO tool’s power is if the software can do the job of a careful human expert in a fraction of the time.

The following experiment details what happened when an SEO tool meets F. Scott Fitzgerald’s “The Great Gatsby.” While it was devised in fun and out of true curiosity, it ended up being a real-life study of SEO tools at work, worth sharing.

The Setup

I’m a writer at digital marketing agency Bruce Clay, Inc., and I’m also an avid reader. Literature lovers like me can spend hours picking apart themes and character dynamics. SEO analysis, meanwhile, should be as efficient as possible with time and resources. Can an on-page analyzer can tell me:

  • How hard or easy text is to understand
  • The relationship between characters (or in the case of websites, the relationship between keywords)
  • And the theme

If a tool can accomplish that with a piece of literature, then it can certainly analyze a web page for appropriate language and SEO relevance. I used a free tool, the SEOToolSet’s Single Page Analyzer, and much to my delight, the SEO tool was able to peg character relationships, point to the theme of this American classic, and crunch 46,000 words in a fraction of the time a person requires. Super sweet.

Fair warning: Yes, there are some spoilers for “The Great Gatsby” but nothing that will keep you from enjoying the book – or Baz Luhrmann’s recent cinematic stunner. (Highly recommended, by the way.)

Reading Level

It’s relatively straight-forward for a computer program to grade the difficulty of text content. Readability is one of the word metrics reported by the Single Page Analyzer.

The SPA puts the text at a Fog reading level of 11.5 – which is right on target, as “The Great Gatsby” is most often taught during the junior year.

While the 11.5 grade level is fine for Fitzgerald, for your average web page, it’s a little higher than the recommended target. If this was your own website, you could use the reading level score to think about how accessible your text is to your target audience.

The Great Gatsby Page Analysis

Characters and Their Relationships, or Keywords and Distribution

One of the basic tasks an on-page analyzer performs is identifying the most commonly used words and phrases on a page – in SEO world, a page’s keyword phrases.

The Single Page Analyzer reports the most-used words on a page, organized as one-word, two-word, three-word and four-word keyword phrases. Here are the one-word and two-word keywords of “The Great Gatsby” according to the SPA:

The Great Gatsby Page Keyword Phrases

The first three one-word keywords are the main characters of the novel, a good indication that the analysis is on point. The SPA has correctly identified that “The Great Gatsby” follows the story of protagonist Gatsby, Tom, Daisy, and the narrator, Nick Carraway. In this analysis, we see Nick as he’s most often referred to as “old sport” – the most frequently used two-word keyword phrase and Gatsby’s favorite term of endearment. Also among the two-word phrases from the analyzer are more central characters and the main location of the action, West Egg.

So far, so good.

A Clue about Relationships, Care of the Keyword Heat Map

Here’s a unique feature of the SPA. Keywords get a visual treatment in the keyword heat map. The report lays out the identified keywords like a topographical map, where the words used the most are the highest peaks, and words that are physically located close together in the text are also placed near each other on the map.

Here’s the SPA’s keyword heat map for “The Great Gatsby”:

The Great Gatsby SPA

Notice that Gatsby and Tom and more closely connected than either of the men are with Daisy! A superficial reading of “The Great Gatsby” may suggest the book is a story of a romance between Gatsby and Daisy. But, this data visualization reveals the layer of meaning beyond that. There is a great distance between Gatsby and Daisy. Gatsby is much closer, in fact, to his foil, Tom, reflecting their constant competition and the story’s central tension. How’s that for in-depth character analysis?

Through these three metrics/reports we’ve assessed reading difficulty and important keywords/characters. Next, the real magic of the SPA’s analysis: understanding the theme.

Identifying the Theme

A theme is more than keywords; it is the underlying meaning or idea expressed. On a web page, a theme is pretty much determined by the topic (or subject) and the purpose (do/know/go, i.e., transaction, information, or navigation).

In a novel, a theme is more complex. We can’t understand a literary theme by looking at the keywords alone, but, of the nearly 46K words the SPA counted in “The Great Gatsby,” the most frequently used words can clue us in to where to look closer for the theme.

If you filter out character names from the list of one-word keywords, a few words stand out, namely: house, eyes, and time, all of which are critical thematic elements in “The Great Gatsby.” Together, the keywords of time, house and eyes do, in fact, point to major themes when explored further: an obsession with the past (time), a preoccupation with wealth and materialism (house), and the modernist shift away from God (eyes).


“The Great Gatsby” is filled with references to time. An obsession with time is laced throughout the pages from start to end — particularly with the past. Recall narrator Nick Carraway’s opening lines: “In my younger and more vulnerable years my father gave me some advice that I’ve been turning over in my mind ever since,” as well as the closing lines: “So we beat on, boats against the current, borne back ceaselessly into the past.” The entire premise of the book is a man obsessed with a time gone by. (And note that other top keywords support this idea, too: came, went, back. The tense looks itself tends to look backward, and the actual adverb “back” is mentioned enough to appear in the top keywords.)


Chapter 1 holds Nick’s first description of Gatsby’s house, standing next to his own.

“My house was at the very tip of the egg, only fifty yards from the Sound, and squeezed between two huge places that rented for twelve or fifteen thousand a season. The one on my right was a colossal affair by any standard — it was a factual imitation of some Hotel de Ville in Normandy, with a tower on one side, spanking new under a thin beard of raw ivy, and a marble swimming pool, and more than forty acres of lawn and garden. It was Gatsby’s mansion.”

Countless descriptions of Gatsby’s house tell of a lavish palace in nouveau riche style —the setting of constant celebrity-studded galas that last until morning and set the night sky ablaze with lights. Both Gatsby’s house and Daisy’s house (over which Gatsby keeps near-constant vigil) play integral roles in the text. Gatsby’s house, after all, was bought for two reasons: its proximity to Daisy and its grand splendor, with which he hopes to impress her. And both homes speak to the “unprecedented prosperity and material excess” of American in the 1920s (SparkNotes).


There are multiple descriptions of characters’ eyes throughout the book, but they’re all overshadowed by the recurring focus on Eckleburg’s eyes.

Remember T.J. Eckleburg? Eckleburg is a faded sign advertising an eye doctor that Gatsby & co. pass by every time they drive into New York from West Egg: “Above the grey land and the spasms of bleak dust which drift endlessly over it, you perceive, after a moment, the eyes of Doctor T. J. Eckleburg. The eyes of Doctor T.J. Eckleburg are blue and gigantic — their retinas are one yard high. They look out of no face but, instead, from a pair of enormous yellow spectacles which pass over a nonexistent nose … his eyes, dimmed a little by many paintless days under sun and rain, brood on over the solemn dumping ground.”

More than just a pair of eyes, Eckleburg is often seen as a god-like figure watching over the sordid affairs of the main players.

Putting It All Together

Here’s my final analysis: “The Great Gatsby” would likely require a few days of engrossed reading to cover the above ground. However, the Single Page Analyzer’s computer processing power effectively discovered key elements and themes of the text, successfully performing a complex analysis of hundreds of pages of signal-dense text in minutes. Wasn’t that fun?!

November 5th 2015 SEO, SEO Tools

LinkedIn: An Underappreciated Gem for SEO

Comments Off

by Jayson DeMers

In the game of SEO, social media networks are becoming more important than ever. And out of all the social platforms, LinkedIn is perhaps the fastest rising star as it relateIn the game of SEO, social media networks are becoming more important than ever. And out of all the social platforms, LinkedIn is perhaps the fastest rising star as it relates to SEO.
LinkedIn has more power to impact rankings than it generally gets credit for. It began as a tiny, networking startup, but as of the start of 2015, LinkedIn has managed to rack up 364 million members in the 13 years since its founding.
A website with that much attention can’t be ignored by search engine ranking algorithms. It’s especially lucrative for B2B companies, since more than 50 percent of them are finding new buyers through LinkedIn. The underlying networking structure makes it especially relevant for searches in the B2B realm.
The open-publishing aspect of LinkedIn, the ability to post job descriptions, and the capacity to reach out to other firms all provide further benefits. Getting results from these methods isn’t easy as you might think, though; it may require some changes to your strategy and a little more effort.
Here are a few suggestions. 
1. Polish Your Profile 
Your profile should be complete to attain the SEO power it’s capable of. Many businesses haven’t taken the trouble to update their page, however.
It should include information about your company, a link to your website, your company’s physical address, a high-quality photo, and any other relevant information. 
A polished profile will strongly improve your rankings for branded search terms, which can go a long way toward protecting your reputation online in organic search results. Take a look at the LinkedIn page for Park West Gallery, one of the largest art galleries in the world, for an example of a polished profile. It not only has information about the business, but also job postings.
With this data on the page, Park West becomes relevant for a variety of search terms, which will add relevance to its desired ranking keywords, boosting its rankings for branded and keyword-related queries not only in Google and Bing, but also for searches conducted in LinkedIn itself. 
2. Optimize Job Descriptions
Job descriptions on LinkedIn can also enhance SEO visibility. When users search for related keywords or branded terms in Google or Bing, your job description published on LinkedIn can display in search results. To achieve this, begin by focusing on the keywords in the job title. The more specific you can be in your job description, the better. Search engines are more likely to display your company profile if it’s pointing to something specific rather than generic. 
As a word of caution, avoid keyword stuffing. You don’t want to miss out on keywords with your LinkedIn content, but you don’t want it to appear spammy either.
Search engines can tell the difference between relevant keywords within an article on a profile page and keywords used with no context. 
3. Reach Out to Influencers 
An effective tactic for getting your page noticed by search engines is bringing it to the attention of important people. Reaching out to super connectors is a great idea, particularly if they’re influential within your field.
Following influencers will give you insight into how other companies and marketers are using SEO to get better rankings. Matthew Capela, best-selling author and founder of Alphametic, has developed a list of top SEO super connectors to be following on LinkedIn.
With the help of these social geniuses, you’ll be able to discover new ways that LinkedIn can get more attention for your business. 
Creating brand exposure through SEO is a common goal for most digital marketers, and LinkedIn is becoming more of a tool for doing so. Smart marketers are including LinkedIn as part of their strategy in order to attract more brand exposure and search visibility.

Be sure and visit our small business news site.

3rd Edition of SEO for Dummies is the Perfect SEO Companion

Comments Off

3rd Edition of SEO for Dummies is the Perfect SEO Companion was originally published on, home of expert search engine optimization tips.

Remember me? Who can forget the fresh face we met in 2009, so eager to make us coffee and protect us from evil villains.

SEO for Dummies 2009

The first edition instantly became a reliable reference guide for us and that lasted until 2012 when it grew in ways that strengthened our reliance on it. The second edition of “Search Engine Optimization All-in-One For Dummies” proved that it was more than just a helper, but an expert in SEO that deserved to be served coffee instead of making it for us.

SEO for Dummies being served coffee

Three years has passed since the second edition and the SEO world has changed drastically. More than ever, digital marketers need a reference guide that’s accessible, reliable, and reflects the latest updates in the industry. That’s what they’re getting in the freshly released third edition. Basically, the book is the perfect companion for any business owner, digital marketer, or anyone responsible to drive traffic to a site. This Halloween season, it also proved the perfect companion for Darth Vader, Tinkerbell, Minnie Mouse and more as the book got into various characters:

The book wasn’t the only one in character … we all take Halloween pretty seriously, too.

Your SEO Companion is Here

It’s hard to find the perfect companion, someone who knows what you’re going through and supports you along the way. Harder yet is to find someone who will grow and change with you over time.

But when the third edition of the “Search Engine Optimization All-in-One For Dummies” book arrived at our office, our hearts jumped! Our dear companion is everything we remember it to be, plus a whole lot more.

Sure, it can still teach us the fundamentals of SEO, but it’s grown so much since then! It can also walk us through today’s hottest subjects, including mobile, advancements in search engine algorithms, and the latest internet marketing technologies that pertain to SEO.

The Spooky Side of SEO

One of the scariest things about the SEO industry is that it’s changing at such a rapid pace. It can feel quite lonely without a companion you can trust. The good news is that you’re not alone. “Search Engine Optimization All-in-One Dummies 3rd Edition” can be your trusted friend ━ it’s there to guide you on all things SEO in a way that’s easy to understand.

Get Your SEO Companion

The 3rd edition of “Search Engine Optimization All-in-One for Dummies,” is available on Amazon and Barnes & Noble. Buy a copy today, or better yet, buy it for your office or the business owner or marketer in your life.

Everyone needs an SEO companion; be sure to get the right one. You can also learn more about Bruce Clay’s nearly 800-page SEO reference guide and take a sneak peek inside the book.

October 31st 2015 SEO