By: seo india

Comments Off on By: seo india

thanks for the book list

May 24th 2008 Uncategorized

Links for 2008-05-22 []

Comments Off on Links for 2008-05-22 []

May 23rd 2008 News

Product Review: Auto Stumble

Comments Off on Product Review: Auto Stumble

Oh boy I haven’t done an actual product review in a really long time. Well with my effort to get back into posting regularly it couldn’t hurt to do one for the sake of catching up. More the merrier. :) This one was sent in by the famous Mark from Digerati Marketing. It’s called Auto Stumble. Its job is pretty apparent, it helps you get exchange stumbles automatically.

I’ve been doing a lot of stumble work lately due to my recent release of several large community sites. Stumbleupon traffic doesn’t convert very well but it has some very good advantages other than the fact that its actual traffic.

1) The few users that convert tend to be very active
2) Stumbleupon users tend share links a lot with their friends. Great for branding and word of mouth.
3) They bookmark everything and anything which means lots of social bookmarking links.
4) They’re suckers for linkbait such as kittens in diapers and neatly featured sites. They really appreciate a good layout.
5) If the timing is right they do wonders for a site with a Digg button.

Auto Stumble is available at for £10 (£20) <- British Pounds?...Damn Paypal.. It's OK Paypal should automatically convert and its available for immediate download after the exchange. That's about $19.66 ($39.32) for you normal people

Auto Stumble automates the process in a pretty ingenious way. It runs as a background app in your system tray and automatically stumbles other users of the program’s sites using your account and has them do the same. It’s pretty nice but a tough tool to pull off because you really got to have a lot of people running it 24/7. I for one am a fan of this system because facts be faced piqq and stumblexchange have just gone to shit and they’re really annoying to login every day and “earn credits.” Once again though lots of people are required to pull off this tool in an effective way. I’ve already given the tool to the SQUIRT members as a bonus and Mark has recruited some buyers so there is enough to at least pull off a featured listing for most topics for at least an hour.

Here’s a video with some more info:

May 21st 2008 SEO Tools

Confirm that you’re using Analytics on all pages

Comments Off on Confirm that you’re using Analytics on all pages

Here’s something from my mailbox – someone wanted to know how he could crawl his site and confirm that all of his pages really have the Google Analytics tracking-code on them. WordPress users have it easy, there are plugins that handle it automatically. Sometimes it’s worth asking nicely :) – let me show you how I did it. As a bonus, I’ll also show how you can check the AdSense ID on your pages, if you’re worried that you copy/pasted it incorrectly.

This is pretty much cross-platform, but as a Windows-user you’ll have to grab and install two files first:

  • wget – a tool to download copies of web pages
  • UnxTools – a collection of popular Unix/Linux tools for the hacker in you

Extract the ZIP files, copy the contents somewhere where you can find it and make sure that the appropriate folders are in your “path” (the files you’ll need for UnxTools are in “…\usr\local\wbin”). We’ll need to access these tools through the command line. I have a feeling I may need to elaborate on that for Windows users :) — let me know if that’s the case.

First, we’ll mirror our site on our local machine (this assumes that your site is crawlable; if it isn’t, then fix it first :D ):

  1. Open a command box or terminal window (on Windows, hit Start / Run … and enter “cmd”)
  2. Go to or create a temporary folder
  3. Run the following command to mirror your site:

    wget --mirror --accept=html,htm,php,asp,aspx

    This command mirrors pages with .html, .htm, .php, .asp and .aspx extensions on It’ll create a folder for the domain and put all the files in it. Dynamic URLs will get adjusted so that they can be used as file names.

  4. Wait … until it’s all downloaded … if it feels endless, you might have endless URLs, perhaps an infinite calendar script or something similar? It’s worth fixing!

Alrighty, now that we have a copy of your site, let’s check things out.

Finding pages without Analytics

We can find pages without the Analytics tracking code by listing all pages which do not have certain content in them:

grep -r -L "" *.*

This command goes through all subfolders (the “-r” option) and lists the files that do not contain a match (“-L”) for “”. That could be extended to just about anything :) .

How about pages that don’t have a “description” meta tag?

grep -r -L "meta name=.description" *.*

The “.” (period) matches any character — in this case, it is used to match the ” (double-quote).

Finding pages with AdSense (and the ID used)

Finding pages that contain a certain text is even easier:

grep -r "google_ad_client" *.*

Note that all we did was drop the “-L” (and change the text, obviously). It will show the lines that match this pattern in all of your pages, which includes the AdSense ID.

Similar to the earlier check for missing “description” meta tags, assuming you have the contents of that tag all in one line, you can easily find all of these meta tags with:

grep -r "meta name=.description" *.*

What would you like to search for today?

Copyright © 2010 This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at is guilty of copyright infringement. Please contact so we can take legal action immediately.
Plugin by Taragana

May 21st 2008 News

Blue Hat Technique #19 – Keyword Spinning

Comments Off on Blue Hat Technique #19 – Keyword Spinning

Holy cripes! It’s been awhile since I’ve sat down and written a Blue Hat Technique. It just so happens I need this one for the next SEO Empire post. I’m like blah blah talking about Keyword Spinning then I realized you guys have no fuckin’ clue what I’m yammering about. So I figure nows a good time to fix all that and luckily this one is really really easy but like all Blue Hat Techniques it works like a mofo in many situations.

The Problem
Let’s say you have a database driven website. A great example would be a Madlib Site or an E-commerce site. In fact this technique works so damn well with Ecom sites it should be illegal along side public urination. So we’ll use that as our primary example. You got your site setup and each product page/joint page has its keywords such as “17 Inch Water Pipes For Sale” and the page titles and headers match accordingly. You have several thousand pages/products put together and are well SEO’d but its impossible to monitor and manually tweak each one especially since most of the keyword research tools available aren’t entirely accurate to the keyword order. Like they may say “Myspace Pimps” gets 50 billion searches a day when really “Pimps On Myspace” are getting it. So while amongst your thousands of pages you have one page that could be ranking for a solid phrase and getting an extra 100 visitors/day for people searching for “Water Pipes For Sale 17 Inch” you’re stuck with virtually no search traffic to that page and never knowing the difference. It’s quite the dilemma and you probably realize that it’s more than likely already happening to you. Luckily its easily fixed with a simple tool you can create yourself to fit whatever needs and sites you have.

1) Add an extra field to all you’re database entries. Any row that creates a page of some sort add an extra field called TrafficCount or something you can remember.

2) Add a snippet of code into your template or CMS that counts each pageview coming from a Goohoomsn referrer and increments the appropriate field.

3) Wait a month….*Goes for a bike ride*

4) Call the titles in the database. It can only be assumed, even in a commercial/free CMS that the titles or keywords are held somewhere in the database. Locate them and scan through them one by one.

5) Use the Google/Yahoo/MSN API’s to see if the page ranks for its keywords.

6) If it does rank than compare that to the traffic count for the month. Compare that to some sort of delimiter you’ve preset. I prefer to use a really small number like 5 for the first month or two then start moving it up as needed. If the traffic is too low than split the titles/keywords and randomly reorganize them.

*Sometimes you’ll end up with some really messed up titles like “Pipes Sale Water For Inch 17″ so if its too un-userfriendly than you may want to make a few adjustments such as never putting a For,The,If,At type word in the front or never rearranging the front two words so like Water Pipes always stays in the front then only the trailing ends. Once again it depends on how your site is already organized.

7) Reset the traffic count.

8) Wait another month and watch your search traffic slowly rise. Every month the site will get more and more efficient and get more and more deep traffic to the site. The pages that are already good will not change and the poor performing pages will start becoming higher performing pages. As an added bonus it will help improve your sites freshness factors.

9) Take a scan of your average number of keywords or title sizes. Let’s say your average page has very short key phrases such as “Large Beer Mugs.” There are only so many combinations that those keywords will produce so if its just a low traffic keyword theres no point in continually changing the titles every single month forever. So I like to only have the Keyword Spinning script run for a preset amount of months on each site. For instance if my average keyword length is three words than the most combinations I can have is six. So I should logically quit running after 6-8 months. At which point my site is about as perfect as it can be without human intervention. Lastly don’t forget to make improvements to your CTR.

Simple huh! Keyword Spinning is a really easy way to get the most out of nearly all your sites. The more you can squeeze out of each site the less sites you have to build to reach your profit goals. With minimal scripting its all very quick to implement and automate (please don’t do it by hand!). That’s all there is to it. :)

Usually with my Blue Hat Techniques I like to drop a little hint somewhere in it that references a really cool trick or spin to the method that’ll greatly increase your profits. Since You’ve been all so damn patient about me being late on the SEO Empire part 2 post, and for the moment at least, quit asking me why Blue Hat sucks now I’ll just tell it to ya. My answer to that question BTW is that I’m still working on my projects which is eating up some time and I’m not happy with what I’ve written so far. If I’m not happy, it doesn’t get published. Sorry but the boss has spoken. :)

The Secret Hint

3) Wait a month….*Goes for a bike ride*

Use this technique on your Cycle Sites that you’ve choosen to not cycle out. Instead of competing with the original author, who you are probably linking to might I add, you can sometimes grab even better phrases and rank for them giving you a ton more traffic (I’ve seen Cycle Sites increase their SE traffic over 50x by doing this). If not than you’ll eventually get their original title again which at least will put you where you started. It’s also the strangest damn thing, you’ll get a percentage less complaints and pissed off bloggers when you switch the titles around, maybe they don’t care as much when they don’t see you ranking for their post titles.

May 19th 2008 News

Millions of Links won’t get you indexed, a Letter to Matt Cutts

Comments Off on Millions of Links won’t get you indexed, a Letter to Matt Cutts

Hi Matt,
I am really pissed off at Google.
out of the 4700 websites I launched last month, only 17 got indexed by Google.
What really pisses me off is that all of them contain original content (by original I mean non duplicate, no way can any other algorithm duplicate the way I am generating these unique articles, just a hint, each 50 word article contain words from at least 10 languages, I am not aware of any other algorithm this advanced) and yet Google won’t index my babies.
I think Google has some real issues with indexing as every one of my sites has 4699 site-wide links. In average each site has 25000 pages, so that makes around 117 million relevant links to each website, (the relevancy here comes from the fact that all these sites were generated by the same script and hence the multi-language relevancy).
I spent the whole week generating sitemaps and uploading them to Google webmasters central to no avail.
The only consolation from all this is the adsense revenue from this network. I make more than $400 a day. imagine if I can get all the sites indexed, really think about it. Well I must say that I am using a unique algorithm to generate this income stream, I think it is border line with the adsense terms of service, so I wont disclose it, sorry guys.
Now I am thinking about renting a dedicated server to donate it to Google, on the condition that they use it to crawl my network of sites. if 1 server is not enough I can through in a couple more. Thanks for taking the time to look into this.

May 13th 2008 Uncategorized

By: Greg

Comments Off on By: Greg

I’m in! Let’s fix this developing mess and misconception.

May 1st 2008 Uncategorized

By: Greg

Comments Off on By: Greg

Couldn’t agree more. When you are ready to build this team, count me in!

Let’s fix this developing mess and misconception.

May 1st 2008 Uncategorized