Featured Posts

NostalgiaNostalgia A few years ago, I added a shortcut to this blog on my web browser favourites bar. I accidentally clicked it tonight and found out it still exists. Therein started a bizzare existential adventure whereby...

Read more

A Premiere At Last! (My Toils Relearning Video Production)A Premiere At Last! (My Toils Relearning Video Production) In the last few months I've been asked/put myself forward to film and live stream some events. During my rather weirdly-spent youth, I was producing videos for my school of its plays and other staged...

Read more

Why I Keep Banging On About Google - An UpdateWhy I Keep Banging On About Google - An Update Today is the 90th day since one of my sites was penalised by Google. Wanna know what that looks like? Take a look at this! One of the more frustrating things about this situation - and don't...

Read more

Taking the downs with the ups...Taking the downs with the ups... It's been a while, and a blog is just for moments like this. Tempting though it is, a blog shouldn't be just for one's successes. If you don't include the failures you're painting a false picture. So...

Read more

Free Speech is Dead (or it will be when I've bombed it)Free Speech is Dead (or it will be when I've bombed... Before I start - I want to admit I have a hard time putting my exact feelings on this subject into writing. Not because I'm not feeling 'free' enough to do so, but because I'm a computer networks graduate...

Read more

Why I Keep Banging On About Google – An Update

Posted by Pat | Posted in 'Work' | Posted on 12-04-2011

6

Today is the 90th day since one of my sites was penalised by Google.

Wanna know what that looks like? Take a look at this!

One of the more frustrating things about this situation – and don’t worry, I will be outlining rather more later on you lucky, lucky people – is that everything I am about to say could be completely irrelevant. That is, nothing I have done since January 12th may have any bearing on the future prospects of obtaining further Google traffic for that site.

What I have done, and what all webmasters have to do is follow Google’s official webmaster guidelines, especially if they find themselves in all manner of turd, search result wise, as I have.

I hate to keep reiterating this in my Blog, but a bit like when I used to tell people I was in advertising they’d be saying “Oh you’re the bastard who pops windows on my computer all the time!” or “Oh so you’re the one to blame for all the spam in my inbox!”. Neither of which were anywhere near the truth – explaining that to people who’ve never worked hard on a website for its content, with the hope that someone will eventually click an ad and you may get a small smidgin of your income from it is hard work. But that’s what I did, and what I still do… so to that end…

I am a ‘White Hat’ as it is known in SEO circles – that is, everything I do with my sites I do in good faith, not to deliberately manipulate my search potitioning or use any other nefarious techniques which the search engines detect on your site and give you a big bollocking for in terms of traffic.

Like all who sail the Google gravy boat, there have obviously been times when adding some of your keywords/phrases in the content of the page (in context and making sense, naturally) may have set your own filters going on your head (“Is that one too many mentions of that word?”) but I’ve been careful to keep things like that very much on the ‘Optimised’ side of ‘Over Optimised’. If you don’t describe your site properly then it clearly isn’t going to tell the search engines (which are just computer programs after all) that you are what it thinks you are. That is just one example, where as an amateur SEO you get a small amount of heebeegeebees just doing a simple update. Either way, I’m fairly certain now that nothing like this example is to blame for my current problem. I’ll get to that at the end.

The old advice from Google to help you with ranking well was fairly simple – follow the guidelines, if you find something that doesn’t quite comply, fix it and submit a reconsideration request.

During the last three months, the rules have changed, the feedback from Google has become different, and where I stand and go forward is more unclear than when I started.

Let me take you through my six reconsideration requests (I have put things in <>’s to remove the specific site in question for the purposes of paranoia):

Reconsideration Request 1 – 14th January 2011:

Hi,

Thanks for helping previously with this site. It is actually almost exactly a year since the site came out of some kind of previous penalty.

In the last few days, my Google rankings have all but vanished, in a way which looks very unusual and that a penalty may have been applied. On this occasion I think I need make you aware of a site which appears to be deliberately attempting to harm my site’s ranking. I noticed it in December, and I believe it may be that it has now taken effect. The culprit is http://www.<mysite>yo.com/ – Obviously they even went to the effort of registering a very similar domain, followed by scraping my content. I received a massive spike of traffic from that domain in the beginning of December and all the traffic was from hard pornography sites. The traffic stopped after a day but the site has seemingly been spidered since.

For your information I have sent a cease and decist to the hosting company (gogax.com) which they have not acted on yet, but I was hoping if a human can see if the fallout from that site is affecting my site’s authority, it would be right for me to tell you about it so things can be rectified.

In the meantime, I have as ever been researching and updating the Blog on the site, and tidying up the directory to eliminate dead links and keeping things as my users expect. All in good faith. I have not participated in any ‘in-linking’ or other kinds of link schemes which could possibly flag up a penalty and this is why I am so suspicious of what the above is seemingly trying to do (and succeeding?).

I feel a little helpless in this case as I learned a big lesson in turning my site around and keeping on the right side of Google over a year ago, as you may see in my previous correspondance. All I can say is I will continue to keep the site fresh and interesting, and hope the rankings hiccup is rectified soon.

Thanks again. All the best,

Pat

The one and only response (at the time) from Google came just 12 days later. Which for those interested is entitled “We’ve processed your reconsideration request for <my site>” and says:

We received a request from a site owner to reconsider how we index the following site: http://www.<mysite>.com/.

We’ve now reviewed your site. When we review a site, we check to see if it’s in violation of our Webmaster Guidelines. If we don’t find any problems, we’ll reconsider our indexing of your site. If your site still doesn’t appear in our search results, check our Help Center for steps you can take.

So I waited and after a week decided my little first request (as mentioned on my previous blog entry) hadn’t worked.

OK! So to the grindstone! Would could actually be wrong with the site which means it doesn’t comply with the guidelines.

Reconsideration Request 2 – 31st January 2011:

Hi again,

So having had my first request processed, clearly that hasn’t hit the spot – so I’ve taken a fresh look at what could be going on.

First – the site I mentioned in the first request which I believed could have caused the trust to vanish from <site name> (<sitename>yo.com) which someone else set up seemingly specifically to hurt me – they scraped my content, bought traffic to it and didn’t seemingly use it for any other purpose – has now gone, thanks to the cease and desist I sent in. So I’m hoping perhaps that will slowly help regain the ranks.

I have had a look at the link profile on <sitename>. Obviously as a <subject> directory there are a lot of links, but they are all individually moderated and unpaid, relevant and properly categorised/described to give the user the best experience. There is no incentive or enticement for the site owners to link back to <sitename>. I have spent some time removing all links on the site which lead to ‘404’ or other pages with outdated information, by hand, so that the directory is completely up to date. That was quite a bit of work!

The blog is written by myself and two writers I contract to write unique and interesting articles. I personally vet all articles before they are published. It occured to me that in some of the pieces in the blog, links to some places aren’t completely <subject> related and as a result could look ‘sponsored’. I hate the idea of ‘link sculpting’ but have asked the writers to go back and make any links which were to non-<subject> places (always topical to the story however) as ‘nofollow’ links, so that you know there is no kind of ‘paid link’ scheme going on. We don’t write articles for exposure or money, we write them because they’re interesting – in fact we’ve never even been approached to do that, but I have a sneaking suspicion that Google doesn’t believe that to be the case for some reason. So, whilst the directory links are still ‘follow’ (I can’t see why they shouldn’t be, given we trust them and recommend them), some are now ‘nofollow’ in the blog.

Also, I wondered if you believed I was obfuscating my affiliate link on the <subject> page by putting it behind a form. I wasn’t – I wanted it tougher for people just to click straight through – but in case this is one of the reasons, I have turned that link into a simple text link. I note that other <subject> directory sites are still holding their positions in Google, sending traffic to the exact same company, but not using forms, so that was a difference I saw.

I have looked at the sites linking to me within Webmaster Tools and the only site in there to do with me was <a different site I own> (5th biggest linker) – but links to <mysite> within <a different site I own> were removed over a year ago when sorting out some cross-site interlinking which I explained to you at the time and which helped me get out of a previous penalty – I have no idea why they are still showing up in Webmaster Tools all this time on, they’re simply not there and haven’t been for 12+ months.

Literally no other site in that list is anything to do with me, nor have I communicated with any of them to try and purchase/sell links. I have never in fact bought/sold a link in my entire webmaster history of over 10 years, I can’t see why people do it if the content is good!

I continue to keep things fresh, and am undertaking a project soon to add more functionality to the <subject> pages, allowing for commenting and more user interaction. In the meantime, I still remain mildly puzzled by the penalty and have done everything I can think of to get out of it. Please help!

Can I just state for the final time that the site does not participate in ANY kind of linking scheme, we just love <subject>, write about them and show as many as we can find. I don’t know how else to prove it.

All the best,

Pat

Ok so that was quite grovelly and really says nothing much. To be honest I remember thinking “This just HAS to be temporary” so every request was a small amount of analysis followed by sending it in, but over time built up into quite an idea of what was going on.

Also, I will point out, that in between requests I was sorting out minor on-page/server issues which I couldn’t see for the life of me would be causing the penalty, so rarely mentioned them. Anyway at this point I got a bit more stuck in submitted the next request before waiting for the canned “We’ve processed” email to arrive from the previous request.

Reconsideration Request 3 – 2nd February 2011:

Hi,

Apologies for slightly superceeding my 2nd request in recent days with this one. I wanted to let you know of action I have taken.

1. I have removed references to <niche subject to do with subject> from the title of the site and the description in the page. Whilst that does help with affiliate sales, I understand completely why it would seem wrong to use, as the site prides itself on only concerning itself with mainstream <subject> links and news. The affiliate link is there, but people have to make a concerted effort to enter the link now, as:

2. I have reinstated the form as previously on ‘<niche subject>.html’. I prefer it that way to stop wandering and unwanted clicks and provides a safeguard to a certain extent, which I am more happy with than a plain link. I still think it is relevant to have that link on the site – many many people do go there and continue to use <sitename>, and it helps pay for the site to run. I have never received a complaint about it.

3. I took a hard look at my web stats (these ones generated from actual server logs) – and have found several ‘SEO’ companies – the sort which sell you back links – that have linked to <sitename>. I will list the domains in question – but again, wanted to tell you that I have never attempted to boost the ranking of <sitename> artificially and so have never ever dealt with these companies. They are:

<deleted links – don’t want to give them any bloody link juice from my blog!>

I have uploaded a screenshot to show you what I was looking at: <mysite/picture.jpg>

I performed the search for the whole of 2010.

Clicking around those and some other referrers (some junk ‘Pharm’ and porn sites), it is clear that whatever links there were to <sitename> on the bad sites have now been removed (I ‘viewed source’ and searched for ‘<sitename>’) But again, this adds ammunition that someone is intentionally placing <sitename> into what looks like a horrible neighbourhood.

Having no comeback with Google is a little frustrating obviously because I don’t know how I can prove my innocence. I am passionate about the issue because I believe <sitename> is a neat little site which deserves to stay where it was in Google. I hope you can agree, especially after the alterations I have performed.

Thanks again for your time.

All the best,

Pat

It intregues me that the first four requests really focussed on incoming links – yet I didn’t really do a proper analysis of these until the 6th request. I am mildly kicking myself for not getting the 5th out of the way first, but to be honest it appeared it would have made no difference. The canned “We processed” email arrived on the 7th February.

Reconsideration Request 4 – 10th February 2011:

Hi again.

I’m really sorry this is the 4th request, but the further I look at things, the more issues I see which could be affecting <sitename>.

Having analysed the drop in traffic, I’m convinced the site lies in a penalty as every single keyword combination to hit the front page of the site has dropped in terms of Google traffic by the same proportions. So I’m sorry to keep requesting reconsideration if this is not the case – it just looks and feels like a proper kicking!

To that end, I continued to look at if there’s anything on and off-site which could be causing problems.

In Webmaster Tools, I noted a lot of 404 Not Found errors. Some of these were expected as there has been a good update recently which will temporarily create 404s from any indexed old links – however the actual links are no longer on the site (i.e on pages which people will browse to). A few more came to light in the Tools, and I have updated/deleted the links where appropriate.

The biggest issue in the ‘Crawl Errors’ section appeared to derive from a piece of web stats code which is inserted into the bottom of each page. It seems it was looking for an image on my server rather than theirs. I have liased with the stats company and updated the code on every page so that the 404 error should now go. Because it was across every page, it could reflect badly on the site possibly.

All in all, the updates performed recently have improved the internal site linking and user experience.

Now, to the external issues. Again, the further I delve into the ‘Links to your site’ section, the more that ‘bad’ sites keep cropping up. Along with my original problem with the person who set up ‘<sitename>yo.com’, a lot of these links (they are too numerous to list) whilst no longer carrying a link, seem to have an SEO/Link Buying connection. It genuinely appears to me that I have been ‘stitched up’ by a competitor.

Thankfully it seems many of the links are no longer on the ‘bad’ sites (unless they’re hidden to me as a browser), so in time should drop off Google’s view of my backlinks I hope.

I terms of on-page issues I saw – there was a ‘search result’ for ‘<niche>’ which had been there years and I forgot about, but it had been spidered either way. I have removed that result. Whilst, as previously explained, there is an affiliate link which is <niche> themed on <sitename>, I have made more effort in the last couple of requests to take away on-page keywords and optimisation on the front page and throughout the site which may mean an unfair ranking for anything ‘<niche>’ related (the worst word was ‘(a word)’ within the title/main content, I never use bad words!).

A surfer has to actively read and click the link on the left hand side to get to the ‘<niche>’ page which has an form on it – so to get to the eventual site, it’s not a case of an accidental click or any other on-page trickery. I hope you understand that I’m not trying to shove it down people’s throats (so to speak!) – it is there, and people do go there and if I’m lucky, I make some money. I don’t think Google takes a ‘moral’ stance on such things, so whilst I’m explaining this I’m conscious that what I’m trying to get across is that the site itself was never and is never going to be about <niche>, and therefore it does disturb me to see the lengths that people have seemingly gone to, to discredit the site’s (and, as you’re reading this, I guess my) credibility by linking <sitename> from loads of adult/junk domains.

I am searching for anything else I can do to help the site, and am continuing to work on improving the user experience every day.

I have read through and digested Google’s quality guidelines in their entirety. Aside from one advisory part about creating a site map (it’s a technical challenge at the moment), I am happy that <sitename> conforms to every guideline positively.

Thanks for your time again – I hope it’s the last time in the nicest possible way.

All the best,

Pat

Honestly looking back, it seems like a masterclass in how NOT to do reconsideration requests, but to be honest for these first few requests I could not ‘see’ a single problem on my site. It was far more ‘Google friendly’ than similar sites in the same arena and had a structure/bulk of content which had not really changed for the last year when rankings were great. So I just couldn’t (and still can’t) see why it would suddenly fall out of favour. It takes until the 23rd February to receive the canned “We’ve processed” email.

In this time, I employed a friend to help me do some data mining within the site and create some useful code to help me with new sections of the site and effectively more, good quality content. It took a lot of work (on his part and later on mine!) but I did it because I started becoming convinced the site was ‘thin’ on content – something that another massive Google algorithmic change (later labelled the ‘Panda’ update) had indicated. Ironically, the site wasn’t affected by either the first (USA) ‘Panda’ rollout on 24th February or the Worldwide one on 11th April. Either way, the few Google employees who do say the odd thing here and there online about what Google Search is up to, had indicated that this year it was all about ‘tackling spam’.

So, with a background of “create more good content” in my head, the site was updated in a fairly big way which was scary in itself. It’s stayed fairly similar structurally since 2004. At the same time, any pre-existing pages were kept and the URL structure and other things that Google look at didn’t alter drastically. But it’s a good time to do it anyway, given it didn’t really matter *that much* if Google wasn’t pleased.

The updates were released on the evening of the 3rd of March and guess what I did then:

Reconsideration Request 5 – 3rd March 2011:

Hi,

This request is to do with the penalty incurred by <sitename> on 12th Jan 2011, nothing that’s happened since (I can appreciate you may be swamped with things since the ‘farmer’ <what later became known as ‘Panda’> update!). This is to go with the 4 previous requests, and hopefully will be the very last, having identified more potential problematic areas of my site.

I will outline what I saw and what I have done to fix it.

Problem: Saw that server was sending pages quite slowly in the Webmaster Tools Crawl Stats. Also, the site was on shared hosting and therefore sat on a shared IP with some sites possibly considered ‘bad’

Solution: Move site to dedicated server and IP. This also improved page delivery speed as seen in the crawl stats.

Problem: Noticed in the ‘Keywords’ section of Webmaster Tools that some words were being repeated far too many times considering the number of pages on the site.

Solution: Realised the vast majority of repetitions came from the use of ‘tags’ within the Blog. For example, the word ‘<subject>’ ended up on there almost 40,000 times! I read an article written by Matt Cutts that same day to do with tag clouds possibly being problematic in terms of keyword stuffing, so I set about removing all tags from posts, deleted all links to tags and any tag pages within the site. The index has not completely caught up with these changes yet a week or so on. <and still hasn’t, as of April 12th>

Also, doing the Tag work made me realise there was potential for ‘duplicate content’ issues on the site. The action taken was as follows, on top of the Tag work:

Added ‘noindex, follow’ to any pages where duplicated content and/or keywords may have been flagged, including:

/blog/page/x (paginating the posts)
/blog/category/categoryname/ (where whole blog posts had been shown in multiple places)
/cgi-bin/rate.pl (a script which looks no different no matter the ID passed to it)
/blog/year/month/date/ (an ‘archive’ which will repeat blog posts shown at their true URLs)

I also abbreviated any Blog posts on the front page of the Blog (which is indexed) so that the actual article had the full wording and therefore should not be duplicated in its entirety anywhere else.

Problem: A multitude of duplicated Meta Descriptions mentioned in Webmaster Tools

Solution: I have written by hand much better meta descriptions which aren’t all generated by the scripts on the site and are suitable descriptions of the page’s contents. For the blog, the article itself it used in generating the description (before, it was the same for all blog posts!). This is ongoing work in terms of the <subject> directory but I would say over 2/3rds of pages have been updated. The index is slowly getting these. <all pages updated and indexed now on April 12th>

Problem: Variations of the correct URL showing in the index, e.g. http://www.<sitename>.com/subdir/index.html vs http://www.<sitename>.com/subdir/

Solution: Added ‘canonical’ tags in the appropriate pages so Google should know which is the correct Url to consider.

Problem: Some old files (e.g. <listed some old pages>) were still on the server which I found when doing a ‘site:’ command in Google.

Solution: Deleted the files. They were all irrelevent to the content of the site for the last few years.

Problem: Discovered a backlink from <a site I own> (A site I own).

Solution: Removed it. It’d been there years and I’d simply forgotten about it.

Problem: Not so much a problem, but concerned that Google may consider the site ‘thin’ because of it being a (albeit well maintained) directory.

Solution: I have added an extra page for each <subject> which gives (if available) actual thumbnail previews of the <subject>, plus allowing users to comment on and rate each <subject>. There was some seriously fun programming involved in obtaining the <subject> previews and I’m glad it was done. I hope you can have a little browse and see yourselves how the site has improved.
I am now at a point I am really struggling to see what else I can do on the site to get back into Google’s good books. If it’s simply a case of the site being flagged as ‘<niche>’, I’d kindly ask you browse the site yourself and ask if you’d put it into that bracket.

Either way I hope if nothing else, you have had a chance to read this and see I am serious about making my site completely comply with the guidelines and will do anything to improve it. It’s been hard work but I will continue, and thanks again for reading this through.

All the best,

Pat

What is mildly frustrating but also good is that during this period I figured out the direction I’d like the site to take, regardless of any Google reasons to be doing it – the frustration comes because I’m a shit programmer so my friend is basically lumbered with doing it – and as he charges like a slave and works brilliantly and totally understands the whole thing I don’t want to subcontract the work. Which means, I have to wait for my even better site.

I really do think the site is pretty cool now. I worry it could “just be” this one small thing or that one small thing – but you can’t just go updating every last thing you read about just to please Google. If I did, I’d have had a nervous breakdown by now and probably fallen out of favour with Bing (and therefore Yahoo!), who incidentally rank and like the site just like Google used to, but have always sent 10% of the kind of traffic Google used to, simply because of their market share.

So I waited, and this time without waiting for the response (it had been nearly a month), submitted this:

Reconsideration Request 6 – 1st April 2011:

Hi,

My request, with links enabled to make life easier for you guys is here: http://www.<mysite>.com/googlerr6/rr.html

Username: <deleted>
Password: <deleted>

I seemingly exceeded the word count on this form otherwise I would have submitted it ‘as is’. I don’t want to truncate it as I feel it explains everything properly and wholeheartedly.

Thanks again for your time

Pat.

And yes, you lucky, lucky people, here it is in full:

Hi again,
Having done everything I can possibly think of to the ‘on page’ side of my site as described with the 5 previous requests, (though some nice features are on their way, but that is about user experience, not technical/SEO problems), I thought I’d shift my focus to links, specifically those pointing at the <mysite>.com domain.

First I will give a little history of what I know (in case some year+ old requests aren’t visible to you) and what I have done about it and what I see now – and how that relates to the position I find the site in now.

<mysite> first incurred a penalty in April 2007. At the time I guess I was in denial as to what could have happened and took a stoical approach believing that perhaps the site had just fallen out of favour. I had no idea that the sitewide cross-linking of some of my sites which I had on the site was causing me problems. I gradually learned some SEO and come October 2009 I was determined to sort out the problems by following the guidelines completely and wholeheartedly and submitting a reinclusion request, owning up to creating links between sites, although I had done it in good faith at the time as a means of passing relevant traffic, not enhancing my search placement.

After 3 requests and 3 months and having removed the interlinking and making changes for the better, I saw the site bounce back. I was elated! I had doubted that the site was penalised at all before then, so this is why I am now convinced what I see is a penalty again, having emerged before. Since that day I have been very careful not to tread any line which could have been seen as ‘bad’ in Google’s eyes.

So when the site all but vanished from Google’s results in January 2011, almost exactly a year after emerging from a penalty I was (and to a point, still am) hopelessly stumped, but have worked very hard to sort this out.

Obviously with no feedback from Google, it is frustrating to not know if anything you are doing is completely fruitless, but it has focussed me on doing better and as a result, the user experience is improving all the time and I am excited about the site’s future.

I know that’s very non specific and the sort of thing they tell you not to put in a reinclusion request, but I feel I have to get it off my chest. I’m not an SEO by trade, this situation has forced me to become an amateur one, so I’m sorry if this sounds more personal than it should – I’m not the sort of person to just outsource this to a company, and besides, I couldn’t trust they wouldn’t do something nasty after what I’ve seen lately.

Getting to the point…(!)…

This is my Google traffic graph to <mysite> with the 12th (or thereabouts) of January hit: <a link to a picture much like that at the top of this Blog post>

It continues to drop further now, but I am aware this is probably because of deoptimising some keywords onsite in case that was the problem.

I took a second look in Webmaster Tools at the links in to the site, and was surprised just how many links there were to the <subject> submission form (http://www.<mysite>.com/add.html).

I learned of a company, Majestic SEO who can generate you charts, detailing your site’s backlinks over time – using their tools immediately stunned me. Take a look at http://www.<mysite>.com/googlerr6/majestic-1.jpg (cumulative) and http://www.<mysite>.com/googlerr6/majestic-2.jpg (last 30 days)

It seems many of the links have come and gone, but according to Majestic’s cumulative and ‘Fresh’ (last 30 days) data, these are links pointing to that page: http://www.<mysite>.com/googlerr6/backlinksforurl.csv (cumulative) and http://www.<mysite>.com/googlerr6/backlinksforurl-2.csv (fresh).

Looking at the anchor text, it is horribly apparent that with a mixture of completely irrelevant, some slightly adult (which could be deemed mildly relevant to do with <subject> I guess) and outright pornographic/illegal sounding, I am convinced these links are raising a red flag at your end and causing my current problems. In November 2010, according to http://www.<mysite>.com/googlerr6/majesticseo_backlinks_history_backlinks.csv there was a massive spike of these links. It would make sense this has done its part to knock <mysite> out, especially when you compare it to the Google traffic graph above where there seems to be a spike, followed by the penalty.

Having read these two (and many other) articles backing this theory up, I am doing what I am told is the right thing to do, and file this request:
http://www.webmasterworld.com/google/3964441.htm
http://www.searchenginejournal.com/what-if-my-competitor-buys-spam-links-to-my-website/28786/

I want to categorically state, and would happily sign a legal document to attest that absolutely none of those spam links were bought, organised, arranged or otherwise by me or anyone else who has anything to do with <mysite>. Also I have not bought, sold or otherwise other links of any nature to <mysite> for any SEO purpose, ever. I have one affiliate link running from the site, and that is IT.

My ‘confessional’ to Google to do with links is really all tied up in the 2009/10 requests where I admitted and removed:
– The interlinking between my (related topic) sites as mentioned above
– a page (http://www.<mysite>.com/<subject>-links.html) which incentivised those who liked <mysite> to link back to the site
Again, at the time these two ideas were done in good faith purely to exchange traffic, and in no way trying to game any search engine.

I do not participate in ‘link schemes’. I don’t answer anyone emailing for a link and never have. I don’t ask my <subject> owners to link back to the site. The site is a directory of <subject>, with eventual links out to the source of the <subject> – so there are external links involved clearly, but every submission through that add.html page which has been seemingly ‘googlebombed’ is highly moderated. It did used to end up with a lot of spam in it – but none of that ever made it onto the site. I have since (this year) added the ‘security code’ to reduce false/automated submissions, but either way nothing automatically gets onto <mysite> as a result, and quite often a large amount of <subject> don’t get through to the site as they were submitted because I like to attribute the <subject> to the actual source because many people hotlink from elsewhere for the sake of running ‘Made for Adsense’ sites, so those do not get included as a rule.

I have written more about this in the new about and contact page: http://www.<mysite>.com/about-contact.html – I want to be as transparent as possible to you and the <subject> owners/people who submit new <subject> about how I am doing everything in my power to keep the site as a great destination for all.

Back to these links to add.html (why they chose to bomb that page by the way, I have no idea other than perhaps someone has implied it is used for nefarious purposes and told Google so, I don’t know, call me paranoid) – I have gone through the spreadsheets of 309 / 261 (cumulative/30 days) links (presumably Majestic belives the other 185,000 have dropped out of the link graph as that extra data is not there when I download the spreadsheets) – as quite a lot were in Russian and Chinese it has been a case of using ‘whois’ information rather than finding contact information on the site. Many didn’t even mention contact information anyway, and looked ‘set up’ for spamming. I’ve had 27 emails bounce so far, and not a single reply yet. I am checking all the time to see if my links are being removed. Interestingly there is a Russian SEO (oggin.net) who has an ok reason for linking to that page having translated the page so I don’t really mind that one by the way. But that literally is the only ONE link that seems valid to that page.

What you will do with this data, I honestly don’t know. If it goes a tiny bit the way to help Google’s algorithms then I feel I’ve done something good at least.

I just want you to know I’m constantly on the lookout and that none of the spammy/porn links are anything to do with me. It seems the campaign continues well through last month even though <mysite> is down and out, and I fear I will be fighting a losing battle, I will continue to contact the sites to attempt to get my links removed. It’s arduous and obviously tedious, and what I really want is to be able to get back into enjoying building the site and making it better for everyone.

I really want to post all this on your Webmaster Central forum – but as I am seemingly being targetted by spammers, I fear it would add fuel to the fire. For what reason this is happening, I don’t honestly know, I have never put anyone’s back up or made enemies with the site. So again, apologies for being more personal than business like, but this is really the only way I know to ensure a human being at Google reads this and hopefully can understand the anguish this penalty is causing me. I have read and digested so much material on Google’s Webmaster Central and other forums that I am absolutely sure that <mysite> cannot be breaking the guidelines at this point. I hope this new information about links has finally put this issue to bed, and as ever I will continue to improve the site for those who do keep coming back.

Thank you again for your time.

All the best,

Pat

Phew! I know. As Alan Partridge would probably say, what a load of “Blabbering Crap!”. It’s not untrue, it just waffles, I understand that. But as you might get from the tone of it, I am becoming just a little bit frustrated, and it really does look like somebody else has had a go (and is continuing) to harm the site, which I have absolutely no control over.

Again, I might be completely wrong with this rather annoyingly repeated diagnosis about links. But other than the minor issues I sorted out as part of the 5th request, I honestly couldn’t find problems on the site itself and still can’t, no matter how many hours I have spent reading the guidelines/forum posts/etc etc. The whole subject has taken over my life to be honest, it clearly feels rather unfair – especially if I am actually right and others can harm your site.

Google’s guidelines do say at some point, brilliantly ambiguously:

“There’s almost nothing a competitor can do to harm your ranking or have your site removed from our index. If you’re concerned about another site linking to yours, we suggest contacting the webmaster of the site in question.”

ALMOST being the word to bring about my newfound and irritating (not only to me, but my entire circle of friends cos I won’t shut up about it) anger at Google penalising the site.

There are a few indicators to me that Google may be accidentally penalising sites which suddenly get a flow of bad links.

Firstly, that traffic graph just before the penalty hit seems to indicate quite a spike, during which the site obtained a ridiculously large amount of (what only can be described as) shit links, especially in November/December, shortly before Google took the site down to the duldrums.

Also, the issue with the <mysite>yo.com site – it received over 30,000 visitors very quickly on one day in December, all from pretty hardcore porn sites. They had stolen the front page of my site and not even bothered to remove the stats tracking code, so I saw when the traffic arrived. The navigation on that front page all points to the actual site… so the shit traffic arriving there looked like a doorway page to my actual site. Again, I had no control over that at all – aside from eventually getting it shut down with a formal cease and decist. Why I go on about this one so much is that it was so obviously deliberate – they hadn’t changed my content AT ALL on the page they stole. There was no obvious gain for them, they even bothered to register a domain almost the same as the actual site. To me, it’s a clear example of someone knowing how to and succeeding in dicking you over.

I’ve never bought or sold a link in my life, but if you have, Google suggest you submit a reconsideration request and own up to it and tell them what you’ve done to fix it. Well, Christ! I almost wish I had gone and bought them now, so I could confess all to them and maybe the human at the other end would get that confessional priest buzz out of it and push the ‘Hail Marys’ button rather than them not believing the truth I am actually telling them. It feels crap to be in a ‘guilty before innocent’ scenario, and the irony is I wish I was guilty so I could get some form of slap on the wrist and forgiveness from them.

Matt Cutts (Google’s webmaster spam God bloke basically) recently posted a video explaining that reconsideration requests only apply if manual action had taken place to knock your site down – that is, as a result of a spam report. The response to the final request came through in 5 days (7th April) and was different. It was entitled: “Reconsideration request for http://www.<mysite>.com/: No manual spam actions found” and reads:

Dear site owner or webmaster of http://www.<mysite>.com/,

We received a request from a site owner to reconsider http://www.<mysite>.com/ for compliance with Google’s Webmaster Guidelines.

We reviewed your site and found no manual actions by the webspam team that might affect your site’s ranking in Google. There’s no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team.

Of course, there may be other issues with your site that affect your site’s ranking. Google’s computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users.

If you’ve experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site’s content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you’ve changed the URLs for a large portion of your site’s pages. This article has a list of other potential reasons your site may not be doing well in search.

If you’re still unable to resolve your issue, please see our Webmaster Help Forum for support.

Sincerely,

Google Search Quality Team

Not long ago, Matt Cutts indicated that communication with webmasters was going to be improved, and it seems I was one of the first to see this, and posted about it on Webmaster World – where it was picked up elsewhere and yesterday (11th) was confirmed on that page by another Google employee and Matt Cutts himself. I’d like to think Matt had read through the Webmaster World thread, as I go off on a rant somewhat about, well, everything you see above.

Anyway, as a result of this new message, basically it seems I am barking up the wrong tree completely sending in these requests. But that leaves me with nowhere to turn where I can privately engage someone at Google.

The last resort is to post to their own Forum, but having read about 100 hours worth of reading material on there, I already know what the self righteous volunteers who man it will say. And they’re not Google. And it’s open in public with people picking apart your site and quite possibly whoever has done the damage to my site getting a cheap laugh out of it too.  They’re not the ones who can actually help me,  I’m not saying I’m king of SEO now but I have banged my head hard enough on the desk to know there is nothing else for me to consider at this stage, and whilst I know it’ll increasingly make me look like a wacko, I’m getting convinced it’s Google’s fault for allowing spammers to win in my case.

With a lot of people suffering the fallout of the two recent ‘Panda’ rollouts, I think Google are going to be a bit bombarded to hear me any more. Not that I have any means of communicating with them anyway now.

It’s annoying and bizzare to me that one of Google’s “helpful” communications in Webmaster Tools if you have had some ‘unnatural links’ detected is to use a reconsideration request. I simply don’t know if that message existed back when my site was getting reams of the shitty things so didn’t get one. If it didn’t, and I’m supposed to submit a request as it suggests, well, I’ve done that and now the new message says basically ‘Bugger off’.

And guess what, they never accessed the 6th request or any of the supporting documents on the site according to my access logs 10 minutes ago, so fat lot of use it would have been anyway, even if I was supposed to send it in.

Right it’s time for me to get on with some actual work again. Some penalties only last 90 days apparently. I’ll let you know tomorrow.

Update – 20th April 2011

Ok so it’s not tomorrow. Thought I’d give it some leeway. In fact the site has dropped a little further with the rollout of global Panda – which at the time I thought I had escaped from. Traffic has levelled out about another 15% lower from Google.

I’m becoming increasingly stoical about this. One thing I haven’t mentioned is: it’s not the end of my life. It feels like it sometimes though. It’s half my income gone – I can manage for now, so don’t get me wrong. Sometimes, one must reassess their position and go forward from there.

Things I’m doing now are all algorithmic related on-page issues – one is a possible ‘doorway page’ issue to do with the ad I linked to from *every page* (I know) – I have taken it off a massive chunk of the pages, and will probably remove anything ‘flabby’ completely if there’s no effect.

I’ve also ploughed on with general improvements and thought less about Google, and more about the few nice people who still use the thing.

Either way, I will put this post to a close, and hopefully one day you’ll get the “Woohoo” post where I’ll outline what I’ve done since the last request – might aswell share any nuggets of joy if there are any to be had!

See you on the other side…

Comments (6)

I genuinely sympathise, and I’m sorry i can offer no helpful technical assistance. However, have you considered asking the genuinely good people of reddit? If you kept it (relatively!) brief, this is exactly the kind of thing someone on there might just be able to help you with.. Best of luck.

Will,

I have confined my rants to here and Webmaster World for now purely because I want to see if I can unearth what is causing the problem without having to lay it all bare and look a bit desperate in public!

Also, I have some theories to do with padding the site out a bit and removing ads which might harm me short term but at least help me understand where I’ve been going wrong!

Thanks for the comment, it’s nice you took the time… in a way I hope you didn’t actually read all the post for your own sake!!

Are you in a similar situation?

Pat

I’m in a similar situation. I did a reconsideration request in October 2010 that worked – brought all the traffic back. On March 3rd this year traffic dropped again (90% drop). I did one reconsideration request, no effect. I did another – got the “No manual spam actions found”. This is interesting. There is definitely a penalty applied to the site (can’t rank for exactly unique title text for example). So it must be an “algorithmic penalty.” Something the site does had tripped a red flag with the algorithm. Knowing this I’m actually quite excited because I can now go through a logical process to try and work out that flag is. Here’s what I think I’ll do:

1. Register 5 new domains and 5 expiring domains
2. Put simple WordPress blogs on them, with all the “bits” that Google supposedly likes (contact page/policies etc)
3. Divide content from old site into 10 different categories and move it all onto the 10 new blogs with the same title and content.
4. Start building a few links to these pages until they are ranked
5. Gradually start changing the “new” sites until they become the same as the “old” site, noting the changes applied. Obv. different things should be changed on different sites
7. See if penalty applied to any of them. If so, flag identified
8. If not I have regained the traffic I had before except now I have 10 domains instead of 1
9. Will be interested to see if expiring domains perform better than new ones

Andy,

I love your proactive approach to figuring it out – it obviously can’t be make or break(?) or would you risk duplicating or weakening what you already have.

I’m sure, like me, when you come up with a plan like that it’s easy to see the flipside of what anyone in SEO would say and think. Sometimes, it’s good to think outside the box and just forget about the repurcussions. In a way, I suppose this is what Google means by thinking of the users – a lot of what I did to the site in question threw caution to the wind as regards the algorithm, but improved the look and feel.

One change I did to the site recently created 1400 ‘thin’ pages, but really improved the experience. Obviously none of those pages are going to have any natural backlinks to start with – but I figure it couldn’t hurt something already dead. As it turns out, those pages are starting to get a slow upwards curve of longtail traffic now. Which says to me the inner pages (all linked to from the homepage which seems to be the suffering page) still holds pagerank and flows its juice downwards!

I would be wary launching so many sites to try and replicate a penalty that simply might not apply to a new or even old domain simply because of each site’s link profile being so different to your suffering one. Given I have a few sites in a similar niche (yet all link to the same affiliate) – I note that doing one thing in one site doesn’t necessarily affect doing the same thing on another. It’s how strong your trust is which stops things flitting in and out of the filter, which is what I think we’ve both experienced… our sites are just on the cusp, annoyingly.

You might find your October reconsideration requests actually didn’t get you back out of it – it might have been what you did, rather than them pushing a button, I have had to reconsile myself about that concerning a 3-and-a-half-year penalty on the same site which it came out of in January ’10 after 3 requests (but a few changes which could have helped). I have tried every last piece of technical SEO on the site now, and it hasn’t budged – in fact the only massive effect is harming my ranking in Bing & Yahoo.

I observed something which may or may not apply to you, last night. My home page, which I believe is holding back the whole site, was the last page on the site to run an affiliate ad on it. I recently removed the same sitewide affiliate link from every other page. When doing a ‘site:website.com’ search, those pages which are cached without the ad have the little binocular icons to show Google Site Preview – those still with the ad don’t have them. Basically, Site Preview doesn’t think my content is suitable with the ad in place.

It was the same time (+2 days) I removed the ad from every page but the index page that the long tail to my new (and existing) inner pages improved, which doesn’t seem coincidental.

The affiliate link does lead to adult content FYI (but is relevant to the niche).

Now I have completely removed all ads from the site, so it is utterly completely 100% ad free – and so now I’m not earning anything from it – but I’m waiting for my ‘Site Preview’ binoculars to reappear, especially for the index page and crossing my fingers that within a week I’ll see some progress. I’ll let you know.

If you can comment back on your little experiment if you go ahead with it, it’d be fascinating.

No it’s not make or break, although it pretty much was in April 2010 when it was first penalized. When that happened I decided to duplicate the site although with a completely different design and all the content rewritten. Now that it’s been penalized again that first “duplicate” is going strong, which means I’m not in a break situation. Yet. Which is now why I think gradually creating 10 duplicates this time around might be a good idea. Back in April 2010 it took me 6 months before I filed a reconsideration request; I thought the site would come back on its own. Now, I’m much more inclined just to start slinging mud at the wall – reconsideration requests, building exact duplicates, building duplicates with rewritten content etc. Maybe in the process I’ll be able to get a better idea of how G’s algorithm works to avoid red flags. Maybe not. But it’s certainly worth a try, should be fun and will hopefully give me some added security against future slaps with the additional income streams I’ll be creating. I think Google’s lack of communication or customer service of any kind really means you have to get a little blackhat if you want any answers.

I will commment back if I turn up anything interesting.

[…] Why I Keep Banging On About Google – An Update Today is the 90th day since one of my sites was penalised by Google. Wanna know what that looks like? Take a look at this! One of the more frustrating things about this situation – and don't… […]

Write a comment