Welcome to Search Kingdom

Castles, keeps, moats... No, sadly we haven't got any of those, but we do have all the first hand knowledge you need to help your website to rank well in search engine results. No hype, no false promises, just clear advice, training or direct assistance to get your website found.

Archive for the ‘SEO Tips’ Category

Google+ Vanity URLs Redirection is a 302!

Thursday, November 7th, 2013

Now Google have never said they were SEO experts for their own sites. In fact many years’ ago they did a ‘drains up’ look at many of their properties with a view to improving the way they interact with, well… with themselves I guess.

This is a good one though!

Google have finally got around to doing some ‘invited’ vanity URLs on Google+ and I think the way they avoided the usual land grab is to be applauded.

However, when you choose to accept their offer of a vanity URL (BTW, why the odd capitals in the URL?), the previous URL has a redirect, but it is a 302 (i.e. temporary) redirect! Check our old one in a header checker – https://plus.google.com/108616727453700388455.


301 Redirects and PageRank Dilution

Friday, October 11th, 2013

Just a quick one…

As we have known for some time a 301 redirect is treated by Google the same as a normal link and the PageRank/link juice/weight/whatever… flow is the same through both.

So when you need to do a 301 redirect, you don’t have to worry about more PageRank dissipating than for a normal link. So why did I underline need? Well although 301s don’t hurt you any more than a normal link, if you change your URL structure and you don’t really need to and then redirect the old URLs with a 301. You are losing PageRank when you don’t need to.

Remember part of the reason PageRank works in the way it does is that there is always a percentage of power that never gets transmitted via any link. Some say this is between 10% and 15%. However, it is always worth remembering that whatever the percentage the link will not be equal to all those on the page, so the actual loss is based upon that fact.

Google and The Bank of England

Monday, September 12th, 2011

Strange name for a post, eh? Still, got your attention…

It occurred to me the other day that even though PageRank is a far lesser factor in Google’s algorithm than ever before, it is still the most fascinating. Also, it is and will always be the fundamental premise on what has made Google so successful. So even though it has taken a little bit more of a back seat in our thoughts, don’t ever forget this founding father.

So why the Bank of England? Well, the Bank of England regulate the flow and amount of money (well pounds anyway) in circulation. They also do things like ‘quantitative easing‘ (which we now unfortunately know so well), where they inject money into the system by buying bonds, etc.

We know that PageRank dissipates and weakens as it flows from page to page and site to site. This is part of the complex algorithm that regulates the PageRank in the system (you see where I am going with this?). So where does the master PageRank flow regulator sit? How regularly does it pump PageRank in to circulation, is it based upon the number of sites (or more likely pages) indexed, number of links found, etc. Does Google have a dial that alters the weight of certain links and then pumps some more PageRank into the system to flow to these?

In the same way that every page has a defined PageRank number, there must also be a defined amount of PageRank in the system at any given time. Mervin King meets Matt Cutts!

Is there a TrustRank regulator too?

Fresh Content Is Not King

Monday, March 21st, 2011

Good content is king, not ‘fresh’ content or ‘chase the long tail’ content or ‘copied and rehashed’ content. No, just plain old good content. Why? Well, Google says so and anyone who ever access the web to try to find out something says so too… but mainly Google.

Whichever way you look at it the ‘Farmer’ or ‘Panda’ update from Google was coming from a long way off. Why? Well people thought they had found a loop hole in getting less than great content to rank well. You know the type of sites/pages? No? Well, have you ever looked for something like ‘get ketchup off of a carpet’ (beautifully worded I know, but intentionally worded for search)? If you have you could have some across a page like this…

How to get Ketchup off a Carpet

1. Ketchup is another name for tomato sauce and is used to complement many dishes.
2. Carpets are used on hard floors and come in a variety of different styles and colours.
3. Spilling ketchup on carpets can cause stains.
4. If you have split ketchup on your carpet and want to know ‘how to get ketchup off a carpet’ you should consult a specialist carpet cleaner.

Queue AdSense adverts…

Know what I mean? Did you think that Google would not deal with this type of site at some stage? There are other types of sites that Google have attacked on this recent update too, but they all mainly centre around ‘thin’ content or ‘content free’ content as I like to call it.

How have Google done it? Well, the update seems to be site specific and not page specific. This seems like a sensible first stab at this as it is a lot easier to create some rules that would apply to a domain with thin content rather than do this by page, which would lead to lots more work and a far deeper algorithm change. However, watch this space as it will come one day.

So, just to revise my initial words, ‘fresh’ content is king (update you website every minute of the day if you can) and ‘long tail’ content is king too (around 30% of Google searches every day are totally ‘new’ to Google). However, if it is not ‘good’ content then you are most defiantly on Google’s radar. Be warned…

Google Ban?

Friday, February 18th, 2011

Don’t worry, Google bans are not as common as you think. However, if you are doing some things that may run close to the wire, then this video from Matt Cutts may help you do what you need to do.

In essence, if you don’t play by Google’s rules, then Google will get you in the end. If you are in your business for the long term then, it makes sense to plan your SEO over the long term too. Google will NEVER stop being attracted by great links and great on-topic content. Anything else is up for debate (no matter what your SEO guru tells you…).


PageRank, TrustRank or Both?

Tuesday, December 21st, 2010

Thank goodness for Google! Even though our business as SEO consultants gets harder, at least Google are constantly mixing things up to enable those who specialise to be… well, specialists… Now, don’t get me wrong I don’t completely get ‘gooey’ eyed over everything about the big G, but if you like a boss (and let’s face it they are the boss) who constantly challenges you then they do this just fine.

So, what am I talking about? Well, this time it Google (and Bing… like what I am seeing by the way guys!) talking about what signals they get from Twitter and Facebook for natural search. Thank you Danny Sullivan for such a great article that kicks this process off.

To get a further insight into this, check out this Matt Cutts’ video.

Don’t Twitter and Facebook ‘nofollow’ their links? Yes, they do, but as Danny points out both Google and Bing get a feed from both without the ‘nofollow’, so in theory they can pass PageRank (or whatever Bing call it too).

Aren’t Twitter and Facebook links mainly done though a URL shortener? Yes, but do you really think that both Google and Bing can’t easily circumnavigate this?

Wait a minute, I just realised this article was turning into a mini Q&A with myself… just one more though…

What about Wikipedia links then? They are ‘nofollow’, but will they now help me rank better? Ah ha! Good question. Does the link pass PageRank? Probably not. Does it pass trust or TrustRank (can I trademark this?), I would say yes. Put it this way if you had a page that was hugely authoritative in its genre, was well linked to and had a bunch or axe wielding (not literal, but you know what I mean) custodians who made it pretty much spam free, wouldn’t you find a way of using this as a signal? Remember it is your search engine and algorithm.

OK one last question then… Why do these links not pass PageRank then? Because Google has said that if you can’t vouch for a link or it is a paid link then use ‘nofollow’. That doesn’t mean they can’t find some other way to use it as a signal, does it?

Instant Caffeine Fix?

Tuesday, September 14th, 2010

I have not really ever written about Google Caffeine. This was deliberate as I wanted to see how it panned out. And well… it seems to have done exactly what Google said it would i.e. provide a better and more up to date way for them to index. This brings all sort of advancements, but the main one is speed. Quicker indexing, quicker response, etc.

Did Google Caffeine change things for us greatly? No, not really, but the landscape has changed for Google to build upon, which is the part that does change things a great deal.

Tried Google Instant yet? Now that is a big change for all of us. Instant was in many ways made possible by the Caffeine update and it will greatly affect the way we access Google’s search results. Add this to the personalisation elements that have been happening for the last few years and the way we search and, most importantly, what we choose to access once we have searched has changed a great deal in this time frame.

Try Google Instant for yourself (currently you need to be signed in to your Google account). Did you type less? Did you type more? Did the top results that occurred half way through your query help you? Did it alter the way you typed the rest of the query? Will you turn Google Instant off? Did you use the suggested search less or more? What did you click on? Lots of questions and all of the answers can change each time we search.

Anything that changes the way we search will clearly affect those people who have a view on how best to affect the way Google orders their results. But that is it, we in the SEO world try to affect the way Google orders its results and not how people search. We can never affect that.

So for me Caffeine was and is always something that I had to watch and learn from, not worry about. The May Day update was the big news that happened around the time Caffeine was launched because that directly affected the way I need to think about SEO. As long as Google delivered Caffeine in the way it was meant to happen (with no mess ups that affected search results) then I was and still am prepared to watch how things develop. Sure, quicker results, quicker indexing, more ways to slice and dice the results (Instant), more pages indexed, but nothing that should affect the order of results.

Will Instant change what gets clicked? Yes. Can you affect it from an SEO perspective? No. Can I learn from it over time? Yes.

Oh, and by the way, I am only taking about SEO here. PPC? That is a totally different matter… Head term guys, watch your budgets and ROI! Remember Google search is all about being the best and getting ads clicked. If you view every change through these eyes life becomes far less complicated.

Lastly, tried Google Realtime yet? Caffeine at work again.

The Definitive Answer On Nofollow

Thursday, July 8th, 2010

Readers of this blog will know we went through this one when Google announced they had changed the way they viewed the ‘nofollow’ tag.

You can read these posts here:

PageRank Sculpting
Matt Cutts Answers PageRank Sculpting Question
PageRank Sculpting Phase Two

Also, here is a new video from Matt on this subject.

There are many (many…) sites on the web that still have a ‘nofollow’ based PageRank sculpting architecture. There is an element of ‘not broke, don’t fix’ about this, but it is worth bearing in mind that if you have this environment you are burning a great deal of PageRank that you could be channelling more wisely.

Frequency, Quality and Search Engine Rankings

Monday, June 28th, 2010

I am not going to try to cover this huge topic in just one post… What an uninspiring start, eh? Still, now that I have got your attention, let’s see where this leads…

Firstly, what is ‘quality’ in regard to website content? Wholly subjective, isn’t it? Just like the daily newspaper you buy, the ‘quality’ factor is whatever you judge it to be. Far easier to judge ‘quality’ by standards, like a newspaper that is poorly printed, has bad spelling, etc. would be judge as ‘bad quality’. Likewise, a website with copied, badly written, badly formatted content would be judged as ‘bad quality’. Your website’s ‘quality’ is judged by whoever eventually may read it and, crucially, whether they feel it has served them what they wanted or is interesting, inspirational, informative, etc.

Now, frequency and quality with regards to website content can get somewhat blurry when mixed. Is it better to churn out lots of, at best, mediocre content or deliver something good and insightful whenever you feel it is appropriate? Even if you are passing on information, do you pass on everything or only that which has really merit to your readership?

These frequency and quality questions, mainly come down to what is your website for and what are you trying to achieve. Is it for your own interest? Are you trying to sell something? Are you delivering important information? Etc.

Simple stuff, so far? Well, yes until you bring search engines into the equation. Then these pretty basic assumptions change and break up into a fantastically silly guessing game. Does Google like lots of updates? Should I change my home page regularly? Can Google look at my content and see if it is rubbish? Does it care?

Now is a good time to bring in a recent video from Matt Cutts about this subject.

So, is it any clearer? Now, I don’t think for moment that Google or Matt Cutts will ever be transparent enough to tell you the whole story. However, I also think that the steers they give us are never too far away from the direction we should be heading. The information above all else that has been communicated over the past couple of years from Google is that ‘producing great content will give you the best chance of getting good links’ (except they don’t always mention the ‘good links’ part. The rider is that the great content needs to be known about in the first place, which is somewhat of the Catch 22.

Does Google know if your content is good? Well, no not really. It knows if you are on topic, it knows if you have copied your content, it knows if it is link worthy, etc. etc. But, unless they do a hand sort, it does not know if you content is good, even then it won’t be subjective and will only look at the ‘bad quality’ that I mentioned above, but in a search engine’s case they are looking for ‘bad quality’ that tries to cheat them or us. The algorithms will pick up most of the ‘cheating’ Google elements and a great deal of the semantics elements, but will never pick up if your post is fantastic, but then it doesn’t need to, the web will tell it if it is.

So will frequency help me rank well? Yes, it will. For all the reasons Matt says and many others. But will frequency on its own help me? Somewhat, but not in real terms and certainly not without the other ‘trust’ and ‘popularity’ factors that Google puts above all others. More than anything ‘frequency’, as long as it is aligned with good elements of appropriate diversity, will help your ‘long tail’ exposure. For ‘head terms’ there is a much bigger reliance on ‘quality’ mixed with ‘frequency’ to bring link weight to your site as a whole, which will then, in turn, help your site (and it’s targeted key phrases) rank better. Frequency, without quality and diversity will not help you very much and also thin and spread your PageRank/trust weight at the same time.

If you are looking for search engine spiders to visit your site more then, frequency does help, in the same way that individual page improvements help. But, frequency will not help if Google is not really that interested in your site and even though the ‘supplemental index’ has long been forgotten about, the principals still play a part in what Google will and will not index and how it indexes your content.

This post was really meant to look at bit harder at the ‘fresh content’ mantra of SEO, where some people have taken Google’s words and built their own theory. Personally, I agree with certain elements of the theory, but average at best content and average at best links will only get you so far, and there is still a lot of effort and money involved in taking this path.

So is content king? Not in my opinion with regard to better search engine exposure. Google’s fundamental principal has never changed and links and citations are king. However, without quality, popular, authoritative or crucial content, links and citation will always be contrived. And in essence, that can only take you so far and nowhere near far enough in a competitive search engine ranking environment.

Will May Day Wag The Long Tail?

Thursday, June 3rd, 2010

Back in the day when Google updates used to cause panic and elation in equal measures, it was always pretty apparent what had happened. In some ways it is a shame that Google are now very much in the ‘law of diminishing returns’ phase where many of their algorithm changes go pretty much unnoticed. However, within the last month we have had the ‘May Day’ update which is a little bit more than one of the usual tweaks and is worth mentioning.

The phrase ‘long tail’ when applied to search engine listings describes the countless phrases that are used by us all that fit outside of the ‘head terms’. A ‘head term’ would be something that is used many, many times by lots of different searchers e.g. ‘pizza delivery’, ‘mortgage quote’, etc. The ‘long tail’ are the less used, but multiple search queries that often use more qualifying words e.g. ‘negative equity mortgage advice company’, wheat free pizza bases delivery’, etc.

The (very) basic premise of the ‘long tail’ is that roughly speaking you will get the most of your traffic (or sales if you run a ecommerce shop) from the ‘head terms’, but these head terms from a diversity perspective will be much, much less in number than the multitude of different ‘long tail’ queries. For different niches this weighting can be very different and in fact the ‘long tail’ can be your most important traffic source and the one that leads to most sales.

So how does the ‘May Day’ update relate to the ‘long tail’? Well, Google have decided to tackle this type of query in more of an isolated way and try to more closely match the needs of the searcher in relation to the page(s) that are delivered.

Here is a video from Matt Cutts that talks about this change.

So how does this relate to your site? Well, the best way to evaluate your ‘long tail’ exposure is to run a search query report on your analytics programme from a relevant month and look at all the individual searches that bring traffic to your site that are relatively low in number (but are many when all added together) and contain multiple words. Than you can run a report from around mid-May onwards and see if this has changed either positively or negatively.

I am still evaluating what I think the triggers are for this change and how Google is making the judgment call on the relevancy and quality of the results it hopes to delivery for ‘long tail’ queries.

More on this in a future post, but in essence this could be a great directional change for Google. However, I am sensitive to those of you out there who have had a real and negative traffic hit from this change.

Shock! Google Doesn’t Use The Keywords Meta Tag

Tuesday, September 22nd, 2009

Well, not really a shock, but good that Google has come out and said this finally.

We have known for a long time that the ‘keyword’ Meta tag was not being used by Google. There was also the suspicion that both Yahoo and Microsoft weren’t using it too. As we know the tag had been open to abuse since the early days of search engine importance and because it is hidden from the web viewer it was pretty much fair game for all sorts of SPAM and keyword stuffing.

Here is a video from Matt Cutts explaining the whole thing.

So until Microhoo come out with the same information, it pays to still use the keyword tag sparingly. It is not much of an effort to put the three or four keyphrases your page is trying to be optimised for in the tag. Any more than this is a waste of effort and also Google said ‘we don’t use the keyword in the tag’ it didn’t say that ‘we don’t use all the Meta tag information to help us gauge trust’. So, don’t stuff that Meta tag for old times’ sake and think it won’t be noted still!

Are Your Title Tags ‘Just a Little Bit Rubbish’?

Monday, August 3rd, 2009

Very eloquent title, eh?

Still, now I have got your attention, are they? Just a little bit stupid? Redundant? Boring?

Not going to go massively in depth here (it’s in the training course!), but I have just been doing some analysis for a new client and part of this was checking out the competition. Now I can understand if you have never paid any attention to your page titles (it is quite nice to walk into a project and know that you can make a big change that is totally under your control), but when you see some of the actually ‘optimised’ titles that have been put together, well… why bother?

Here is a random list of things I have come across this morning.

  1. Title that are so unattractive that nobody would ever click on them (you wouldn’t do this in your PPC, would you?)
  2. Two word titles for a company’s home page (is that all you do?)
  3. Fifty word titles on a company’s home page (no need to list ALL you do)
  4. Company name only on every title (that old chestnut)
  5. Using the ‘|’ (pipe) to separate key phrases (ugly, over used and because you need to use two spaces uses more characters than just a comma)
  6. Site-wide titles (yes, they still do exist!)

There are lots more, but these are the ones that come to mind right now.

Your page title is a powerful thing. Use it well and it will improve your on-site SEO by a bigger percentage than almost any other element. However, always remember this is what will be the clickable link when you get listed. If you get to the top three you will get traffic anyway, but anything lower than that and your title will make the difference.

Free SEO Site Review

Tuesday, July 28th, 2009

Sorry for the lack of posting recently. What with one thing or another, I just haven’t had the time to get some done. Maybe it is the temptation to just micro blog things now. Still, no excuses.

As you will have noticed I put up a box on the top right a while ago offering a free SEO assessment from a top UK SEO no less… (self-proclamation is it really needed or worthwhile?). Well, we have got to the stage now where I am getting about two or three enquiries per day for this; thank you! Every person who submits a request gets a personalised assessment back. Some are brief and some are somewhat longer (depending on my current time pressures), but all of them should add some insight and value to the SEO situation for the site concerned.

There were two reasons I started this service. The first was to keep myself fresh (you know what I mean) and ‘with it’ with new sites and situations to look at and analyse. The second was to maybe get some paid assignments out of it too. This is when the initial assessment is greeted with a ‘let’s talk further’. I am pleased to say that I have hit the mark with both of these goals so far.

So what has changed? Well, I would like to open these out a bit and maybe once a month use a particularly good (or bad) situation and write about it publicly. The good news here is that the public ones will be quite in depth (the ones that usually have a fee attached if the person/company wants me to go a lot further) and this (public) assessment will now be free. This will only be by agreement (if you ask for an assessment you won’t now suddenly find me dissecting your site in a live post) and also I will make sure the version that appears here is a little bit more truncated that the one you will get. Hey, we might even do some live video or screen cam ones.

Now, here comes my get out. If this takes off too much I may have to rein back on the amount that I can do and be more selective. But let’s see how it goes.

If you would like me to give you an SEO assessment overview of your site, just send your detail via the contact form. You never know you might be the first one to get an extended assessment for free and make it on to the site too.

P.S. as the name of the site suggests this site is (meant to) concentrate on the UK SEO scene. So, please don’t be offended, but I do ask for the sites submitted to be UK based (doesn’t need to be hosted in the UK, but needs to be administered in the UK). However, if you have a site outside of this region and you think (or know) I couldn’t resist to delve further on the SEO side you are welcome to submit it and give it a try!

Matt Cutts on Google link: Command

Tuesday, July 7th, 2009

My goodness, I knew that Matt was making some videos about questions he has had regarding SEO, etc., but I didn’t realise how many he had made. Well done Matt, it must have been a hell of a session(s).

I found this video regarding checking for back links, etc. Matt also mentions the ‘link:’ search command. I did an article on the Google ‘link:’ command a few days ago and talked about its flakiness. Matt mentioned the Google stance that they only show a few links to give a small overview of the back link profile to protect the link data from prying eyes, etc. I am not saying that this is not on the money, it is just that the ‘weirdness’ of some of the results is puzzling. Also, how exactly do Google get a real randomness to this subsection of results?

Anyway, here is Matt’s video.

Let’s not put too much weight on this analysis. I am sure it is not going to lead us to the Google search holy grail or anything. I have just always thought that it was very strange for the best search company in the world to put its name to flaky results, no matter how they down play it.

Google’s link: Command Revisited

Friday, July 3rd, 2009

Have you ever received one of those emails that says,

“As you may know Google puts a great deal of store in incoming links. We have noticed that your site has very few incoming links according to Google. You can try this for yourself by typing ‘link:www.yourwebite.co.uk’ into Google. This will show you how many incoming links Google can see for your website. As you can see you only have 23!”

The email usually goes on to say what a wonderful job they could do for you and how 23 links is a pretty rubbish effort.

As you probably know Google has always treated the ‘link:’ command in a very lacklustre way. This is the direct opposite to the way it treats literally any other search term. I have always been quite puzzled by this and thought that Google may as well just pull the facility rather than leave something out there that was, at best, poor and at worst damaging.

So based upon my last post ‘Google Search Operators‘ I thought I would have a small play with the ‘link:’ command mixed in with some other operators. The theory is that any insight you can get into how Google views links must be worth something.

To start with this is what usually happens with the ‘link:’ command in Google. Listed below are the results for this site and some others.

Search Kingdom ‘link:’ results in Google

Search Engine Land ‘link:’ results in Google

Matt Cutts ‘link:’ results in Google

Apart from the fact that both Danny’s and Matt’s sites have infinitely more links than me (boo hoo), you can see the way that Google treats sites with more importance (and links) with the ‘link:’ command.

The results you get usually mimic the ratio of internal links/pages and external links. Both Danny’s and Matt’s sites have thousands of incoming links from a wide variety of websites. So the results the ‘link:’ command returns for their sites are varied and depicts the ratio of incoming and internal links on the sites in question. Try this for yourself on your own site or any sites you may be working on. Are there many results? Can you see more internal links than external links? Are the internal links/pages at the top of the results? Do you see a site wide external link coming up first? If so how many pages are shown and what pages are they? Do they look pretty random?

Answering some or all of the above may be a small insight into the way that Google looks at link weight and importance of your site or in general.

Now, after mixing the ‘link:’ command with other operators and having a little bit of a play with this it seems in most cases the results go crazy! If you are going to play with this yourself pick websites that have small to medium amounts of links and in contrast also ones that have lots of incoming links. The craziness for a site with a smaller to medium number of links is really interesting and seems to go really off the wall. For instance the ‘link:’ command really breaks down if you use this site (www.searchkingdom.co.uk) as an example and then add the ‘-site:www.searchkingdom.co.uk’ operator to the search command. For example:

Weird incoming ‘link:’ search results for www.searchkingdom.co.uk

The results really expand from the paltry ‘3’ for the ‘link:’ command without any operators. As you can see the thread works for some of the time through the results, but Google also decides to mix in some results for the term ‘search kingdom’ and include pages that do not link to this website and seem to be about Kingdom Hearts. There are lots of references on the web to ‘search Kingdom Hearts’ and for some reason Google decided to mix these results in with my ‘link:’ command. Does this mean that the operator has ‘broken’ the results here or just made them more interesting?

Try this on your own site and also test this out with some more operators. Also, have a look at the ‘related:’ command. Both of these commands mixed with other operators spit out some really interesting results that are worth examining.

Overall, there is a definite possibility that the ‘link:’ command in Google is just a broken and forgotten about thing that no one pays much attention towards. This is certainly the reputation the command has built up. However, it is worth having a closer look at the craziness that some of these results throw up and seeing if these can give us even a small insight into how Google views some of its link structure, the weight it places on some links and how it deals with unique links.

Now we all know that a good link is one that is relevant, not bought, intrinsic and valid. These are the links you need to find to give your site the importance and exposure you would like it to have. The quality of your content and the way you market that content will give you more reward than anything else. However, Google holds the cards in this particular game, so shedding any light on what hand they have is always useful.

I would be interested to hear from anyone who has discovered some more interesting results. This will help to see whether it will be valid and useful to take this analysis further.