Many of you will probably have noticed that the Google keyword tool has officially switched over to the new interface. But after using the new version (well new to me as I always preferred the old interface), I started to notice some strange results. Figures much much lower than I had seen before. So I decided to do some digging and came across this article R.I.P Google Keyword Tool. It looks like other people are having problems with the tool aswell. It seems that Google have also admitted the tool now has a filter on it where it will only show data on commercial keywords. This is bound to make Keyword Research much harder if they are only giving data on certain keywords. Only time will tell whether this is a permanent change or will Google change the tool due to the public reaction.
Google have a great hour long video on how to bring more users to your sites through the search engines. You might know a lot of the information, but there could be a few new things you have yet to discover.
A thread sprang up last month on the Google Webmaster Help forums with an interesting issue about duplicate content from sites harming rankings in instances where the duplication may have, in fact, been appropriate.
The particular problem was for two sites that required separate, but incredibly similar, content for an Irish audience and a UK audience. Each one was hosted on a country specific domain, but due to the overlap between the UK and Ireland, the Irish site was appearing in the UK search rankings.
Because of the duplicate content showing up in one set of rankings, both sites dropped in rankings.
What then follows is some helpful advice from two contributors on how to avoid this problem, the best option being to utilise sub-domains so that the duplicate content is contained within one place.
The highlight of the thread is when ‘JohnMu’, a Google employee, enters to outline the company’s policy in these matters, and the difficulties they face when deciding what to do. He mentions that normally in these circumstances, one site will be removed from the rankings so that the users are not faced with duplicate entries.
If you care to check the two websites listed by ‘Mr Code Red’ and attempt to find them in Google, it appears that the measures ‘JohnMu’ talked about have been taken. The Irish site does not appear, but the UK one does.
From this, we can see that overlap between search engines has to be considered during SEO work. We’re also reminded of the fact that research before hand can stop these problems from even cropping up in the first place.
A quick read and potentially useful. So, as JohnMu says, “Hope it helps!”
Remember back in the day when you used to see AOL Keywords printed on video cassettes and adverts, well it seems the trend maybe starting back up again. While watching a tv trailer for Watchmen on British TV I noticed that at the end of the advert it displayed a message along the lines of “Search online for Watchmen”. So are some companies deciding to ditch giving out urls and giving out keywords. I will report back when I find out more on this topic.
So where do we start? If you have been looking into the use of anchor text on websites then you may have already established an opinion of what is going to work. You may already be familiar with a few of the thousands of posts, documents and articles available to read about the correct use of anchor text, but it seems people struggle to agree on the most beneficial way of its use.
This post will see another suggestion of how anchor text is analyzed at by Google and hold a theory on how the system may work. Those knowledgeable people over at SEOmoz have produced another theory on the benefits and drawbacks of anchor text and it’s location on a webpage.
For those people who are thinking ‘what is anchor text?’ Anchor text is a way of giving the user relevant information about the destination of the link. It may also be worth while knowing what a deep link is: A link that leads to a webpage on any site (same URL or different URL) that isn’t the home page – this is usually associated with anchor text. This is a way of getting you to the page you need without having to get you to go through the home page.
The new theory on anchor text appears to be (according to the published information at SEOmoz) that the location of your anchor text plays a part in the result of the keyword term searched. For example if you have a series of links at the top of the page home, ‘Information’, ‘Products’, ‘Contact’, ‘Location’ and ‘FAQ’ yet you also have an anchor link further down the page saying ‘see our products’ which leads to exactly to the same place as the ‘products’ link at the top of the page. It seems that Google will only acknowledge the first of the anchor text links to the products page.
To make things a little more complicated, it seems that Google doesn’t read the page like a user would. It appears that it is the code that is read by Google and the order of the links in the code bears the most weight as to what Google identifies first as anchor text.
Why are these things never straight forward?
For more information check out that the people from SEOmoz had to say about Anchor text.
Team member Sam found this cool video with Maile Ohye from the Google team talking about some SEO things, it’s a long one so I will report back with important points later.
The expectation from clients of seeing top ten results in Google and Yahoo within a short space of time can be demoralizing for most SEO executives. I mean you’ve only got to make vast onsite changes as well as start link building and viral marketing campaigns…
The problem is how do you explain to a client that SEO rarely works quickly? Results aren’t visible within hours and recent links may take a while to activate. Everyone of us working in SEO understand that clients want to see positive changes as soon as possible but next time you are getting it in the neck for not “producing the expected results” remember and pass on these details to keep their business.
Search engines take time to re-crawl sites – Authority sites that are used by millions daily won’t be troubled as much but can still take days or weeks to be crawled. Sites that are slightly lower in Google’s “have to crawl” list can take up to 4 months before all of the sites pages have been fully indexed.
Linked sites need to be crawled too! – Sites that you are linking with will need to be crawled by Google too. Unless you are linking with authority sites (in which well done!) you will have to be patient to experience the fruits of you labour.
Search engines reward patience – Google and Yahoo quickly learnt that black hat link building is often made for the short term whilst good, quality links last. This will see a links power/worth increase over time thus making your site a trustworthy one. New sites and domains suffer with trust issues especially with Google but over time you will see your links having a positive effect on your rankings.
Patience is one of the most important qualities when it comes to SEO, particularly whilst we are going through tough economic times. It is advised that you give a good 3-4 months work on a client’s site before expecting the positive results to roll in.
Not really SEO related, but Google have put up a site to celebrate their 10th Birthday, there is an interesting interactive timeline thats shows you the history of Google.
Here’s an oldie but a goodie, ever wondered how to check what pages on your site carry the most link juice for the main search engines? Well wonder no more, below is an example of how you can find your strongest pages on Google and Yahoo.
Both results will probably vary due to their respective algorithms but it should give you an idea on which pages have more juice so that you can redirect it to pages that need it.