SpiderFriendly.co.uk

Search Engine Optimisation :: Frequently Asked Questions

  1. What does SEO stand for?
  2. Why doesn't my site rank well for the keywords I target?
  3. How do I start?
  4. How do I optimise the site copy? Where do I place my keywords?
  5. How do I build links pointing to my site?
  6. How do I optimise the <title> tag contents?
  7. What are meta-tags? Are they really important for SEO purposes?
  8. What is search engine spam? Why is it a bad idea? Is there a SE spam classification?
  9. What is Google Page Rank (PR)? What is all the fuss about? I see a lot of sites in the TOP 10 that show 0/10 Google PR - how is it possible?
  10. How do I optimise the site's navigation for SEs? What is anchor text?
  11. How soon will the search engines find my site?
  12. My site used to rank well for my targeted key phrases but today I found out it lost the rankings. Am I banned? What have I done?
  13. Such and such a site ranks #1 for my targeted keywords, but it's no better than mine - why?
  14. I've read a lot about SEO. I've applied all the SEO advice I've read about to my site but it still doesn't show in the TOP 10. Why? I'm desperate. Is Google evil? Help!
  15. Are reciprocal links considered spam? Are they counted towards my link popularity? Aren't they ruining the business image of my site?
  16. Why is it believed that all the directory submissions should be done manually? There are so many auto-submission programs around - and manual submission will take a lot of time, won't it?
  17. How do I create a spider-friendly site map?
  18. Do spiders follow image links?
  19. Do spiders crawl dynamic sites?
  20. Do I hurt my rankings by linking to other sites?
  21. What is the "Google sandbox effect"? Can it be avoided?



What does SEO stand for?

SEO is an abbreviation for search engine optimisation. SEO appeared on the scene quite recently, about the time the World Wide Web (and HTTP protocol) was born. With a huge number of web sites and the opportunities they brought as effective marketing tools, the idea of influencing search engine rankings using certain methods and skills quickly found its way into people's minds.

Now, organic search engine rankings and the advantages they can bring are becoming a consideration for more and more business owners, and thus SEO is becoming more important.

There are, generally, two main factors that affect search engine rankings: the [1] content of the site's pages and the number and quality of [2] links pointing to it from the rest of the web. By improving your content and making it more relevant to your preferred theme and keywords, you can achieve great results in the organic search engine listings, but links are also very important, as they add authority to your site.

So, when doing search engine optimisation, pay attention to your on-page factors (body copy, <title> tags, link text) and off-page factors (incoming links from other authoritative and relevant sites).

Why doesn't my site rank well for the keywords I target?

There can be a lot of different reasons for that.

And patience, patience, patience!

How do I start?

Start from defining your overall goals. Analyse your current positions and ROI, and your current traffic sources (if any). Check your server logs to see what keywords already bring you traffic. Brainstorm more key phrases, then check them using Wordtracker or other keyword research tools.

Now think how you can re-write your existing pages (or write new ones) to emphasise the search terms you've chosen without breaking the overall concept or destroying the marketing quality of your site. If it seems to be complicated, reconsider your keywords; most likely, you've selected the wrong ones.

How do I optimise the site copy? Where do I place my keywords?

The site copy is the core of your site's concept and image, so its optimisation is the most important part of your SEO process.

When writing for SEO purposes, place your keywords within the copy, closer to the top. Include them in headers and sub-headers (<h1>-<h6>), but do not try to fool search engines by placing these tags within your paragraphs. Use <strong> or <em> tags instead.

When your keywords are included in the link anchor text, it increases their weight.

Make sure your copy remains readable and pleasing. Do not stuff it with keywords. When finished, read it out loud to check if you are overusing your search terms - or ask somebody else to read it for you.

Use the Google Toolbar highlighter feature to check your keyword density.

How do I build links pointing to my site?

There are several ways of building links to your site. The first and most well known is reciprocal linking: you contact another webmaster and offer to link to his/her site in exchange for the same favour.

Another common way is directory submission. Directories are numerous, and their main goal is to list sites, so your submissions are, in most cases, welcome. Just be sure you've read and understood the guidelines.

Publish articles and press-releases across the web. Offer testimonials and add your link to your signature. Add an RSS feed to your site (it will help you if your content is frequently updated and interesting to a wide audience) Register your RSS feed in proper directories, and soon lots of blogs and news pages will link to you in exchange for an intro paragraph of each and every article you publish and feed.

Stay away from FFAs (free-for-all schemes), link farms, pyramids, automated reciprocal linking scripts and reciprocal link directories that offer automated link exchange procedures.

Make your site good. Quite likely, you will soon find out people are linking to it just because they like it.

Do not buy links for the sole purpose of acquiring another link. Buy advertising - and make sure it converts.

[5] more on link building

How do I optimise the <title> tag contents?

Optimising the <title> tag in the <head> portion of your HTML-based web page is a very important part of your overall SEO work. The page title is displayed in the topmost area of you browser window and it pretty much tells your users what your page is about - for this reason the search engines pay special attention to it.

To increase the relevancy of your main keywords, include them in your <title> tag, but do not overuse this tactic. Twice is OK, but the same word repeated three times may be considered spam. If your brand name doesn't consist of keywords, add it to the end of your title tag. The search engines only read part of it, so your keywords should be closer to the beginning.

The title of your page should be readable, short and descriptive. If some words are unnecessary, remove them.

If you are targeting both singular and plural forms of your main keyword, it is sometimes possible to include both of them in the title. Do so only if it sounds logical.

What are meta-tags? Are they really important for SEO purposes?

Meta tags are special tags placed in the head portion of the HTML-based page and are intended to provide certain information to browsers and search engines such as description, keywords, copyright, etc.

Their common format is: <META NAME="tag name" CONTENT="value">

The following meta tags are most often mentioned in connection with SEO: "description", "keywords", "robots" and "revisit-after".

<META NAME="description" CONTENT="the description of your page">

You can use this tag to add a good, descriptive summary of the page's content. There is no need to make it too long, but pay attention to its marketing quality. It will not help you to achieve better rankings in Google and will do very little for it in Yahoo!, but there is a chance Google or Yahoo! (as well as some other SEs) will use the content of your "description" meta tag as a snippet when your site is listed in their result pages. Such snippets impel potential visitors to click on your listings when they see what they actually search for - or to go elsewhere. So, include your keywords, but do it moderately. Stuffing won't work in this case.

<META NAME="keywords" CONTENT="your keywords">

This tag is ignored by Google, but other search engines pay some attention to it. However, stuffing keywords in it is not recommended - it might harm rather than help your rankings. Separate your keywords using blank spaces or commas - it makes no difference. Do not make them too long: those minor search engines that still read this tag and can bring you traffic should know exactly what you are targeting, and there is no need to confuse them.

You may delete this tag completely if you wish. It won't change your overall traffic significantly.

<META NAME="revisit-after" CONTENT="10 days">

This can be set at 15 days, 7 days, 1 day or whatever you like. Many webmasters add this tag to their pages in the hope that it will force a visiting spider to come and re-crawl the page at a frequency of their choosing. It does nothing of the kind. Most spiders simply ignore the tag, and many interpret it quite the other way, which is: "Do not come earlier than instructed". If your pages contain this tag, remove it.

<META NAME="robots" CONTENT="INDEX,FOLLOW">

This tag tells the spiders to index the page and follow all the links that can be found on it. This is true by default, so there is no need to add this tag.
Use <META NAME="robots" CONTENT="NOINDEX,NOFOLLOW"> if you do not wish the spiders to read and index this particular page.
Use <META NAME="robots" CONTENT="INDEX,NOFOLLOW"> if you do not want them to follow the links on the page.

What is search engine spam? Why is it a bad idea? Is there a SE spam classification?

Search engine spam is a common name for all the deceptive techniques that are aimed at an artificial increase in the relevancy of a page to keywords frequently searched for but that have little or no relevancy to a site's theme.

Search engine spam includes:

There is no need to spam the search engines. Good rankings can well be achieved without all those outrageous methods, and if you look at actual SERPs, you won't see so many spammy sites close to the top. Search engines are interested in producing good, relevant results, so they constantly improve their spam filters and ban sites that abuse their algorithms too obviously. It goes without saying that the resources and talents of their developers could have been better used.

Spammy techniques are bad for sites that use them. Usually, the quality of such sites is very low. They are bad for the Internet surfers who have to spend more time finding what they really need. Unscrupulous SEOs who use such methods to "optimise" their clients' sites often leave these clients with banned domain names and ruined businesses, which is, perhaps, the worst of it.

What is Google Page Rank (PR)? What is all the fuss about? I see a lot of sites in the TOP 10 that show 0/10 Google PR - how is it possible?

Google PR (Page Rank) is an indicator of the page's link popularity calculated according to a complicated formula that utilises PR values of all the pages linking to the page in question. It doesn't evaluate such things as relevancy or link anchor text, and the formula, however complex it might be, is still not sophisticated enough to say everything about the link authority the site receives. It is, therefore, only one of the factors used when actual rankings of the page are calculated, and is by no means the most important one. There are many other factors to take into account, so if you analyse the sites ranking TOP 10 for particular keywords, you won't see pages with PR values going in descending order.

The Google PR value given in the Google toolbar is rather inaccurate, and only gives you a vague idea of the site's real link popularity. So, when choosing the best link partners for exchange, do not go by Google toolbar PR value. Analyse other factors such as relevancy, quality - and, of course, actual search engine rankings.

How do I optimise the site's navigation for SEs? What is anchor text?

In order to optimise the navigation for the search engines, simply optimise it for your human visitors. Improve usability and make your navigation logical and link your keywords to pages where these concepts, services, or products are described in detail (if your site contains such pages, of course). Anchor text (the text of the link between <a> and </a> tags) should contain keywords: it increases their weight and helps you to achieve good rankings for your site. Unless you abuse the technique, it really improves usability. Do not link from your body copy to the same page that contains the text with the sole purpose of including keywords in the link: it looks ridiculous. No human visitor really needs this link; anything done only for spiders is bad SEO. But if you link the page to itself in a navigation block, it is a normal practice: navigation blocks are often spread across the whole site.

When your site grows and gets hard to navigate, add a site map. It is equally beneficial for humans and for spiders, provided it is done using plain spiderable HTML links with no JavaScript or drop-down boxes.

JavaScript-based links, as well as drop-downs, are very spider-unfriendly, so be sure to provide plain HTML links to all the pages within the site. Your site won't be spidered otherwise.

How soon will the search engines find my site?

Different search engines have different crawling speeds. Google is very fast; Yahoo! is very slow. But if you know for sure that your site has at least one incoming link from a well-established site, and the page it is placed on is indexed by search engines, then those engines will send their spiders to you - sooner or later.

Google will, most likely, come on the next day. If the link that points to your site is authoritative enough, Google will index several pages on its first visit, not just a home page; otherwise, try and get more links.

In some cases, you will have to wait about a month or two for Yahoo! to learn about your site. But it will come, too. MSN, sometimes, is even quicker than Google.

My site used to rank well for my targeted key phrases but today I found out it lost the rankings. Am I banned? What have I done?

Most likely, you have done nothing bad (because, of course, you know better). There are different reasons why you could lose your positions, such as an algorithm dance (where the search engine has tweaked its ranking algo and changed the weight given to different ranking factors) or a datacenter dance (where the fresh information acquired by spiders is being updated across the servers that store the index); anything can happen to sites' rankings during such periods.

But if you know for sure you have done something very questionable, it is a good idea to check if you are actually banned. If you aren't, remove everything deceptive ASAP.

Such and such a site ranks #1 for my targeted keywords, but it's no better than mine - Why?

Well, search engines must have their reasons to think otherwise. Are you sure you've checked all factors?

Check your competitors' inbound links. Try to get links from the same sites. Write more content. If possible, make your site considerably better. Never stop working on it. The day will come for you to win if you are a dedicated and hard-working person. But don't become obsessed with your competitors. Think about your site.

I've read a lot about SEO. I've applied all the SEO advice I've read about to my site but it still doesn't show in the TOP 10. Why? I'm desperate. Is Google evil? Help!

Google is not evil. Search engines will only survive if they help searchers, and for this reason only they are our good friends and deserve respect and friendly treatment. They may not always be very successful in delivering the results we are searching for, but their algorithms are improving continuously.

When doing SEO we should always remember that search engines owe us nothing. They work for searchers, not for SEOs. But we are supposed to help them if we want to get something in return.

That SEO advice you've read and applied might be wrong. Then again it might be right, but the problems are appearing because you haven't applied it completely. There are always ways to improve further; SEO is a never-ending process, so, take a quick rest - and then start again with your good work. And if your niche is very competitive, try re-considering you keywords and choose less competitive ones.

If you feel like your patience is exhausted, the only thing that is going to help you is a good sense of humour.

Are reciprocal links considered spam? Are they counted towards my link popularity? Aren't they ruining the business image of my site?

Reciprocal linking is a legitimate method of promoting websites online, and when done properly it doesn't ruin the image of your business at all; if you approach it properly it can even improve it.

In spite of a well-known opinion that reciprocal links are considered by search engines (e.g. Google) as spam and will soon be penalised, it just doesn't seem logical. There are too many cases where sites are supposed to link to each other, for example partners' sites and academic works. Penalising them for doing so would be harmful to the Internet, and people who develop search engine algorithms are smart enough to realise it.

That link pages do not transmit as much authority as content pages is true, but it can easily be explained by the fact that they contain more links - and the authority is shared by them all. Only if you abuse reciprocal linking can your link pages end up disappearing from Google's index.

Link exchange pages that look like link farms do spoil the appearance of your site. However, resource pages that are easy to find and use can be really helpful to your visitors, so adding categories, implementing user-friendly design, and not overloading them can work wonders. And if pages still seem too long, sub-categorise further. Do not be afraid of one-way links to sites you really like. Generosity won't hurt you in this case: when your resource page contains valuable one-way links, people will realise you've created them for purposes more noble than simply acquiring more links to your site. Actually, it is recommended that the majority of your outbound links are one-way links to high quality, relevant sites.

To protect yourself from possible complications, just follow a few simple rules:

(1) [7] Do not link to bad neighbourhoods. Avoid unrelated sites unless they are useful for all Internet surfers, like search engines, general directories or communities.

(2) Before writing a link exchange request, be sure to read linking policies, which can differ from site to site. Let the other party know you've actually done so (i.e. quote these guidelines in your request if appropriate). If you can find the name of the site owner, use it in your request.

(3) Resist the temptation to automate the process.

(4) Remember that quick and aggressive link exchange campaigns are a thing of the past. It can still be a good idea to initiate 5 to 15 link exchanges when the site is young; after that, just let natural links come through.

(5) Trash all the link exchange requests that look automated (99% of them do). If you receive a personalised email, review the site to see if it is of any real value to your visitors. Check it for being a bad neighbourhood. If the link policies of the site say that a reciprocal link is a requirement, it is yet another reason not to link to it.

(6) Ask yourself if the link they are offering you is likely to bring direct visitors.

Remember that all new inbound links - reciprocal or one-way - are now given a trial period before they start giving you any ranking benefits in Google, so be patient. As long as the overall number of reciprocal links is not unnaturally high, Google will still count them, but if you are getting too aggressive, your links will sooner or later be devalued. Other engines don't look much at link popularity at all and pay much more attention to the on-page factors.

Using three-way links (where A links to B, B to C and C to A) instead of reciprocal links is not a good idea.

[8] More on natural linking patterns and reciprocal linking issues.

Why is it believed that all the directory submissions should be done manually? There are so many auto-submission programs around - and manual submission will take a lot of time, won't it?

Directory submission - as with everything in SEO - is only good when done artistically. It means varying titles and descriptions; it means choosing the most relevant category for your site in all cases. The last task is not as easy as it may seem: all directories have different category structures.

It is good style to read the site-specific guidelines before you start submitting your site. Like everything else, guidelines differ slightly from directory to directory; your auto-submitter is, obviously, incapable of reading them. Many directory owners attach a "test for humanity" to their submission forms: you will have to enter certain characters, shown to you as graphics, manually. That's done in order to stop auto-submitters, as such practice is commonly recognised as spam.

How do I create a spider-friendly site map?

Make it using pure HTML, not JavaScript. If you like JavaScript-powered site maps, add a static alternative in the <NOSCRIPT> tags. It will be visible to search engines, as well as to the Internet users who prefer to turn the scripts off for security reasons.

If you wish to make it in static HTML, it is good to indent entries according to your site's structure: let's say, your home page is Level 1; pages directly linked from it are Level 2 (with a 5-pixel indent), then comes Level 3 pages (with a 10-pixel indent) and so on. Such indents will improve the usability, and you can use CSS to achieve this.

Be sure to include keywords in your site map links. It will happen naturally if your pages are well optimised, as on-page optimisation includes headers; if you've done everything properly, they should already be keyword-rich. Just reproduce the headers of all pages on the site map - and everything is done. Do not, though, include multiple links to the same page with the sole purpose of including slightly different keywords: it is a spammy technique.

Do spiders follow image links?

Yes, they do. To improve your rankings, add an alt attribute to such images; but we would recommend using plain keyword-rich text links whenever possible.

A mini-sitemap containing links to your most important pages placed at the bottom of every page is, therefore, a good practice; just do not overuse it.

Do spiders crawl dynamic sites?

Again, yes, they do. They do not like parameter strings with too many parameters (generally, more than two parameters at once). And they hate session IDs, though modern search engines can handle such situations.

So, avoid showing session ID parameters to spiders whenever possible. Try to use POST instead of GET. Make sure different URLs do not point to the same page (spiders will treat them as duplicates); if you think you cannot avoid this, add a <META NAME="robots" CONTENT="NOINDEX,NOFOLLOW"> tag dynamically to all the excessive copies of your pages or hide them from spiders via the robots.txt file. There is no need to waste spiders' bandwidth by making them crawl the same content several times: they will filter such pages in their SERPs anyway.

And remember that SE spiders do not support cookies, so if your application requires them, it may become non-spiderable.

Do I hurt my rankings by linking to other sites?

No, you do not, unless you have too many links to bad neighbourhoods. The Internet is supposed to be interlinked, so an attempt to penalise for linking outside would kill the very idea at once. Quite on the contrary, many experts will tell you that if your site is reasonably linked to good, relevant resources, it will, quite probably, help you with your rankings. Our own experience confirms that.

There is nothing wrong in being generous.

What is the "Google sandbox effect"? Can it be avoided?

There are several different Google-specific effects recently noticed by SEOs which are somehow connected with the time/age factor. For simplicity, they all come under the umbrella name the "sandbox effect", but they are not all the same as each other.

Most often, when people say "sandbox effect", they refer to the effect experienced when new sites don't receive any decent Google rankings for any valuable keywords during the first months of their existence. This effect is also known as "aging delay" or "aging filter" and was first noticed after March 2004. Sites that had been launched earlier weren't affected.

Some people say that the aging delay effect can be avoided. So far, I know of only three examples of sites that actually avoided the effect completely. Two of them were pure blogs, and the third one a pure forum, which, I think, shows preliminary evidence of a pattern. Regular sites (neither a forum nor a blog) do get delayed, though in the non-competitive niches they can appear in the TOP 10 for some of their targeted terms quite soon (in about three months). In more competitive niches (like SEO or real estate) the process usually lasts much longer (8 to 14 months).

The aging delay effect wears off step by step and requires a lot of patience from the site owners.

Another type of sandbox effect is the so-called redesign sandbox. When the site gets completely redesigned it is in danger of losing a lot of previously acquired Google rankings for an undefined period of time, but it doesn't mean it happens to all sites.

Sometimes the complete redesign doesn't affect the rankings at all. It is recommended to preserve old URLs of pages, if possible. If not possible, a 301 redirect from the old locations to the new ones (per page) is very strongly recommended. If it doesn't help to avoid the redesign sandbox completely, it should at least minimise it.

The third manifestation of the sandbox effect is the trial period applied to all new inbound links. It means that when the new inbound links are added, they don't immediately start giving any ranking boost to the site they point to. They start working partially in about a month and a half and then, step by step, their weight increases.

People who like to jump-start their linking campaigns sometimes complain that their sites suddenly don't show in the TOP 1000 of Google for any terms, and assume it is because their links trigger yet another type of Google sandbox. It may or may not be true, but jump-starting link campaigns is not a good idea any more. It doesn't comply with natural linking patterns, which are now valued most.

The last type of Google sandbox effect comes into play when webmasters use the 301 redirect to merge their several websites and point them from the more authoritative sites to the less authoritative one. In such cases, it can take up to three months before the full authority of the redirected site gets transferred to the main domain and the rankings recover.

Links on this page:

[1] content of the site's pages: http://www.spiderfriendly.co.uk/seo-faq.php#4
[2] links pointing to it: http://www.spiderfriendly.co.uk/seo-faq1.php#5
[3] sandbox effect: http://www.spiderfriendly.co.uk/seo-faq4.php#21
[4] spam: http://www.spiderfriendly.co.uk/seo-faq2.php#8
[5] more on link building: http://www.spiderfriendly.co.uk/linkbuilding-secrets.php
[6] sandbox effect: http://www.spiderfriendly.co.uk/seo-faq4.php#21
[7] Do not link to bad neighbourhoods: http://www.spiderfriendly.co.uk/link-safely.php
[8] More on natural linking patterns and reciprocal linking issues: http://www.spiderfriendly.co.uk/natural-linking.php

Click here to print.