Everyone knows that search engine optimization is the game to play these days. After all, it seems as though it’s the only thing that everyone talks about anymore. But how do you go about properly optimizing your website? Do you really need to spend a bundle of money for special tools and consultants?
The answer is no – as long as you have a little bit of time and willingness to learn. In this article, we’ll take a look at some simple steps you can take that have been proven to greatly improve the ranking of websites within search engine results.
But first, I’ll define search engine optimization, according to Dictionary.com, so that we start on the same page: [search engine optimization is] the process of choosing targeted keywords and keyword phrases related to a Web site so the site will rank high when those terms are part of a Web search; abbr. SEO. A good basic definition – you will see in a moment, however, that SEO does go above and beyond mere keywords.
So let’s get started from the ground up.
We’ll start at the very basis – your website’s structure. This pertains to the way your site is put together, and every other aspect of SEO builds upon this. I’ll start with the basic website/web page elements you do not want to have, and I’ll also explain why:
>Re-directs – Re-directs refer to pages that are blank and just point to another page. Many search engines run into trouble when they try to add re-direct pages to their databases, so this is something you want to stay away from. Many re-direct pages are not indexed, and if you are using a re-direct for your homepage, there goes any chance you may have had of any part of your website landing in search results.
> Frames – Frames consist of essentially two separate web pages being displayed as one. Frames are usually used so that, for instance, a constant navigation menu can be displayed in a panel on the left side of the screen, while the rest of the screen changes each time a link in the navigation panel is clicked. This can make a website look snazzy, but search engines just can’t process frames, and you’ll end up with the same problems as if you tried to use re-directs. It is best to stay away from frames. The good news is that you can emulate the function of frames pretty easily, through the use of simple HTML – the only difference is that you’ll have to include the navigation menu code on each page, but it is worth doing this to have your website indexed.
> Image maps – Image maps are images that have had separate links added to them. To illustrate, suppose you have created an image with three areas of text on it: ‘Home’ ‘Help’ and ‘Links’. By “mapping” that image, you can create three separate links around each area of text, without having to break the image into three smaller images. Image maps are usually used for navigation menus, but here is where the problem comes in: search engines cannot follow the links in a navigation menu. (Aren’t you starting to get tired of all the problems these navigation menus cause?) There is an easy work-around, though, if you really want to use image maps – just include the links as text links somewhere else on each page that has an image map. Search engines will be able to follow these text links and index the rest of your site.
> Macromedia Flash – You either have a love or hate relationship with Flash, but the bottom line is it chokes up search engines too. If you don’t have to use it, please, don’t. (Not only for your sake, but for the sake of us who have gotten just a little bit tired of those tacky Flash headers on every website now!)
> CGI, PHP, and all that stuff – CGI, PHP, and other web programming and scripting languages are very powerful and can contribute a great deal to a website’s functionality. But dynamic web pages are just one more thing that search engines can’t digest. Oftentimes, there may be no getting around using scripting in a website, but use it sparingly, and if you’re using it for the sole reason of looking impressive when HTML would do the job just fine, by all means, switch to HTML!
So now you have a basic understanding of the technologies and areas to avoid when building – or re-designing – your website. A properly structured website facilitates the easier “crawling” of “spiders” through it so that it receives a higher search engine results ranking.
We’ll now move on to the content of your website, an often-overlooked aspect of search engine optimization.
The content your website contains is very important. All the search engine crawling in the world won’t do you any good if the search engine “spiders” don’t find any useful content on your site. Oftentimes, you won’t see search engine optimization programs cover your website’s content, but this is a very important point that I think should not be left out.
Your website should consist of a good amount of relevant, explanatory content, containing your keywords (which we’ll talk about in a few moments) and several good supporting images (we’ll get to those in a second as well).
Upon hearing that more text is better, many webmasters make the mistake of adding un-decipherable gibberish to their sites. This is also something we’ll discuss under the Keywords & META Tags section, but for now I will say that this is a big no-no. Search engines can tell when you’re trying to fool them, and if they consider your site to have committed a large offence, they may even permanently blacklist your domain from their results database. So it is extremely important that the content on your website makes sense.
One good idea is to include articles on your website. Search engines greatly prefer content that changes often, and what better way to have changing content than to continuously post articles on your site. Not only will you be perceived as an expert in your field, you’ll also make the search engines happy.
I think that’s as far as I’ll go with content – much of what will be discussed in the next section applies here as well, so keep reading!
Keywords & META Tags
Keywords are yet another integral part of your search engine optimization campaign. For starters, you need to sit down with a sheet of paper (and don’t forget your good old thinking cap!) and think of keywords that pertain to your business. (For the record, keywords can be both single words and phrases – just make sure they’re not too long!) Think of what your website does, and develop keywords that you would use to describe your website and its content. It does not matter how many keywords you come up with, although it is best to have no less than five (and the less you have, the more targeted your search engine marketing can be).
After you’ve developed your list of keywords, there are a few things you’re going to want to do with them. First, sprinkle them within the pages of your site. Anywhere you have text on your website, include these keywords. It is best to have at least three per page (of each one), as the search engines likely will not pay attention to them otherwise. Keep these tips in mind when inserting your keywords:
> Do not repeat a keyword over and over.
> Do not repeat a keyword over and over (or even just once) and make it the same colour as your website’s background.
> Do not include your keywords all together as gibberish that no normal human can understand.
The second thing you’re going to want to do is to include these keywords in the descriptive sentences you will use in the ALT tags for the images you have on your site. Again, follow the rules in bold above that can be applied to ALT tags.
Third, add these keywords to your META tags. There are three elements to the META tags that you will be using: Title, Description, and Keywords. For the Title and Description META tags, follow the rules in the bold text above – and be sur
e to include your company name in these two tags, if it is not already one of your keywords. For the Keywords META tag, you can go to town adding your keywords, one right after the other! Separate them with a comma, and do not include the same keyword more than once! Doing so (as well as violating any of the bolded ‘Do not’ points above) can also get your domain name permanently blacklisted.
Another great search engine optimization technique that is often overlooked is incoming links. Every link you have coming into your website raises the “awareness” of your site in the search engine’s minds. If the links come from websites with high Google Pagerank ratings, that is even better (in Google’s eyes, anyway – this doesn’t apply to MSN, Yahoo!, etc.).
How do you go about getting links to your website? One of the easiest ways is to add your site to as many industry-related online directories as you can find. A couple of other quick and easy methods include posting ads at Craigslist and other free classifieds sites, and swapping your links with similar (competing or not – this is up to you!) sites.
Remember, the more inbound links you have, the higher your Google Pagerank, as well as your position in the search results of Google, MSN, and the others.
Did you know that outbound links are just as important as inbound links? Perhaps the search engines don’t care for traffic hogs, but whatever their reason, websites containing links leaving their sites consistently rank higher in search engine results than those that do not.
So who do you link to? Remember that you can swap links with other related sites and you can both benefit. You can also link to complementary websites that visitors to yours may be interested in (but that don’t in any way compete with you). The main idea for getting both inbound and outbound links is to be aggressive, as well as innovative in your approach.
This is by no means an exhaustive guide. But I can’t keep rambling on and on either. You need time to go perform the tasks we’ve discussed here today, so that your site can be on its way to the top of the search engine results.
If you are writing articles for the purpose of off page SEO it is time to consider just how swiftly Googles algorithm is changing over a short period off time. We don’t need to concern ourselves too much with the other search engines at this time. You will have heard many times that unique content is King, and it still is, but you must also consider that contextual relevance is now Queen.
The whole way of thinking behind article content is being changed. Semantics, overall page theme, phrase matching, (comparisons with millions of other similar pages in their index to see if phrases are relevant to the topic / theme), and even the use of buzz or niche words – eg woodworkers, surfers, stamp collectors or whatever all have their own specialised words and phrases which would mean nothing to the rest of us. This is just a part of what Google will use to determine how important and relevant your article actually is to them.
This ongoing war between the guys who find a chink in Googles armour and then write software to exploit it will never stop. This is human nature. The thing is that these guys pushing their “latest, most powerful” scripts and PC applications have a good job for life. There are always millions of people who are quick enough to snap up their $97 products if there is a hint they can get to #1 without doing any work.
They have customers for life. As soon as Google addresses an issue then another script is on the market and so the loop continues. The sad thing is that the buyers of these scripts are spending so much time trying to be “smart” they probably never actually build a business.
A similar problem exists with the use of Private Label Rights articles. There is little wrong in principal with using PLR properly, they can give you some great ideas to get started, but the vast majority of users who have downloaded 1000 articles either free or for next to no cost simply load them up and fire them out onto the Net without even looking at them! This, together with spinning software is the cause for the incredible amount of useless and senseless junk on the Internet that Google is determined to deal with. And rightly so.
There is no doubt that the duplicate content issue is presently a huge cause for concern for Google. We all know that there was never any penalty imposed by Google for Duplicates. They simply ignored them. Weight being given to the oldest most relevant content.
It is unlikely that we shall see any penalty against duplicate content as it is quite reasonable to expect more than one version of an article on different websites. What we may see is penalties against what Google determines to be spun, low quality content with no relevance. I will not speculate on these penalties but we may have an issue on exactly what is determined to be sub standard content.
It would seem that the days of us “leading” Google into what our content is all about through the use of titles, keyword density, file names and so on is coming to end. Once the programming is in place it may well be that all we have to do is to write naturally with a certain amount of passion about a theme we have an interest in or have researched well. MMmmmm! Didn’t we use to do this before search engines and algorithms and programmers came on the scene?