We’ve been over the steps of what you need to do when you’ve been penalized and dropped from the index, but once you’ve followed all of those steps, you might be wondering just what’s next? To recap quickly what you should do, first go over your email (which you most undoubtedly have) and follow their major points of issue. If it’s bad backlinks, do your best to have the removed. Spammy content? Get a handle on it and rewrite it. Found out your SEO is playing the black hat game of gaming the engines instead of working with them? It’s time to drop them and call the real experts in search. After all of those steps, you resubmit your site for inclusion.
But once you’ve done all of that, it’s in the hands of the search gods. It’s where you need to sit on your hands, and wait for them to decide if you’ve done enough, to be reindexed and included back into the search rankings. What some people don’t realize though, is sometimes the search engines don’t fully clean your record, it may only be a partial pardon, incentive really to clean the rest of your act up. Just like search engine optimization isn’t just a black and white industry, neither is directing traffic at Google or Bing.
So, just how relevant is too relevant? It’s a question being asked lately as more and more often, the results page tends to be over taken by the same website. There was a short video put out by Matt Cutts and the Google team, trying to describe just what’s going on.
The method for displaying these newer results however, have been getting under users skin however. How diverse do the search results really look, or seem, when the top three or four, and sometimes the entire page, is taken up with a single result? Relevance to the search query is obviously which drives Google and other search engines to deliver their results, and the better refined they the better it is for the end user. Have you had any instances recently where the search results page has been dominated by a single result?
The web is a huge place, full of anything you can think of at any given time, because chances are if you can think of it, someone has made a website or web page for it somewhere. It could be as common as people writing about the latest movie or song, or it could be as low key as a new local band for instance, but if you were to hit up a search engine you will almost always find at least a webpage about it.
And with all of the billions and billions of web page and websites out there, it creates a market, and with any market comes the marketers. Search engine optimization, adwords, white hat, black hat, when you start reading about the industry you will find yourself running into terms which become more and more unfamiliar as you go. It’s no wonder that when you start having the conversation with a prospective, or sometimes even existing client, that the question comes up “Do you know how Google/Bing/Yahoo works? Can you promise me number 1?” Now the polite, short answer to that question is “No” and the long version is “No, we can’t promise number 1″. And then the inevitable happens, they utter the beginning of the worst phrase you can hear as an SEO “But I read/heard/was told that..”
Here’s the short reason why we can’t guarantee you number 1 in search for your business: the web and the search algorithms are always changing. When Sergei and Larry initially created the Google algorithm to run around and start indexing the web, it wouldn’t be a surprise to hear they never imagined it would get so massive. It’s rumored that the algorithm that runs now has somewhere between 250 and 300 ranking factors in it as it parses your website. And some of the confusion for those on the outside of the market, is when they read an article about how someone has cracked the algorithm to always rank on the top. I apologize for being up front, but anyone who tries to tell their clients that is a conman. At this stage of the search game, with as long as the algorithms have been changing and adapting, I doubt there is any one person employed by Google or Bing, who can sit down and tell you just how it works. Because at this point, they are just too big, too complex, and take into account so many different points that it’s mind boggling.
So your best course of action, is to adhere to the KISS principle, Keep It Simple Stupid. Don’t get crazy with your site, don’t get too smart with your content and follow the best practice guidelines; and you’ll be okay.
You can’t knock them for trying, and even though they’re well out of the search market share race, Yahoo has thrown another punch in the fight to stay relevant. Their new piece of software, Axis, is actually an interesting project, born with the idea to make searching quicker, and synchronous across your devices.
When you’re on your PC or laptop, you can only use their new tool as a browser plugin, giving you a more visual display of your searches. With the results pulled directly from Bing powered search, you can browse through your results in a manner more akin to flipping album covers in your media player. The results are the same, powered by Bing as they have for the last while, just delivered in a different package. As well as having a bottom screen bar taking up some of your screen real estate, you’re also given tabs on the left and right of your screen, to quickly navigate deeper into your search results. It’s a different take on a new game, and basically eliminates using the back button to locate exactly what you were looking for.
There is a bigger difference when you start to use Axis on your iPhone or iPad however. The app functions more like an instant share button, allowing you to spread the word about the newest deal you’ve found. It also allows you to quickly send those results to any of your contacts, and has added a new spin to mobile browsing with letting you preview your destination. Currently no other app has the capability in the marketplace, and by innovating Yahoo, with Axis, has kept themselves relevant in the search marketing game. A bonus for the company is they’ve also allowed you to synch your devices when signed into your Yahoo account, so while looking for that home repair guide on your desktop, you can open up that same page on your iPad and get right down to work, as it will have your spot saved right where you left off.
With mobile marketing and mobile search growing at a massive rate for the next few years to come, by pushing into this market early Yahoo has definitely made themselves a player in the meantime, and likely a part of the game for at least a few more years to come.
An interesting blog post from Bing has been gaining steam in discussion forums, and unsurprisingly, it pokes fun at Google and the recent Penguin update. A little poem of sorts has been made up, and it goes something like this:
Animal kingdom hurting ROI?
Pandas and penguins, oh my!
Take control and tell the fauna “Bye Bye”,
With these helpful suggestions to diversify!
It’s a silly little rhyme, but it has great sense in it; diversify. They go on to explain that by diversifying your websites optimization techniques, you can soften, or even eliminate the blow felt across the web with algorithm changes. If your organic optimization is flowing strong and healthy, focus on a weaker area, perhaps pay per click optimization and help to boost it’s output. Organic results are typically the hardest hit in search when there is an algorithm update or a sweeping change made ala Panda or Penguin. By having your additional channels of traffic performing at their peak, you can protect your position online and react if there is a drastic change occuring.
The Bing post went on to make great points as well, diversification aside, about how to manage your presence on the web. Keep an ear to the ground for any new and trending websites or aggregators, like Pinterest just a couple of months ago. It went from a simple board where people can share interests quickly and easily, to having a Pin button begin popping up on almost every major site out there. Pinterest had some key factors which helped make it incredibly relevant, strong, rapid growth, easily adopted technology, the media got on board quickly which spread the word and add in the interaction of friends and family and it took off like a rocket. Keeping your eyes on the horizon and watching for a trend can be an extremely helpful safety net.
There were a few great other points that were covered in the post, a lot of them were really just basics that cover some of the most basic SEO skillset. Like taking care of your sitemaps, are all of the links relevant and none broken. Same with your robots file, when was the last time you had a look at it’s contents to ensure it was still correctly configured? Do you have social sharing on the pages you want to have sharing on, and have you managed to keep any duplicate content issues down to nothing. Very, very basic work, not even necessarily from a search engine optimization stand point, but just from a webmaster stand point. Keep it clean, keep it basic, follow the news and trends, and you’ll be ready for the algorithm shifts across nearly all search engines.
Currently Bing is going through a transformation of sorts, they’ve revamped their look and performance, changed up the way they do social, and tried to streamline everything overall. The current end result: in their own internal testing they’ve come out ahead of Google. A near 10% gain while Google lost 10% of their score during testing, so what’s Bing been up too?
Firstly, they’ve been working hard at incorporating more of the social web, into your search results. Earlier in the year, Google introduced their version of this idea as Search+ your world, and was met with the ire of masses. The claim was made that Google was favoring their own social network and shunning Facebook and Twitter, with Google counter arguing that they couldn’t gather information from those sources. Bing currently, manages to pull information on searches from all of these sources, Facebook, Twitter, as well as Google+. It may seem as though Google was just blowing hot air, but it needs to be mentioned that late last year Twitter did effectively block the search engine, and Facebook keeps a pretty tight handle on what gets out onto the web, even with open and social profiles. Microsoft Bing, currently has deals worked out with both of these parties to index their information, and Google+ profiles, if they’re set to public then everything on that page is indexed as a public website.
Bing used to have your social mixed in with your search results, but they decided to change that idea and went in a completely different direction. All of the social search results have been shoved off to the right side of your screen, where your friends, family and colleagues are ranked as per relevancy based on your search. Also included in those social results are people and items which may also be relevant to your search. The reason for the change according to Bing, is having the social results mixed in with organic, they felt that it diluted the page too much, and your searches would be affected.
So where does that leave us, Bing is in the process of launching their completely revamed search and social service, and they’ve made big gains in the search world, based on their own internal testing. A blog post on that point makes it a little clearer:
We regularly test unbranded results, removing any trace of Google and Bing branding. When we did this study in January of last year 34% people preferred Bing, while 38% preferred Google. The same unbranded study now shows that Bing Search results now have a much wider lead over Google’s. When shown unbranded search results 43% prefer Bing results while only 28% prefer Google results.
Along with all things, changes to the way we use the internet happen on a daily basis for the most part. Starting from a single browser interface, to now having a half a dozen available to use depending on preference and platform, web tech has been changing and evolving almost as fast as the web itself.
Take browsers for example, just a few years ago in 2008, the online world was dominated by Internet Explorer, followed up by Firefox and just a sprinkle of the odd ones here and there. That was the year that Google Chrome was introduced, and since that time, the number one seeds have changed some. As of the start of 2012, there is a fairly even split of the browser market going to the top 2, Firefox and Chrome as the most widely used, Internet Explorer coming in at a distant third and the rest, still just a smattering on the internet landscape. As of March 2012, Internet Explorer has dipped under 20% of the browser landscape, thankfully at least half of that market uses an updated version of the browser, with version 8.
But browsers aren’t the only change we’ve had in the last few years online, social media has become a massive market on the web. The largest player in the space needs no introduction, Facebook entirely crushes the social market with having around a half billion users logged in on average per day. The unencouraging portion of that number however, is that nearly half of the businesses out there, don’t even use social media marketing to their advantage. Only about 20% of the businesses out there are even using Facebook to push their brand and market, with the smaller business owners more readily embracing the technology. Knowing it’s an avenue that needs to be explored, and taking that step to do so are two different things, and it seems that a lot of the time it’s people that try and make it complicated. Any concern for marketing is return on investment, and while organic search engine optimization is the best return in the business, it’s cost and time factors make it difficult for those with very shallow pockets. Freebie advertising though, like that can be found with Facebook and Twitter, can be easily measured however, broadcast your ad/tweet, and measure your traffic over the next couple of days. It’s not magic, it’s simple math when you have to keep it basic.
The goals of SEO are relatively simple, to make your site rank as highly as possible within the search pages for your niche. Whether you build houses, write stories, or draw pictures, search engine optimization is applicable for any website online. What a lot of smaller business owners can also use SEO for, is to knock the big players down a peg or two.
It’s an important step for all parties to consider SEO as a great equalizer online, you do however have to remember to stay within the rules. There are billions of web pages online, and yet with that daunting number in mind it’s still a relatively simple process to stay within the sights of the search engines. All you really need to keep in mind are the basics, even just following the best practices guidelines gives your website a shot at being picked up and indexed. But you need to also remember, the internet isn’t exactly a friendly place yet, a great deal of the web is free and wild. As a small example, you can’t control what websites choose to link to you if they choose too. This can be a difficult hurdle to overcome as well, as irrelevant, or inappropriate back links leading to your website can seriously hamper any SEO efforts you may have in place. This is only a single element of what’s known as negative SEO.
The larger, more established and authoritative sites such as Amazon are somewhat safer in this regards, however no one is completely immune to negative SEO. Negative search engine optimization can be defined as spammy links, blatant keyword stuffing, duplicate content or anything that isn’t considered white hat SEO by the search engines. Smaller, newer sites unfortunately are more susceptible to negative optimization problems. In the beginnings of a sites growth, it may not have much content or links pointing to it. If you’re not careful with how you craft your content or structure your links and navigation, you may even get dinged as having duplicate or irrelevant content in your niche. The number one point however that you need to keep in the forefront of your mind though, because the internet is still wildly untamed, the playing field is actually relatively plain and simple. Follow the rules, manage your website and monitor your content to make sure it doesn’t get scraped or that it has been copied from another resource. Even the big hitters can be taken down online, no target is too big or too relevant on the web.
When you’ve decided to build yourself a new site, whether it be due to needing an update, or if you’re just looking for a new image there’s a very important step to monitor. You need to ensure, that before you get too far into the process that you’re not making a rookie mistake and allowing the search engines to index both versions of your website. Doing so, can cause you grief and could ulimately penalize both websites for duplicate content.
When you’ve begun working on the newest version of your site, you need to ensure that it’s not being indexed by the search engines so you can work all you like without worry. The simplest way would be to use your htaccess file to block the bots, or alternatively if you have the means, you could work on a local server where the site isn’t techinically on the internet. Duplicate content can cause Google or Bing not to know which page it should list in response to a search. The search engines suddenly have two versions of your website and content to consider, and need to determine which it feels is the most relevant of the two. Seeing as your old site originally had the content, you stand to injure your brands reputation and new url simply by working on a new site or look.
Duplicate content isn’t just a concern when you’re working on your own website, it’s actually a point you should make note to occaisionally monitor. A bothersome trait and a difficult problem to tackle is if your own, original content ends up being scraped by a bot and winds up on an aggregator site. You can search for your own content by searching for key phrases and terms which you’ve used within the content and/or title, and hopefully the only sites which come up are your own or those you’ve given permission too to reproduce it. Typically scraper sites don’t rank that highly in search anymore, however there are still occasions where they do show up higher in the results than the original creators. When this happens, you often become trapped in a terrible cycle of trying to have your own, hard earned content removed from the index, and having credit given where credit is due.
It’s somewhat common knowledge that when someone performs a search, there will be a box of “Sponsored results” to the left, above, and sometimes even below the organic results. Bing has a paid service, as does Yahoo and Google has their AdWords which proved a business in search can be profitable. There’s a discussion lately surrounding paid search advertising and the big 3 search engines, and if you’re not careful with how you read it, you may walk away with the wrong idea.
Compared to this time last year, the CPC for Google has fallen again, for the second quarter in a row while Bing and Yahoo’s CPC have continued to climb. On the surface it’s a statement which can make it sound like Bing and Yahoo have been managing to grab ad space from Google. The point closer to the truth however is more to the tune that Google has become an even better choice to advertise with, as opposed to Bing and Yahoo. Search engine marketing via the AdWords platform or one like it, has to be measured differently than the organic results, you can’t take positioning as the end goal.
When you begin to break down the numbers involved in SEM and SEO, there are some key differences that you need to understand. They both depend on conversion rates, because without converting your traffic, you’re wasting time and money. One of the largest, and most important difference however is the click through rate of your positioning. You could be ranked at the very top of the AdWords results, but if you have a poorly written ad, or a poorly built website, chances are your conversions will be limited.
Another major point you need to keep in mind is cost per click, or CPC as was being discussed earlier. Where paid advertising is concerned, CPC is a literal interpretation of how much it is costing you to have someone click on your listing. Organic SEO is more difficult to define, as you’re not paying each time someone clicks your organic listing, but after a few months you can more easily break it down. A high cost per click for your search term can mean that there are many people in the same space, or, it can mean that one of your competitors is driving up the bid on the keyword to try and gain dominance. A declining average cost per click isn’t necessarily a bad omen, it can point to reduced competition, it can also mean an improved conversion rate.
There are all sorts of experts out there in the SEO world, and for all of the experts out there willing to take only a couple of hundred dollars to place your site, there is a larger road block to finding the real pros in the industry; information. Good info, bad info, just plain wrong info, if you search for seo expert, or anything along that though line, you will run into some real winners if you’re willing to dig deep enough.
The last year or so the internet search world has been buzzing with Panda, it dropped this clients site, or it ruined the results for the term which magically brought their site 20,000 visitors in a day previously. Casting aside all of the hyperbole, Panda didn’t affect the vast majority of the websites out there, the main aim of the algorithm is to search out spammy sites, sites with scraped content from other sources and even sites which use automatic posting means. Just like any of the other major algorithm shifts, if you weren’t doing anything wrong at all, you’ll have noticed very little change in your positioning, and in your visitors.
But, if you happened to be working in a back linking scheme to garner thousands of links from a seemingly active blog, and you magically dropped in the rankings, then chances are the blogger wasn’t quite doing things the proper way. Before you start reading information on search and taking it at its face value, you need to dig even deeper into the threads and posts on the site which calls itself experts. If it’s only comprised of a handful or so pages, chances are they haven’t done anything except find some decent content and copy it. If the information sounds good, check the post date on it, if it was posted even a year ago, then as great as it sounds then there are likely vast portions of it unusable. And finally, actually take the time to dig into the post, read it both silently and out loud. If something isn’t adding up as you read it, sentance structure is off, or the cadence is jerky, then there are a couple of strong contenders. The post in question was either scraped and put together in a hodge podge fashion to try and dupe the algorithms, or, the piece was written by a piece of software.
The last bit may seem a little odd, but there are programs available now which can truly write all of your blogs for you. You can feed it a topic, how many words you want, what type of emphasis, and a few minutes later you have a post. The key issue with these programs however, is just as bad as someone manually scraping the web for content, the posts are almost entirely made up of scraped content. The software is just designed to piece it together to make it fit the parameters you have set.
There is really only one rule to bear in mind when searching for an SEO, or when one approaches you: Can you find them when you search for them? Because if they can’t list their site in the top few pages, then chances are very strong they can’t do a thing for your site as well.