So if you’ve been tracking your sites progress on Googles search results pages, and you noticed some funny movement in the last week or so, you’re not imagining things. Google came out with it finally and admitted, yes they’ve had another regular update, but with Panda as part of the equation this time. Some have noticed that their sites have shifted a half dozen places or so, and some have noticed that for some of their optimized terms they’ve just completely disappeared.
As shocking and distrubing as it may be to suddenly find you’re not in the results where you were in the previous weeks, you may want to hold off on that complete site revamp to address your disappearance. To put it another way, Google took their search index, full of billions and billions of terms, tossed it up in the air and all of the websites are still coming down. Being filtered into all of their most relevant terms based on the current algorithm, it’s safe to wait just a few more days to see what happens through the weekend.
Google and +1
So search, it’s a funny game, moving, shifting, always changing. Facebook has their ‘Like’ button, which Bing has added their own special metric and weight to. And Google has their newer +1 button which they’ve come out and said basically ‘Yes it’s good for you to have on your site along side the Like button’. Basic fact though, the implementation of the +1 button on your site was actually bogging it down as of late, cutting your performance in half by almost half in some extreme cases.
While the Facebook ‘Like’ button is a flat blue color, the +1 button is a script or two which glows and stands out from your web pages. Definitely a hindrance to performance conscientious site owners, it wasn’t long until another disturbing trend was noticed. Visitors to pages with the +1 button, were slowly and steadily dropping. Almost strangely and on cue, Google has released a new version of their +1 button, faster, sleeker and much more in line with current web speed standards.
And just like the Facebook button, and those scandalous people making a living selling their browser clicks. It seems that because the +1 button can have a positive effect on your search ranking, some of the less scrupulous SEO companies out there are now selling their clicks. It’s not much of a stretch or a surprise really, as there are grey SEOs to be found all over the web selling all manner of SEO tricks. Selling links, scraping and rewriting content for you, Facebook ‘Like’ sellers and now +1 sellers. Just cut the SEO juice from the button and it’s true use will emerge, content promotion because it’s genuinely good content.
Unless you’re a member of the tinfoil hat group, you’ve undoubtedly used the internet and a search engine at some point in the last few days. You may have used Bing, maybe Google, but you had that need for information. Irregardless of which search engine you decided to partake in, you made your choices based on what you learned. But if you’ve ever been curious, ever taken the time, the results from Bing and Google can sometimes completely differ for the exact same search.
Effective searching is, strangely enough, a skill that everyone who is online should have, yet few do. It’s actually difficult sometimes to explain to clients, both existing and prospective, that the more complicated you make search in your head, the more frustrating your SEO campaign will be to you. The first problem as a business and website owner that you need to overcome, is the idea that when people search for you online, they use niche or specialized terms as they work. Unfortunately however, this is where things begin to go over the top in complication. If you own a website and business which fixes vacuums, then it’s in your best interest to optimize and build your site around that theme. The wrong approach to take would be to try and optimize your site around all of the different brands you deal with, instead of using an all encompassing term.
Different search engines display their results differently as well, and you’ll show up for different terms in them. Some of the points which will influence where, when and how you appear are things like your content, your url structure can even influence your positioning some what as can the lack of content. There’s no such thing as too much content, provided of course it’s relevant to your business and website. Keep it simple, don’t overthink it, and before you know it you’ll be showing up in the SERPs for all sorts of terms and phrases relevant to your business.
When you logon to your computer, fire up your browser and start your internet trek for knowledge, entertainment or what ever it is that has your mind occupied, are you going to be able to find your answer? It’s a question which has been gaining more and more traction in the last year or so, and DuckDuckGo, a new start up search engine has been shaking the search cage in an effort to forge it’s own path.
Recently they have put up a page detailing how when you perform a search on Google, Bing or Yahoo, you’re not getting a true results page. The screen shot of the search results clearly shows that different people will receive different results searching for ‘Egypt’ as a search term. Without reading the link text, it’s clear that the results pages are vastly different. But why are they different comes down to dozens, if not hundreds of different reasons. It can be as simple as your location in the country, the time of day or the trend in the news lately. The short pictorial provided on the DuckDuckGo page details essentially how search engines, Facebook, Twitter etc are all delivering pre-packaged results based on your web usage and they also contend that this shouldn’t be happening.
DuckDuckGo is a search engine which doesn’t save your search results, doesn’t pass your search terms onto referred websites, has a nifty red box they call zero click info (handled by Wolfram Alpha) which appears on some searches and after all that, is throwing their hat into the search engine ring. Being a new player at an old game is a tough market to break into, and DuckDuckGo is performing search in a way that is attempting to deliver a filtered *and* unfiltered internet. It’s a noble idea and does have some merit if you’d like to perform somewhat private searches on sensitive matters it may be an alternative for you. Google Chrome and Internet Explorer however both offer a cookieless browser which accomplishes the same result so you don’t really have to give up the engine you know and are familiar with.
The only real way to test if you genuinely live in a “search bubble” is to perform the same search, with 0 clicks on multiple computers. If you begin seeing that your results are significantly different than other peoples then perhaps you have a case. Personally after viewing the screenshots, when you look closely at the how many pages were fetched for each search term, there are tens of millions of pages of difference, so of course the results are going to be different. Part of Google, Bing and Yahoo’s success comes from the fact that they pass some search data to the referred website in the form of the search term, it’s what enabled the search engines to build their ad programs for web users. There are dozens of different variables when you receive your search results after you click that search button and even a simple variable like which data center sends you your results influences your page. If it happens to be running with an index which is a few hours older than others, you can very easily get different results when performing the same search multiple times.
The founder of the newer engine DuckDuckGo has recently discovered that he’s being hit with tons of spam queries for all sorts of seemingly random searches. He’s made note of the fact that while he can block these botnets from spamming his servers for the same query over and over and over again, he’s formed a question about this traffic.
In his own words from his blog:
“if other search engines include this errant traffic in their query counts. We work hard to keep them completely out because they would overwhelm our real direct queries #s and therefore distort our perception of progress. “
And while Gabriel makes a solid point and brings up a great question as to the quality of the searches and query numbers being generated, I think he’s missing the simplest answer. The founder of DuckDuckGo has managed to block the botnets at the firewall level to prevent them from skewing the query numbers and influencing the search numbers. And being that the other search engines, Google, Bing and Yahoo respectively, have been around far longer, it would lead to the assumption that they’ve already dealt with the issue about the false searches. As far as SEO is concerned, this kind of activity can be seen as a quality spam, as it can be seen as bots that the websites in question have received hundreds of thousands of queries and results from these malicious users. A game and method which was dealt with years ago by both Google and Bing, so it’s almost completely a non issue.
I think the more realistic reasoning behind the botnet traffic on the new search engine is a very simple problem that anyone with a website that has an input box and no validation can relate to. It’s just spam, either looking for an exploit or a kink in the code to be able to exploit the website software that’s been picked up. It’s argued that the small search engines like Blekko and DuckDuckGo offer a better quality of search due to the fact that they are smaller and less bloated than their big brothers. In time however, I can see it being realized that the larger and larger these small engines become, the more increasingly difficult it will be to deliver incredibly fast results (less than half a second) while maintaining a complex directory of hundreds of billions of pages. Google just last year reached the 3 trillion pages indexed mark, a number which would cripple most data centers in existence.
With the addition of the Google +1 button to the social world, an old question has been starting to make advertising agencies take notice again. Is the social ranking element of search, beginning to shape where you show up in the SERPs?
It has been long known that when you “Like” a topic via the Facebook button, you can generate a fair amount of traffic just with a simple click of a button. It’s only recently been entrenched in the Bing results now though that those “Like”s are beginning to shape your personal results pages. When you’re signed into your Facebook account and you perform a search for model racecar in Bing for example. You’ll be able to see mixed within your search results if any of your friends are involved in the same model racing scene as you are. It can create a good deal of traffic if your site is catering on a social level. With the addition now of the Google +1 button, it’s assumed we’ll begin to see the integration of the same types of results in Google as you would see in Bing and the Facebook “Like” button.
Part of the idea is you can determine which of those people on your friends list, you may have more in common with that you didn’t already know. It’s really a personal preference at this point in the game, as you need to be signed in to both services to view your friends likes in your search results. It’s going to be an interesting shift in the search game depending on how heavily your friends connections are valued as opposed to the organic listings as they are presently.
If there is anything in the world of search which can change the performance of your site ranking overnight, it’s being tagged as being a malicious site or having malware links on your pages. Searchers get warned when your site is displayed in the SERPs that visiting your site may harm their computer, it’s not hard to imagine that searchers would choose to stay away from your site as a result.
Being flagged as a malicious site or having malware links on your page can happen a number of different ways. In the serious end of things your site and/or server may have been hijacked and your pages could have been rewritten. You could have been picked up with an iframe attack, a clever hacker could have written a code injection on a page comment or link, or you could have just been repoted as such by a jealous competitor. There are a great many ways you can be flagged as a malware/malicious site.
There is a story circulating in the search world today about one such website owners dilemma. Their website, a Yahoo based store, was flagged as being a malicious website contianing malware links on their site. This is a web based business whose CMS is sandboxed after a fashion, by Yahoo controls and Bing has labelled them as malicious. Being flagged is a terrible thing, but seeing as the diligence has been done and there’s been found to be no fault, the owner submitted a ticket for the flag to be removed from their site as it had been applied in error. Now here’s the big problem, after submitting his help ticket and noticing that no change has been made they made a help ticket with Bing. The response they received, has to be no less than shattering; malware re-evaluation with Bing can take anywhere from 3 to 6 weeks to become resolved and for the flag to be removed. When your business is primarily generated through online presence, losing 3 to 6 weeks of business due to an error on a search engines part is devastating to your lively hood.
It’s always warranted to search for yourself online, to ensure you’re placing where you’re aiming and you’re displaying the information you want to be known for. What you definitely do not want to see is the malicious website warning to be tied to your site, as it takes Bing a minimum of a month to remove it, maybe as long as 6 weeks. At least on the upside, Google only needs 24 hours to remove a misplaced tag.
There are some general misconceptions about SEO which crop up from time to time and often come up when going over the process with clients. Some points are extremely valid questions to bring up while others receive ambiguous answers as it changes every day.
Some discussion points like “Why do we need to wait in building back links to our site?” for example tends to come up. To build up quality back links to your website takes time first of all, secondly if you were to go the shady route and buy thousands of links to boost your Page Rank, it’s a very quick way to get the search engines attention. And not in a good way!
“Why should I pay you every month when this other guy says he can do the same for a one shot job?” This is probably the largest misconception about the SEO industry and one of the hurdles which we are met with in dealing with new clients. The biggest reason that you can’t do just a once over and expect the results to carry on forever is because the internet doesn’t shut off. It doesn’t stop, it doesn’t sleep, it’s always changing. And in order to compensate and keep up, the search engines do exactly the same. The change their algorithms, tweak the results and shift the rankings on a weekly, and sometimes daily basis. Upkeep is absolutely essential to remain competitive in search engine optimization and someone telling you they can plant you firmly at the top for a one time cost of $200 is yanking your chain.
“SEO doesn’t seem so bad I’m sure our techs can do it here” This is perhaps the most closed minded statement to be encountered. I’ve written of it here on the blog before, but pick the right horse for the course. When you’re building a new website, contract a web designer. When you’re adding basic information to your site or updating information, use your techs. When you want to bring your brand and website up in the rankings, use a search engine optimization expert. Saying that your tech who does your database scripting will do your optimization for you is basically money lost at best. At worst, they try and shortcut your site and you get kicked from the index for breaking a rule or two.
There’s a million and one ways to make yourself found online, local, mobile, social, organic, ppc and within each of these there are countless other methods to work on. Let’s start with the assumption that you’ve followed all of the best practices when it comes to building your website.
You’ve used CSS and XML to create a uniform and attractive look. Used even simple things such as a doctype to tell your browser what it is that it’s reading. Creative, compelling content with a strong call to action which drives your visitors to buy your product, sign up for your news letter or forum and continue visiting your pages. Your images are tagged, your categories are tagged, you’ve worked hard at being the best in your niche market and are steadily enjoying the growing fruits of your labor. And then you learn, there is more which you can do to increase your traffic flow, visibility and as a result, improve your bottom line.
There’s always more which can be done in marketing yourself online, more steps which you can take to become more visible. That step you’ve taken to tag all of your images on your website properly? Congratulations, by taking a very simple step you’ve helped increase your visibility in the image searchs in both Bing and Google. With properly tagged and titled images, it helps your customers reach your site when you have clear pictures for your product to be seen.
Another strong step is issuing news about your company consistently. Whether you’ve closed that massive new merger which will allow you to double production or support, or even if all you’ve done is decided to hold a spring cleaning sale. It’s important to remain active in the eyes of your customer. This is where a blog is an amazing tool for your business, both small and large. It’s an ideal space for all of the aforementioned releases, as well as a location for your clients and customers to reply to your posts and even suggest improvements if some are needed in their eyes.
If you’ve cornered your niche market, and created your very own brand image offline, it’s extremely important to continue that leverage online. As an example, it wouldn’t do Pepsi or Coca-Cola any good to have direct queries for their brand name, direct users to competitors websites. It’s lost revenue and a lost avenue for income.
And if you provide a product or a service which has many steps or intricacies, it only helps your case to develop your very own how to pages on your website. If you provide a specific style of door knocker as an example, providing clear and concise directions on your website on how to install and care for your product can help instantly transform a curious searcher, into a new customer.
There are some interesting threads around the web at the moment around the recent global Panda roll out. Some websites are noticing that when the initial introduction came out a while back and they were dropped from the engine, their traffic is starting to return to previous Panda metrics. It would have been frustrating for sure to have to deal with the not knowing if you had in fact been in breach of scraping content, or been penalized for it when you hadn’t done anything wrong. It’s an anxiety which Google could have eliminated with even just a quick little post along the lines of “We’re addressing your concerns in an upcoming roll out, please be patient with us” as opposed to staying quiet.
Anyhow, the global roll out has occured and it looks like for the moment the farmers have been hit with a drought. The initial numbers have started to appear on various communities online and there are some familiar names in the list with some big losses in ranking. And what seems to be a long time coming, ehow.com has received their penance. Long touted as one of the worst offenders for aggregating content, the site was left virtually untouched in the preliminary Panda roll out. It seems however, that what ever loop hole they slipped through the first time, it snagged them on the second pass. Initial reports are showing a drop of over 80% representation in search results for the site. Other sites which were hit hard were live123.com, findarticles.com and associatedcontent.com. No real surprises there.
And just to mention something which is a little of a pet peeve of mine, the over thinking, or sensationalizing of somewhat arbitrary numbers on the internet. It was a thread I had been following for a few days in which the discussion was centered around the idea of what could be considered Google’s biggest threat to their online presence. The top three came back at no real surprise with Bing, Facebook and Google themselves, all being the threats to the giant.
The part of the discussion which really made me question the reading comprehension of the poster, was that because Facebook is most likely going to become public, that automatically makes them the biggest threat to Google full stop. Their resoning was based around an online tool in which they showed that Google had dropped 2.5% in their yearly traffic and Facebook had grown by 15%!! That clearly said to them that Facebook is the winner in the dominance race.
My issue with their reasoning, besides the point they were spouting their opinion as fact, was they never compared the metrics used to reach those percentages. Problem number 1, Google and Facebook are two different online tools. One is social, one is search. Only if and/or when Google becomes more social, or Facebook focuses on search, can comparisons begin to be drawn. Problem number 2, in comparing apples to oranges the numbers will always be skewed, yet that was ignored. Problem number 3, Bing was unfairly ignored in the comparison. Throwing Bing into the mix really tosses a monkey wrench into the comparison, as they experienced a 44%!!! growth from April 2010 to February 2011. In following with comparing apples to oranges, I contend that Bing is actually Facebook’s largest competitor, excluding the fact that they have an online partnership.
If you think of the internet as the wild west, then it’s safe to make the correlation of there being good guys, bad guys and everyone else.
Using this basis of comparison, who fits into what category is a completely arbitrary decision that changes between people and organizations. The search engines for example, Bing, Google, Yahoo etc, are they the good guys because of the services they provide? Or are they the bad guys because they can provide you with a basicly clear window to the internet? What about the RIAA, FCC and those of the same ilk. Are they good or bad because they want to be able to monitor online content, filter it according to rights and punish all who may dare to break their rules.
It’s a new age of content creation, distribution and monitoring, so I find it a little strange that the policy makers are pointing fingers at the big guy, Google. Their claim as a part of the proposed Web Censoship bill, is that Google (in a nutshell) is responsible for policing the internet and what their searches turn up. A spokesman for Google, Kent Walker was plain in his answer in saying that if this bill were to pass, then private companies will have a tremendous amount of power over Google and it’s behaviour. He also pointed out that there are flaws in any system, and that the bad eggs are out there specifically working on gaming the system and that just because a website has a link to content which may not be hosted by them, they shouldn’t be punished.
Because let’s be honest, as any web designer can tell you, a site can be created in about 20 minutes and uploaded and active online in 30 total. That site will then be crawled and placed in the index as appropriately as possible. Now the people trying to game the censorship system, all they have to do is create site after site, after site. The pages will be up and indexed faster than they could ever be taken down, any one with even half of an idea as to how the web works knows this.
If the bill should pass, it will mean new stringent guidelines to be adhered to and that god forbid you post something that becomes unliked by someone in power because you may just find yourself invisible in search no matter what you do. It’s an authoritorian rule, managed by those with the most power. And Scarface said it best:
In this country, you gotta make the money first. Then when you get the money, you get the power.
For further information and reading, you can find both sides of the argument at ArsTechnica. Both sides of the argument are discussed, those for the bill, and those against the bill.