It really shouldn’t have to be said, but search is changing, it’s evolving into a faster, finer tuned machine than it was 10 years ago. Within all of those changes, there are different focal points which are constantly being tweaked. Recently, MDGadvertising produced a great graphic outlining the current and future trend of local search.
The term ‘local search’ shouldn’t be an unfamiliar one, any business owner with a website absolutely needs to be concerned with their online exposure. The image which MDG produced puts a lot of the research gathered about local search into an easily digestible format. And some of the information while straight forward, is still exciting to read.
There are a number of factors directly affecting current local search growth. One is business owners and search engines for that matter, are getting better are targetting where it is you’re conducting your search from. If you’re searching in Winnipeg there’s no reason for Google/Bing/Yahoo to give you results for Regina for example if you’re looking for a new car or home repairs. A second large metric to consider is the widely increased use of mobile phones to conduct searches while on the go. It’s estimated that on average 33% of all mobile subscribers use their phones to conduct searches, and 20% of those do it on a daily basis.
All of this local search business has a lot of value in it, especially when you work hard at becoming a leader locally in your niche. It’s estimated that this year nearly $6 billion will be generated by local search and by 2015 that number will be over the $8 billion dollar range. A huge amount of cash flow, that seems to be going largely untapped. Because at present,
less than 40% of businesses have a local search presence on the SERPs. Local search is only starting to pick up, soon the game will be running a full speed. Have you taken the proper local search optimization steps?
There are a lot of tips out there online about search engine optimization and the methods you can put to use to rank higher in the Google/Bing/Yahoo SERPs. You can find some of the same type of posts on our blog here as well. You’ll find discussion of white hat techniques, black hat techniques, the common steps known as well as some of the not so obvious ones.
What you don’t find very often however, are posts about what not to do, or what to look out for when you’re looking at contracting a company to perform SEO on your website. While the search engines are somewhat flexible in what you’re allowed to do, there are most definately some tricks which can get you black marked, all the way to completely kicked from the SERPs.
So, when you’re looking for a company to perform optimization on your site, keep your ears open for any of the below terms. If there is mention of using any of these practices, it’s time to run for the hills.
Using Cloaked content
This is one of the most common, and most likely to get your company banned, practices out there. For the most part, when you create content for your site you’re telling the search engines what your site is for. Google/Bing/Yahoo then lists the website under the titles and keywords that is found in that content. Cloaking content is when a company shows Google content, and then shows viewers different content such as ads or links to malware infected sites. This is what is cloaking and will get a site removed from Google in very short order.
A lot of blogs talk about how the meta data for keywords and description are defunct, but Google often looks to these as indicators of keywords that make up a site. For example a site about water softeners will often contain content relevant to that industry. Some companies, however, try to gain new content by what is known as “keyword stuffing”. Mainly this involves hiding keywords with single pixel sized font or camouflaging text the same as the background color to try and get listed more often, for more terms. It may seem to work short term, but it will get a site removed from the SERPs.
Duplicate content Websites
Some novice SEOs and SEO companies try to increase rankings by putting the exact same content on different pages on multiple sites. Typically they also use a scraper tool to gather quality content from websites for their own. Search engines have gotten adept at catching this and will happily penalize, a website that has too much duplicate content.
Auto Generating Content
Another poor technique is to use a program to write content for your website. This is exactly as it sounds, taking one article and then having a program rewrite the article by changing a few sentences and keywords over and over again.
Those are only a few of the terms you need to be aware of when speaking with an SEO company. Absolutely stressing the point that if any of the above techniques are mentioned as a tool they use, avoid them at all costs. There is no shortcut to success in online marketing, real SEO takes time and the more time and effort you can put into it, the bigger return on investment you can expect.
The anti-trust hearings versus Google and their supposed stranglehold of the web has been continuing in front of the senate. There are people on all sides of the argument it seems, Google on the defensive, Microsoft and a few others decrying that they’ve been wronged by the search giant. And one of the most basic arguments that Schmidt has used to rebut all of the claims of unfair business could very well win the day. Schmidt’s defence basically says:
“Google faces competition from numerous sources including other general search engines (such as Microsoft’s Bing, Yahoo!, and Blekko); specialized search sites, including travel sites (like Expedia and Travelocity), restaurant reviews (like Yelp), and shopping sites (like Amazon and eBay); social media sites (like Facebook); and mobile applications beyond count, just to name a few.”
Now on one hand, yes Google can provide all of the services that are available on the web, but there are simply better options. If you’re big into social networking, Facebook is still the king, if you travel a lot you use Expedia to find tickets and deals. I’ve personally used Amazon, eBay and Kajiji to post and purchase items and even the smaller search engines like Blekko have their place and a few tricks that Google just can’t do.
So Schmidt’s argument that there are options available online, users just need to navigate to them, is utterly true. Google doesn’t so much have a dominance of the internet, as it has a dominating presence in the search arena. And there are many out there who would point out, Bing, Yahoo and the littls start ups like Blekko which come along, chip away little by little at that armour. Google’s search advantage or position isn’t going to disappear or diminish in any great capacity until a revolutionary game changer makes itself known, just as Larry and Sergei did with Google.
So don’t worry about Google’s “dominating web presence” so much, instead use your keyboard and mouse and investigate the alternatives. Just because one site offers similar products, doesn’t automatically mean you have to use them. After all, you wouldn’t call Coca-Cola to order some Pepsi.
comScore has put out the August search numbers, and while it shouldn’t be a surprise Google’s search marketshare is nearly 65 percent while Yahoo and Bing are collectively in the low-30 percent range.
Google had 64.8 percent last month, down from 65.1 percent in July — dropping by a third of a percent. But as Google dipped ever so slightly, Yahoo and Bing picked up its losses. Yahoo’s share grew by 0.2 percent to 16.3 percent from the month before, and Bing rose too by 0.3 percent to 14.7 percent.
While Google drops slightly, the other two in-duo are raising their user base. Considering the Microsoft-Yahoo deal last year, whereby Microsoft handles search queries for Yahoo’s pages, the combined effort alone is beginning to show its mark on the search statistics front.
Americans conducted 19.5 billion total core search queries in August (up 1 percent). Google ranked first with 12.5 billion searches (up 1 percent), followed by Yahoo! with 3.6 billion (up 5 percent) and Microsoft with 2.6 billion (up 1 percent). But as Microsoft is still behind Yahoo by nearly two percentage points, one has to wonder whether all the investment and deals with Bing is even worth it.
When Google launched their first salvo into the social war with Buzz, they made some really big mistakes. Allowing anyone who was on your contact list basically be able to browse your contacts was a pretty big breach of trust for any social network, and it nearly sunk all of Google’s aspirations in one swoop. But fast forward 18 months or so and we’re over a month into their latest social offering with Google+.
They’ve made some serious improvements to their social understanding by watching the explosive growth of Facebook and their flop with Buzz. Privacy controls are easy and intuitive to manipulate, friends are easy to arrange and messaging controls are plain and straight forward. It’s easy to say that Google+ may be a contender in the social arena with hitting 25 million accounts in a fraction of the time that Facebook had, but public understanding and acceptance need to be used to temper their growth. People are beginning to understand the nature of social web sites with Facebook having been the king for so long. Many, myself included, find they have as much as entire friend feeds blocked as all they do is play Farmville or Cafe World. Facebook boasts having high day to day activity and retention rates, but if the majority of those people are just there to play games the quality of the use is definitely in question.
But just like Google’s AdSense and paid advertising you see on results pages, those game players on Facebook are served ads. Social Media Marketing is a very real avenue to explore if your a small company on a tight budget. Google+ at present doesn’t have business options setup, but they’ve made clear that yes, they are coming. So get your practice in with Facebook, Twitter tweets and PPC/AdSense marketing because even with a “paltry” 25 million users, Google+ will be a qualified market for advertising.
All of the taglines you generate with Twitter, Facebook and soon with Google+, may have more strength than you might think. Nicholas Schiefer recently won a Canada Wide science fair and made interesting inroads in the realm of search. The 17 year old is being compared to Mark Zuckerberg for his idea and implementation of his search algorithm, and those are no small shoes to fill.
The algorithm as it’s written, searches short documents like tweets, Facebook statuses and news headlines for starters. That 140 character string of gold is crunched and parsed by his infant algorithm to deliver results. It may not seem much different from what Google, Bing or Yahoo offer, but where it does get different is when his search algorithm applies context to the results. The advantages of a semantic algorithm which could determine context in the results it retrieves would be a great improvement in the realm of social search. As an example, you’ve been out for dinner and had a poor experience, you could use that type of search engine to determine if others have had the same experience. It’s possible to do so with the existing search engines, but it takes a bit of work to sort through the results to find customer reviews if you don’t include it as part of your initial search. It’s an impressive start for a young man who may be a part of changing the way the world searches. Time will tell how interested the world is in semantic, contextual searching should Mr. Schiefer continue his project.
When you logon to your computer, fire up your browser and start your internet trek for knowledge, entertainment or what ever it is that has your mind occupied, are you going to be able to find your answer? It’s a question which has been gaining more and more traction in the last year or so, and DuckDuckGo, a new start up search engine has been shaking the search cage in an effort to forge it’s own path.
Recently they have put up a page detailing how when you perform a search on Google, Bing or Yahoo, you’re not getting a true results page. The screen shot of the search results clearly shows that different people will receive different results searching for ‘Egypt’ as a search term. Without reading the link text, it’s clear that the results pages are vastly different. But why are they different comes down to dozens, if not hundreds of different reasons. It can be as simple as your location in the country, the time of day or the trend in the news lately. The short pictorial provided on the DuckDuckGo page details essentially how search engines, Facebook, Twitter etc are all delivering pre-packaged results based on your web usage and they also contend that this shouldn’t be happening.
DuckDuckGo is a search engine which doesn’t save your search results, doesn’t pass your search terms onto referred websites, has a nifty red box they call zero click info (handled by Wolfram Alpha) which appears on some searches and after all that, is throwing their hat into the search engine ring. Being a new player at an old game is a tough market to break into, and DuckDuckGo is performing search in a way that is attempting to deliver a filtered *and* unfiltered internet. It’s a noble idea and does have some merit if you’d like to perform somewhat private searches on sensitive matters it may be an alternative for you. Google Chrome and Internet Explorer however both offer a cookieless browser which accomplishes the same result so you don’t really have to give up the engine you know and are familiar with.
The only real way to test if you genuinely live in a “search bubble” is to perform the same search, with 0 clicks on multiple computers. If you begin seeing that your results are significantly different than other peoples then perhaps you have a case. Personally after viewing the screenshots, when you look closely at the how many pages were fetched for each search term, there are tens of millions of pages of difference, so of course the results are going to be different. Part of Google, Bing and Yahoo’s success comes from the fact that they pass some search data to the referred website in the form of the search term, it’s what enabled the search engines to build their ad programs for web users. There are dozens of different variables when you receive your search results after you click that search button and even a simple variable like which data center sends you your results influences your page. If it happens to be running with an index which is a few hours older than others, you can very easily get different results when performing the same search multiple times.
The founder of the newer engine DuckDuckGo has recently discovered that he’s being hit with tons of spam queries for all sorts of seemingly random searches. He’s made note of the fact that while he can block these botnets from spamming his servers for the same query over and over and over again, he’s formed a question about this traffic.
In his own words from his blog:
“if other search engines include this errant traffic in their query counts. We work hard to keep them completely out because they would overwhelm our real direct queries #s and therefore distort our perception of progress. “
And while Gabriel makes a solid point and brings up a great question as to the quality of the searches and query numbers being generated, I think he’s missing the simplest answer. The founder of DuckDuckGo has managed to block the botnets at the firewall level to prevent them from skewing the query numbers and influencing the search numbers. And being that the other search engines, Google, Bing and Yahoo respectively, have been around far longer, it would lead to the assumption that they’ve already dealt with the issue about the false searches. As far as SEO is concerned, this kind of activity can be seen as a quality spam, as it can be seen as bots that the websites in question have received hundreds of thousands of queries and results from these malicious users. A game and method which was dealt with years ago by both Google and Bing, so it’s almost completely a non issue.
I think the more realistic reasoning behind the botnet traffic on the new search engine is a very simple problem that anyone with a website that has an input box and no validation can relate to. It’s just spam, either looking for an exploit or a kink in the code to be able to exploit the website software that’s been picked up. It’s argued that the small search engines like Blekko and DuckDuckGo offer a better quality of search due to the fact that they are smaller and less bloated than their big brothers. In time however, I can see it being realized that the larger and larger these small engines become, the more increasingly difficult it will be to deliver incredibly fast results (less than half a second) while maintaining a complex directory of hundreds of billions of pages. Google just last year reached the 3 trillion pages indexed mark, a number which would cripple most data centers in existence.
With the addition of the Google +1 button to the social world, an old question has been starting to make advertising agencies take notice again. Is the social ranking element of search, beginning to shape where you show up in the SERPs?
It has been long known that when you “Like” a topic via the Facebook button, you can generate a fair amount of traffic just with a simple click of a button. It’s only recently been entrenched in the Bing results now though that those “Like”s are beginning to shape your personal results pages. When you’re signed into your Facebook account and you perform a search for model racecar in Bing for example. You’ll be able to see mixed within your search results if any of your friends are involved in the same model racing scene as you are. It can create a good deal of traffic if your site is catering on a social level. With the addition now of the Google +1 button, it’s assumed we’ll begin to see the integration of the same types of results in Google as you would see in Bing and the Facebook “Like” button.
Part of the idea is you can determine which of those people on your friends list, you may have more in common with that you didn’t already know. It’s really a personal preference at this point in the game, as you need to be signed in to both services to view your friends likes in your search results. It’s going to be an interesting shift in the search game depending on how heavily your friends connections are valued as opposed to the organic listings as they are presently.
There are some general misconceptions about SEO which crop up from time to time and often come up when going over the process with clients. Some points are extremely valid questions to bring up while others receive ambiguous answers as it changes every day.
Some discussion points like “Why do we need to wait in building back links to our site?” for example tends to come up. To build up quality back links to your website takes time first of all, secondly if you were to go the shady route and buy thousands of links to boost your Page Rank, it’s a very quick way to get the search engines attention. And not in a good way!
“Why should I pay you every month when this other guy says he can do the same for a one shot job?” This is probably the largest misconception about the SEO industry and one of the hurdles which we are met with in dealing with new clients. The biggest reason that you can’t do just a once over and expect the results to carry on forever is because the internet doesn’t shut off. It doesn’t stop, it doesn’t sleep, it’s always changing. And in order to compensate and keep up, the search engines do exactly the same. The change their algorithms, tweak the results and shift the rankings on a weekly, and sometimes daily basis. Upkeep is absolutely essential to remain competitive in search engine optimization and someone telling you they can plant you firmly at the top for a one time cost of $200 is yanking your chain.
“SEO doesn’t seem so bad I’m sure our techs can do it here” This is perhaps the most closed minded statement to be encountered. I’ve written of it here on the blog before, but pick the right horse for the course. When you’re building a new website, contract a web designer. When you’re adding basic information to your site or updating information, use your techs. When you want to bring your brand and website up in the rankings, use a search engine optimization expert. Saying that your tech who does your database scripting will do your optimization for you is basically money lost at best. At worst, they try and shortcut your site and you get kicked from the index for breaking a rule or two.
If you think of the internet as the wild west, then it’s safe to make the correlation of there being good guys, bad guys and everyone else.
Using this basis of comparison, who fits into what category is a completely arbitrary decision that changes between people and organizations. The search engines for example, Bing, Google, Yahoo etc, are they the good guys because of the services they provide? Or are they the bad guys because they can provide you with a basicly clear window to the internet? What about the RIAA, FCC and those of the same ilk. Are they good or bad because they want to be able to monitor online content, filter it according to rights and punish all who may dare to break their rules.
It’s a new age of content creation, distribution and monitoring, so I find it a little strange that the policy makers are pointing fingers at the big guy, Google. Their claim as a part of the proposed Web Censoship bill, is that Google (in a nutshell) is responsible for policing the internet and what their searches turn up. A spokesman for Google, Kent Walker was plain in his answer in saying that if this bill were to pass, then private companies will have a tremendous amount of power over Google and it’s behaviour. He also pointed out that there are flaws in any system, and that the bad eggs are out there specifically working on gaming the system and that just because a website has a link to content which may not be hosted by them, they shouldn’t be punished.
Because let’s be honest, as any web designer can tell you, a site can be created in about 20 minutes and uploaded and active online in 30 total. That site will then be crawled and placed in the index as appropriately as possible. Now the people trying to game the censorship system, all they have to do is create site after site, after site. The pages will be up and indexed faster than they could ever be taken down, any one with even half of an idea as to how the web works knows this.
If the bill should pass, it will mean new stringent guidelines to be adhered to and that god forbid you post something that becomes unliked by someone in power because you may just find yourself invisible in search no matter what you do. It’s an authoritorian rule, managed by those with the most power. And Scarface said it best:
In this country, you gotta make the money first. Then when you get the money, you get the power.
For further information and reading, you can find both sides of the argument at ArsTechnica. Both sides of the argument are discussed, those for the bill, and those against the bill.