I had an odd occurence recently in terms of how search is evolving and it involved a rogue browser extension. It’s mildly annoying when you have a toolbar become installed in your browser of choice, but it’s frustrating when it’s installed without your expressed knowledge, say by having the install clause buried in a EULA for another program.
The rogue extension in question was Surf Canyon, a real time search reorganizer would be the short description. With the internet being comprised of literally trillions of web pages, search engines like Google, Bing and Yahoo are the big hitters in locating what you need online. They all offer their own pros and cons, Google is the weapon of choice for the majority of searchers out there for the past 10+ years.
Real time search results have become a challenge for all of the search players, with everyone working to get a solid solution to serving up relevant results which compliment the current organic offerings. The idea behind Surf Canyon extension is that it personalizes the web for you as you search. A fine enough idea, what was actually noticed however was the extension has somewhat a mind of it’s own.
Toolbars are a nuisance in a browser, fake links on webpages are a pain as you don’t really know what’s real and what isn’t without clicking. But a browser extension which supplants false links into webpages which you know have no outgoing links? That’s poor business practice and sketchy access to a computer and browser history.
Is Bing more biased than Google when it comes to the results pages? In research that has been gaining traction as of late, the answer seems to be yes. It wasn’t a directed study on a few select terms either, it was a large random sampling of the SERPs conducted by a professor at George Mason University.
What he found in the tests that he conducted was that for the most part, Bing will favor Microsoft content more often and more prominently, than Google favors its own content. According to the findings, Google references its own content in its first results position in just 6.7% of queries, while Bing provides search result links to Microsoft content more than twice as often (14.3%). The percentages may seem small, but when you consider there are billions of searches performed daily, suddenly 14% isn’t such a small number.
The findings also cast a different light on the recent FTC antitrust complaints which Google has been handling surrounding anti-competitive behaviour. It’s also a stark contrast to a similar study done earlier in the year, which concluded that “Google intentionally places its results first.” So now as a user with two completely different data points, which is the set to believe?
Well the second study which has been conducted had two goals in mind : To replicate the findings of the first study and to also expand on the methods used to determine if it was perhaps an issue in how the results came about. From the very beginning, it was found that while Google does favor its own content at some points, the selection of terms is exceedingly small. What was also learned and wasn’t mentioned in the first study, Bing does precisely the same in preferring Microsoft results, but for a much wider range of terms than Google does and is much more likely to do so. “For example, in our replication of Edelman & Lockwood, Google refers to its own content in its first page of results when its rivals do not for only 7.9% of the queries, whereas Bing does so nearly twice as often (13.2%)”
As for the second part of the study, the study used a much larger, more random sampling of search queries as opposed to the only 32 samples that the first study used to portray Google as the big bad guy of search. And the findings of the second study were related in the beginning of the post; Google references own content in its first results position when no other engine does in just 6.7% of queries, while Bing does so over twice as often (14.3%).
So, what does this mean as an end user?: Google (and Bing, though less so) really are trying to deliver the best results possible, regardless of whether they come from their own services (local search, product search, etc) or not. It all comes down to preference.
There are many steps which are part of a successful organic SEO campaign. There’s all of the little steps like writing good content, making sure you have the titles and meta tags in place and having a menu which is comprehensive. When you’re finished with the good practices pages, you begin to read about one of the time intensive steps of the campaign, link building.
Since Panda has reared it’s head over the last year or so, there’s been chatter about how the SEO game has fundamentally changed. That scrapers and content aggregators, the black hatters and the link buyers would just disappear and we’d have pristine, precise results. Time has started play it’s part and while the scrapers, aggregators, black hatters and link buyers have mostly been swept away, there has recently been a new call to revamp the way the system has been working. The desire to change the link building metric portion of the search game sometimes comes up in discussion as the points for and against the practice are argued. When you break it all down to the basic points, primarily every search engine will tell you the same thing: content is king. If you produce quality, relevant content, you will rank in the SERPs.
The kicker about producing this kind of content however, is you will naturally receive back links to your site and it’s pages. When you’re a new site and you need to visit and email possible consumers and possible partners in the same niches, building those back links takes time. But they will be built, they will be taken as a metric by the search engines and until an algorithm can come along which can read and evaluate content as a user would, link building will be relevant. It will be an important portion of any and every organic SEO campaign no matter how big or how small. The success of your link building campaign can be directly tied to how much work you’re willing to put into contacting those who are in an industry which compliments your own.
It really shouldn’t have to be said, but search is changing, it’s evolving into a faster, finer tuned machine than it was 10 years ago. Within all of those changes, there are different focal points which are constantly being tweaked. Recently, MDGadvertising produced a great graphic outlining the current and future trend of local search.
The term ‘local search’ shouldn’t be an unfamiliar one, any business owner with a website absolutely needs to be concerned with their online exposure. The image which MDG produced puts a lot of the research gathered about local search into an easily digestible format. And some of the information while straight forward, is still exciting to read.
There are a number of factors directly affecting current local search growth. One is business owners and search engines for that matter, are getting better are targetting where it is you’re conducting your search from. If you’re searching in Winnipeg there’s no reason for Google/Bing/Yahoo to give you results for Regina for example if you’re looking for a new car or home repairs. A second large metric to consider is the widely increased use of mobile phones to conduct searches while on the go. It’s estimated that on average 33% of all mobile subscribers use their phones to conduct searches, and 20% of those do it on a daily basis.
All of this local search business has a lot of value in it, especially when you work hard at becoming a leader locally in your niche. It’s estimated that this year nearly $6 billion will be generated by local search and by 2015 that number will be over the $8 billion dollar range. A huge amount of cash flow, that seems to be going largely untapped. Because at present,
less than 40% of businesses have a local search presence on the SERPs. Local search is only starting to pick up, soon the game will be running a full speed. Have you taken the proper local search optimization steps?
There are a lot of tips out there online about search engine optimization and the methods you can put to use to rank higher in the Google/Bing/Yahoo SERPs. You can find some of the same type of posts on our blog here as well. You’ll find discussion of white hat techniques, black hat techniques, the common steps known as well as some of the not so obvious ones.
What you don’t find very often however, are posts about what not to do, or what to look out for when you’re looking at contracting a company to perform SEO on your website. While the search engines are somewhat flexible in what you’re allowed to do, there are most definately some tricks which can get you black marked, all the way to completely kicked from the SERPs.
So, when you’re looking for a company to perform optimization on your site, keep your ears open for any of the below terms. If there is mention of using any of these practices, it’s time to run for the hills.
Using Cloaked content
This is one of the most common, and most likely to get your company banned, practices out there. For the most part, when you create content for your site you’re telling the search engines what your site is for. Google/Bing/Yahoo then lists the website under the titles and keywords that is found in that content. Cloaking content is when a company shows Google content, and then shows viewers different content such as ads or links to malware infected sites. This is what is cloaking and will get a site removed from Google in very short order.
A lot of blogs talk about how the meta data for keywords and description are defunct, but Google often looks to these as indicators of keywords that make up a site. For example a site about water softeners will often contain content relevant to that industry. Some companies, however, try to gain new content by what is known as “keyword stuffing”. Mainly this involves hiding keywords with single pixel sized font or camouflaging text the same as the background color to try and get listed more often, for more terms. It may seem to work short term, but it will get a site removed from the SERPs.
Duplicate content Websites
Some novice SEOs and SEO companies try to increase rankings by putting the exact same content on different pages on multiple sites. Typically they also use a scraper tool to gather quality content from websites for their own. Search engines have gotten adept at catching this and will happily penalize, a website that has too much duplicate content.
Auto Generating Content
Another poor technique is to use a program to write content for your website. This is exactly as it sounds, taking one article and then having a program rewrite the article by changing a few sentences and keywords over and over again.
Those are only a few of the terms you need to be aware of when speaking with an SEO company. Absolutely stressing the point that if any of the above techniques are mentioned as a tool they use, avoid them at all costs. There is no shortcut to success in online marketing, real SEO takes time and the more time and effort you can put into it, the bigger return on investment you can expect.
The anti-trust hearings versus Google and their supposed stranglehold of the web has been continuing in front of the senate. There are people on all sides of the argument it seems, Google on the defensive, Microsoft and a few others decrying that they’ve been wronged by the search giant. And one of the most basic arguments that Schmidt has used to rebut all of the claims of unfair business could very well win the day. Schmidt’s defence basically says:
“Google faces competition from numerous sources including other general search engines (such as Microsoft’s Bing, Yahoo!, and Blekko); specialized search sites, including travel sites (like Expedia and Travelocity), restaurant reviews (like Yelp), and shopping sites (like Amazon and eBay); social media sites (like Facebook); and mobile applications beyond count, just to name a few.”
Now on one hand, yes Google can provide all of the services that are available on the web, but there are simply better options. If you’re big into social networking, Facebook is still the king, if you travel a lot you use Expedia to find tickets and deals. I’ve personally used Amazon, eBay and Kajiji to post and purchase items and even the smaller search engines like Blekko have their place and a few tricks that Google just can’t do.
So Schmidt’s argument that there are options available online, users just need to navigate to them, is utterly true. Google doesn’t so much have a dominance of the internet, as it has a dominating presence in the search arena. And there are many out there who would point out, Bing, Yahoo and the littls start ups like Blekko which come along, chip away little by little at that armour. Google’s search advantage or position isn’t going to disappear or diminish in any great capacity until a revolutionary game changer makes itself known, just as Larry and Sergei did with Google.
So don’t worry about Google’s “dominating web presence” so much, instead use your keyboard and mouse and investigate the alternatives. Just because one site offers similar products, doesn’t automatically mean you have to use them. After all, you wouldn’t call Coca-Cola to order some Pepsi.
Some would think it foolish to continually bail out a sinking ship, as it springs more and more holes the longer it’s in the water. Yet as strange as a metaphor as that may be, that’s exactly what Microsoft has been doing for the last 4+ years with it’s search engine, first Live and now known as Bing.
Realistically you may initially think that it can’t really be that bad, search is a multi-billion dollar a year industry, Bing and search powered by Bing has about a third of the market so how bad could it really be. Bing search share has been growing slowly, abeit steadily since they rebranded themselves and growth and change has been positive. So how bad is it really? Since 2009, Bing has lost $5.5 billion dollars for Microsoft.
The image itself should put into stark contrast the amount of money that’s been lost in the search game. But Microsoft isn’t without tricks or ideas, and in an interview they hinted at some upcoming surprises.
Microsoft President of Online Services Qi Lu gave an impassioned speech about how Bing would improve search by “reorganizing the Web.” To do that, Microsoft plans to leverage its network of products and partnerships to gain a better understanding of what the user is after when they enter a query into a Bing search box.
And as if to emphasize that Microsoft has recognized the necessity for radical change, Lu also said in the same interview:
Microsoft could not and would not try to “out-Google” Google. Instead, it must “change the game fundamentally.”
If Bing stays the course, even the money analysts are predicting positive cash flow in the next 3-4 years, provided of course Bing continually earns search share from Google. Lu’s statement however, sounds as if Bing is being poised to be sent in a new direction.
comScore has put out the August search numbers, and while it shouldn’t be a surprise Google’s search marketshare is nearly 65 percent while Yahoo and Bing are collectively in the low-30 percent range.
Google had 64.8 percent last month, down from 65.1 percent in July — dropping by a third of a percent. But as Google dipped ever so slightly, Yahoo and Bing picked up its losses. Yahoo’s share grew by 0.2 percent to 16.3 percent from the month before, and Bing rose too by 0.3 percent to 14.7 percent.
While Google drops slightly, the other two in-duo are raising their user base. Considering the Microsoft-Yahoo deal last year, whereby Microsoft handles search queries for Yahoo’s pages, the combined effort alone is beginning to show its mark on the search statistics front.
Americans conducted 19.5 billion total core search queries in August (up 1 percent). Google ranked first with 12.5 billion searches (up 1 percent), followed by Yahoo! with 3.6 billion (up 5 percent) and Microsoft with 2.6 billion (up 1 percent). But as Microsoft is still behind Yahoo by nearly two percentage points, one has to wonder whether all the investment and deals with Bing is even worth it.
When Google launched their first salvo into the social war with Buzz, they made some really big mistakes. Allowing anyone who was on your contact list basically be able to browse your contacts was a pretty big breach of trust for any social network, and it nearly sunk all of Google’s aspirations in one swoop. But fast forward 18 months or so and we’re over a month into their latest social offering with Google+.
They’ve made some serious improvements to their social understanding by watching the explosive growth of Facebook and their flop with Buzz. Privacy controls are easy and intuitive to manipulate, friends are easy to arrange and messaging controls are plain and straight forward. It’s easy to say that Google+ may be a contender in the social arena with hitting 25 million accounts in a fraction of the time that Facebook had, but public understanding and acceptance need to be used to temper their growth. People are beginning to understand the nature of social web sites with Facebook having been the king for so long. Many, myself included, find they have as much as entire friend feeds blocked as all they do is play Farmville or Cafe World. Facebook boasts having high day to day activity and retention rates, but if the majority of those people are just there to play games the quality of the use is definitely in question.
But just like Google’s AdSense and paid advertising you see on results pages, those game players on Facebook are served ads. Social Media Marketing is a very real avenue to explore if your a small company on a tight budget. Google+ at present doesn’t have business options setup, but they’ve made clear that yes, they are coming. So get your practice in with Facebook, Twitter tweets and PPC/AdSense marketing because even with a “paltry” 25 million users, Google+ will be a qualified market for advertising.
All of the taglines you generate with Twitter, Facebook and soon with Google+, may have more strength than you might think. Nicholas Schiefer recently won a Canada Wide science fair and made interesting inroads in the realm of search. The 17 year old is being compared to Mark Zuckerberg for his idea and implementation of his search algorithm, and those are no small shoes to fill.
The algorithm as it’s written, searches short documents like tweets, Facebook statuses and news headlines for starters. That 140 character string of gold is crunched and parsed by his infant algorithm to deliver results. It may not seem much different from what Google, Bing or Yahoo offer, but where it does get different is when his search algorithm applies context to the results. The advantages of a semantic algorithm which could determine context in the results it retrieves would be a great improvement in the realm of social search. As an example, you’ve been out for dinner and had a poor experience, you could use that type of search engine to determine if others have had the same experience. It’s possible to do so with the existing search engines, but it takes a bit of work to sort through the results to find customer reviews if you don’t include it as part of your initial search. It’s an impressive start for a young man who may be a part of changing the way the world searches. Time will tell how interested the world is in semantic, contextual searching should Mr. Schiefer continue his project.
So in the world of search there’s a handful of true search engines, those little boxes of which you type in your current question or conundrum and off you go into the wild internet. We have Bing, which holds onto somewhere around 27% or so of the search market, Google who holds onto the lions share of search at just over 65%, and all those little crumbs in the bottom are search engines like Ask.com etc.
It’s not difficult to find press about how Bing is making massive inroads into Googles share of search, or how last year Bing grew by over 90%.. blah blah blah. When you boil the numbers all the way down however, all you’re really left with is Google and Bing, and the only way Bing is going to make positive growth in search is to take it from Google. So using misleading titles to the tune of Bing overtaking Google, or Bing Grows 90% over the year are nearly wholely misleading. Even with all of this “incredible growth”, with all of the addins and marketing strategies Microsoft throws at Bing they’re left with a fairly large problem. Despite owning more than 25% of the worlds search volume, Bing doesn’t make any money for Microsoft.
That may not seem like it makes any sense, but look at it from a different perspective, try and see it from the advertising angle of things. The sole product sold by search engines are the advertisements that appear on search pages, which are sold not for a set amount, but based on how many times customers click on an ad tied to the search phrase that brought the user to the page. And since Google has such a huge search market share, they’re rolling in cash right from the start because of their cost per click for their adword programs. Now the one biggest reason Bing doesn’t make money, isn’t because they have a smaller search share than Google alone, as it turns out, the cost per click tied to their advertising model is as much as 1/5 the cost of Googles cost. As bad as that may sound as a revenue model, it actually gets a little worse for the Bing machine. Less CPC looks great on the surface, but as an advertiser it brings up the issue of what is driving that low cost. Bing has less traffic than Google at the outset, the CPC to serve the same ad on Bing is cheaper than Google and in the end it translates into less ad impressions on the Microsoft search engine.
So the question in the end really, is there ever really going to be a solid competitor to the Google machine? If a multi-billion dollar a year company can’t even step into the same arena as the giant and succeed, who truly can? I say bring them all on, competition is what made the web what it is today, more will only make it better.