Tagged with " bing"
It’s been long clear that a search engine is a search engine, and there are a handful online that receive the majority of the traffic out there. Google, Bing and Yahoo are the usual methods which are used to search, with other sites like Blekko, Duckduckgo, and Ask also being used by those desiring a different experience. Facebook however, is one of the largest websites online, and with what is approaching a billion users, the world has been waiting to see if Facebook is going to try and enter the search arena.
Ever since Facebook has gone public, the stock has been a sort of tepid pool, with no real revenue model the online mutterings often go to the topic of a search engine. And when Mark Zuckerberg throws around statements akin to “Facebook is doing a billion searches per day without trying” the mutterings pick up some volume. In a recent Techcrunch interview, Zuck made it clear that the company realizes that there is a huge opportunity for a search engine with Facebook, but tempers that by also talking about how the way that users search is ever evolving.
“Search engines are evolving” to “giving you a set of answers,” Zuckerberg said. ”Facebook is pretty uniquely positioned to answer a lot of questions people have. At some point we’ll do it,” he went on. “We have a team working on search.”
There are some real concerns about just how Facebook could leverage their massive user base as a search engine however, and it has less to do with spidering capabilities than it does privacy. The system as he briefly described and envisioned, would mean taking the opinions of your friends, family and contacts and trying to form a result for your query. Searching for terms like ‘best burger’ or ‘new batman movie reviews’ wouldn’t necessarily be informational, but would deliver you a list of opinions from your contact list. At any rate, it’s not happening today or tomorrow, or even soon for that matter. But (if) when it does, it will be introducing change into the search landscape and the online experience as a whole, and change is very, very good.
Google is the most widely used search engine globally, and accounts for 65% +/- usage in North America usage. Bing is the rebrand of the Live search engine service, and it’s sitting in a maintain position as of late at around 30% +/- share of the search market place.
Bing has long contended that they have a comparable search service, and some in the search world share their sentiments. But even with their rebrand, television commercials, and with taking over the Yahoo search market, their share remains at a steady third or so of the market. Dubbed as the Bing it On test, it’s a blind survey test which display unformatted, unbranded results and the user decides which results they would use of the two. It’s a testing method that is also known as the Pepsi test, where random people were given a sample of two drinks, and asked which they prefer.
When Bing had tallied the results of their (very small) online sample of 1000 people, they found that the users chose the Bing results at almost a 2:1 ratio. That’s a rather large statistical difference from the current norm with Google dominating the search market share, so why wouldn’t the numbers be the same for current market share? Well for starters the sample number is incredibly small. Using a data sample from 1000 people in the 18+ demographic is like a drop in the ocean, with there being somewhere north of 200 million people in the US alone. If you’re interested in which search engine appeals to you as a user, you can try out their survey for yourself here.
Previousl I wrote about a Google patent which has gained more and more traction in the search community. The patent in question is named Ranking documents, and as mentioned previously it seemed like an interesting tactic to employ to catch thsoe who less than ideal tactics to rank a website.
Unfortunately, often times you’ll find that people are discussing search engine optimization, and will tack on the terms spamming, or buying links, but the truth of it is – when you do it right SEO is none of those things. The new patent that was discovered I think of like a magic trick, it’s a bit of slight of hand that Google is using to get those unscrupulous tacticians to reveal themselves. Google shows them the index, they try and spam to get ranked highly. When Google notices the spam, they show a different set of results (if you don’t think they can do that, you’re not thinking clearly) just to see the reaction of their target. If they continue to offend, their site will likely be penalized, and the ranking pages will haven’t been affected negatively.
Spamming links to a website to give uncredible weight to a term is a measure used by those in the industry who are in the game for only a few quick clicks. Often used on disposable website urls, they will target a hot, trending term – as an example Paul Ryan’s RNC speech – and will try and garner as many links and as much visibility as possible in as short a time frame as they can. It’s a pump and dump tactic that is very likely to have the offending site banned from the index and the url blacklisted.
I fore see that this patent will be dissected, discussed, scrutinized and blamed for all manner of SEO problems and head aches to come. The only problem that some site owners have has with Google, Bing and Yahoo is they’re getting better and better at catching the people who try to cut corners or use less than natural methods to rank their site. Maybe even one day soon, we’ll have an adaptive algorithm which detects, and removes spam sites as you search actively so black hatters go the way of the dodo bird.
There have been adjustments, changes, and what seems like complete rewrites of the algorithm that Google started with in the beginning. At first when you searched, the results you were given were based directly upon the query as you’d entered it, and sorted by how many back links it had. Now however, when you search for ‘Winnipeg Jets’, images and video for the team appear even though the words “images” or “videos” weren’t in the query.
The algorithm that Google, Bing and a handful of others use, has grown and evolved to a point where it’s trying to anticipate what you’re searching for, as well as the direct query you may have typed. The search engines are getting better at bringing what you want to see on your given topic, and seem to be weighing the number of clicks through to a result as well as all of the previous criteria as well. As you’ve probably mistakenly typed a word or two while searching as well, you’ve probably noticed that search engines are also able to correct spelling mistakes which are commonly made. What the engines are getting much better at doing however, is not correcting your spelling, but interpreting what you may actually be searching for. Google can load dictionaries of how words should be spelled and common misspelled variations of those words and can look at how searchers correct searches and when they click on different variations. And it can use this data to not only suggest a query with a different spelling but to treat the misspelling as a synonym behind the scenes and rank the correctly spelled matches.
And as with what goes in the same basket as interpreting what you might be looking for, Google is noticeably moving forward on trying to discern your intent while searching as well. A basic description of how it works:
Keyphrases don’t have to be in their original form. We do a lot of synonyms work so that we can find good pages that don’t happen to use the same words as the user typed. – Matt Cutts
Even Bing is getting in on the act as if you mispell a common word or phrase, you’ll still often be brough to the correct results. Perhaps soon enough, you won’t need to search by typing, but by simply visiting your preferred search engine. And because you researched a new car and purchased one via online dealership shopping, the engine knows that in a few months it should perhaps deliver you information on local garages which offer oil changes and tire rotation services.
Google has been king of the search world, from almost the day it became a tool on the web. There are a handful of other search engines as well, all which do their best to offer a choice when you’re looking for online information.
There’s been some discontent with Google as a service, and it sometimes leaves users craving an alternative to the giant. There is a small problem with that idea however, and it’s the same reason that makes Google so successful. When you consider the basics of search, if you have quality content that people easily link to, you’re going to be well represented on the search engines, Google, Bing, etc. Google has just worked out how to best deliver the most likely content you’re searching for, because it can work out the content and the links leading back to that content.
It doesn’t mean though, that the web and search is due to stagnation. Everyone is working to innovate on the space, trying to find the newest, and biggest evolution in search and online interaction. There are some out there that do their best to be an answer engine, where you can basically query a database for an answer, and there are others out there which pride themselves on being hand curated by teams of human users to help promote the most relevant content. Bing and Google are both trying their hands at integrating your social life into your search results, both with mixed success at the present. But what all of these search engines become stuck and stuble upon, is the same issue, all of the current relevant results are built primarily upon links and link structures to help give value and authority to the website.
The future of search, won’t lie in constructing links back to your quality content, it will be when someone is able to come up with a search engine which can predict what it is you may be searching for. When you’re able to start looking for a new home for sale in a new city for example, and based upon your current, and previous searches it can determine that you’re in need of a new home near a school for your children, and it delivers those results to you as the most relevant. The technology doesn’t quite exist in such a way at the moment, as it would require massive amounts of calculations to hold the web open, ready to pick out the points you’re searching for. But the web and it’s technology do grow everyday, and perhaps soon enough we’ll be able to talk to our devices to find what we want.
In the middle of last year or so, Google started slowly pushing out warnings to webmasters of what they deemed as ‘unnatural links’ which were pointing to their website. Unnatural links, for lack of a better description, fall under the realm of being unrelated to your website. As an example, like a plumbing forum having links pointing to a website on cooking or gardening. Earlier this year, Google stepped up the notification significantly and almost immediately, sent the world of search engine optimization into a tizzy.
It was at that juncture, that webmasters began to start to drop links too and from their website, probably in the hopes that sending in their reconsideration request they would be able to clear the mark from their webmaster tools page. It’s an interesting process that Google has put in to place with the unnatural links notifications, some webmasters have laid evidence showing that they did nothing and plummeted in the search results. While others, who went through untold rigmarole trying to get their links cleaned up, reported no change in their positioning, despite multiple notifications.
And to muddy the waters just a little more, over the last day or so, Google has sent out another massive batch of notifications of unnatural links to webmasters everywhere. It seems that as of late, with all of the features Google has been adding to it’s webmastertools suite, they’re really looking at placing responsibility on the web owners. An interesting twist to the equation, is when you consider that search engines place at least some of the portion of their ranking factor into the links pointing to a website. Maybe, this is the beginning of Google trying to diversify their ranking algorithm and ideals? Time will tell, but giving webmasters the idea that they need to carefully maintain their link profiles is an interesting step.
Just as Hitwise measures search market share, there is a report put out by ACSI (American Customer Satisfaction Index) which tries to put a number on how happy users are with the varying search engines and social media sites out there. While there were some expected results with the survey, there was a surprise or two to be seen.
As far as search engines were concerned, it wasn’t a huge surprise to see Google still on the top of the list with an overall 82 points out of 100, and Bing picked up a little ground on them coming it at 81 points of satisfaction. When pressed for reasons about satistfaction, more than half of the respondants who chose Bing, noted that they liked the ease of use of Bing. I may be somewhat biased as I’ve always primarily used Google to do the bulk of my searching, and perhaps it’s a difference of Bing.ca versus Bing.com, but I’m not sure how Bing is easier to use over Google when both are just a search box. The links which appear after performing a search are nearly entirely alike, and it’s a rather short affair to be able to specify your results and tailor them as you like. Opinions are different for everyone, and that is the main point of a survey after all, to gather as many different ones as possible, back to the list. Plonking our way down we pass Ask.com at 80 points on the list and Yahoo at 78 points in customer satisfaction. And note, the survey wasn’t conducted about who uses which search engine, Hitwise covers that quite well and the numbers are fairly static with Google holding onto the lions share of the market. The point of the survey was about the satisfaction of using their preferred search engine, acquiring a rounded opinion would mean that after a point, the survey would have filled their quota with Google results and have been looking for Bing, Yahoo, and Ask users.
A new report that ACSI has put out however, has detailed the satisfaction level of those who use social media sites like Facebook and Google+. And again, just like the report for search engine satisfaction, it’s not about market share, it’s satisfaction so the same principle applies – to form a rounded opinion you need to have as equal amount of respondants as possible for each social media site. It was with this report, that the numbers were beginning to be surprising. The top marks in the survey actually went to Google+, with 78 points out of 100, followed by Youtube (73 points) and Pinterest (69 points). Twitter, LinkedIn and Facebook all took the bottom spots, with Facebook holding the basement spot with 61 points. With such a vastly diverse user base, it is understandable that opinions would be strong with some users about how Facebook handles itself, but there were some key reasons which came out which hurt the social media giant. The biggest issues came from the implementation of the Timeline feature, users felt there were too frequent, unnecessary changes to the user interface. Intrusive advertising came in as an issue with nearly 20% of the respondants complaining and one of the largest contributors to unsatisfaction was the privacy concerns which still dog the social giant. Nearly half of those surveyed rated Facebook a 5 or lower on a 10 point scale on how they handle privacy. Not surprisingly, the reasons Google+ excelled on the survey, happened to be the reasons Facebook tanked in comparison. On that same 10 point scale, 60% of the respondants for Google+ ranked their privacy protection as excellent with the fledgling social site. No advertising, at least not in the sense that dominates Facebook, exists on the service, and at present there aren’t any plans to add them, and a very strong mobile presense all helped Google+ to attain the top marks in satisfaction this year. There is, however, a small caveat to bear in mind with the social media results. On the whole, taking all of the social sites in hand, users are only 69 points out of 100 satisfied with social media sites, almost putting it in the basement of the study with television, newspapers and airlines.
When discussing links and linking strategies to your website, I had made mention as to the negative connotations around having poor, or unrelated backlinks pointing to your website. The watered down version of this would be, say you own a window repair business and in passing the search engines notice that you have a few hundred links from a taxidermy site pointing to your url. That’s a very quick way to get yourself in trouble, have your site scrutinized and quite possibly, dropped from the index until you have gone over all of the backlinks pointing to your site.
It’s long been an issue for the search engines in dealing with the proliferation of improper linking schemes employed by sketchy SEO practitioners. Google has their list and documentation about what they’ll do to your site should they happen to have improper links or linking strategies pointing at your site, but have they been beaten to the punch in truly dealing with the problem? Very recently one of the other guys in search, Bing, has released a way to allow users to disavow the links pointing to a website. What leads to a tad bit of confusion however is the way that Bing talks about the way improper backlinks affect your site, sometimes saying that it won’t do any harm and them sometimes saying that it could very much negatively affect your position within their rankings.
From the Bing Webmaster Blog:
Today we’re announcing the Disavow Links feature in Bing Webmaster Tools. Use the Disavow Links tool to submit page, directory, or domain URLs that may contain links to your site that seem “unnatural” or appear to be from spam or low quality sites.. There is no limit on the number of links you can disavow via this tool.
It’s a great way for you to have more control over who is pointing their content at your site, as well as the control it lets you have over your own web positioning. It’s a solid first step in being able to control your backlinks, it will be interesting to see how Bing deals with the reports which are submitted, as they have been notoriously slow to deal with changes and updates to their index.
With the way things have been progressing online, it’s not much of a stretch to think that some of the old ways have gone to the wayside for search engines. Google, Bing, Yahoo and all of the others out there need to choose a metric of sorts where by it allows them to determine what is relevant to certain topics and categories. The long running, and highest contributing factor since the beginning, has been linking to websites; both those which are relevant to your business and those which may help your positioning. It’s a very simple formula really, site A compiles a great deal of information about cogs and becomes known across the country as the top producer and information source for them. Site B, is a reseller of site A, and as such provides a link directly too site A, helping cement their positioning online as the top purveyor of the cog industry. That’s a very basic example, multiply that a trillion times and you’re beginning to see the beginnings of the internet, and how linking works to sort out the web.
Over the last couple of years especially though, the social web has made a big splash. Facebook, Twitter, LinkedIn, Pinterest, Google+, all social sites which are being hooked by the spiders and are being more frequently plugged into your search results. Some companies out there like to make the correlation that the larger the web has gotten, the more community focused people have become. Instead of searching out for the best cog company, people are asking their friends on Facebook and Twitter for example. It was a shock that ran through the SEO industry to think that the value of links was going to be tossed away, all because people suddenly had the ability (not that they didn’t already) to ask their friends for their opinion. It’s a (non)issue that continues being blown out of proportion by the unseasoned search experts out there.
Bet a very simple truth is this, the weight that linking and back linking to websites sin’t going to go away. Not just yet, and not in the near fore seeable future at any rate. That’s not to say that social signals and social linking isn’t going to become the heavy weights at some point, but that point is not today, not next month, and not next year.
With all of their updates that have been applied in the last while, Penguins, Pandas and who knows what else is coming, it’s becoming fairly common to read the occasional article on how poorly Google is faring as a search company. The news headlines are even beginning to creep into mainstream media more and more often, especially with Google+ trying to creep into Facebook territory.
But when you start to look at the numbers, year over year, nobody is really going anywhere. Where online search is concerned, just over 2/3 of the users choose to use Google as their search engine when looking for information online. The Bing/Yahoo machine (since Bing provides all of the results for Yahoo) stayed at a near 30% search share for the month of May, overall a loss of search share for the duo. Bing remained constant from April, and gained from a year ago, but since they’re filling the role of search engine for Yahoo, it is only logical to lump the pair together. The remainder of the search market is taken up by everyone else, Ask, AOL, and all of the other smaller engines out there like DuckDuckGo. These numbers are relevant to the desktop search market.
The mobile search market is much different than the desktop variant. While there maybe a much more varied platform base in the mobile market, it is absolutely dominated by Google, taking up the monster share of 95% of the US market. It seems that regardless of how much some SEOs decree the death of Google as a search engine, that the general user disagrees. At this point in the life of the web, the original search engine, is still the best search engine, going by the numbers. Your personal use and interpretations will vary somewhat from the general public.
Google, the Government, and you
Going over the search share numbers, it’s very plain to see that Google is sitting on the largest share of the pie, by a very clear margin. Being a company of such a huge size, with such a massive market share, makes you an impressively large target to take aim at. A couple of years back, in order to make information more available for view, Google began a new feature they dubbed as a transparency report. The introduction of the information was to give the general public an idea as to the types of removal requests the company faces on an ongoing basis. They’ve released their fifth data set, which gives a fairly clear timeline of events and online postings, and in their blog post from yesterday, Google has noticed a disturbing trend.
“We noticed that government agencies from different countries would sometimes ask us to remove political content that our users had posted on our services. We hoped this was an aberration. But now we know it’s not.”
It should be no surprise that governments are keenly interested with online activity and online content, it was only a short time ago that portions of the internet went black in opposition of the proposed SOPA bill. But even though governments have been requesting blog posts, videos, and sometimes even entire websites to be removed from the index, in the end they are just that; requests. And with the nature of requests, can come denial, which is what Google has been doing with most of the requests they’ve received. You can delve deeper into the report by following this link and it’s safe to assume that other search engines often receive the same requests to remove content from their index as well.