Browsing "internet news"
In a somewhat strange twist of irony, Googles social site Google+ most followed member is Mark Zuckerberg. “Mark Zuckerberg isn’t banned from using Google+” you might ask but its probably the best indication that the two giants don’t really compete with each other. On the other side of the argument, Google is making some decidedly strong headway into the social arena with the beta of Google+ so who better to push it’s boundaries than the head of the largest social media network on the web.
Some of the reports coming out of the beta testing waters are interesting. Little tweaks to the social experience like a group video chat, better friend controls and more powerful privacy tools go a long way to providing a unique enough experience over Facebook. Google+ being one of the search giants products is going to be widely accessible right from the get go as it’s development on multiple platforms occurs in tandem. It will be available in browsers, on mobile, through search and as rumor has it, as an enterprise product as well.
Staying within the boundaries of the social aspect of the web, Farmville creator Zynga filed their S-1 form last week. For those of us (myself included) who have no idea what that means, the social gaming innovator is working on becoming public. Contained within their filing spells out just how dependant they are (at present) on their relationship with Facebook to remain as profitable as they are, for as long as possible. And with the switch to using Facebook credits as currency for their online social offerings, Facebook stands to earn a good lump sum, as Zynga reported their ‘hardcore’ players spent $600 million alone last year. A little more than pocket change at their 30% share of the pie for Facebook.
To continue in the same grain of sorts from yesterdays blog, Matt Cutts made an appearance on a Hacker News forum to shed some light on the notion of users being trapped in a search bubble.
Some of the main points that he made were:
- If someone prefers to search Google without personalization, add “&pws=0″ (the “pws” stands for “personalized web search”) to the end of the Google search url to turn it off, or use the incognito version of Chrome.
– personalization has much less impact than localization, which takes things like your IP address into account when determining the best search results.
– We do have algorithms in place designed specifically to promote variety in the results page. For example, you can imagine limiting the number of results returned from one single site to allow other results to show up instead.
As I mentioned yesterday, a search bubble doesn’t strictly exist, and it doesn’t remove sections of the internet from your searches and viewing history, it does however try and give you a results list that fits within your, at the least, location. Using the incognito version of Chrome, the private version of Internet Explorer will allow you to access as fully unfiltered a web as you like. You can even go so far as to remove the location factor in your Google searches by pointing your browser to www.google.com/ncr. The NCR extension essentially sticks you into cyber-space with no location setting.
When you logon to your computer, fire up your browser and start your internet trek for knowledge, entertainment or what ever it is that has your mind occupied, are you going to be able to find your answer? It’s a question which has been gaining more and more traction in the last year or so, and DuckDuckGo, a new start up search engine has been shaking the search cage in an effort to forge it’s own path.
Recently they have put up a page detailing how when you perform a search on Google, Bing or Yahoo, you’re not getting a true results page. The screen shot of the search results clearly shows that different people will receive different results searching for ‘Egypt’ as a search term. Without reading the link text, it’s clear that the results pages are vastly different. But why are they different comes down to dozens, if not hundreds of different reasons. It can be as simple as your location in the country, the time of day or the trend in the news lately. The short pictorial provided on the DuckDuckGo page details essentially how search engines, Facebook, Twitter etc are all delivering pre-packaged results based on your web usage and they also contend that this shouldn’t be happening.
DuckDuckGo is a search engine which doesn’t save your search results, doesn’t pass your search terms onto referred websites, has a nifty red box they call zero click info (handled by Wolfram Alpha) which appears on some searches and after all that, is throwing their hat into the search engine ring. Being a new player at an old game is a tough market to break into, and DuckDuckGo is performing search in a way that is attempting to deliver a filtered *and* unfiltered internet. It’s a noble idea and does have some merit if you’d like to perform somewhat private searches on sensitive matters it may be an alternative for you. Google Chrome and Internet Explorer however both offer a cookieless browser which accomplishes the same result so you don’t really have to give up the engine you know and are familiar with.
The only real way to test if you genuinely live in a “search bubble” is to perform the same search, with 0 clicks on multiple computers. If you begin seeing that your results are significantly different than other peoples then perhaps you have a case. Personally after viewing the screenshots, when you look closely at the how many pages were fetched for each search term, there are tens of millions of pages of difference, so of course the results are going to be different. Part of Google, Bing and Yahoo’s success comes from the fact that they pass some search data to the referred website in the form of the search term, it’s what enabled the search engines to build their ad programs for web users. There are dozens of different variables when you receive your search results after you click that search button and even a simple variable like which data center sends you your results influences your page. If it happens to be running with an index which is a few hours older than others, you can very easily get different results when performing the same search multiple times.
The founder of the newer engine DuckDuckGo has recently discovered that he’s being hit with tons of spam queries for all sorts of seemingly random searches. He’s made note of the fact that while he can block these botnets from spamming his servers for the same query over and over and over again, he’s formed a question about this traffic.
In his own words from his blog:
“if other search engines include this errant traffic in their query counts. We work hard to keep them completely out because they would overwhelm our real direct queries #s and therefore distort our perception of progress. “
And while Gabriel makes a solid point and brings up a great question as to the quality of the searches and query numbers being generated, I think he’s missing the simplest answer. The founder of DuckDuckGo has managed to block the botnets at the firewall level to prevent them from skewing the query numbers and influencing the search numbers. And being that the other search engines, Google, Bing and Yahoo respectively, have been around far longer, it would lead to the assumption that they’ve already dealt with the issue about the false searches. As far as SEO is concerned, this kind of activity can be seen as a quality spam, as it can be seen as bots that the websites in question have received hundreds of thousands of queries and results from these malicious users. A game and method which was dealt with years ago by both Google and Bing, so it’s almost completely a non issue.
I think the more realistic reasoning behind the botnet traffic on the new search engine is a very simple problem that anyone with a website that has an input box and no validation can relate to. It’s just spam, either looking for an exploit or a kink in the code to be able to exploit the website software that’s been picked up. It’s argued that the small search engines like Blekko and DuckDuckGo offer a better quality of search due to the fact that they are smaller and less bloated than their big brothers. In time however, I can see it being realized that the larger and larger these small engines become, the more increasingly difficult it will be to deliver incredibly fast results (less than half a second) while maintaining a complex directory of hundreds of billions of pages. Google just last year reached the 3 trillion pages indexed mark, a number which would cripple most data centers in existence.
Since they’ve up and done it again, Facebook that is, they’ve stepped over another line which has prompted EPIC to petition the FTC to get involved. The feature that has stoked their fire? The facial recognition software which Facebook enabled to auto-tag members pictures.
The Electronic Privacy Information Center (EPIC) stepped forward and asked the FTC to become involved in the new software because the new service in their eyes is unfair and deceptive in it’s use and methods. They’ve called for the FTC to force Facebook to suspend the entire program until stronger privacy standards are put in place and the feature is set to opt-in only. Facial recognition software and technology has been a touchy point since Facebook announced they were beginning to test it in Decemeber, and they’ve decided to make it like the rest of their programs and make it opt-out; hence the attention of EPIC. The statement EPIC released on the issue:
“users could not reasonably have known that Facebook would use their photos to build a biometric database in order to implement a facial recognition technology under the control of Facebook”.
To change the tone a little, location based mobile services, including search and advertising, is projected to reach more than $10 billion in the next few years. Location based advertising has attracted a lot of negative attention lately namely because the privacy angle of how much data your phone stores about you. Both Apple and Google admitted that yes their phones temporarily store your location data but that information was obtained by triangulating your position using cell phone towers.
Location based advertising and marketing, say like flash sales for consumers in the area, is only just starting to become a lucrative angle for businesses. The full potency of how far the metric can take you and your business is only just starting to be realized. Google is far and away the leader of the pack in mobile search, and with a click through retention rate of more than 75%, mobile search advertising is becoming the newest marketing tool for businesses everywhere.
The folks over at Search Engine Land developed a great chart which details SEO and used properly, may even be able to pinpoint issues in your current website and campaign.
It has several areas which layout a basic SEO checklist, beginning even from the coding of your pages, all the way up to the off page SEO tactics used. As well it has a section labeled as Violations, which starts with the very light errors of having poor content and keyword stuffing, up to methods which can grant severe penalties like using paid links and cloaked pages.
It’s a well put together chart which gives a great starting point for the technically minded business owner interested in their websites and would like to take a peek at things under the hood so to speak. Bear in mind as well, the graphic is not the end all be all list of SEO tips and guidelines, it’s merely Search Engine Lands version of what they feel is the absolute neccesary list of pros and cons.
You can have a better look at their image here.
With the addition of the Google +1 button to the social world, an old question has been starting to make advertising agencies take notice again. Is the social ranking element of search, beginning to shape where you show up in the SERPs?
It has been long known that when you “Like” a topic via the Facebook button, you can generate a fair amount of traffic just with a simple click of a button. It’s only recently been entrenched in the Bing results now though that those “Like”s are beginning to shape your personal results pages. When you’re signed into your Facebook account and you perform a search for model racecar in Bing for example. You’ll be able to see mixed within your search results if any of your friends are involved in the same model racing scene as you are. It can create a good deal of traffic if your site is catering on a social level. With the addition now of the Google +1 button, it’s assumed we’ll begin to see the integration of the same types of results in Google as you would see in Bing and the Facebook “Like” button.
Part of the idea is you can determine which of those people on your friends list, you may have more in common with that you didn’t already know. It’s really a personal preference at this point in the game, as you need to be signed in to both services to view your friends likes in your search results. It’s going to be an interesting shift in the search game depending on how heavily your friends connections are valued as opposed to the organic listings as they are presently.
Previous long running CEO of Google Eric Schmidt during a conference yesterday had a lot of thoughts to share about the online world.
Facebook for example, had connections to all of the friends you have, have ever had and even the friends you forgot about. They’re almost all there ready for you to find and become reacquainted with. Microsoft has their finger in the business pie so to speak, as that is their strongest market. Amazong Schmidt shared, is seen as the largest “store front” on the internet and Apple makes pretty things.
For all of the merits he bestowed on his comrades in the online world, he was also quick to add that as strong as they are, one of the companies being discussed was out of the expanding loop of the internet. The giant who just seems to be missing the bus is Microsoft, they just don’t seem to be using the same “platforming strategy”, as Schmidt called it, as the rest of the bunch.
The discussion however, was not limited to Microsofts perceived weakness in the current digital age. Schmidt in a rather candid moment declared, “I screwed up.” Of the laundry list of complaints people have had the world over about the search giant and their practices and procedures, the mistake Schmidt was speaking of was missing the boat to the social party. That’s not to say that Google is a one trick pony of course, just that he missed the social boom so to speak.
Schmidt has since passed the CEO reins to Larry Page, who’ve shifted the companies focus towars social with a very focused vision of becoming a serious player.
If there is anything in the world of search which can change the performance of your site ranking overnight, it’s being tagged as being a malicious site or having malware links on your pages. Searchers get warned when your site is displayed in the SERPs that visiting your site may harm their computer, it’s not hard to imagine that searchers would choose to stay away from your site as a result.
Being flagged as a malicious site or having malware links on your page can happen a number of different ways. In the serious end of things your site and/or server may have been hijacked and your pages could have been rewritten. You could have been picked up with an iframe attack, a clever hacker could have written a code injection on a page comment or link, or you could have just been repoted as such by a jealous competitor. There are a great many ways you can be flagged as a malware/malicious site.
There is a story circulating in the search world today about one such website owners dilemma. Their website, a Yahoo based store, was flagged as being a malicious website contianing malware links on their site. This is a web based business whose CMS is sandboxed after a fashion, by Yahoo controls and Bing has labelled them as malicious. Being flagged is a terrible thing, but seeing as the diligence has been done and there’s been found to be no fault, the owner submitted a ticket for the flag to be removed from their site as it had been applied in error. Now here’s the big problem, after submitting his help ticket and noticing that no change has been made they made a help ticket with Bing. The response they received, has to be no less than shattering; malware re-evaluation with Bing can take anywhere from 3 to 6 weeks to become resolved and for the flag to be removed. When your business is primarily generated through online presence, losing 3 to 6 weeks of business due to an error on a search engines part is devastating to your lively hood.
It’s always warranted to search for yourself online, to ensure you’re placing where you’re aiming and you’re displaying the information you want to be known for. What you definitely do not want to see is the malicious website warning to be tied to your site, as it takes Bing a minimum of a month to remove it, maybe as long as 6 weeks. At least on the upside, Google only needs 24 hours to remove a misplaced tag.
The most recent effort to introduce a bill aimed at placing the responsibility of policing the internet, of sorts, and it’s content has been blocked by Senator Ron Wyden, an Oregon Democrat.
The PROTECT IP Act was layed out and written in such a fashion that it would fall to internet service providers and search engines to essentially censor the internet. The proposed aim was to reduce the flow of business to websites selling counterfit name brand products. And while the goal is a noble one, the powers granted to the government over the ISPs and search engines if they didn’t comply with their directives was too far reaching. Basically any business could rat out another to the government, who would then turn around and say “Block this website” to the search engines and service providers. If they didn’t comply, they’d be subject to the whims of the body put in place to oversee their actions.
The largest issue with the bill and the way it was written, the burden of proof was placed on the accused, not the accuser. In essence, if you wanted to stop a competitor from advertising on the web and placing within the SERPs, all you would need to do is accuse them of infringing on your copyrights. The burden of proof would then be placed on the accused and they would be basically blacklisted to the corners of the internet.
A strong advocate of the bill had his own take on the necessity of the bill:
“American consumers are too often deceived into thinking the products they are purchasing at these websites are legitimate because they are easily accessed through their home’s Internet service provider, found through well known search engines, and are complete with corporate advertising, credit card acceptance, and advertising links that make them appear legitimate”- Senator Patrick Leahy
It’s easy enough to debate his comment however with just the simple statement, if it’s too good to be true, it probably is. If you’re looking to buy a Rolex and you stumble upon that “hidden” gem online where you can buy one for a 10th of the retail cost, I would bet you’re buying a counterfit. Big business has a problem with the counterfitters namely because they’re almost entirely fly by night. They’ll engage in ruthless cut throat, black hat SEO tactics to continually rank above them in the SERPs to gain the visibility. The most consistent way to “win” the counterfit war is to simply rank above the gamers of the system. Investing in your website, investing in organic SEO and most importantly, investing in your brands online visibility.