Browsing "internet news"
So it’s no secret that Bing and Google aren’t the best of friends, but with Microsoft behind the Bing machine, it was a shock for the web to suddenly find Google labelled as malware.
You may think it’s really not that big of a deal, but it only takes one red flag to turn many novice users away from using any service or website. The mistake has since been ironed out on Microsoft’s end, and Google is no longer labelled as a security risk. Malware is a rather generic term, basically covering any kind of code or software which either steals your private information or messes up your computer enough that you can’t really use it effectively. Unfortunately for those same novice searchers and computer users, malware has another, more inconvenient side.
It should be no surprise that scripters and hackers who work to develop malware, are also tied to the black hat side of the SEO world. Search is a multi-billion dollar a year industry, and being able to sit atop the search results for highly competitive terms for even a few days is a million dollar industry. Many times this is where you’ll find a specific type of malware usually known as ransom software. What happens is when a user clicks on the address of what they innocently think is their top results choice, instead they’re greeted with a popup message usually along the lines of “Your computer is infected – click here to protect your data!” And once that user clicks the button, they’ve been hooked. Once that back door has been opened, it is nefariously difficult to shut. It often leaves you open to backdoor access as well, which the scripter can use to steal your information, or even use your own computer to attack other unsuspecting searchers.
The first step to defending yourself is to have a proper anti-virus product, even a basic one will stop the majority of malware. The second step is to know what you’re seeing when you search. A proper website url will be www.this-is-a-real-site.com/yourresults.html, shown in green below your search results. A strong indication of a hijacked site or possible malware trap is when that address looks like so: www.possibly-malware.com/?p=23466. If you find an address which begins with a query string, there’s a good chance you’re not going to necessarily end up where you’ve hoped.
Money is a great thing, it’s needed for pretty much everything you need or want in this world. There are times to save money, and there are times to spend it. With the new year still fresh, now is the time to spend on your online presence so you can make 2012 your best earning year to date.
Search engine optimization is, for some odd reason, still a largely overlooked advertising expenditure. The internet is the ultimate store front, it never rests, and is always waiting to bring customers to your doors. It takes time, patience, an understanding of your current website and traffic, and what your ultimate goals are to even begin to craft an SEO campaign to implement.
There’s no ‘one size fits all’ version of optimization, as each and every client and website has it’s own unique set of problems. When you’re in the market for SEO, you need to bear that point in mind. If you’re searching for someone truly qualified in the area, there’s a very high chance they won’t have pre-packaged services for you to choose from. There are really 2 main steps when you’re hammering out the details of your costs associated with search engine optimization. The first, and one which affects your cost, is what is it that you’re trying to achieve and what key terms are you interested in. If you’re looking to rule the SERPs on a term which returns tens of millions of pages, your contract will have a steeper cost as opposed to a more niche market. The second step is where the compromising comes into play where terms are concerned.
Working as an SEO, we see the web a bit different than other people do. I know I haven’t browsed or used the internet the same way since I’ve began. Sometimes the keyterms clients choose need some adjustment, and through discussion we decide which route to pursue. It can mean the difference of a page 1 ranking, or a difference of a few thousand dollars in the term of a contract. Our goal in the end, is to bring you all the traffic you can convert, are you ready for the 2012 rush?
There are many key elements once you’ve built your website which you need to stay on top of, besides trying to focus on search engine optimization. Updating content, perhaps having a blog or a Twitter account with which to interact with customers/clients, and if your niche demands it, publishing a newsletter or email campaign to keep subscribers in touch.
In the background of your website, there’s also another element which needs consistent attention. Everytime you add a page to your website, create a new form or maybe add a photo album, your sitemap needs to be attended to. A sitemap is exactly as the name implies, it’s a table of contents for every page on your site. If you add, change, or remove pages, you also need to update your sitemap to reflect the changes. Typically your sitemap will be in xml or html format, but the important point is: it needs to be updated everytime you make a change.
Until now, there’s never really been a way to validate your sitemap without waiting for a short while until the search engines pass by your site to index it. At that point you could sign into your webmaster tools or site analytics and verify that you’ve either done things right, or if you needed to make some adjustments. But now for those who may be a little less technically inclined, Google has added the ability to test your sitemap before the spiders get to it, to make sure that everything is done correctly. This update, as well as a handful of other new and upcoming changes to their site tools are detailed on their blog post.
Over the last couple of weeks people have been hacking and slashing at Google because they’ve rolled out a change to how your results pages show up when you conduct a search. They’ve dubbed the change “Search plus Your World” and the idea is you receive Google+ data while signed into your Google account and conduct a search. Personally, I really don’t see the issue with their idea and here’s why.
Number one reason, if you’re signed into your Google account, searching Google.com, why would it surprise you to find publicly available information from Google+ in your results pages if it’s relevant? And from all of the screenshots of the integrated social results, a click of a button and they’re gone. Another argument I’ve seen about Google integrating the information into the SERPs is they are prioritizing its own content instead of linking out to third-party sites, which arguably is the whole point of a search engine. Valid point to bring up, but again, you can simply shut the option off with a few clicks at most. In the online world where 800 million or so people are used to the “opt-out” model thanks to Facebook, it’s almost surprising that it’s taken this long for another major web player to try it. Twitter and Facebook even backed a small browser bookmark of sorts to help cull out the Google+ results from your results pages. It’s outraged enough people, that bloggers are already forcasting that Bing is the new King of Search.
It’s perhaps those last two points which contributed to my puzzlement. For all of the people up in arms with Google and switching over to Bing, I can only assume two things. You were born on January 1, 2012 and you don’t have a Facebook account; amazing really considering there are so many. Here’s a brief excerpt from an article stabbing at the changes Google has recently made:
The new feature is baked right into Google and aims to personalize your search results by including Google+ data when you are signed into your Google account.
And here, is an excerpt from an article written in May 2011:
The worlds of SEO and social media were rocked the other day when Bing announced they will incorporate Facebook data into their search results for the most personal social-search integration to hit the web. What does this mean for the user? If you search for something on Bing and are logged into your Facebook account, you will see which pages, products and websites your friends Like and recommend high in the results, regardless of where that page ranks in the general SERP.
Perhaps Facebook should recite the idiom, people in glass houses shouldn’t throw stones, as Bing and Facebook have been at social search integration coming up quickly on a year of implementation.
Lots of noise being made in the social arena over the last week or so, Google lauding it’s social network, Google+, by saying that over 90 million people have joined so far. It’s still just a drop in the social bucket really compared to Facebook’s 800+ million, but some have started to question just how genuine Google’s numbers are.
Reports have been growing online that the process to sign up for a Gmail account has changed from being just a simple matter of choosing a login name and a password. It seems that now, you’re required to fill out a social profile for your newly created Google+ profile. To say it’s irked people who were previously considering getting a Gmail account would be putting it mildly. A work around was quickly found for those interested in only a Gmail account, but that functionality obviously will not last. And although I haven’t noticed any changes in my search results, there are other reports of people saying that Google+ is taking over their SERPs. It’s gotten bad enough that just today, Facebook and Twitter developers have released a ‘tool’ to remove the social results from your pages named of all things, “don’t be evil”. The tool is a bookmarklet that you place in your shortcuts bar which can ‘fix’ your results if you’ve found they’re full of Google+ results.
Throwing another curve into the already twisted recent social signals that Google is putting out, in the earnings report that was just put out, Larry Page was ecstatic about the reported ‘gains’ they were making in the social arena. It’s not really a surprise that Page would be so excited about Google+, as when he took the helm last year he was throwing out big bonuses for improvements on the Google social front. It’s a rocky start for the online giants, but it’s a start which will make 2012 an impressive year for social media.
There’s been a shift in the algorithm lately, as most in search are aware of and it may have a little to do with a Google blog post recently put up. In it, Google commented on websites with heavy advertising at the top of the page, forcing visitors to search for the actual content and about how things are going to change.
From their post it reads that there are complaints about having to basically search for results twice on the web. When a user clicks on what they deem the most relevant to their search and they’re greeted with a website which is heavy with advertisements on the top of the page; they’re needing to search the website again for their content. Top heavy advertisements on websites were mentioned as being so heavily populated with ads, a user has to scroll to even begin to find content. Google also mentioned that for going forward, those websites which are advertisement heavy, ‘may not rank as highly as before’.
Now don’t fret if you have an ad block on your website, you’re not going to be kicked into the basement of search. This change is going to affect less than 1% of searches conducted globally. So unless you’re in the business of having tons of spam on your webpages, you should be just fine. As usual when there are any kind of algorithm changes, no doubt in a few days the ‘end of SEO’ will be heralded online for the beginning of the year. But those who have been playing the game from the beginning, who helped shape just how search works and hammer out the rules, they should be the ones you watch. Until it comes from one of the real search experts that SEO is dead, I’ll just keep on plugging along.
In just a few more hours, the internet will be officially on strike in demonstration of the resistance being put forth versus the SOPA bill currently trying to be pushed in the States. There are a great number of sites ‘going dark’ in support of the event, with the idea that it will give web users an indication as to what the web may be like, should this bill, or any like it, come to pass.
Some popular web destinations like Reddit which bills itself as the front page of the internet, Facebook, Wikipedia, Tucows and even Google are gearing up to participate in tomorrows black out. There is some very strong language circulating where SOPA is concerned, going so far as to even name the bill an attack on free speech on the web. Those who support the bill being passed are deeming the blackout as a knee jerk reaction, trying to emphasize the bill in a bad light, the documents are out there for you to make your own decision.
Google has come along with a handy dandy guide as well, to assist with going dark in support of tomorrows events, which also come in handy should you need some downtime to work on your website. The advice comes in the form of enabling a 503 header return for your website which is your way to tell web spiders to come back later, your site is currently unavailable. This handy implementation will work should you decide to be part of tomorrows protest and come in handy should you need to work on your site in an emergency.
It’s fairly easy to return the 503 instruction for bots, especially if you have root access on your server. Adding the following line to your .htaccess file can take care of it for you:
RewriteRule .* /path/to/file/myerror503page.php
just adjust the instruction in accordance to your webserver. What this will do, is redirect visitors to your site to the error page for you, as well as taking care of any spiders poking around so as not to thwart any search engine optimization efforts you’ve been working on for your site.
So Google made a little bit of a blunder with their Chrome advertising it seems and what was the end result? Well perhaps the best way to understand what happened and it’s ensuing result, the algorithm needs to be a little more understood.
The Google search algorithm was intentionally designed to go out and read as much of the content of the web as it could find. It pays no heed to race, color or quality of the content. It doesn’t care how pretty your pictures are, how impressive your flash intro is or how quickly you can flip through your menu items on your navigation bar. It takes in the content of the web and spits it out when you ask it a question. It’s because it’s so simple that there needed to be filters put into place and penalties levied against people who either managed either by accident or on purpose to get around the quality controls put in place.
Paying for pagerank, that intangible mega star of the Google world, is a heavily punishable offence in the quality control guidelines. So it came as a rather big surprise when it was found suddenly, that Google was seemingly paying for advertising which was passing pagerank to its Google Chrome web page. The skeptics of the web automatically assumed that the Google machine would just shrug, apologize to the web, as they didn’t intend for it to happen, and everyone would be on their way. The outcome however, was actually the opposite.
Matt Cutts, via is Google+ account had the following to say of the incident:
“Google was trying to buy video ads about Chrome, and these sponsored posts were an inadvertent result of that. If you investigated the two dozen or so sponsored posts (as the webspam team immediately did), the posts typically showed a Google Chrome video but didn’t actually link to Google Chrome… we did find one sponsored post that linked to www.google.com/chrome in a way that flowed PageRank.. we only found a single sponsored post that actually linked to Google’s Chrome page and passed PageRank, that’s still a violation of our quality guidelines”
So okay, it was found out there was a minor slip in what was intended and what was the actual result, so what did they do?
“In response, the webspam team has taken manual action to demote www.google.com/chrome for at least 60 days. After that, someone on the Chrome side can submit a reconsideration request documenting their clean-up just like any other company would. During the 60 days, the PageRank of www.google.com/chrome will also be lowered to reflect the fact that we also won’t trust outgoing links from that page.”
If anyone ever questioned as to whether the machine would point it’s gun at itself, question no longer. As the webspam team has shown, no one is above the rules set with quality searches in mind. So bear in mind when you next work on your websites SEO, ensure that you’re following the best practices and the search guidelines readily found all over the web, else you’ll find yourself flung deeper into the ranks than you could imagine.
Did Bing play dirty over the shopping holidays? If you tried at all this most recent Cyber Monday to use the Bing search engine, the signs currently point to yes, they did play dirty with their results.
The creators of the idea of Cyber Monday, found themselves lost in Bings search listings because according to Bing their content was too “thin”. If the term is familiar, it’s because it sounds a lot like Google-speak when they started rolling out the infamous Panda updates and culling “thin content” based websites from their index. A difference to note however, Panda didn’t actually remove the offenders from the index, it just meant the odds of those sites ranking well plummeted.
Back now to Bings version of taking care of thin content and removing websites which fall into this category. Cyber Monday is now a billion dollar online shopping event, where website owners have the opportunity to make some good money heading into the holiday shopping weeks. If a site which could promise and deliver strong referrals could rank well, they would also stand to make a fair bit of change. Shop.org came up with the term Cyber Monday in ’05 and a year later created the corresponding website, cybermonday.com. This past Cyber Monday Google had the website in their SERPs, while Bing did not. Bing did however, have their shopping channel listed at the top of their results for searching cyber monday.
Bing has stated previously that they will dispense internet justice on sites deemed unworthy to be listed as part of their SERPs, but completely removing any and all traces of a site? Bing defines spam as:
Some pages captured in our index turn out to be pages of little or no value to users and may also have characteristics that artificially manipulate the way search and advertising systems work in order to distort their relevance relative to pages that offer more relevant information. Some of these pages include only advertisements and/or links to other websites that contain mostly ads, and no or only superficial content relevant to the subject of the search. To improve the search experience for consumers and deliver more relevant content, we might remove such pages from the index altogether, or adjust our algorithms to prioritize more useful and relevant pages in result sets.
So by removing the cybermonday.com website, if Bing were to stick to their guidelines they should remove all “thin” websites which fell under the same blanket. Yet they did not entirely and websites which feature almost identical content to the cybermonday.com website still appeared in their results. To further muddy the waters, the Bing powered search results which were served up in Yahoo would turn up Black Friday “websites” which would be deemed even thinner than the Cyber Monday website. With all the fuss that Bing was putting up about Google favoring their own results over all others, this sure doesn’t look well on the Bing radar. The Panda updates may drop websites rank if they’re found as being too thin a website, but at least they’re not completely removing them from the index ala Bing.