Late last week Google announced an additional metric to how it will be handling search results. Starting from last Friday, Google will be taking into consideration valid DMCA requests when parsing the index. While the new portion of the algorithm hasn’t been made live, they did have this to say:
Sites with high numbers of removal notices may appear lower in our results. This ranking change should help users find legitimate, quality sources of content more easily – whether it’s a song previewed on NPR’s music website, a TV show on Hulu or new music streamed from Spotify.
There are a couple of Google owned properties which are notorious for having copyrighted content, specifically Youtube and Blogger. And while they tend to receive the lions share of DMCA requests, Google has said it’s the valid takedown requests which will be used as the metric to decide who should stay, and who should fall. It’s the next major algorithm shift in store for site owners and it’s going to be interesting to see where it takes the content of the web.
Google is taking another page from Facebooks social networking prowess, and will being allowing vanity urls to some select profiles. Currently the majority of the Google+ urls are followed with a long string of numbers denoting your profile, while some are being tidied up.
While the idea is to roll out the feature and vanity value to all of the users, currently they have only passed the cleaned up address to a few on the social network. While it’s a step in a good direction for Google+ social offering, they still have a fair amount of ground to cover in order to catch up to Facebook. A small problem has been picked out currently with the change, as the new vanity urls haven’t been forwarded with optimization in mind. The new addresses are being used as canonical urls as opposed to being a full 301 to pass the full and proper content to the search engines.
Google has been king of the search world, from almost the day it became a tool on the web. There are a handful of other search engines as well, all which do their best to offer a choice when you’re looking for online information.
There’s been some discontent with Google as a service, and it sometimes leaves users craving an alternative to the giant. There is a small problem with that idea however, and it’s the same reason that makes Google so successful. When you consider the basics of search, if you have quality content that people easily link to, you’re going to be well represented on the search engines, Google, Bing, etc. Google has just worked out how to best deliver the most likely content you’re searching for, because it can work out the content and the links leading back to that content.
It doesn’t mean though, that the web and search is due to stagnation. Everyone is working to innovate on the space, trying to find the newest, and biggest evolution in search and online interaction. There are some out there that do their best to be an answer engine, where you can basically query a database for an answer, and there are others out there which pride themselves on being hand curated by teams of human users to help promote the most relevant content. Bing and Google are both trying their hands at integrating your social life into your search results, both with mixed success at the present. But what all of these search engines become stuck and stuble upon, is the same issue, all of the current relevant results are built primarily upon links and link structures to help give value and authority to the website.
The future of search, won’t lie in constructing links back to your quality content, it will be when someone is able to come up with a search engine which can predict what it is you may be searching for. When you’re able to start looking for a new home for sale in a new city for example, and based upon your current, and previous searches it can determine that you’re in need of a new home near a school for your children, and it delivers those results to you as the most relevant. The technology doesn’t quite exist in such a way at the moment, as it would require massive amounts of calculations to hold the web open, ready to pick out the points you’re searching for. But the web and it’s technology do grow everyday, and perhaps soon enough we’ll be able to talk to our devices to find what we want.
In the middle of last year or so, Google started slowly pushing out warnings to webmasters of what they deemed as ‘unnatural links’ which were pointing to their website. Unnatural links, for lack of a better description, fall under the realm of being unrelated to your website. As an example, like a plumbing forum having links pointing to a website on cooking or gardening. Earlier this year, Google stepped up the notification significantly and almost immediately, sent the world of search engine optimization into a tizzy.
It was at that juncture, that webmasters began to start to drop links too and from their website, probably in the hopes that sending in their reconsideration request they would be able to clear the mark from their webmaster tools page. It’s an interesting process that Google has put in to place with the unnatural links notifications, some webmasters have laid evidence showing that they did nothing and plummeted in the search results. While others, who went through untold rigmarole trying to get their links cleaned up, reported no change in their positioning, despite multiple notifications.
And to muddy the waters just a little more, over the last day or so, Google has sent out another massive batch of notifications of unnatural links to webmasters everywhere. It seems that as of late, with all of the features Google has been adding to it’s webmastertools suite, they’re really looking at placing responsibility on the web owners. An interesting twist to the equation, is when you consider that search engines place at least some of the portion of their ranking factor into the links pointing to a website. Maybe, this is the beginning of Google trying to diversify their ranking algorithm and ideals? Time will tell, but giving webmasters the idea that they need to carefully maintain their link profiles is an interesting step.
Just as Hitwise measures search market share, there is a report put out by ACSI (American Customer Satisfaction Index) which tries to put a number on how happy users are with the varying search engines and social media sites out there. While there were some expected results with the survey, there was a surprise or two to be seen.
As far as search engines were concerned, it wasn’t a huge surprise to see Google still on the top of the list with an overall 82 points out of 100, and Bing picked up a little ground on them coming it at 81 points of satisfaction. When pressed for reasons about satistfaction, more than half of the respondants who chose Bing, noted that they liked the ease of use of Bing. I may be somewhat biased as I’ve always primarily used Google to do the bulk of my searching, and perhaps it’s a difference of Bing.ca versus Bing.com, but I’m not sure how Bing is easier to use over Google when both are just a search box. The links which appear after performing a search are nearly entirely alike, and it’s a rather short affair to be able to specify your results and tailor them as you like. Opinions are different for everyone, and that is the main point of a survey after all, to gather as many different ones as possible, back to the list. Plonking our way down we pass Ask.com at 80 points on the list and Yahoo at 78 points in customer satisfaction. And note, the survey wasn’t conducted about who uses which search engine, Hitwise covers that quite well and the numbers are fairly static with Google holding onto the lions share of the market. The point of the survey was about the satisfaction of using their preferred search engine, acquiring a rounded opinion would mean that after a point, the survey would have filled their quota with Google results and have been looking for Bing, Yahoo, and Ask users.
A new report that ACSI has put out however, has detailed the satisfaction level of those who use social media sites like Facebook and Google+. And again, just like the report for search engine satisfaction, it’s not about market share, it’s satisfaction so the same principle applies – to form a rounded opinion you need to have as equal amount of respondants as possible for each social media site. It was with this report, that the numbers were beginning to be surprising. The top marks in the survey actually went to Google+, with 78 points out of 100, followed by Youtube (73 points) and Pinterest (69 points). Twitter, LinkedIn and Facebook all took the bottom spots, with Facebook holding the basement spot with 61 points. With such a vastly diverse user base, it is understandable that opinions would be strong with some users about how Facebook handles itself, but there were some key reasons which came out which hurt the social media giant. The biggest issues came from the implementation of the Timeline feature, users felt there were too frequent, unnecessary changes to the user interface. Intrusive advertising came in as an issue with nearly 20% of the respondants complaining and one of the largest contributors to unsatisfaction was the privacy concerns which still dog the social giant. Nearly half of those surveyed rated Facebook a 5 or lower on a 10 point scale on how they handle privacy. Not surprisingly, the reasons Google+ excelled on the survey, happened to be the reasons Facebook tanked in comparison. On that same 10 point scale, 60% of the respondants for Google+ ranked their privacy protection as excellent with the fledgling social site. No advertising, at least not in the sense that dominates Facebook, exists on the service, and at present there aren’t any plans to add them, and a very strong mobile presense all helped Google+ to attain the top marks in satisfaction this year. There is, however, a small caveat to bear in mind with the social media results. On the whole, taking all of the social sites in hand, users are only 69 points out of 100 satisfied with social media sites, almost putting it in the basement of the study with television, newspapers and airlines.
When discussing links and linking strategies to your website, I had made mention as to the negative connotations around having poor, or unrelated backlinks pointing to your website. The watered down version of this would be, say you own a window repair business and in passing the search engines notice that you have a few hundred links from a taxidermy site pointing to your url. That’s a very quick way to get yourself in trouble, have your site scrutinized and quite possibly, dropped from the index until you have gone over all of the backlinks pointing to your site.
It’s long been an issue for the search engines in dealing with the proliferation of improper linking schemes employed by sketchy SEO practitioners. Google has their list and documentation about what they’ll do to your site should they happen to have improper links or linking strategies pointing at your site, but have they been beaten to the punch in truly dealing with the problem? Very recently one of the other guys in search, Bing, has released a way to allow users to disavow the links pointing to a website. What leads to a tad bit of confusion however is the way that Bing talks about the way improper backlinks affect your site, sometimes saying that it won’t do any harm and them sometimes saying that it could very much negatively affect your position within their rankings.
From the Bing Webmaster Blog:
Today we’re announcing the Disavow Links feature in Bing Webmaster Tools. Use the Disavow Links tool to submit page, directory, or domain URLs that may contain links to your site that seem “unnatural” or appear to be from spam or low quality sites.. There is no limit on the number of links you can disavow via this tool.
It’s a great way for you to have more control over who is pointing their content at your site, as well as the control it lets you have over your own web positioning. It’s a solid first step in being able to control your backlinks, it will be interesting to see how Bing deals with the reports which are submitted, as they have been notoriously slow to deal with changes and updates to their index.
At present, there are a couple of different sets of active penalty systems being tossed around on Google. There’s the Penguin update, and the link penalties which are being levied against sites with strange link profiles. If you’ve noticed a backtracking in your site position, how are you to know which way to proceed? Let’s have a quick look and see if it can be narrowed down some.
Firstly, the Penguin update which is still running around, seemingly causing mass havoc with some site owners. The most recent version of Penguin in the wilds of the web are searching for unnatural backlinks, a tighter field of view when looking for fresh content, better page title generation (if your page doesn’t have one), and better detection of hacked websites and pages. Note as well, that all of these algorithm changes, and much more, all run automatically as the spiders break down your site. If you’ve received no notification in your webmaster tools area, chances are you’ve been hit by a Penguin drive by.
The second current most common way that sites are being hit, is with what is called an unnatural link penalty. The key difference between this method and Penguin, is the link penalty is manually handed down against your website. If you pay heed to your webmaster tools notifications (you did remember to set that up right?), then just follow the steps in place to correct the penalties levied.
Or as I saw it put so succinctly:
Unnatural links is more about link networks, paid links, blog networks and unnatural link patterns.
Penguin is more about low quality links with weird looking anchor text, plus other over optimization related link building techniques.
In the discussion that has followed, it has been noted that the two seem to be inter-related to each other, so be sure to keep tabs on your websites performance, and don’t ignore the notifications you receive. You’ll only be sinking yourself faster should you choose to ignore any warnings or messages.
Search engine optimization, it’s the big marketing buzz word of the last 5 years or so. And what was once known as a highly technical, and relatively unknown business tactic has become a medium embraced by the masses. So well embraced in fact, that it’s become more and more populated by people who barely understand what the term means, let alone how to properly implement it on a clients site.
It’s becoming ever more obvious when we speak with prospective clients, that their first introduction to the world of SEO wasn’t all it is cracked up to be. The most common way to be taken in is usually with your webhost, offering what seems to be an amazing suite with submissions to directories and search engines. What these small, and sometimes large companies don’t seem to realize, is that directories don’t carry much influence with the search engines, and as for being submitted to them, well it’s not a process that exists in so many words. This scenario, as bleak as it may seem, is the best case scenario unfortunately.
The worst case scenario, and we’ve run into it a few times, is a client who’s been attracted to the false promises, and ‘darker’ side of SEO. The black hat entrpreneurs, if you can call them that, lure in their clients with promises of page 1, top 10 positions and quick return on their investment. The problem here, is the tactics that employed often destroy the online reputation of the company, and lead to the website often being removed from the index. When we’re engaged with a client in such a predicament, we’ve actually had them start over entirely, new url, new website, the works.
Key points to remember about legitimate search engine optimization are: It is not a one shot deal that will place you in the top rankings
It is a long term, high ROI solution
It is the highest ROI marketing solution when the costs and gains are weighed
And a real SEO expert will engage you as a client, to help you create content, and create an online experience to help bring qualified visitors consistently to your site.
Don’t be fooled by get rich quick schemes where SEO is concerned, it doesn’t work, it doesn’t exist and anyone who is trying to sell it to you should be black listed in your contact book.
With all of their updates that have been applied in the last while, Penguins, Pandas and who knows what else is coming, it’s becoming fairly common to read the occasional article on how poorly Google is faring as a search company. The news headlines are even beginning to creep into mainstream media more and more often, especially with Google+ trying to creep into Facebook territory.
But when you start to look at the numbers, year over year, nobody is really going anywhere. Where online search is concerned, just over 2/3 of the users choose to use Google as their search engine when looking for information online. The Bing/Yahoo machine (since Bing provides all of the results for Yahoo) stayed at a near 30% search share for the month of May, overall a loss of search share for the duo. Bing remained constant from April, and gained from a year ago, but since they’re filling the role of search engine for Yahoo, it is only logical to lump the pair together. The remainder of the search market is taken up by everyone else, Ask, AOL, and all of the other smaller engines out there like DuckDuckGo. These numbers are relevant to the desktop search market.
The mobile search market is much different than the desktop variant. While there maybe a much more varied platform base in the mobile market, it is absolutely dominated by Google, taking up the monster share of 95% of the US market. It seems that regardless of how much some SEOs decree the death of Google as a search engine, that the general user disagrees. At this point in the life of the web, the original search engine, is still the best search engine, going by the numbers. Your personal use and interpretations will vary somewhat from the general public.
Google, the Government, and you
Going over the search share numbers, it’s very plain to see that Google is sitting on the largest share of the pie, by a very clear margin. Being a company of such a huge size, with such a massive market share, makes you an impressively large target to take aim at. A couple of years back, in order to make information more available for view, Google began a new feature they dubbed as a transparency report. The introduction of the information was to give the general public an idea as to the types of removal requests the company faces on an ongoing basis. They’ve released their fifth data set, which gives a fairly clear timeline of events and online postings, and in their blog post from yesterday, Google has noticed a disturbing trend.
“We noticed that government agencies from different countries would sometimes ask us to remove political content that our users had posted on our services. We hoped this was an aberration. But now we know it’s not.”
It should be no surprise that governments are keenly interested with online activity and online content, it was only a short time ago that portions of the internet went black in opposition of the proposed SOPA bill. But even though governments have been requesting blog posts, videos, and sometimes even entire websites to be removed from the index, in the end they are just that; requests. And with the nature of requests, can come denial, which is what Google has been doing with most of the requests they’ve received. You can delve deeper into the report by following this link and it’s safe to assume that other search engines often receive the same requests to remove content from their index as well.
We’ve been over the steps of what you need to do when you’ve been penalized and dropped from the index, but once you’ve followed all of those steps, you might be wondering just what’s next? To recap quickly what you should do, first go over your email (which you most undoubtedly have) and follow their major points of issue. If it’s bad backlinks, do your best to have the removed. Spammy content? Get a handle on it and rewrite it. Found out your SEO is playing the black hat game of gaming the engines instead of working with them? It’s time to drop them and call the real experts in search. After all of those steps, you resubmit your site for inclusion.
But once you’ve done all of that, it’s in the hands of the search gods. It’s where you need to sit on your hands, and wait for them to decide if you’ve done enough, to be reindexed and included back into the search rankings. What some people don’t realize though, is sometimes the search engines don’t fully clean your record, it may only be a partial pardon, incentive really to clean the rest of your act up. Just like search engine optimization isn’t just a black and white industry, neither is directing traffic at Google or Bing.
So, just how relevant is too relevant? It’s a question being asked lately as more and more often, the results page tends to be over taken by the same website. There was a short video put out by Matt Cutts and the Google team, trying to describe just what’s going on.
The method for displaying these newer results however, have been getting under users skin however. How diverse do the search results really look, or seem, when the top three or four, and sometimes the entire page, is taken up with a single result? Relevance to the search query is obviously which drives Google and other search engines to deliver their results, and the better refined they the better it is for the end user. Have you had any instances recently where the search results page has been dominated by a single result?
With Google commanding somewhere around 2/3′s of the internet search market, it’s important to remember the basic steps we’ve discussed here. Simple navigation, a solid website built as simple as possible, while maintaining an aesthetic that you enjoy and solid content with which to bait and capture the bots, and your target audience.
Of these items, it’s content which can actually make or break your online presence. Your content is the meat of your website, it’s what captures the search engines attention and is what makes you relevant to your target market. If you’ve written it well, made sure that it’s relevant to the theme of your business then you’ve started yourself on the road to the top. When it does come to your content though, you need to also keep in mind the people that you want to read your information. Most visitors to a website, if they can’t find their information quickly and easily, will just as soon click that back button if they can’t find their way to the content they desire. It’s a great idea therefore, to break up the monotony of your website and have snippets of highly relevant information, stand out on your pages. Bolded text, italicized, and placed high in your pages helps deliver a message quickly and clearly to visitors to your site.
On the other hand of the spectrum, you have your entire articles placed within your pages for the audience you wanted to have continually return to your site. This is for that captivated visitor, whom you’ve already sold your business or website to. To a reader, all information is generally good information. The more they know about the product, your company and anything else that increases your credibility will help them feel secure in doing business with you instead of a competitor. Text is an important part of the decision-making process. From the homepage to categories and sub-categories to the actual product page, the reader is intensely interested in what you have to say, as it will be the determining factor in whether you get a conversion or not.
Good website marketing isn’t about building a site for any one type of visitor, it’s about building a site that speaks to as many different visitor types as possible without alienating any. You must have the right pieces in the right places in the right way. Skimmable content allows you to target all types of readers and give them even more than they want. That way, everyone has a positive experience.