• LinkedIn
  • Fresh Traffic on Google Plus
  • Subcribe to Our RSS Feed
Tagged with "yahoo Archives - Fresh Traffic"

Facebook Search Engine on the Horizon?

Sep 12, 2012   //   by FreshTraffic   //   internet marketing  //  Comments Off

It’s been long clear that a search engine is a search engine, and there are a handful online that receive the majority of the traffic out there. Google, Bing and Yahoo are the usual methods which are used to search, with other sites like Blekko, Duckduckgo, and Ask also being used by those desiring a different experience. Facebook however, is one of the largest websites online, and with what is approaching a billion users, the world has been waiting to see if Facebook is going to try and enter the search arena.

Ever since Facebook has gone public, the stock has been a sort of tepid pool, with no real revenue model the online mutterings often go to the topic of a search engine. And when Mark Zuckerberg throws around statements akin to “Facebook is doing a billion searches per day without trying” the mutterings pick up some volume. In a recent Techcrunch interview, Zuck made it clear that the company realizes that there is a huge opportunity for a search engine with Facebook, but tempers that by also talking about how the way that users search is ever evolving.

“Search engines are evolving” to “giving you a set of answers,” Zuckerberg said. ”Facebook is pretty uniquely positioned to answer a lot of questions people have. At some point we’ll do it,” he went on. “We have a team working on search.”

There are some real concerns about just how Facebook could leverage their massive user base as a search engine however, and it has less to do with spidering capabilities than it does privacy. The system as he briefly described and envisioned, would mean taking the opinions of your friends, family and contacts and trying to form a result for your query. Searching for terms like ‘best burger’ or ‘new batman movie reviews’ wouldn’t necessarily be informational, but would deliver you a list of opinions from your contact list. At any rate, it’s not happening today or tomorrow, or even soon for that matter. But (if) when it does, it will be introducing change into the search landscape and the online experience as a whole, and change is very, very good.

More Google Patent Thoughts

Aug 30, 2012   //   by FreshTraffic   //   internet marketing  //  Comments Off

Previousl I wrote about a Google patent which has gained more and more traction in the search community. The patent in question is named Ranking documents, and as mentioned previously it seemed like an interesting tactic to employ to catch thsoe who less than ideal tactics to rank a website.

Unfortunately, often times you’ll find that people are discussing search engine optimization, and will tack on the terms spamming, or buying links, but the truth of it is – when you do it right SEO is none of those things. The new patent that was discovered I think of like a magic trick, it’s a bit of slight of hand that Google is using to get those unscrupulous tacticians to reveal themselves. Google shows them the index, they try and spam to get ranked highly. When Google notices the spam, they show a different set of results (if you don’t think they can do that, you’re not thinking clearly) just to see the reaction of their target. If they continue to offend, their site will likely be penalized, and the ranking pages will haven’t been affected negatively.

Spamming links to a website to give uncredible weight to a term is a measure used by those in the industry who are in the game for only a few quick clicks. Often used on disposable website urls, they will target a hot, trending term – as an example Paul Ryan’s RNC speech – and will try and garner as many links and as much visibility as possible in as short a time frame as they can. It’s a pump and dump tactic that is very likely to have the offending site banned from the index and the url blacklisted.

I fore see that this patent will be dissected, discussed, scrutinized and blamed for all manner of SEO problems and head aches to come. The only problem that some site owners have has with Google, Bing and Yahoo is they’re getting better and better at catching the people who try to cut corners or use less than natural methods to rank their site. Maybe even one day soon, we’ll have an adaptive algorithm which detects, and removes spam sites as you search actively so black hatters go the way of the dodo bird.

Algorithm Updated and it’s getting smarter

Aug 27, 2012   //   by FreshTraffic   //   internet marketing  //  Comments Off

There have been adjustments, changes, and what seems like complete rewrites of the algorithm that Google started with in the beginning. At first when you searched, the results you were given were based directly upon the query as you’d entered it, and sorted by how many back links it had. Now however, when you search for ‘Winnipeg Jets’, images and video for the team appear even though the words “images” or “videos” weren’t in the query.

The algorithm that Google, Bing and a handful of others use, has grown and evolved to a point where it’s trying to anticipate what you’re searching for, as well as the direct query you may have typed. The search engines are getting better at bringing what you want to see on your given topic, and seem to be weighing the number of clicks through to a result as well as all of the previous criteria as well. As you’ve probably mistakenly typed a word or two while searching as well, you’ve probably noticed that search engines are also able to correct spelling mistakes which are commonly made. What the engines are getting much better at doing however, is not correcting your spelling, but interpreting what you may actually be searching for. Google can load dictionaries of how words should be spelled and common misspelled variations of those words and can look at how searchers correct searches and when they click on different variations. And it can use this data to not only suggest a query with a different spelling but to treat the misspelling as a synonym behind the scenes and rank the correctly spelled matches.

And as with what goes in the same basket as interpreting what you might be looking for, Google is noticeably moving forward on trying to discern your intent while searching as well. A basic description of how it works:

Keyphrases don’t have to be in their original form. We do a lot of synonyms work so that we can find good pages that don’t happen to use the same words as the user typed. – Matt Cutts

Even Bing is getting in on the act as if you mispell a common word or phrase, you’ll still often be brough to the correct results. Perhaps soon enough, you won’t need to search by typing, but by simply visiting your preferred search engine. And because you researched a new car and purchased one via online dealership shopping, the engine knows that in a few months it should perhaps deliver you information on local garages which offer oil changes and tire rotation services.

The Future of Search

Aug 9, 2012   //   by FreshTraffic   //   internet marketing  //  Comments Off

Google has been king of the search world, from almost the day it became a tool on the web. There are a handful of other search engines as well, all which do their best to offer a choice when you’re looking for online information.

There’s been some discontent with Google as a service, and it sometimes leaves users craving an alternative to the giant. There is a small problem with that idea however, and it’s the same reason that makes Google so successful. When you consider the basics of search, if you have quality content that people easily link to, you’re going to be well represented on the search engines, Google, Bing, etc. Google has just worked out how to best deliver the most likely content you’re searching for, because it can work out the content and the links leading back to that content.

It doesn’t mean though, that the web and search is due to stagnation. Everyone is working to innovate on the space, trying to find the newest, and biggest evolution in search and online interaction. There are some out there that do their best to be an answer engine, where you can basically query a database for an answer, and there are others out there which pride themselves on being hand curated by teams of human users to help promote the most relevant content. Bing and Google are both trying their hands at integrating your social life into your search results, both with mixed success at the present. But what all of these search engines become stuck and stuble upon, is the same issue, all of the current relevant results are built primarily upon links and link structures to help give value and authority to the website.

The future of search, won’t lie in constructing links back to your quality content, it will be when someone is able to come up with a search engine which can predict what it is you may be searching for. When you’re able to start looking for a new home for sale in a new city for example, and based upon your current, and previous searches it can determine that you’re in need of a new home near a school for your children, and it delivers those results to you as the most relevant. The technology doesn’t quite exist in such a way at the moment, as it would require massive amounts of calculations to hold the web open, ready to pick out the points you’re searching for. But the web and it’s technology do grow everyday, and perhaps soon enough we’ll be able to talk to our devices to find what we want.

Google and Google+ Earn Top Marks

Jul 18, 2012   //   by FreshTraffic   //   internet marketing  //  Comments Off

Just as Hitwise measures search market share, there is a report put out by ACSI (American Customer Satisfaction Index) which tries to put a number on how happy users are with the varying search engines and social media sites out there. While there were some expected results with the survey, there was a surprise or two to be seen.

As far as search engines were concerned, it wasn’t a huge surprise to see Google still on the top of the list with an overall 82 points out of 100, and Bing picked up a little ground on them coming it at 81 points of satisfaction. When pressed for reasons about satistfaction, more than half of the respondants who chose Bing, noted that they liked the ease of use of Bing. I may be somewhat biased as I’ve always primarily used Google to do the bulk of my searching, and perhaps it’s a difference of Bing.ca versus Bing.com, but I’m not sure how Bing is easier to use over Google when both are just a search box. The links which appear after performing a search are nearly entirely alike, and it’s a rather short affair to be able to specify your results and tailor them as you like. Opinions are different for everyone, and that is the main point of a survey after all, to gather as many different ones as possible, back to the list. Plonking our way down we pass Ask.com at 80 points on the list and Yahoo at 78 points in customer satisfaction. And note, the survey wasn’t conducted about who uses which search engine, Hitwise covers that quite well and the numbers are fairly static with Google holding onto the lions share of the market. The point of the survey was about the satisfaction of using their preferred search engine, acquiring a rounded opinion would mean that after a point, the survey would have filled their quota with Google results and have been looking for Bing, Yahoo, and Ask users.

A new report that ACSI has put out however, has detailed the satisfaction level of those who use social media sites like Facebook and Google+. And again, just like the report for search engine satisfaction, it’s not about market share, it’s satisfaction so the same principle applies – to form a rounded opinion you need to have as equal amount of respondants as possible for each social media site. It was with this report, that the numbers were beginning to be surprising. The top marks in the survey actually went to Google+, with 78 points out of 100, followed by Youtube (73 points) and Pinterest (69 points). Twitter, LinkedIn and Facebook all took the bottom spots, with Facebook holding the basement spot with 61 points. With such a vastly diverse user base, it is understandable that opinions would be strong with some users about how Facebook handles itself, but there were some key reasons which came out which hurt the social media giant. The biggest issues came from the implementation of the Timeline feature, users felt there were too frequent, unnecessary changes to the user interface. Intrusive advertising came in as an issue with nearly 20% of the respondants complaining and one of the largest contributors to unsatisfaction was the privacy concerns which still dog the social giant. Nearly half of those surveyed rated Facebook a 5 or lower on a 10 point scale on how they handle privacy. Not surprisingly, the reasons Google+ excelled on the survey, happened to be the reasons Facebook tanked in comparison. On that same 10 point scale, 60% of the respondants for Google+ ranked their privacy protection as excellent with the fledgling social site. No advertising, at least not in the sense that dominates Facebook, exists on the service, and at present there aren’t any plans to add them, and a very strong mobile presense all helped Google+ to attain the top marks in satisfaction this year. There is, however, a small caveat to bear in mind with the social media results. On the whole, taking all of the social sites in hand, users are only 69 points out of 100 satisfied with social media sites, almost putting it in the basement of the study with television, newspapers and airlines.

Is Linking Dead?

Jul 6, 2012   //   by FreshTraffic   //   internet marketing  //  Comments Off

With the way things have been progressing online, it’s not much of a stretch to think that some of the old ways have gone to the wayside for search engines. Google, Bing, Yahoo and all of the others out there need to choose a metric of sorts where by it allows them to determine what is relevant to certain topics and categories. The long running, and highest contributing factor since the beginning, has been linking to websites; both those which are relevant to your business and those which may help your positioning. It’s a very simple formula really, site A compiles a great deal of information about cogs and becomes known across the country as the top producer and information source for them. Site B, is a reseller of site A, and as such provides a link directly too site A, helping cement their positioning online as the top purveyor of the cog industry. That’s a very basic example, multiply that a trillion times and you’re beginning to see the beginnings of the internet, and how linking works to sort out the web.

Over the last couple of years especially though, the social web has made a big splash. Facebook, Twitter, LinkedIn, Pinterest, Google+, all social sites which are being hooked by the spiders and are being more frequently plugged into your search results. Some companies out there like to make the correlation that the larger the web has gotten, the more community focused people have become. Instead of searching out for the best cog company, people are asking their friends on Facebook and Twitter for example. It was a shock that ran through the SEO industry to think that the value of links was going to be tossed away, all because people suddenly had the ability (not that they didn’t already) to ask their friends for their opinion. It’s a (non)issue that continues being blown out of proportion by the unseasoned search experts out there.

Bet a very simple truth is this, the weight that linking and back linking to websites sin’t going to go away. Not just yet, and not in the near fore seeable future at any rate. That’s not to say that social signals and social linking isn’t going to become the heavy weights at some point, but that point is not today, not next month, and not next year.

Search Market Share Numbers & Transparency

Jun 18, 2012   //   by FreshTraffic   //   internet marketing  //  Comments Off

With all of their updates that have been applied in the last while, Penguins, Pandas and who knows what else is coming, it’s becoming fairly common to read the occasional article on how poorly Google is faring as a search company. The news headlines are even beginning to creep into mainstream media more and more often, especially with Google+ trying to creep into Facebook territory.

But when you start to look at the numbers, year over year, nobody is really going anywhere. Where online search is concerned, just over 2/3 of the users choose to use Google as their search engine when looking for information online. The Bing/Yahoo machine (since Bing provides all of the results for Yahoo) stayed at a near 30% search share for the month of May, overall a loss of search share for the duo. Bing remained constant from April, and gained from a year ago, but since they’re filling the role of search engine for Yahoo, it is only logical to lump the pair together. The remainder of the search market is taken up by everyone else, Ask, AOL, and all of the other smaller engines out there like DuckDuckGo. These numbers are relevant to the desktop search market.

The mobile search market is much different than the desktop variant. While there maybe a much more varied platform base in the mobile market, it is absolutely dominated by Google, taking up the monster share of 95% of the US market. It seems that regardless of how much some SEOs decree the death of Google as a search engine, that the general user disagrees. At this point in the life of the web, the original search engine, is still the best search engine, going by the numbers. Your personal use and interpretations will vary somewhat from the general public.

Google, the Government, and you

Going over the search share numbers, it’s very plain to see that Google is sitting on the largest share of the pie, by a very clear margin. Being a company of such a huge size, with such a massive market share, makes you an impressively large target to take aim at. A couple of years back, in order to make information more available for view, Google began a new feature they dubbed as a transparency report. The introduction of the information was to give the general public an idea as to the types of removal requests the company faces on an ongoing basis. They’ve released their fifth data set, which gives a fairly clear timeline of events and online postings, and in their blog post from yesterday, Google has noticed a disturbing trend.

“We noticed that government agencies from different countries would sometimes ask us to remove political content that our users had posted on our services. We hoped this was an aberration. But now we know it’s not.”

It should be no surprise that governments are keenly interested with online activity and online content, it was only a short time ago that portions of the internet went black in opposition of the proposed SOPA bill. But even though governments have been requesting blog posts, videos, and sometimes even entire websites to be removed from the index, in the end they are just that; requests. And with the nature of requests, can come denial, which is what Google has been doing with most of the requests they’ve received. You can delve deeper into the report by following this link and it’s safe to assume that other search engines often receive the same requests to remove content from their index as well.

Yahoo Fighting to Remain Relevant

May 24, 2012   //   by FreshTraffic   //   internet marketing  //  Comments Off

You can’t knock them for trying, and even though they’re well out of the search market share race, Yahoo has thrown another punch in the fight to stay relevant. Their new piece of software, Axis, is actually an interesting project, born with the idea to make searching quicker, and synchronous across your devices.

When you’re on your PC or laptop, you can only use their new tool as a browser plugin, giving you a more visual display of your searches. With the results pulled directly from Bing powered search, you can browse through your results in a manner more akin to flipping album covers in your media player. The results are the same, powered by Bing as they have for the last while, just delivered in a different package. As well as having a bottom screen bar taking up some of your screen real estate, you’re also given tabs on the left and right of your screen, to quickly navigate deeper into your search results. It’s a different take on a new game, and basically eliminates using the back button to locate exactly what you were looking for.

There is a bigger difference when you start to use Axis on your iPhone or iPad however. The app functions more like an instant share button, allowing you to spread the word about the newest deal you’ve found. It also allows you to quickly send those results to any of your contacts, and has added a new spin to mobile browsing with letting you preview your destination. Currently no other app has the capability in the marketplace, and by innovating Yahoo, with Axis, has kept themselves relevant in the search marketing game. A bonus for the company is they’ve also allowed you to synch your devices when signed into your Yahoo account, so while looking for that home repair guide on your desktop, you can open up that same page on your iPad and get right down to work, as it will have your spot saved right where you left off.

With mobile marketing and mobile search growing at a massive rate for the next few years to come, by pushing into this market early Yahoo has definitely made themselves a player in the meantime, and likely a part of the game for at least a few more years to come.

Google Saving Your Advertising Dollars

Apr 12, 2012   //   by FreshTraffic   //   internet marketing  //  Comments Off

It’s somewhat common knowledge that when someone performs a search, there will be a box of “Sponsored results” to the left, above, and sometimes even below the organic results. Bing has a paid service, as does Yahoo and Google has their AdWords which proved a business in search can be profitable. There’s a discussion lately surrounding paid search advertising and the big 3 search engines, and if you’re not careful with how you read it, you may walk away with the wrong idea.

Compared to this time last year, the CPC for Google has fallen again, for the second quarter in a row while Bing and Yahoo’s CPC have continued to climb. On the surface it’s a statement which can make it sound like Bing and Yahoo have been managing to grab ad space from Google. The point closer to the truth however is more to the tune that Google has become an even better choice to advertise with, as opposed to Bing and Yahoo. Search engine marketing via the AdWords platform or one like it, has to be measured differently than the organic results, you can’t take positioning as the end goal.

When you begin to break down the numbers involved in SEM and SEO, there are some key differences that you need to understand. They both depend on conversion rates, because without converting your traffic, you’re wasting time and money. One of the largest, and most important difference however is the click through rate of your positioning. You could be ranked at the very top of the AdWords results, but if you have a poorly written ad, or a poorly built website, chances are your conversions will be limited.

Another major point you need to keep in mind is cost per click, or CPC as was being discussed earlier. Where paid advertising is concerned, CPC is a literal interpretation of how much it is costing you to have someone click on your listing. Organic SEO is more difficult to define, as you’re not paying each time someone clicks your organic listing, but after a few months you can more easily break it down. A high cost per click for your search term can mean that there are many people in the same space, or, it can mean that one of your competitors is driving up the bid on the keyword to try and gain dominance. A declining average cost per click isn’t necessarily a bad omen, it can point to reduced competition, it can also mean an improved conversion rate.

SEOs Biggest Enemy – Itself

Apr 2, 2012   //   by FreshTraffic   //   internet marketing  //  Comments Off

There are all sorts of experts out there in the SEO world, and for all of the experts out there willing to take only a couple of hundred dollars to place your site, there is a larger road block to finding the real pros in the industry; information. Good info, bad info, just plain wrong info, if you search for seo expert, or anything along that though line, you will run into some real winners if you’re willing to dig deep enough.

The last year or so the internet search world has been buzzing with Panda, it dropped this clients site, or it ruined the results for the term which magically brought their site 20,000 visitors in a day previously. Casting aside all of the hyperbole, Panda didn’t affect the vast majority of the websites out there, the main aim of the algorithm is to search out spammy sites, sites with scraped content from other sources and even sites which use automatic posting means. Just like any of the other major algorithm shifts, if you weren’t doing anything wrong at all, you’ll have noticed very little change in your positioning, and in your visitors.

But, if you happened to be working in a back linking scheme to garner thousands of links from a seemingly active blog, and you magically dropped in the rankings, then chances are the blogger wasn’t quite doing things the proper way. Before you start reading information on search and taking it at its face value, you need to dig even deeper into the threads and posts on the site which calls itself experts. If it’s only comprised of a handful or so pages, chances are they haven’t done anything except find some decent content and copy it. If the information sounds good, check the post date on it, if it was posted even a year ago, then as great as it sounds then there are likely vast portions of it unusable. And finally, actually take the time to dig into the post, read it both silently and out loud. If something isn’t adding up as you read it, sentance structure is off, or the cadence is jerky, then there are a couple of strong contenders. The post in question was either scraped and put together in a hodge podge fashion to try and dupe the algorithms, or, the piece was written by a piece of software.

The last bit may seem a little odd, but there are programs available now which can truly write all of your blogs for you. You can feed it a topic, how many words you want, what type of emphasis, and a few minutes later you have a post. The key issue with these programs however, is just as bad as someone manually scraping the web for content, the posts are almost entirely made up of scraped content. The software is just designed to piece it together to make it fit the parameters you have set.

There is really only one rule to bear in mind when searching for an SEO, or when one approaches you: Can you find them when you search for them? Because if they can’t list their site in the top few pages, then chances are very strong they can’t do a thing for your site as well.

Pages:12»