Browsing "internet strategy"
If you’re building yourself a website for the purposes of getting your brand out there and it’s your first foray into online marketing, there are some key points you need to keep in mind. Whether it’s your first step into digitizing your presence, or you’re well versed with the jargon, a refresher course is always a prudent way to dissect your presence and how effective your search engine optimization has been implemented.
How is your content written? Is it clearly worded for visitors to quickly find what they’re looking for? Or have you crammed your pages with industry specific terms which only those ‘in the know’ could have any knowledge of? When you’re creating content for your website, new or established you need to keep your target demographic in mind. You also need to bear in mind the overall theme of your site as you create your content. Your keyword balance needs to in the forefront of your mind as does your target audience.
You should take the time to examine your website navigation and how your pages flow as you follow your pages. Is your menu well ordered and intuitive to the user? Or do you have it crammed with every single page within your website? Just because you may offer 35 different services as a company, doesn’t mean you need to build your menu with a flyout of 35 different pages. A sitemap takes care of a great deal of the indexing for the robots and allows them to follow it to double check your links for you.
A consideration to keep in mind as well, what is your target area. Are you searching for multi-national rankings, or do you want to own your local market. Your site needs to be tailored to your needs, sometimes shooting for a smaller target, can lead to larger gains as time goes on. These are only just a few of the good practices you should employ as a website owner or builder, but they’ll go a long way towards helping reach your goal of ranking well on the SERPs.
“From destruction comes opportunity…”
Most people panic and throw in the towel when Google makes updates. That’s why so many “amateurs” never make it in the marketing game.
On the other hand, professionals are prepared. They constantly push forward and test everything to see what works. Also, when the amateurs get wiped out, that makes more room for professionals like us to rule the landscape.
So, are you an amateur or professional?
“What works in the aftermath…”
Surprisingly, the same tried and true, quality promotion tactics are still the best. (We can say this because we have over twelve years of experience living through all these updates)
But, the difference is HOW you do it. It’s the PROCESS and the FLOW that has changed. The promotional steps change, and the timing changes. See, you can’t just randomly promote your sites and build links anymore. You have to time everything properly. Otherwise Google will not reward you.
So, the frequency of your promotions has to change. And the order of your promotions has to change. Listen, we’ve been dealing with updates like this from Google for over a decade. And it’s really not that scary when you understand what Google wants.
The problem is that Google always modifies what they want, and when they want it. What’s more, is they aren’t very good at telling you what they want. That’s where it gets confusing. Sometimes the changes are subtle, sometimes they’re extreme. The good news is we’ve always come out ahead and now so can you! Call us Today for help.
There are many steps which are part of a successful organic SEO campaign. There’s all of the little steps like writing good content, making sure you have the titles and meta tags in place and having a menu which is comprehensive. When you’re finished with the good practices pages, you begin to read about one of the time intensive steps of the campaign, link building.
Since Panda has reared it’s head over the last year or so, there’s been chatter about how the SEO game has fundamentally changed. That scrapers and content aggregators, the black hatters and the link buyers would just disappear and we’d have pristine, precise results. Time has started play it’s part and while the scrapers, aggregators, black hatters and link buyers have mostly been swept away, there has recently been a new call to revamp the way the system has been working. The desire to change the link building metric portion of the search game sometimes comes up in discussion as the points for and against the practice are argued. When you break it all down to the basic points, primarily every search engine will tell you the same thing: content is king. If you produce quality, relevant content, you will rank in the SERPs.
The kicker about producing this kind of content however, is you will naturally receive back links to your site and it’s pages. When you’re a new site and you need to visit and email possible consumers and possible partners in the same niches, building those back links takes time. But they will be built, they will be taken as a metric by the search engines and until an algorithm can come along which can read and evaluate content as a user would, link building will be relevant. It will be an important portion of any and every organic SEO campaign no matter how big or how small. The success of your link building campaign can be directly tied to how much work you’re willing to put into contacting those who are in an industry which compliments your own.
What is a domain name worth? well the average price of a .COM domain name is $2,595, according to a study released last week that analyzed 10,608 domain sales during the first quarter of 2011.
Buy your Domain Today
This could be pretty useful information for digital marketers out there to work into their budgets, but more importantly, they should look at the overall value that a domain provides because the return on investment can be fairly substantial.
Domain names are a pretty basic tool in the digital marketer’s arsenal and should be a main component of any campaign, brand management strategy, product marketing strategy, or even an SEO strategy. However, their importance is often overlooked and can sometimes be cast aside due to the sticker shock of how much the right name costs.
Domains have been sold for $13 and for $13 million, but if you consider the average price, it’s a reasonable investment in the grand scheme of a marketing budget. To put it in a brick-and-mortar perspective that most anyone can understand, $2,600 is roughly the cost of a vinyl sign or display booth, making it a very reasonable investment for most companies.
Another thing to remember is that a domain is an investment, The money you spend upfront on a domain will pay dividends in the traffic it helps generate, but it’s also an asset that will appreciate in value over time. According to the same market study that benchmarks domain transactions, the average price of a .COM increased 9 percent from the first quarter of 2010 to the first quarter of 2011.
We often take domains for granted because they’ve become a part of every day life, but they’re a valuable tool for driving traffic, and in the end, that’s what it’s all about. Short and memorable domains can make your site easier to find for new and returning customers; keyword domains can improve SEO and reduce the money you spend on SEM; domains that define a category can capture natural type-in traffic. With the right strategy, domains prove their value many times over.
You only get one domain name, when it’s gone, it’s gone. Securing your business or personal domain name should be one of the first things you do online for Branding, Marketing & Sales.
If you require help securing a domain name for your business or to check out our stable of branded domains, call us today 1.866.259.2483 or drop us a line, we would be happy to help.
There’s rumor of another Panda update on the way, so it might be a good time for all web designers and programmers to do an in depth web site audit. The purpose: to identify and rectify and potential problems which may conflict with the coming update.
The previous Panda applications to the search index have mad a couple of general changes to the way the algorithm works. One factor which has gotten the axe is the Google index has become more stringent with the presence of scraped content. This hit the content farms the hardest initially, but it also managed to scoop up a number of innocent site owners in a round about way. The scneario played out as: web site owner publishes unique content, relevant to their niche. Keyword rich, interesting and informative it quickly begins to climb the search rankings in it’s niche. Along comes a scraper or aggregator from a major mashing site. Suddenly, you realize your original content has been dropped and the aggregator has essentially taken your place. It’s a scenario which has bitten creative owners in the backside for a long time but became a pronounced issue with the introduction of the Panda update. Legitimate creators were being bounced from the index along with the scrapers in a wide reaching effort to reduce web spam. It opened the doors to a long and arduous process to have your website and it’s content reconsidered for inclusion to the index.
Another variable which Panda introduced also ties in the first, the variable began to affect affiliate ecommerce websites. What had happened was, all of the affiliates had been given permission to use the same content as the original seller, and in doing so basically signed their own warrant for the Panda bot. The same net effect as previously discussed, legitimate business owners were finding they were being removed from the index where previously they had been able to make a comfortable living.
In the end when you’re in preparation for the upcoming Panda update, be sure to take a good, hard look at your content. Do your diligence and take every step to make sure your content is as unique, informative and informational as possible for your market to limit any accidental removals from the index. By not doing your homework, the only one to blame in the end is yourself.
If you own a website of any kind, and you pay attention to the traffic coming to and navigating it, you may discover that traffic is perhaps not flowing naturally through your pages. For example, a potential visitor arrives at your site, but upon not finding the information they were looking for quickly or efficiently they leave your site and head to a competitor. Another issue you might discover with visitors is a lot of traffic on pages which contain lots of images in regards to your market and you start to see them pop up on various other sites around the web.
There are a number of ways with which you can direct traffic on your website, the easiest of which is by building an easy to understand navigation menu highly visible on the page to help drive visitors where you would like them. Another method you can use to help direct visitors to your unique content would be to sculpt your traffic flow to your more popular interior pages on your site which contain more information than your front page. Think of it like setting up a series of traffic signals for the internet that helps people land on the pages they’re really looking for.
In the event of discovering your content is being scraped and used here or there on the internet, there are a couple of options easily and immediately available to you. You can contact the site owner and ask them to remove your content, and depending on the severity of the hijacking you may even be able to leverage the power of the DMC Act to help your case. If it’s a repeat offender, a more drastic way to deal with the prying eyes and light fingers would be to completely block their IP address from being able to access your website. It’s quick, fairly simple to implement and mostly absolute.
Using a method of blocking IP ranges can help you trim your traffic to the customers you’re truly interested in having using your site. For example with the recent buying frenzy that was created with Winnipegs new NHL team returning to town, the sales website could have essentially blocked all IPs that were not originating from Manitoba for the day of the sale and reversed the change when the sale was finished. At any rate, that would have cut down on the out of country ticket brokers from getting their hands on tickets they have no intention of using.
The most recent effort to introduce a bill aimed at placing the responsibility of policing the internet, of sorts, and it’s content has been blocked by Senator Ron Wyden, an Oregon Democrat.
The PROTECT IP Act was layed out and written in such a fashion that it would fall to internet service providers and search engines to essentially censor the internet. The proposed aim was to reduce the flow of business to websites selling counterfit name brand products. And while the goal is a noble one, the powers granted to the government over the ISPs and search engines if they didn’t comply with their directives was too far reaching. Basically any business could rat out another to the government, who would then turn around and say “Block this website” to the search engines and service providers. If they didn’t comply, they’d be subject to the whims of the body put in place to oversee their actions.
The largest issue with the bill and the way it was written, the burden of proof was placed on the accused, not the accuser. In essence, if you wanted to stop a competitor from advertising on the web and placing within the SERPs, all you would need to do is accuse them of infringing on your copyrights. The burden of proof would then be placed on the accused and they would be basically blacklisted to the corners of the internet.
A strong advocate of the bill had his own take on the necessity of the bill:
“American consumers are too often deceived into thinking the products they are purchasing at these websites are legitimate because they are easily accessed through their home’s Internet service provider, found through well known search engines, and are complete with corporate advertising, credit card acceptance, and advertising links that make them appear legitimate”- Senator Patrick Leahy
It’s easy enough to debate his comment however with just the simple statement, if it’s too good to be true, it probably is. If you’re looking to buy a Rolex and you stumble upon that “hidden” gem online where you can buy one for a 10th of the retail cost, I would bet you’re buying a counterfit. Big business has a problem with the counterfitters namely because they’re almost entirely fly by night. They’ll engage in ruthless cut throat, black hat SEO tactics to continually rank above them in the SERPs to gain the visibility. The most consistent way to “win” the counterfit war is to simply rank above the gamers of the system. Investing in your website, investing in organic SEO and most importantly, investing in your brands online visibility.
There are some general misconceptions about SEO which crop up from time to time and often come up when going over the process with clients. Some points are extremely valid questions to bring up while others receive ambiguous answers as it changes every day.
Some discussion points like “Why do we need to wait in building back links to our site?” for example tends to come up. To build up quality back links to your website takes time first of all, secondly if you were to go the shady route and buy thousands of links to boost your Page Rank, it’s a very quick way to get the search engines attention. And not in a good way!
“Why should I pay you every month when this other guy says he can do the same for a one shot job?” This is probably the largest misconception about the SEO industry and one of the hurdles which we are met with in dealing with new clients. The biggest reason that you can’t do just a once over and expect the results to carry on forever is because the internet doesn’t shut off. It doesn’t stop, it doesn’t sleep, it’s always changing. And in order to compensate and keep up, the search engines do exactly the same. The change their algorithms, tweak the results and shift the rankings on a weekly, and sometimes daily basis. Upkeep is absolutely essential to remain competitive in search engine optimization and someone telling you they can plant you firmly at the top for a one time cost of $200 is yanking your chain.
“SEO doesn’t seem so bad I’m sure our techs can do it here” This is perhaps the most closed minded statement to be encountered. I’ve written of it here on the blog before, but pick the right horse for the course. When you’re building a new website, contract a web designer. When you’re adding basic information to your site or updating information, use your techs. When you want to bring your brand and website up in the rankings, use a search engine optimization expert. Saying that your tech who does your database scripting will do your optimization for you is basically money lost at best. At worst, they try and shortcut your site and you get kicked from the index for breaking a rule or two.
There’s a new Google search results page being tested out in the wilds of the internet. Varying reports have been given at present, but there seems to be a central theme to the different layouts discussed. The most consistent trend that is being reported in the new pages is.. more white space in the results.
It doesn’t sound like a search game changing shift, but in reality it very much is. There are millions of people using the internet every day do a myriad of things. Searching, playing games, writing stories and blogs or researching who knows what. Almost all users use a 17 inch monitor or larger and the resolution to match. As strange as it may seem, monitor size and resolution also play into the new search results page and how it may affect your search ranking.
Simply defined, white space is literally just the amount of blank space between elements on a web page. By adding more white space to the search results pages, Google has effectively lengthened the page, meaning to get to number 10 of the top 10 results, you have to scroll down on the page. Just as in the real world, location is everything when it comes to search results. If you’re not in the top 10, you’re not betting the views you need to be competitive as a very high percentage of search users don’t click on page 2 let alone page 5 of their search results. It’s been seen in demographics as well, most users don’t even scroll down to the bottom of the top 10 of their searches!
At present most users see the top 5 or 6 on their search results page. If Google were to decide to go live with this change of adding more white space, you would only see the top 4, or 3 even depending on your monitor and resolution. If you were happy and content seeing that you were sitting at number 5 or 6 in your niche, it may be time to take a long hard look at your current site to see if you can kick start some forward gains. The top 3 when it comes to being found is becoming more and more important.
The Panda update has been out for a little while now and while some users have reported a rankings decline it seems that for the most part if your site wasn’t being scraped of content, you’re doing just fine. But the underlying point is, Google is going to adjust the algorithm again. They’ll tweak, tune and make mistakes. They have thousands of employees, it’s difficult enough when you have a team of 10 working together let alone a small town of people making adjustments.
So what is next on the chopping block for the Google bot? Only Google can tell you, but there are definitely points to be considered which are up for grabs. One of the most likely candidates for being demoted in search, are the bogus blog posts full of anchor text going out to different sites. We’ve all run into them at some point, usually when you’re looking for information on how to change a sink tap or what type of air conditioner to buy this coming summer. You arrive at a blog post with a dozen different anchors in it, that doesn’t really tell you anything concrete about your search topic. They still come up rather prominent on the SERPs, it wouldn’t be too surprising to see them culled in the next big update however.
Other points which I’ve seen discussed are adding links into the footer of websites and site stat counters. Both of these as of this current writing, have had their link weight devalued already, but that doesn’t mean that the search gods won’t turn them down some more. Footer links are great to use as internal site navigation, and if you’re honest with yourself having a visible stat counter on your website is gawdy at best.
The last point of issue seems to be that some website owners are reporting vast amounts of erroneous links pointing to their websites from domains of ill repute. Using back links as part of their evaluation method, the search engines are tied to this as one of the metrics with which they rank websites in search. The issue stems from the issue that you can’t fully control who links to your website. You can actively search for the backlinks pointing to your website, and if you don’t like where they’re coming from you can merely ask for them to be removed. It’s a double edged sword which needs to be monitored, no one is immune to the Google ban hammer. Just ask JC Penny..