Archive | Yahoo RSS feed for this section

SEO: Beware of the dark side

21 Sep

Search engine optimization offers benefits for those who understand its nuances and dangers for those who don’t.

Author: Lamont Wood

Computerworld – Being at the top of a search engine results page can mean the difference between business success and failure. So, what would you do to ensure a listing there?

Absolutely anything?

If so, you could be walking into a minefield.

Search engine optimization (known as SEO) involves actions intended to get your page listed higher on a search engine results page. In the past 15 years, SEO has evolved into a complex art, one that is now the foundation of many businesses.

The problem is that there are ways of trying to improve your standing that are considered legitimate by the search engine companies like Google, but there are also methods that can get you into trouble. Google (which receives 90% of the world’s search engine traffic, according to StatCounter, and 65.4% of the U.S. market, according to comScore) does not appreciate being gamed — and will retaliate.

Just ask $17 billion retailer J.C. Penney, which got caught using black-hat (i.e., illegitimate) methods to boost its search results during the 2010 holiday shopping season. Penney was accused of taking part in a so-called link scheme, probably the most complicated black-hat SEO technique.

“Our high [search engine result] rankings were pushed down,” Darcie Brossart, Penney’s vice president of communications in Plano, Texas, confirmed concerning the sanctions Google imposed. “We have terminated our relationship with our former natural SEO firm. We don’t know how it happened. We did not authorize it, and we were not involved.”

It’s important to recognize if your SEO firm (or your in-house Web expert) is venturing too close to the edge of the black-hat cliff — because if Google or other search engines find there is some hanky-panky happening, it’s your site that will suffer. “I’m not saying everyone is doing it, but it’s not unusual,” says Vanessa Fox, former Google Search employee and author of Marketing in the Age of Google. “A company might hire an SEO firm without knowing a lot about SEO, or they might think it’s not risky,” she adds. (Google publishes advice for those considering hiring SEO firms.)

A good grounding in what the major search engines do and don’t consider acceptable can help companies avoid these issues. What follows are some of the techniques that are considered legitimate — and not so legitimate — and how you can tell the difference.

But first, let’s take a quick look at how Google ranks sites.

Google’s secret sauce

The foundation of Google’s trademarked site-ranking technique, called PageRank, is links, explains spokesman Jake Hubert. PageRank is based on the number of outside Web pages that link to a page, the number of pages that link to those pages, and so on.

But while links remain a major consideration in PageRank ratings, Google’s techniques have evolved since the search engine was launched in 1997. The company now ranks pages with an algorithm that has about 200 factors, Hubert says. These factors are adjusted on a daily basis; he says he counted about 500 changes in the past year.

Further details about the algorithm are not made public, Hubert says. A public version of an individual page’s PageRank rating is displayed by Google Toolbar. However, those ratings sometimes need to be taken with a grain of salt, as we’ll see later in the story.

At Bing, a Microsoft spokesman would only say that the site uses upwards of a thousand signals when deciding search results ranking, and that the nature and weight of the signals are constantly being adjusted. Meanwhile, Yahoo Search has announced that it will use the Bing search engine.

Because the algorithms remain a closely guarded secret, many white-hat SEO techniques focus on the known element — links — and are aimed at getting other sites with high PageRank ratings to link to lower-ranked sites.

White-hat SEO

There are a lot of experts willing to give advice about how to legitimately get your site higher in search engine rankings. When you boil it down, what they’re all saying is that the most important thing to do is to build a good site.

“The goal is to be relevant to the user — and then think about search engine strategy,” says Chris Koller, president of IdealGrowth, a digital advertising agency in Dallas.

“Have a differentiator that makes your site compelling and unique,” adds Maile Ohye, a Google developer advocate who liaisons with webmasters. “Design it so that users can do what they need to do. Make it accessible to Web crawlers, so they can follow links through the site. And then develop buzz about your site.”

“You want to have the most compelling content so people will be inclined to link to you naturally,” said Doug Pierce, marketing strategist at Blue Fountain Media in New York. “Also, the URL structure of your site should make sense and have keywords in the page titles.”

There are other strategies that can help. Rand Fishkin, CEO and co-founder of SEOmoz, a Seattle-based SEO software firm, suggests using an interlocking array of online marketing, public relations and brand-building activities designed to find the right audience. These methods include producing data-rich blogs of genuine interest to the readers, real conversations on social sites, news bulletins and interesting tweets. The use of infographics, podcasts, webinars, white papers, videos, forums and referring links should not be overlooked, he adds.

Fox has offered a variety of suggestions in blogs and other venues, including the use of text that is not built into graphics or JavaScript routines where a search engine can’t see it, the use of descriptive text to accompany videos, and the descriptive use of HTML metatags. Avoid having multiple sites with the same content, since the search engines don’t like that, she suggests. Instead, use redirection to cover multiple variations of the site’s name. If there is a FAQ, design it around keywords from questions that users have a history of searching for in your topic, as determined through tools like Wordtracker.

A too-optimistic outlook?

Critics complain that such approaches work mostly for topics that are of interest to bloggers, who link to the material. These are the kind of noncontroversial links that boost PageRank ratings. But more obscure topics, while important in their own fields, develop no following from bloggers and so get no links.

There’s no getting around the fact that, at least temporarily, black-hat techniques can work — much to the disgust of more legitimate SEO advisers who are trying to get their sites higher in the search rankings. “Frustration among white-hat SEOs about manipulative sites outranking them is total,” says Fishkin. “By the time Google catches up with one site, there’s a new one that outranks you.”

Aaron Wall, founder of SEO Book, a website devoted to SEO training, says that conventional white-hat methods (he calls them “vanilla methods”) do work, but he admits that they work best for sites with little competition, or for large, established sites with high relevancy.

Even then it may take several years and large sums of money to rise to the top, he warns. For sites in competitive fields, the backers must assess what portion of the desired traffic that site is attracting and what portion can realistically be obtained, and gauge their SEO efforts accordingly. There is no simple answer, he says.

“You can succeed without spam. It will take longer and it will be more expensive, but the trade-off is that it should not all come crashing down,” Wall says.

Black-hat methods

J.C. Penney was accused of involvement in a so-called link scheme, but other established black-hat techniques include content schemes, cloaking, and outright hacking. But whatever they’re called, all black-hat methods share one basic attribute: They are based on gaming the system, with no thought of providing any benefit to the site’s visitors.

Anyone investing in SEO services must learn to recognize that difference, says Fishkin. “There are many fantastic people in the SEO field, but there are also dilettantes and outright scammers,” he notes.

Link schemes

Formerly, Google’s PageRank ratings could be gamed using links placed in comment spam. “Between about 2004 and 2007, people used to be able to leave comments on blogs on other sites with links to their own sites, to build up links quickly,” explains Wall.

In response, starting in 2005, Google promoted the “nofollow” attribute for link coding. Nofollow links are excluded when calculating PageRank ratings. Nofollow was subsequently adopted by blogging systems like WordPress and Movable Type for links inserted within comments, cutting off that source of links, Wall says.

Webmasters also used to trade links to mutually boost their PageRank ratings, but now they are supposed to use the nofollow attribute in those links, Wall adds. Consequently, such links are no longer of any benefit. (Advertisements on a page don’t get counted as links, as they are typically linked to a click counter.)

Because they are no longer able to use linked comments and traded links, webmasters seeking to maximize their PageRank rating have two options: offer compelling content that other sites (especially blogs) will spontaneously link to — or pay other sites to link, like J.C. Penney did. The latter is a direct violation of Google’s guidelines.

When done by amateurs, paid links are easy to spot, says Fishkin. “When you see a dentist’s site with links to student credit card loan sites, you can assume the guy is getting a couple of hundred bucks a month,” he says.

In order to get around this, businesses called link farms arrange paid links from established sites with respectable public PageRank ratings, since those links carry more weight.

The most successful link farms don’t operate openly. “You just have to know about them,” notes Fox. “But often they have been discovered by Google. They continue operating, but their links are not valued by the search engine.”

Several reputed link farms were approached for comment but none responded — except one, whose spokesman said, off the record, that it was an advertising firm and not a link farm, and then hung up. Another announced in a blog that it was getting out of the link business.

Another problem with the link farm business is that it is based on the public PageRank ratings of the linking sites. But Google’s Ohye says that public PageRank ratings are actually manipulated by Google to make it harder to game the system. While Google updates the public PageRank database every few months, the public ratings of pages known to be selling links are never updated, rendering their PageRank rating meaningless, she explains.

Anyway, Ohye says, Google’s algorithm does not even use the public PageRank ratings to decide how to rank search results. Instead, it uses a completely different, nonpublic database, whose values (fractional numbers rather than a zero to 10 scale) are updated continuously.

As for how this works in practice, Ohye uses the hypothetical example of a news site with a breaking story whose owners find that a lot of other sites are linking to the page with that story.

“They might decide to try to sell links from that page, since it has PageRank,” she says, “so we may decide that the rest of the site is good, but we are unsure about the links from that particular page. So we adjust the database so that links from that article don’t propagate PageRank. So many link buyers are getting nothing.”

Content schemes

The first search engines rated pages purely on their content, spawning keyword-stuffing schemes in which selected words were added to a page to suggest that it was about a popular topic. The added words were then hidden using various coding tricks, such as using white text on a white background or indenting the text so far that it wouldn’t appear on the screen.

A variant of this is called cloaking, where the search engine is shown one thing (usually text) and the user is shown something else (usually ads). There are also screen scrapes, which are spam sites composed of material copied from other sites just to draw traffic for ad revenue.

“Do any of these things, and you will probably get caught,” Fox warns. “Once Google has found a site that uses one technique, they can use that knowledge to find all other sites that use that technique.”

Site hacking

Finally, there is outright hacking, which involves taking over sites with poor security for use in linking schemes. Ohye says that hacking appears to follow two-year cycles, as new techniques erupt and are then brought under control with security patches. Currently, things are under control, she says.

Crime and punishment

There are risks in using these methods in order to boost your rankings. J.C. Penney got off lightly.

“The likelihood of being caught in a few days is low, within six months is pretty good, and within two to four years it’s nearly impossible to avoid,” Fishkin says. “I have seen black-hat techniques that worked for multiple years, but no one who used a particular technique in 2007 is using that technique today. It takes Google a while to catch up, but it does catch up.”

As a result, “I get a call about once a week from firms that have tried black-hat methods and their ranking got hurt when Google found them out,” Fox says. “If you were really egregious and not trying to build content at all, you will be yanked out of the index entirely. If there is actual value in your site, you will be demoted in ranking. You must fix the problem and then file a request with Google to get your ranking back. It may take some time to do that.

“It can put you out of business,” Fox adds. “I have seen it where the traffic drop-off was so severe that by the time they fixed it, they had had to lay everyone off. In other cases, they saw from the start that it would take too much investment to fix the problem. But most of the time you will see the site come back — it can be done in a few weeks.”

Getting reinstated can be a Kafkaesque process, Fishkin notes, since webmasters are not informed of the specific complaint against them.

Google’s Ohye confirms that this is intentional. “Since we are trying to protect our algorithm, we cannot tell you that you have not done X, since you could be a spammer trying to find out where the line is,” she says. But Google does give webmasters as much information as possible if their site has been hacked, she adds.

Gray-hat SEO: The Twilight Zone

Between white-hat and black-hat SEO lies the gray-hat area, where questionable practices may look very similar to acceptable practices.

“The difference comes down to your intent and to the technical application,” says Fishkin of SEOmoz. “If you follow the search engines’ guidelines and you intend to provide value to your visitors, you will generally be in the white-hat world.

“Suppose I say that in order to enter my contest to win a new iPad, you need to re-tweet my link. That is common,” continues Fishkin. “But if I say that you need to link to my Web page from your blog in order to enter, Google might consider that to be against their guidelines, since it is a financial or in-kind reward for linking.

“Or I might go to a magazine and say that I want to buy some ads on its site for $3,000 a month, and the magazine thinks that’s wonderful,” Fishkin explains. “But then I say that the links from the ads back to my site can’t go through ad redirection — they must appear to be editorial links. This is technically against the rules. Unfortunately, it does happen quite a bit, but in the last few years ad departments have been getting savvy about it.”

A third gray area involves a variant of a cloaking technique that Fishkin calls faceted navigation. Faceted navigation means giving every user a customized page. There is nothing wrong with that if it is done to help the users, says Fishkin, but if you add a page intended purely to impress search engines, that falls into the gray area.

“It has gotten people into trouble,” Fishkin says.

But the most visible gray technique is undoubtedly content farming, where a site posts large numbers of hastily written pages covering popular search topics, often containing little real information.

Google responded to content farming in February when it announced a change in its algorithm, called the “Farmer Update,” that would reduce the PageRank ratings of sites that consist of little more than general text written around a search phrase, while improving the ratings of sites with original content and useful information.

“It did hurt most of the obvious content farms,” says SEO Book’s Wall. “But tens of thousands of sites were impacted, including structures — especially some e-commerce sites — that were not bad sites but had certain characteristics in common with content farms. Algorithms have false positives, and the Web is so huge that an algorithm can’t always be right. Three months later, none of the sites that were hit have recovered. We are still on uncertain ground and can’t say, ‘Here is the footprint of the change, and this is how you get out of it.’ ”

One of the leading owners of content farms, Demand Media in Santa Monica, Calif., told Reuters in May that search engine referrals to its eHow site fell 20% after the change, and overall page views fell 12%. According to the article, it announced plans to switch over to higher-quality content, with commissioned articles from professional writers.

Look at the long term

In the end, white-hat SEO is, for most sites, the most practical way to assure yourself of long-term success in rankings.

But that may not be as limiting as it sounds. If you eliminate the obvious black-hat and gray-hat approaches, “white-hat SEO is nearly every other marketing, branding and traffic-growth activity or operation on the Web, including millions yet to be discovered,” says Fishkin.

“If what you are doing requires things like expensive research, or building and leveraging real human relationships, and reflects those relationships, then it is generally considered white-hat,” adds Wall. “White-hat concepts will stay the same, but black-hat methods have to change constantly” in order to stay ahead of Google.

“You might get minimal short-term results with black-hat methods, but the penalization of your website will outweigh that in the long term,” Koller agrees.

For the original article go to:

How To Optimize Your Website for Local Search

9 Mar

How To Optimize Your Website for Local Search

Follow this guide to maximize your visibility in regional markets.

Author:  John Rognerud

This article has been excerpted from Ultimate Guide to Search Engine Optimization, Second Edition by Jon Rognerud, published by Entrepreneur Press.

Local search engine optimization can be just as time consuming and competitive as “regular” SEO. The same rules apply — you need to have good content and quality links. However, the tactics are slightly different in specific areas.

Local search is essential to small businesses. In 2010, Google revealed that the proportion of Google result pages that show a map is one in 13. A few months later, Google changed from its Local Business Center (LBC) to Google Places, which enables businesses to communicate with customers as well as supplement their Google profile information to include hours of operation, photos, videos, coupons and product offerings. We assume Google is serving maps more than 1 billion times a month.

With all of this in mind, here’s what you need to know to successfully tap into local markets.

Where to Begin
List your business in Google Places — it’s free. Watch a number of training videos and explore the features, including tools like tracking of actions (meaning how many times users showed interest in your business listing), clicks for more information on maps, driving directions or direct clicks, as well as impressions (how many times users saw yourbusiness listing as a local search result). As you’ll see, it will be important to get ratings and references, too.

While much focus is placed on Google Places, don’t forget to also register at:

Verify Your Business
One easy way to find out if your business is listed anyplace online is to search for your brand name. Include the city or locale you are supposed to be listed in.

If you are not listed, take action. For instance, if I’m a tax attorney in Beverly Hills, California, I would search for “tax attorney Beverly Hills.” I’d see the top local results (just below Google’s paid search results) as well as a local map on the right, hovering over more paid listings.

When I click on one particular local result, I notice that it has not been verified, meaning it does not include a “verified business owner” link. An arrow points to “Business owner,” indicating it needs verification.

Google pulls the data on this result (address, phone number) from some of the larger business aggregators like infoUSA, and attempts to match it up correctly. However, that data could be wrong. If that’s the case, it would be important for this particular owner to take corrective measures. This is why it’s important to verify your business information in local search results.

Select Your Categories
When registering, make sure to assign your business to the listed categories that best describe it. You can add up to five categories. Once you start typing, Google Places will display related categories.

Continue to add as much information as possible, including hours, payment types, e-mail address, phone number, URL/web address, photos, videos and coupons. Fill out each field, if appropriate.

When you’re done, make videos, upload them to YouTube and link them back into your local profile on Google Places. If you install and use tools like, it’ll be free and easy to make an informational, useful video. Screen cam your PowerPoint presentations. You can include up to 10 pictures and five videos.

Get Listed in Local Directories
Obtaining citations from local business directories like Yelp and Merchant Circle can be a powerful tool to get exposure and drive traffic. (See the list of directories in the Local SEO resources sections below.) Make sure that all your information is correct, and keep the same formatting across all locations.

Ask for Reviews
Don’t be afraid to ask customers for reviews. Offer special incentives and discounts for return visits to your office. You can also add a postcard or business card into your office invoice mailings asking your customers to review the visit and talk about the experience.

However, don’t unwittingly spam this system by asking all your friends to review you in a week. You should also get references from the Better Business Bureau, your local chamber of commerce and the top local directories.

For the original article go to:

10 SEO Tips to Remember When Building Your Site

2 Mar

10 SEO Tips to Remember When Building Your Site

Author:  Mark Cronin

Let us assume that, just like everyone else, you are building a website–after all, the Web is where it is all happening now. As soon as your website goes live–and especially while you are still in the design and development phase–you need to make sure your site’s content will be found through search engines such as Google, Yahoo!, and Bing where many people go to look for information.

Whether you’re starting a blog or deploying an e-commerce solution for your clients, it’s a good idea to keep in mind some good web development practices that will enhance your chances in search rankings.

This article follows up on a previous Six Revisions post called 9 Ways To Improve the SEO of Every Website You Design, sharing with you a few more tips for improving the search engine optimization, semantics, web accessibility, and interoperability of your web designs.

1. Use Flash Wisely

Don’t call me patronising–it goes without saying–but try not to use Flash when HTML/CSS should be used. Flash has its place on the Web: it’s great for interactive components such as sophisticated learning activity games with audio and video, and 3D animation. It’s better served as components of an HTML/CSS site than as the technology that powers the entire site. For example, check out Kongregate, a popular social Flash gaming site. Although Flash is their bread and butter, they still use HTML, CSS, and server-side scripting to power their site functionalities.

Use Flash Wisely

Flash can be SEO-friendly and web accessible, however, it’s more difficult when compared to using open web languages like HTML, CSS, and JavaScript.

2. Use Gracefully-Degrading JavaScript for Hidden Content

If you are hiding and showing content on a web page, it’s best to use JavaScript so that the content is still within the web page’s markup. This is not only good for web robots such as search engines, but also great for those that use screen readers.

Use Gracefully-Degrading JavaScript for Hidden Content

A good test to see whether hidden content on a web page is viewable by web crawlers is temporarily disabling JavaScript and determining if you can read your hidden content. You can do this from within your browser settings, but more conveniently, you can use a browser plugin like Web Developer Toolbar.

3. Name Your Image File Names Accurately

It is easy to forget how important images can be for SEO and web design alike. Name the actual file correctly by giving it a key term (e.g. yellow-banana.jpg and not some random name like img2gtc92.jpg) because this gives your site assets extra context. Make sure that you give the alt property of the image similar key terms and a decent and succinct description–aim to keep it 10 words or fewer.

In addition, Google Images is another way to get traffic to your site, and if you name your image files well and give them excellent context through their alt property, you will improve your chances of showing up in Google Images results.

4. Don’t Drown Your Home Page with Links

Internal linking–hyperlinks that point to other web pages on your site–is important, so try not to have more than 150 links from page to page so that you don’t dilute your web pages’ rank.

Don't Drown Your Home Page with Links

Too many internal links can overcrowd the page and can also slow down your users’ ability to find the link they need.

5. Don’t Use Redundant Links

Some may think that increasing the value of a particular page involves repeatedly linking to the same page from another page. Search engines will only count the first instance of that link, so there is no need to repeat links. In addition, this is a poor practice that will confuse your users.

6. Deep Linking Can Improve Conversions

Deep linking are links that point to internal pages instead of the main/home page. It is a fantastic way for you to send power to pages deeper in your site outside of your home page. Deep linking also promotes the exploration of your site by visitors, providing additional points of conversions.

7. Have a Blog

Blogs are a great way to keep building fresh content on your site and targeting long-tailed key terms. A possible idea is to bring snippets of blog posts onto related pages for fresh content on these specific pages.

8. Make Your Brand Obvious

Make sure that your branding is very clear and that your brand name is obvious on your website. This makes for an easy way for people to remember who you are and augments the possibility of people searching for your brand name on search engines. Being searched by your brand’s name also means that you won’t compete with generic words that people often use in searches (i.e. “budwieser” versus “american beer”).

9. Use an XML Sitemap

An XML Sitemap is a protocol that aids search engine crawlers gain contextual meaning about your site’s web page.

Use an XML Sitemap

If you’re using a content management system, see if it has an XML Sitemap extension (or built-in feature) that will automatically generate the XML file for you. If not, you can use a tool like XML Sitemaps Generator.

10. Use Anchor Text Accurately for Deeper Pages

When linking through to your deeper pages, use your anchor text as precise and short pieces of information. It is important to use keywords that search engine bots can relate exactly to your page. For example, if you’re linking to a web page about Maldives holidays, your link should be <a href="/maldives-holidays.html">Maldives holidays</a>. This way people, as well as web robots, easily know what they are going to get.

For the original article go to: