Subscribe:
Showing posts with label On-Page Optimization. Show all posts
Showing posts with label On-Page Optimization. Show all posts

Friday, December 2, 2011

SEO Tips for Beginners - Optimizing Images


Search engines are advanced enough to find and index images but they still need help determining what the image is. Eventually search engines will be able to scan an image and determine what it is but we aren't there quite yet. As of right now they need us to tell them what our images are so they can index them into image search for related keywords.

There are a few reasons search engines give boosts to websites that user proper image names and alt tags. The first is that it helps them understand what the image is so they can put it into their image index and other searchers can find it. The other reason search engines reward websites that optimize images is accessibility. Search engines want your content to be accessible to as many people as possible and will reward those who help them with that goal. If you optimize your images correctly it helps the blind and users with text‐only browsers understand what the image so they can get a more complete idea of the page.

Let's say you wrote an article called "Dog Training Techniques" and you have images of all the types of dogs that the particular methods work for. If your images are automatically named something like image1.jpg or postimage2123.jpg it doesn't tell us what the image is about at all. A better and more user‐friendly alternative for a picture of a golden retriever would be golden‐retriever.jpg or goldenretriever‐puppy.jpg. That way both search engines and visually impaired people know what the image is supposed to be even if they can't see it. It might sound like common sense, but give your images accurate names and search engines will reward you.

Note: Use hyphens in your image names to separate words instead of underscores, they're easier for search engines to understand.

The other major image attribute search engines use as a ranking factor is the alt tag. The alt tag is meant to be the “alternate” text if an image is unable to load or can’t be displayed on the page. When you have alt tags it creates stand‐in text if there is a problem with an image loading. Anyone accessing the page can at least see what the image was supposed to be if the image fails to load.

You can use the image name for the alt text unless it doesn't make sense to. A good alt tag for the image golden‐retriever‐puppy.jpg would be "Golden Retriever Puppy." The key with alt tags is to use a short description of what the image is or what it's supposed to represent.

You can pick up a lot of free traffic easily by just naming your images correctly. So few website owners name images what they are you can sometimes rank highly on image search without doing anything else. Easy traffic coupled with a higher search engine score makes image optimization a necessity when it comes to SEO.

Monday, November 28, 2011

SEO Tips for Beginners - On Page SEO Checklist

  1. Title Tag - Use accurate title tags with your target keyword for that page (this is what shows ups as the blue hyperlink on Google research results) 60 characters or less.
  2. Description Tag - Use accurate and helpful description tags to explain what the page is about (this is what shows up under the blue hyperlink on Google search results) 160 characters or less.
  3. H Tags - Use keyword rich H1, H2, and H3 tags to break down sections and subsections about your page content.
  4. Image Alt Tags - Use keywords relevant to your images in your alt tags.
  5. Keyword in URL - Whenever possible pick a domain name with your keyword in it and also make sure that your page URLs have your keywords in them. Example: http://www.yourdomain.com/keywords
  6. LSI Keywords - Try not to overuse one keyword and instead find variations of it that can help expand the visibility of your pages to more searchers.
  7. Bold and Italicize - Bold and italicize your keywords in your content, just don't overdo it!

Wednesday, November 9, 2011

Why Site Structure Is Important?

Time and time again over the years, I have seen students take their established sites to entirely new levels of traffic, simply by making the right kind of adjustments to the way their sites were structured.

These days, everyone seems to think that SEO is all about getting links. Now I’m not saying that links aren’t important, but in most of these cases, they hadn’t even begun to work on link building.

Most of these increases in traffic have come from improvements we made in the structure of the web site itself - and this kind of result isn’t all that unusual. In fact, site structure is probably the most overlooked and misunderstood aspect of SEO.

While most of your competitors are still trying to use a “sledgehammer” approach, and overwhelming the search engines with massive quantities of inbound text links, you can gain a tremendous advantage by paying attention to how your site is linked together.

Link building certainly magnifies the benefit of a good site structure, but the reverse is also true: good site structure greatly amplifies the benefit of your investment in link building.

There are four primary goals in structuring, or restructuring, a web site:
  1. Improving the user experience is your first goal, because this leads to higher conversion rates, happy customers, etc. If I ever have to choose between creating a good user experience and an SEO objective, I will choose my site’s visitors every time.
  2. Improving the “crawlability” of the site and channeling “link juice” (PageRank at Google, other search engines have their own formulas) into the most important pages – the ones that you’re trying to get ranked in search results. One method we use for this is called dynamic linking.
  3. Increasing the ranking of individual web pages within the site, and “broadening the profile” of our most important pages. By using the “anchor text” of our own internal links, and adding the right links in strategic places, we can boost our own search engine rankings.
  4. Getting more pages into the search engines’ index, also known as “index penetration.” Every additional page that gets indexed adds to our ability to improve our rankings, and in fact makes it easier to increase index penetration.
It shouldn’t be terribly shocking that the four stages of the “site structure” step are mapped against these four goals.

Friday, August 19, 2011

Optimize Your Web Content

Search engines don’t read like humans. We actually make sense of the individual words and their combinations (phrases, sentences, paragraphs, pages, page hierarchies, etc). We even read between the lines and take all visual design and aural elements into account.Search Engines aren’t that sophisticated even Google. In fact, they don’t really process meaning at all, they categorize a site’s subject matter based on the words that are used most often in the body copy, headings, links, etc. So content optimization is simply the act of using your target keyword phrases frequently on your site and in the places that matter. ‘Target keyword phrases’ being the words your target customers are searching for when they’re looking for your product or service.

When you optimize your website for a particular word, you’re essentially telling the search engines to include you in the results when people search for that word.As a rule of thumb, the more frequently you use your keywords, the more relevant you’ll be considered by the search engines, and the more likely you are to appear in searches for those words.

Sunday, August 14, 2011

Do Not Spam!

It’s almost impossible to spam unintentionally. Search engine spamming usually involves quite a bit of work and knowledge. But just to be sure, here’s a quick look at what you shouldn’t be doing.

What is search engine spam?
- A website is considered search engine spam if it violates a specific set of rules in an attempt to seem like a better or more relevant website. In other words,if it tries to trick the search engines into thinking that it’s something it’s not.

What is On-page spam?
- it is a deceptive stuff that appears on your website. Here are some examples of an On-page spamming.

  • Cloaking - Showing one thing to search engines and something completely different to visitors.
  • JavaScript Redirects - Because search engines don’t usually execute complex JavaScript, some spammers will create a page that looks innocent and genuine to search engines, but when a visitor arrives, they’re automatically redirected to a page selling Viagra, Health Products, etc.
  • Hidden Content - Some webmasters just repeat their keywords again and again and again, on every page, then hide it from visitors. These keywords aren’t in sentences, they’re just words, and they provide no value. That’s why they’re hidden, and that’s why it’s considered spam.The intent is to trick the search engines into thinking that the site contains lots of keyword rich, helpful content, when, in fact, the keyword rich content is just keywords; nothing more.These spammers hide their keywords by using very, very, very small writing (1pt font), or by using a font color that’s the same as the background color.
  • Keyword Stuffing - Severely overdoing your keyword density. Try to stick to around 3% keyword density. This is the most reader-friendly density. Usually anything over 5% starts to seem very contrived.
  • Doorway Pages - Page after of almost identical pages intended to simply provide lots and lots of keyword-rich content and links, without providing any genuine value to readers.
  • Scraping - Spammers who are lazy or incapable of creating their own content will steal it from other sites, blogs, articles and forums, then re-use it on their own site without permission, and without attributing it to its original author. The intent is to create lots of keyword rich content on their website, and trick the search engines into thinking their site is valuable, without actually doing any of the work themselves.

Tuesday, August 9, 2011

How To Resolve Duplicate Content Problems

You have more than one version of any page - Multiple versions of the same page is clearly duplicate content. Ex. A print-friendly version and the regular display version.) The risk is that Google may choose the wrong one to display in the SERPs.
  • Solution: Use a no_follow link to the print-friendly version. This will ensure that Google’s bots don’t crawl it, and that it won’t be indexed. The HTML of a nofollow link looks like this: <a href="page.htm" rel="nofollow">go to page</a>
  • Or use your robots.txt file to tell the search bots not to crawl the print friendly version.
You reference any page with more than one URL - Even though there’s really only one page, the search engines interpret each discrete URL as a different page. The reason for this problem is No canonical URL specified. A canonical URL is the master URL of your home page. The one that displays whenever your home page displays. For most sites, it would be http://www.yourdomain.com. Test if your site has a canonical URL specified. Open your browser and visit each of the following URLs:
    • http://www.yourdomain.com/
    • http://yourdomain.com/
    • http://www.yourdomain.com/index.html/
    • http://yourdomain.com/index.html/
If your homepage displays, but the URL stays exactly as you typed it, you have not specified a canonical URL, and you have duplicate content.
  • Solution: Choose one of the above as your canonical URL. It doesn’t really matter which one. Then redirect the others to it with 301 redirects. Read more here 301 Redirects.
  • Specify your preferred domain in Google Webmaster Tools (you have to register first). To do this, at the Dashboard, click your site, then click Settings and choose an option under Preferred domain. This is the equivalent of a 301 redirect for Google. But it has no impact on the other search engines, so you should still set up proper 301 redirects.
Someone has plagiarize your content - If someone has plagiarized your content, Google may mistakenly identify their plagiarized version as the original. This is unlikely,however, because most webmasters who plagiarize content are unlikely to have a very credible, authoritative site.
  • Solution: You can contact the offender and ask that they remove the content, and you can also report the plagiarism to Google (http://www.google.com/dmca.html). You can also proactively monitor who’s plagiarizing your content using Copyscape.
You syndicate content - If you publish content on your site and also syndicate it,your site’s version may not appear in the SERPs. If one of the sites that has reprinted your article has more domain authority than yours, their syndicated version may appear in the SERPs instead of yours.Also, other webmasters may link to the syndicated version instead of yours.
  • Solution: One way to try and avoid this situation is to always publish the article on your site a day or two before you syndicate it. Another is to always link back to the original from the syndicated. Whatever the case, the backlink from the syndicated article still contributes to your ranking. You just may not get as much direct search-driven traffic to the article which really isn’t the point of content syndication, anyway.

Monday, August 8, 2011

Avoid Duplicate Content In Your Website/Blog

You have duplicate content when:

  • You have more than one version of any page
  • You reference any page with more than one URL
  • Someone plagiarizes your content
  • You syndicate content
And it’s a problem for two reasons:
  1. Duplicate content filter - Let’s say there are two pages of identical content out there on the Web. Google doesn’t want to list both in the SERPs, because it’s after variety for searchers. The duplicate content filter identifies the pages, then Google applies intelligence to decide which is the original. It then lists only that one in the SERPs. The other one misses out. Problem is, Google may choose the wrong version to display in the SERPs. There’s no such thing as a duplicate content penalty.
  2. PageRank dilution – Some webmasters will link to one page/URL and some will link to another, so your PageRank is spread across multiple pages, instead of being focused on one. Note, however, that Google claims that they handle this pretty well, by consolidating the PageRank of all the links.

Thursday, August 4, 2011

Truth About Making Dynamic URLs to Static URLs

When a site’s content is called from a database, its URLs are normally generated. You can tell if a URL is dynamic because it’ll have characters like “?”, “=” and “&” in it. This is typical of sites that utilize a Content Management System (CMS) - including blogs. Example:
http://www.website.com/main.php?category=books&subject=biography

Static URLs, on the other hand, are tied to their content, and are generally a combination of the page’s filename and directory location. Example:
http://www.website.com/projects.htm

The 3 main problems with dynamic URLs are:
  1. They can lead to duplicate content issues.
  2. Search engines can have trouble reading them properly
  3. They reduce click-thrus from search engine results, they’re harder to remember, share and write down, they’re easily clipped, they’re often not keyword rich, and they often don’t give readers any clue about what to expect at the destination site.
These issues can be overcome by rewriting your dynamic URLs in such a way that they become static. For example, the following dynamic URL:
http://www.website.com/main.php?category=books&subject=biography

Could be rewritten to become the following static URL:
http://www.website.com/pagebooks-biography.htm

Unfortunately, static URL rewriting is not without risks of its own. If done incorrectly, it can cause Google problems crawling and indexing your pages. Google now outrightly advocates dynamic URLs:
Providing search engines with dynamic URLs should be favored over hiding parameters to make them look static.Of course, it’s important to remember that Google’s a public company, answerable to shareholders. It’s ability to crawl and index dynamic URLs better than its competitors is a significant competitive advantage, if leveraged. My advice is, if you’re using a CMS that doesn’t offer trustworthy dynamic URL rewriting, stick with dynamic URLs. If, however, your CMS rewrites dynamic URLs very well (e.g. WordPress or any CMS using mod_rewrite), then consider rewriting to static URLs - if it will help your customers and aid your promotions significantly. Rewriting dynamic URLs isn’t likely to have a huge impact on your rankings, so I would avoid it unless I was sure it wasn’t going to cause problems.

Wednesday, August 3, 2011

Check For Broken Links

Broken links are bad for visitors because they convey the impression that your site is not well maintained, and they‘re bad for SEO because they can stop the search engine bots from crawling all your pages.Note that Google specifically advises webmasters to check for broken links. It’s entirely possible that Google views broken links as a sign that your site is in poor repair, just as human visitors do,and that the existence of broken links may impact your ranking simply because Google wants well maintained sites at the top of its SERPs. You can use a tool called Xenu to find broken links. It’s simple to use and the reports are self-explanatory.

Add Titles To Internal Links

HTML links can include a Title which becomes a tool tip when a visitor hovers their mouse over the link.They’re also read out by screen readers for the vision-impaired people. Because this aids accessibility and helps reduce visitor disorientation, and because it’s indicative of the content of the destination page, search engines crawl it and it plays a part in how they index the page. Link titles look like this:
<a href="http://www.example.com/products.htm" title="Cheap Products">Products</a>

Monday, July 25, 2011

Optimize Your Internal Links

For search engine bots, text links are like doorways from page to page and site to site. This means websites are generally better indexed by search engines if their bots can traverse the entire site using text links. But there’s more to it than that. Links from top level pages like the ‘Home’ and ‘Products’pages carry more weight than links from lower level pages (e.g. the ‘5 tips to stay older’ page).The logic here is that if you link to a page from a top level page, you obviously want a lot of your visitors to see that link, so it must be key to your subject matter and business model.

Internal links also tell the search engines what pages are important. In other words, if you link to a page again and again and again, and you use meaningful anchor text, Google will assume that page is a core part of your subject matter, and index you accordingly. What’s more,every time you link to a page, it’s passed a bit of PageRank. Link to it enough, and it will become one of your higher ranking pages, as it develops ‘link equity’.

Limit links to fewer than 100 per page. Jakob Nielsen’s advice is ''include links to other resources that are directly relevant to the current location. Don't bury the user in links to all site areas or to pages that are unrelated to their current location.” Place your links prominently on each page. The search engines pay more attention to links toward the top of the page, and visitors are OK with prominent links too.Consider adding a nofollow to links that point to less important pages, so that the search engines don’t visit those pages. This increases the relative link equity of all your other pages. 
A nofollow looks like this: <a href="page1.htm" rel="nofollow">Go To Page 1</a>

Thursday, July 21, 2011

Optimizing HTML Meta Tags

Within the HTML code behind your page, there are things called ‘meta tags’. These are short notes within the header of the code that describe some aspects of your page to the search engines.Although there is some debate over how important meta tags are to SEO, it’s generally agreed that they shouldn’t be ignored.

Title Tag
Because of its function as the headline of your SERPs listing, the search engines figure it’s likely you’ll make it something fairly relevant to the content of the target page, in order to get people to click through. As a result, they pay more attention to it than the other tags when indexing your site.Try to use your keyword at least once in the Title, as close to the beginning of the tag as possible. But don’t use it again and again and again. That’s keyword stuffing, and you could be penalize. You have 66 characters including spaces in which to write a compelling, keyword rich headline for your listing. The better your title, the more people will click on it. Be descriptive and accurate.

Also it can be a good idea to include your company name in the Title. Above all else, this helps develop brand recognition especially when you rank on page 1,and lends credibility to your listing. E.g: Effective Weight Loss and Control Tips - Domain.com And finally, it’s best not to use the same Title tag on every page. It’s supposed to be a headline, compelling searchers to click through to your page. If it’s generic enough to be suitable for every page, it’s not going to be particularly compelling. What’s more, if Google sees duplicate Title tags, it may choose to display DMOZ data instead of your actual tag data.

The Title Tag looks like this:
<title> Effective Weight Loss and Control Tips- Domain.com </title>

Description Tag
Think of your description tag as the copy for an ad. You have 155 characters (including spaces) in which to craft an informative, compelling description. Try to use your keyword at least once in the Description, as close to the start as possible. For a product website, you might consider including the vital statistics about each product in the Description tag. E.g. Brand names, model numbers, colors, etc.
Note, however, that you don’t actually have to define a Description tag. Most search engines are capable of extracting what they need for the description from your site copy. Danny Dover, of SEOmoz, recommends defining a Description tag for the Home page, and leaving the rest blank and letting the search engines decide what to display (they’ll choose what content to pull from your page based on the search query).

I’m not convinced. If you leave the search engines to their own devices, there’s no guarantee they’ll choose a section that’s well written or even intended to be the “copy for an ad” as I’ve suggested the Description should be. I recommend defining the Description on all pages. It’s not a good idea to use the same Description on every page. It’s supposed to be helpful and persuade searchers to click through to your page. If it’s generic enough to be suitable for every page, it’s not going to be particularly engaging, compelling or helpful. What’s more, if Google sees duplicate Description tags, it may choose to display DMOZ data instead of your actual tag data.

The Description Tag looks like this:
<meta name="DESCRIPTION" CONTENT="From healthy diet plans to helpful weight loss tools, here you'll find the latest diet news and information." />


Keywords Tag

A comma-separated list of keywords that are most relevant to the subject matter of the page. Stick to about 300 characters and don’t repeat your keywords over and over. You can, however, include variations of your keyword, such as “weight loss”, “weight loss tips”, “weight loss guide” and “weight loss strategies.” You can also re-use a keyword so long as it’s part of a different phrase. The Keywords tag isn’t visible to visitors of your website unless they view the source. It’s really just a legacy from a time when the search engines used it as their sole means of identifying a site’s subject matter.

These days, most search engines pay it little or no mind. The key exception is Yahoo. Yahoo likes your Keywords tag to be ‘aligned’ with your web copy. So don’t include keywords in your tag that don’t appear in your copy if you want to rank in Yahoo.

Keywords Tag looks like this:
<meta name="KEYWORDS" CONTENT="weight loss, weight loss tips, weight loss guide, weight loss strategies" />