Monday, December 29, 2008

Image Search Optimization

As an obvious first step, figure out how images can and should fit into the user experience on your site. This is a non-trivial step. Then, determine how and where you can obtain original images for your site. Image search engines don't like duplicate content any more than web search engines, so you need to obtain your own original images. Once you have this in place, here are the major steps you can take to optimize for image search engines:

  1. Use keywords in the alt tag attribute. This is a critical step, as it is the one best opportunities you have to unambiguously label the image. Bear in mind that there is a huge amount of search volume that includes words like: photo, picture, image, pics, pix, or locations. Regarding the locations, if your image is a picture of a physical location, include some location information in the alt tag attribute.
  2. Note that the title tag attribute is usually ignored. Don't waste your time on it.
  3. Pick a logical file name, that reinforces the keywords. Using hyphens in the file name to isolate the words in the keyword is an OK to thing, just try not to exceed two hyphens. Do not use underscores as a word separator.
  4. Use a descriptive file name, in a similar fashion to the alt tag attribute.
  5. Pay attention to the file extension too. For example, if the image search engine sees a ".jpg" (JPEG) file extension, it's going to assume that the file is a photo.
  6. Basic web page optimization applies too. For example:
    • The title tag of the web page
    • The text nearby the image
    • The overall theme of the content of the page
    • The overall theme of the site (or section of the site)
  7. Also important is to get links to the page with the image on it. This could become an entire link building discussion in itself, but one simple way to do this it to post the pages with images on them to del.icio.us.
  8. Avoid duplicate content on your site. If for example, you have a thumbnail, a medium size image, and a full size image, you don't want these to all be indexed. The best way to handle this is to use robots.txt to prevent the crawler from looking at the versions you don't want indexed (most likely this would be the thumbnail or the full size image).
  9. seo company in hyderabad www.searchenginefactors.com

Saturday, December 27, 2008

seo checklist

Domain name & URLs (click heading for additional details)

* Short and memorable
* Uses Keywords
* Used in email addresses
* Uses Favicon
* Site.com redirect to www. version:
* Alternate Domain redirects
* Home page redirect to root
* No underscores in filenames
* Keywords in directory names
* Multiple pages per directory
* Registered for 5+ years



* Multiple versions:
* .com
* .org
* .net
* .biz
* Hyphenations
* Misspellings
* Product names
* Brand names
* Type-in keywords URLs

Browser issues (click heading for additional details)

* Visible address bar
* Fully functional navigation tools
* Visible status bar



* Site works in multiple browsers
* No browser hi-jacking

Site logo (click heading for additional details)

* Displays company name clearly
* Isn't hidden among clutter
* Links to home page



* Unique and original
* Use tagline consistently across site

Design considerations (click heading for additional details)

* Instant site identification
* Crisp, clean image quality
* Clean, clutter-less design
* Consistent colors and type
* Whitespace usage
* Minimal distractions
* Targets intended audience
* Meets industry best practices
* Easy to navigate
* Descriptive links
* Good on-page organization
* Easy to find phone number
* Don't link screen captures
* Skip option for flash
* Consistent page formatting
* No/minimal on-page styling
* Avoid text in images



* Font size is adequate
* Font type is friendly
* Paragraphs not too wide
* Visual cues to important elements
* Good overall contrast
* Low usage of animated graphics
* Uses obvious action objects
* Avoid requiring plugins
* Minimize the use of graphics
* Understandable graphic file names
* No horizontal scrolling
* Non-busy background
* Recognizable look and feel
* Proper image / text padding
* Uses trust symbols
* Works on variety of resolutions
* Works on variety of screen widths

Architectural issues (click heading for additional details)

* Correct robots.txt file
* Declare doctype in HTML
* Validate HTML
* Don't use frames
* Alt tag usage on images
* Custom 404 error page
* Printer friendly
* Underlined links
* Differing link text color
* Breadcrumb usage
* Nofollow cart links
* Robots.txt non-user pages
* Nofollow non-important links
* Review noindex usage
* Validate CSS
* Check broken links
* No graphics for ON/YES, etc.
* Page size less than 50K



* Flat directory structure
* Proper site hierarchy
* Unique titles on all pages
* Title reflects page info and heading
* Unique descriptions on pages
* No long-tail page descriptions
* Proper bulleted list formats
* Branded titles
* No code bloat
* Minimal use of tables
* Nav uses absolute links
* Good anchor text
* Text can be resized
* Key concepts are emphasized
* CSS less browsing
* Image-less browsing
* Summarize all tables

Navigation (click heading for additional details)

* Located top or top-left
* Consistent throughout site
* Links to Home page
* Links to Contact Us page
* Links to About Us page
* Simple to use
* Indicates current page
* Links to all main sections
* Proper categorical divisions
* Non-clickable is obvious
* Accurate description text



* Links to Login
* Provides Logout link
* Uses Alt attribute in images
* No pop-up windows
* No new window links
* Do not rely on rollovers
* Avoid cascading menus
* Keep scent from page to page
* Targets expert and novice users
* Absolute links

Content (click heading for additional details)

* Grabs visitor attention
* Exposes need
* Demonstrates importance
* Ties need to benefits
* Justifies and calls to action
* Gets to best stuff quickly
* Reading level is appropriate
* Customer focused
* Benefits and features
* Targets personas
* Provides reassurances
* Answers WIIFM



* Consistent voice
* Eliminate superfluous text
* Reduce /explain industry jargon
* No typo, spelling or grammar errors
* Contains internal contextual links
* Links out to authoritative sources
* Enhancing keyword usage (SEO)
* Date published on articles/news
* Web version of PDF docs available
* Consistent use of phrasing
* No unsubstantiated statements


Content Appearance (click heading for additional details)

* Short paragraphs
* Uses sub-headings
* Uses bulleted lists
* Calls to action on all pages
* Good contrast



* No overly small text for body
* No overly small text for headings
* Skimmable and scannable
* Keep link options in close proximity

Links and buttons (click heading for additional details)

* Limit the number of links on a page
* Avoid small buttons and tiny text for links
* Leave space between links and buttons
* Avoid using images as the only link



* Link important commands
* Underline all links
* Accurately reflects the page it refers

Home page (click heading for additional details)

* No splash page
* Instant page identification
* Provides overview of site



* Site purpose is clear
* Robot meta: NOODP,NOYDIR

About Us page (click heading for additional details)

* Adequately describes company
* Shows team biographies
* Shows mission statement
* Up to date information
* Note associations, certifications & awards
* Links to support pages:
* Contact page



* Investor relations
* Company news
* Registration info
* Job opportunities
* Newsletters
* Link to social media profiles

Contact Us page (click heading for additional details)

* Easy to find
* Multiple contact options:
* Phone
* Fax
* Email
* Form
* Chat
* Customer feedback
* Street map
* Hours of operation
* Final call to action



* Multiple points of contact:
* Customer service
* Tech support
* Inquiries
* General info
* Job applications
* Billing
* Management team
* Ad-free
* Form requires only essential info

E-Commerce considerations (click heading for additional details)

* Mini-product basket always available
* Displays payment options:
* CC
* Paypal
* Google Checkout



* No multiple paths to dupe product pages
* No tracking IDs in URLs
* Exclude shopping cart pages
* No (or nofollowed) links to secure pages
* Keep secure cert current

Product pages (click heading for additional details)

* Visible calls to action
* Clear contact info (phone #)
* Consistent layout
* Clear pricing
* Show additional fees
* Clear product presentation
* Show shipping cost
* Show availability
* Provide delivery options, details
* Estimate delivery date
* Link to site security info
* Return / guarantee info
* Allow "save for later"
* Related products & up sells
* Clear product image
* Describe images


* Enhanced multiple image views
* Product description
* Product details & specs
* Product selection options
* Customer product reviews
* Product comparisons
* Printer-friendly option
* "Add to cart" close to item
* Secondary "add" button at bottom
* Standardized product categorization
* Clutter-free page
* Provide International pricing
* Provide product search
* Emphasis brand quality and trust
* Compare to offline competitors
* Short URLs with keywords

Basket page (click heading for additional details)

* Obvious checkout link
* Product descriptions
* Product image
* Show availability
* Updatable quantities
* Ability to remove items
* Link to products
* Product price
* Payment options
* Promos/vouchers explained
* Link to security



* Link to guarantees
* Show delivery costs
* Show delivery date
* Allow gift options
* "Continue shopping" link or options
* Show contact information
* No advertising/upselling
* Don't keep personal info w/o authorization
* Shipping questions answered
* International shipping
* International address forms

Mini baskets (click heading for additional details)

* Make new products added obvious
* Link to full basket page



* Allow removal of products
* Show order total

Checkout process (click heading for additional details)

* No hidden fees
* No pre-registration
* Keep checkout process short
* Show benefits of registration:
* Faster checkout in future
* Access to order history
* Check order status
* Saved for later information
* Access to special promotions
* Personalization
* Joining a community
* Show checkout progress meter
* Effective after-order follow-up
* Receipt / Confirmation:



* Printable
* Emailed
* Thank you message
* Order number
* Order date
* Items purchased
* Expected delivery date
* Payment method
* Cancellation policy
* How to cancel
* Return policy
* Address return costs
* After-sale guarantees

Login & My Account pages (click heading for additional details)

* Easy to find login access
* Use security protocols
* Provide security assurances
* Link for new registrations
* Outline account benefits
* Reclaim lost password option
* "Remember me" option
* Link to privacy policy
* Logged-in status is clear
* Account info change access
* Confirmation of change info

* Links to financial info
* Transaction history
* Invoices
* Balances
* Payment methods
* Choose method of delivery:
* Text email
* HTML email
* Snail mail
* Overnight
* Etc.

Help and FAQ pages (click heading for additional details)

* Avoid marketing hype
* Allow Help search
* Provide printable text
* Link to additional resources:

User guides
* Product support
* Customer support
* Downloads

Forms and errors (click heading for additional details)

* Flexible entry requirements
* Allow for tabbing between fields
* Proper tab order
* Clear field labels
* Text label above field box
* Only require necessary information
* Minimal instructions
* Instructions above field
* Friendly error output
* Errors obviously indicated
* Errors describe remedy
* Errors provide contact / help option
* Preserve data with errors
* Provide pre-selected choices
* Don't overdo choices

* Note required fields
* Progress indicator
* Progress navigation
* Remove navigation
* Link to privacy information
* Final info verification check
* Confirmation/thank you page
* Stack fields vertically
* Proper use of radio buttons
* Keep "submit" close to fields
* Field boxes adequately wide
* No "reset" or "cancel" buttons
* Autocomplete=off as necessary
* Buttons denote action

Site search (click heading for additional details)

* Located in top-right corner
* Search not case sensitive
* Properly labeled as "search"
* Link to "advanced search"
* Forgiving of misspellings
* Shows similar products
* Shows related items in results
* No "no products found"
* Provide refinement options
* Provide alternate spellings

* Provide links to relevant pages
* Show search string in results
* Don't place results in tables
* Display exact matches first
* Display close matches second
* Bold query words in results
* Display titles with descriptions
* No more than 20 results p/ page
* Option to increase result p/ page
* Link to additional results pages

Privacy and Security pages (click heading for additional details)

* Present info in easy to read format
* Make information easily scannable
* Provide section summaries
* Identify information types collected
* Explain how cookies are used

* Explain how user information will be used
* Explain how info will be protected
* Provide additional protection tutorials
* Link to these pages in footer
* Provide links to contact info

Site map (click heading for additional details)

* Keep information current
* Link to site map in footer
* Linked from help and 404 pages
* Provide overview paragraph

* Provide intro to main sections
* Visible site hierarchy
* Descriptive text and links
* Link to xml sitemap in robots.txt file

Friday, December 19, 2008

Smo Cousin of SEO

The concept of Search Engine Optimization

(SEO) has created another of its cousin of the same fray – Social Media Optimization or popularly called SMO. Quite congruent to the SEO process, SMO involves making sure that a website is popular and easily link-able to different media resources like blogs, social bookmarking websites and other media sharing websites. The whole purpose of SMO is to popularize the website in the social circles of the internet and make netizens aware of it.

SEO and SMO, inspite of having their share of differences are quite similar. However, their roles are not interchangeable. Both the concepts are best used in complement to each other. Thus it becomes very important to know and understand the crucial characteristics of each and how and when to use them.

Similarities of SEO and SMO

Here are some common objectives of both Search Engine Optimizations and Social Media Optimization

Link Ability
Linkability determines how linakable your website is in cyberspace i.e. how many people would like to link to your website. If they do not have reason enough to link to your website, then there is no good reason why they should visit you. After all a bad website is as good as no website.

However, do not confuse Linkability with Link bait. Both are two different things. Link bait is the content that you may have on your website to which other sites link automatically because they like it, without your asking. On the other hand linkability is how appealing you can make your website to the members of cyberspace. It depends on content quality, website design and its usability.

both SEO and SMO strive to increase the linkability factor of a website.

quality Content

This is a natural consequence of the linkability factor. Good content not only increases the linkability of a website but also gives visitors a reason to come back. And as repeat visitors increase, the credibility of the website goes up.

Pull Marketing

Pull marketing is a tad different from the regular marketing attempts. Quite unlike the latter, the former aims to attract visitors by exhibiting to them the potential value of the website and what they can gain through it. Obviously it scores above and is more effective than blatant advertisement of the website.

Differences of SEO and SMO

Here are the areas of difference of both the SEO and SMO processes:

Channels of Process

SEO is the process of popularizing the website on major search engines like Google, Yahoo, and Microsoft etc. Thus it follows the principal guidelines dictated by search engines only. SMO, as the name suggests again is all about making the website popular in social circles or media sources of the internet like blogs, forums, online communities, bookmarking websites etc.

Target Users

SEO target users who have something particular in mind. That is because people using search engine have an intention and wish to acquire more information about the same. However, SMO has a wider target audience that mostly consists of savvy net users who don’t mind looking through a good piece of content.

seo Basic

1. Insert keywords within the title tag so that search engine robots will know what your page is about. The title tag is located right at the top of your document within the head tags. Inserting a keyword or key phrase will greatly improve your chances of bringing targeted traffic to your site.Make sure that the title tag contains text which a human can relate to. The text within the title tag is what shows up in a search result. Treat it like a headline.

2.
Use the same keywords as anchor text to link to the page from different pages on your site. This is especially useful if your site contains many pages. The more keywords that link to a specific page the better.

3. Make sure that the text within the title tag is also within the body of the page. It is unwise to have keywords in the title tag which are not contained within the body of the page.

Adding the exact same text for your h1 tag will tell the reader who clicks on your page from a search engine result that they have clicked on the correct link and have arrived at the page where they intended to visit. Robots like this too because now there is a relation between the title of your page and the headline.

Also, sprinkle your keywords throughout your article. The most important keywords can be bolded or colored in red. A good place to do this is once or twice in the body at the top of your article and in the sub-headings.

4. Do not use the exact same title tag on every page on your website. Search engine robots might determine that all your pages are the same if all your title tags are the same. If this happens, your pages might not get indexed.

I always use the headline of my pages as the title tag to help the robots know exactly what my page is about. A good place to insert the headline is within the h1 tag. So the headline is the same as the title tag text.

5. Do not spam the description or keyword meta tag by stuffing meaningless keywords or even spend too much time on this tag. SEO pros all agree that these tags are not as important today as they once were. I just place my headline once within the keywords and description tags.6. Do not link to link-farms or other search engine unfriendly neighborhoods.

7. Do not use doorway pages. Doorway pages are designed for robots only, not humans. Search engines like to index human friendly pages which contain content which is relevant to the search.

8. Title tags for text links. Insert the title tag within the HTML of your text link to add weight to the link and the page where the link resides. This is like the alt tag for images.

My site contains navigation menus on the left and right of the page. The menu consists of links not images. When you hover over the link with your mouse, the title of the link appears. View the source of this page to see how to add this tag to your links.

9. Describe your images with the use of the alt tag. This will help search engines that index images to find your pages and will also help readers who use text only web browsers.

10. Submit to the search engines yourself. Do not use a submission service or submission software. Doing so could get your site penalized or even banned.Submit only once. There is no need to submit every two weeks. There is no need to submit more than one page. Robots follow links. If your site has a nice link trail, your entire site will get indexed.My site has a nice human friendly link trail which robots follow easily. All my pages get indexed without ever submitting more than the main index page once.

what is robot text optimization

Some of you may ask what is it and why do we need it? In a nutshell, as it’s name implies, the Robots exclusion protocol is used by Webmasters and site owners to prevent search engine crawlers (or spiders) from indexing certain parts of their Web sites. It could be for a number of reasons, such as sensitive corporate information, semi-confidential data, information that needs to stay private, or to prevent certain programs or scripts from being indexed, etc.

A search engine crawler or spider is a Web “robot” and will normally follow the robots.txt file (Robots exclusion protocol) if it is present in the root directory of a Website. The robots.txt exclusion protocol was developed at the end of 1993 and still today remains the Internet’s standard for controlling how search engine spiders access a particular website.

If the robots.txt file can be used to prevent access to certain parts of a web site, if not correctly implemented, it can also prevent access to the whole site! On more than one occasion, I have found the robots exclusion protocol (Robots.txt file) to be the main culprit of why a site wasn't listed in certain search engines. If it isn't written correctly, it can cause all kinds of problems and, the worst part is, you will probably never find out about it just by looking at your actual HTML code.


As the name implies, the “Disallow” command in a robots.txt file instructs the search engine’s robots to "disallow reading", but that certainly does not mean "disallow indexing". In other words, a disallowed resource may be listed in a search engine’s index, even if the search engine follows the protocol. On the other hand, an allowed resource, such as many of the public (HTML) files of a website can be prevented from being indexed if the Robots.txt file isn’t carefully written for the search engines to understand.

The most obvious demonstration of this is the Google search engine. Google can add files to its index without reading them, merely by considering links to those files. In theory, Google can build an index of an entire Web site without ever visiting that site or ever retrieving its robots.txt file.

In so doing, it is not violating the robots.txt protocol, because it’s not reading any disallowed resources, it is simply reading other web sites' links to those resources, which Google constantly uses for its page rank algorithm, among other things.

Contrary to popular belief, a website does not necessarily need to be “read” by a robot in order to be indexed. To the question of how the robots.txt file can be used to prevent a search engine from listing a particular resource in its index, in practice, most search engines have placed their own interpretation on the robots.txt file which allows it to be used to prevent them from adding resources or disallowed files to their index.

Most modern search engines today interpret a resource being disallowed by the robots.txt file as meaning they should not add it to their index. Conversely, if it’s already in their index, placed there by previous crawling activity, they would normally remove it. This last point is important, and an example will illustrate that critical subject.

The inadequacies and limitations of the robots exclusion protocol are indicative of what sometimes could be a bigger problem. It is impossible to prevent any directly accessible resource on a site from being linked to by external sites, be they partner sites, affiliates, websites linked to competitors or, search engines.

Wednesday, December 17, 2008

How to make your website load fast

1. Use simple Design: Always go for simple Website Design It is easy to maintain and is quick to load up on visitor’s PC. Complicated design will increase loading time. Use minimum lines of code to develop the webpage.

2. Maximize the use of HTML and CSS: HTML (Hypertext Markup Language) is one of the most popular languages used in website design. Use HTML where ever possible to design your web pages. HTML coupled with CSS( Cascading Style Sheet) based coding of your web pages enables your website not only to load and navigate fast but also to be modified easily at a later stage if need be.

3. Use Light Images: A light webpage will load faster. Use only limited images be it text or picture in the image. This allows loading the webpage faster and even helps in SEO (Search Engine Optimization) of the website. It is easy for search engines to comprehend text on a webpage then an image. Use various image compression tools available online to reduce the image size in kilobytes to make it light. Generally a GIF image will be lighter than a JPEG one, but there is a bargain in terms of image quality in such a case.

4. Flash: Use of Flash in web pages may not be a very good idea. It does not stand very good in case you are trying to optimize your website for search engines like Yahoo, Google, MSN, etc. In addition it increases the loading time of webpage. So better option is to try and avoid a Flash website design. If still it is necessary use it within HTML site.

5. Use Tables: Loading time of tables is less because it is just HTML. Tables can be used everywhere it may be your home page, menu or somewhere else. It allows you to code and have a properly aligned design when you use tables in your coding. It definitely scores over frames. All kind of frames should be avoided on a website.

6. Increase Content Section: Content provides information about your services and your products to visitors. If used wisely content can make your webpage load faster. Imaging replacing a heavy image on your webpage which is there just to beautify the web page, with some meaningful content. This allows your website to perform better in search engine searches.

7. Text Link: In some sites you will find that buttons don’t fit properly on webpage. This can be due to size, shape or color of buttons. It is always desirable to use text links in place of graphics buttons. Another benefit of using text links is that you can use them within your text content area on a webpage.

8. Java Script: Main disadvantage of using JavaScript is that it is not supported by some browsers. It is also not search engine friendly and increase the loading time of page. So use it as less as possible. But on the other hand JavaScript is a very powerful tool in web programming and AJAX which is an advanced form of JavaScript is being used widely in web2.0 projects around the world.

9. Check Load time: Periodically check the load time of your website. On internet there are many free sites available where you can check the loading time for your website. This allows you to understand if any change to website has increased or decreased the loading time for your website. These webpage speed tests also provide you with valuable suggestion to improve you website performance.

hi friends famous seo company in hyderabad http://www.searchenginefactors.com

Monday, December 15, 2008

Page rank, an introduction

Google uses the ‘page rank algorithm’ amongst many other factors to determine which web pages are displayed at the top of search results. Expressed in a range from 0 to 10, the algorithm essentially ‘confers’ ranking from one page to another. Thus, for example, if a page rank 6 page links to you and there are no other external links on that page your page potentially gets a page rank boost of up to 4.8 (only a maximum of around 0.8 of the page’s own rank can be conferred). The reality is that the conferred ranking is distributed across ALL links on that page so a page rank 5 page with 5 links on it confers at most 0.8 page rank to each of its linked-to pages.

In practice, page rank is something the webmasters tend to obsess about, frankly because it is one of the few visual measures of ranking provided by Google, other than of course the presence of a site at the top of the search engine results pages (SERPs). Page rank is, in fact, a measure of . . .page rank. Many webmasters rigidly refuse to trade links with sites with lower page rank than their own, regardless of relevance. This is a mistake.

There are a good many page rank 0 websites that appear at the top of the results for relatively competitive terms. Equally some high page rank sites don’t appear for their target keywords. All of this indicates that, as you may expect, page rank is but one of many ranking factors in the Google algorithm.

The Google algorithm has developed over time and is far more complicated than it used to be. For example trusted sites will have higher search rankings and Google is known to have had teams hand reviewing high traffic and ‘major brand’ sites. The ranking that really matters is overall ranking in the SERPs. Over the last few years many methods have developed to improve a page’s rank. Today, Search Engine Optimisation (SEO) is the practice of promoting a site’s overall search ranking by using a combination of various legitimate techniques.

Friday, December 12, 2008

SEO (Search Engine Optimisation)

Search engine optimization is the process of making ones website content more search engine friendly to attract traffic by ranking higher. Search Engine Factors is the most experienced hand in the field of search engine optimization.Search Engine Factors is a specialized search engine optimization, marketing, promotion and ranking firm. We set very high, but realistic goals and do not believe in misleading our clients. Search Engine Optimization also known as Ranking is a gradual process as we all know and at the end of the time frame agreed upon, we guarantee top ranking results.

We assure you that at Search Engine Factors we use completely tested and perfectly legal methods to optimize your website. We have people who are highly qualified and understanding any complex search engine algorithm, Our team of consultants and marketing specialists are well versed with the algorithms of every major search engine.

hi friends iam launched new website for seo www.searchenginefactors.com

Supplemental Pages

Pages which are indexed in Google but do not exist at this time. But during searching for a particular thing they are shown in the search result pages. These pages provides additional information about the particular search.

Wednesday, December 10, 2008

Video Search Engine Optimization

Video search engine optimization is the one of the newest part of SEO. Now a day’s video creation tremendously increases. Why is that? Any one says only righting content and user will read all article or news content. So most of seo peoples turns towards video creation and there optimization. For example normal user goes [...]

Keyword Research is the king of your website

When you want to publish your website in competitor market, which basics your website will famous in it. First disused which subject basic of your website, then you try to choose correct keywords for website. Because every one knows keyword is the king of website to get more hits on your website in terms of [...]

Monday, December 8, 2008

AltaVista

AltaVista:
Used to be the #1 search engine until Google came along.

Link Popularity history:

To gain a better understanding of link popularity it is useful to know why it became so crucial for search engine rankings. In the past a web page's ranking was determined, amongst other factors, by the number of keyword occurences within 'on-page' elements i.e. in page text, META tags, title tag. When web developers learned that they could trick a search engine to return their web pages by cramming keywords into their pages the search engines had to get a bit smarter. They were using 'on-page' elements to determine relevance so it was only natural that they would look to elements out of direct control of the web page creator i.e. 'off-page' elements. Search engines made the assumption that the greater the number of links from other sites pointing to a web site, the more popular the web site is and therefore a more quality resource. This worked nicely in theory but in practice it was also to be abused.

Web site owners figured out many ways to get links pointing to their web sites one example of which was through the use of link farms, pages the contained nothing more than a collection of links, Quantity of links was being abused so the search engines made use of the old saying "quality not quantity" and began to assign a quality factor to each of the links pointing to a web site. Now web sites that had a higher number of high quality links were looked upon favourably by the search engines. Building link popularity became a science in itself and today is still the most time-consuming and frustrating activity for a search engine optimizer.