Monday, December 29, 2008

Image Search Optimization

As an obvious first step, figure out how images can and should fit into the user experience on your site. This is a non-trivial step. Then, determine how and where you can obtain original images for your site. Image search engines don't like duplicate content any more than web search engines, so you need to obtain your own original images. Once you have this in place, here are the major steps you can take to optimize for image search engines:

  1. Use keywords in the alt tag attribute. This is a critical step, as it is the one best opportunities you have to unambiguously label the image. Bear in mind that there is a huge amount of search volume that includes words like: photo, picture, image, pics, pix, or locations. Regarding the locations, if your image is a picture of a physical location, include some location information in the alt tag attribute.
  2. Note that the title tag attribute is usually ignored. Don't waste your time on it.
  3. Pick a logical file name, that reinforces the keywords. Using hyphens in the file name to isolate the words in the keyword is an OK to thing, just try not to exceed two hyphens. Do not use underscores as a word separator.
  4. Use a descriptive file name, in a similar fashion to the alt tag attribute.
  5. Pay attention to the file extension too. For example, if the image search engine sees a ".jpg" (JPEG) file extension, it's going to assume that the file is a photo.
  6. Basic web page optimization applies too. For example:
    • The title tag of the web page
    • The text nearby the image
    • The overall theme of the content of the page
    • The overall theme of the site (or section of the site)
  7. Also important is to get links to the page with the image on it. This could become an entire link building discussion in itself, but one simple way to do this it to post the pages with images on them to del.icio.us.
  8. Avoid duplicate content on your site. If for example, you have a thumbnail, a medium size image, and a full size image, you don't want these to all be indexed. The best way to handle this is to use robots.txt to prevent the crawler from looking at the versions you don't want indexed (most likely this would be the thumbnail or the full size image).
  9. seo company in hyderabad www.searchenginefactors.com

Saturday, December 27, 2008

seo checklist

Domain name & URLs (click heading for additional details)

* Short and memorable
* Uses Keywords
* Used in email addresses
* Uses Favicon
* Site.com redirect to www. version:
* Alternate Domain redirects
* Home page redirect to root
* No underscores in filenames
* Keywords in directory names
* Multiple pages per directory
* Registered for 5+ years



* Multiple versions:
* .com
* .org
* .net
* .biz
* Hyphenations
* Misspellings
* Product names
* Brand names
* Type-in keywords URLs

Browser issues (click heading for additional details)

* Visible address bar
* Fully functional navigation tools
* Visible status bar



* Site works in multiple browsers
* No browser hi-jacking

Site logo (click heading for additional details)

* Displays company name clearly
* Isn't hidden among clutter
* Links to home page



* Unique and original
* Use tagline consistently across site

Design considerations (click heading for additional details)

* Instant site identification
* Crisp, clean image quality
* Clean, clutter-less design
* Consistent colors and type
* Whitespace usage
* Minimal distractions
* Targets intended audience
* Meets industry best practices
* Easy to navigate
* Descriptive links
* Good on-page organization
* Easy to find phone number
* Don't link screen captures
* Skip option for flash
* Consistent page formatting
* No/minimal on-page styling
* Avoid text in images



* Font size is adequate
* Font type is friendly
* Paragraphs not too wide
* Visual cues to important elements
* Good overall contrast
* Low usage of animated graphics
* Uses obvious action objects
* Avoid requiring plugins
* Minimize the use of graphics
* Understandable graphic file names
* No horizontal scrolling
* Non-busy background
* Recognizable look and feel
* Proper image / text padding
* Uses trust symbols
* Works on variety of resolutions
* Works on variety of screen widths

Architectural issues (click heading for additional details)

* Correct robots.txt file
* Declare doctype in HTML
* Validate HTML
* Don't use frames
* Alt tag usage on images
* Custom 404 error page
* Printer friendly
* Underlined links
* Differing link text color
* Breadcrumb usage
* Nofollow cart links
* Robots.txt non-user pages
* Nofollow non-important links
* Review noindex usage
* Validate CSS
* Check broken links
* No graphics for ON/YES, etc.
* Page size less than 50K



* Flat directory structure
* Proper site hierarchy
* Unique titles on all pages
* Title reflects page info and heading
* Unique descriptions on pages
* No long-tail page descriptions
* Proper bulleted list formats
* Branded titles
* No code bloat
* Minimal use of tables
* Nav uses absolute links
* Good anchor text
* Text can be resized
* Key concepts are emphasized
* CSS less browsing
* Image-less browsing
* Summarize all tables

Navigation (click heading for additional details)

* Located top or top-left
* Consistent throughout site
* Links to Home page
* Links to Contact Us page
* Links to About Us page
* Simple to use
* Indicates current page
* Links to all main sections
* Proper categorical divisions
* Non-clickable is obvious
* Accurate description text



* Links to Login
* Provides Logout link
* Uses Alt attribute in images
* No pop-up windows
* No new window links
* Do not rely on rollovers
* Avoid cascading menus
* Keep scent from page to page
* Targets expert and novice users
* Absolute links

Content (click heading for additional details)

* Grabs visitor attention
* Exposes need
* Demonstrates importance
* Ties need to benefits
* Justifies and calls to action
* Gets to best stuff quickly
* Reading level is appropriate
* Customer focused
* Benefits and features
* Targets personas
* Provides reassurances
* Answers WIIFM



* Consistent voice
* Eliminate superfluous text
* Reduce /explain industry jargon
* No typo, spelling or grammar errors
* Contains internal contextual links
* Links out to authoritative sources
* Enhancing keyword usage (SEO)
* Date published on articles/news
* Web version of PDF docs available
* Consistent use of phrasing
* No unsubstantiated statements


Content Appearance (click heading for additional details)

* Short paragraphs
* Uses sub-headings
* Uses bulleted lists
* Calls to action on all pages
* Good contrast



* No overly small text for body
* No overly small text for headings
* Skimmable and scannable
* Keep link options in close proximity

Links and buttons (click heading for additional details)

* Limit the number of links on a page
* Avoid small buttons and tiny text for links
* Leave space between links and buttons
* Avoid using images as the only link



* Link important commands
* Underline all links
* Accurately reflects the page it refers

Home page (click heading for additional details)

* No splash page
* Instant page identification
* Provides overview of site



* Site purpose is clear
* Robot meta: NOODP,NOYDIR

About Us page (click heading for additional details)

* Adequately describes company
* Shows team biographies
* Shows mission statement
* Up to date information
* Note associations, certifications & awards
* Links to support pages:
* Contact page



* Investor relations
* Company news
* Registration info
* Job opportunities
* Newsletters
* Link to social media profiles

Contact Us page (click heading for additional details)

* Easy to find
* Multiple contact options:
* Phone
* Fax
* Email
* Form
* Chat
* Customer feedback
* Street map
* Hours of operation
* Final call to action



* Multiple points of contact:
* Customer service
* Tech support
* Inquiries
* General info
* Job applications
* Billing
* Management team
* Ad-free
* Form requires only essential info

E-Commerce considerations (click heading for additional details)

* Mini-product basket always available
* Displays payment options:
* CC
* Paypal
* Google Checkout



* No multiple paths to dupe product pages
* No tracking IDs in URLs
* Exclude shopping cart pages
* No (or nofollowed) links to secure pages
* Keep secure cert current

Product pages (click heading for additional details)

* Visible calls to action
* Clear contact info (phone #)
* Consistent layout
* Clear pricing
* Show additional fees
* Clear product presentation
* Show shipping cost
* Show availability
* Provide delivery options, details
* Estimate delivery date
* Link to site security info
* Return / guarantee info
* Allow "save for later"
* Related products & up sells
* Clear product image
* Describe images


* Enhanced multiple image views
* Product description
* Product details & specs
* Product selection options
* Customer product reviews
* Product comparisons
* Printer-friendly option
* "Add to cart" close to item
* Secondary "add" button at bottom
* Standardized product categorization
* Clutter-free page
* Provide International pricing
* Provide product search
* Emphasis brand quality and trust
* Compare to offline competitors
* Short URLs with keywords

Basket page (click heading for additional details)

* Obvious checkout link
* Product descriptions
* Product image
* Show availability
* Updatable quantities
* Ability to remove items
* Link to products
* Product price
* Payment options
* Promos/vouchers explained
* Link to security



* Link to guarantees
* Show delivery costs
* Show delivery date
* Allow gift options
* "Continue shopping" link or options
* Show contact information
* No advertising/upselling
* Don't keep personal info w/o authorization
* Shipping questions answered
* International shipping
* International address forms

Mini baskets (click heading for additional details)

* Make new products added obvious
* Link to full basket page



* Allow removal of products
* Show order total

Checkout process (click heading for additional details)

* No hidden fees
* No pre-registration
* Keep checkout process short
* Show benefits of registration:
* Faster checkout in future
* Access to order history
* Check order status
* Saved for later information
* Access to special promotions
* Personalization
* Joining a community
* Show checkout progress meter
* Effective after-order follow-up
* Receipt / Confirmation:



* Printable
* Emailed
* Thank you message
* Order number
* Order date
* Items purchased
* Expected delivery date
* Payment method
* Cancellation policy
* How to cancel
* Return policy
* Address return costs
* After-sale guarantees

Login & My Account pages (click heading for additional details)

* Easy to find login access
* Use security protocols
* Provide security assurances
* Link for new registrations
* Outline account benefits
* Reclaim lost password option
* "Remember me" option
* Link to privacy policy
* Logged-in status is clear
* Account info change access
* Confirmation of change info

* Links to financial info
* Transaction history
* Invoices
* Balances
* Payment methods
* Choose method of delivery:
* Text email
* HTML email
* Snail mail
* Overnight
* Etc.

Help and FAQ pages (click heading for additional details)

* Avoid marketing hype
* Allow Help search
* Provide printable text
* Link to additional resources:

User guides
* Product support
* Customer support
* Downloads

Forms and errors (click heading for additional details)

* Flexible entry requirements
* Allow for tabbing between fields
* Proper tab order
* Clear field labels
* Text label above field box
* Only require necessary information
* Minimal instructions
* Instructions above field
* Friendly error output
* Errors obviously indicated
* Errors describe remedy
* Errors provide contact / help option
* Preserve data with errors
* Provide pre-selected choices
* Don't overdo choices

* Note required fields
* Progress indicator
* Progress navigation
* Remove navigation
* Link to privacy information
* Final info verification check
* Confirmation/thank you page
* Stack fields vertically
* Proper use of radio buttons
* Keep "submit" close to fields
* Field boxes adequately wide
* No "reset" or "cancel" buttons
* Autocomplete=off as necessary
* Buttons denote action

Site search (click heading for additional details)

* Located in top-right corner
* Search not case sensitive
* Properly labeled as "search"
* Link to "advanced search"
* Forgiving of misspellings
* Shows similar products
* Shows related items in results
* No "no products found"
* Provide refinement options
* Provide alternate spellings

* Provide links to relevant pages
* Show search string in results
* Don't place results in tables
* Display exact matches first
* Display close matches second
* Bold query words in results
* Display titles with descriptions
* No more than 20 results p/ page
* Option to increase result p/ page
* Link to additional results pages

Privacy and Security pages (click heading for additional details)

* Present info in easy to read format
* Make information easily scannable
* Provide section summaries
* Identify information types collected
* Explain how cookies are used

* Explain how user information will be used
* Explain how info will be protected
* Provide additional protection tutorials
* Link to these pages in footer
* Provide links to contact info

Site map (click heading for additional details)

* Keep information current
* Link to site map in footer
* Linked from help and 404 pages
* Provide overview paragraph

* Provide intro to main sections
* Visible site hierarchy
* Descriptive text and links
* Link to xml sitemap in robots.txt file

Friday, December 19, 2008

Smo Cousin of SEO

The concept of Search Engine Optimization

(SEO) has created another of its cousin of the same fray – Social Media Optimization or popularly called SMO. Quite congruent to the SEO process, SMO involves making sure that a website is popular and easily link-able to different media resources like blogs, social bookmarking websites and other media sharing websites. The whole purpose of SMO is to popularize the website in the social circles of the internet and make netizens aware of it.

SEO and SMO, inspite of having their share of differences are quite similar. However, their roles are not interchangeable. Both the concepts are best used in complement to each other. Thus it becomes very important to know and understand the crucial characteristics of each and how and when to use them.

Similarities of SEO and SMO

Here are some common objectives of both Search Engine Optimizations and Social Media Optimization

Link Ability
Linkability determines how linakable your website is in cyberspace i.e. how many people would like to link to your website. If they do not have reason enough to link to your website, then there is no good reason why they should visit you. After all a bad website is as good as no website.

However, do not confuse Linkability with Link bait. Both are two different things. Link bait is the content that you may have on your website to which other sites link automatically because they like it, without your asking. On the other hand linkability is how appealing you can make your website to the members of cyberspace. It depends on content quality, website design and its usability.

both SEO and SMO strive to increase the linkability factor of a website.

quality Content

This is a natural consequence of the linkability factor. Good content not only increases the linkability of a website but also gives visitors a reason to come back. And as repeat visitors increase, the credibility of the website goes up.

Pull Marketing

Pull marketing is a tad different from the regular marketing attempts. Quite unlike the latter, the former aims to attract visitors by exhibiting to them the potential value of the website and what they can gain through it. Obviously it scores above and is more effective than blatant advertisement of the website.

Differences of SEO and SMO

Here are the areas of difference of both the SEO and SMO processes:

Channels of Process

SEO is the process of popularizing the website on major search engines like Google, Yahoo, and Microsoft etc. Thus it follows the principal guidelines dictated by search engines only. SMO, as the name suggests again is all about making the website popular in social circles or media sources of the internet like blogs, forums, online communities, bookmarking websites etc.

Target Users

SEO target users who have something particular in mind. That is because people using search engine have an intention and wish to acquire more information about the same. However, SMO has a wider target audience that mostly consists of savvy net users who don’t mind looking through a good piece of content.

seo Basic

1. Insert keywords within the title tag so that search engine robots will know what your page is about. The title tag is located right at the top of your document within the head tags. Inserting a keyword or key phrase will greatly improve your chances of bringing targeted traffic to your site.Make sure that the title tag contains text which a human can relate to. The text within the title tag is what shows up in a search result. Treat it like a headline.

2.
Use the same keywords as anchor text to link to the page from different pages on your site. This is especially useful if your site contains many pages. The more keywords that link to a specific page the better.

3. Make sure that the text within the title tag is also within the body of the page. It is unwise to have keywords in the title tag which are not contained within the body of the page.

Adding the exact same text for your h1 tag will tell the reader who clicks on your page from a search engine result that they have clicked on the correct link and have arrived at the page where they intended to visit. Robots like this too because now there is a relation between the title of your page and the headline.

Also, sprinkle your keywords throughout your article. The most important keywords can be bolded or colored in red. A good place to do this is once or twice in the body at the top of your article and in the sub-headings.

4. Do not use the exact same title tag on every page on your website. Search engine robots might determine that all your pages are the same if all your title tags are the same. If this happens, your pages might not get indexed.

I always use the headline of my pages as the title tag to help the robots know exactly what my page is about. A good place to insert the headline is within the h1 tag. So the headline is the same as the title tag text.

5. Do not spam the description or keyword meta tag by stuffing meaningless keywords or even spend too much time on this tag. SEO pros all agree that these tags are not as important today as they once were. I just place my headline once within the keywords and description tags.6. Do not link to link-farms or other search engine unfriendly neighborhoods.

7. Do not use doorway pages. Doorway pages are designed for robots only, not humans. Search engines like to index human friendly pages which contain content which is relevant to the search.

8. Title tags for text links. Insert the title tag within the HTML of your text link to add weight to the link and the page where the link resides. This is like the alt tag for images.

My site contains navigation menus on the left and right of the page. The menu consists of links not images. When you hover over the link with your mouse, the title of the link appears. View the source of this page to see how to add this tag to your links.

9. Describe your images with the use of the alt tag. This will help search engines that index images to find your pages and will also help readers who use text only web browsers.

10. Submit to the search engines yourself. Do not use a submission service or submission software. Doing so could get your site penalized or even banned.Submit only once. There is no need to submit every two weeks. There is no need to submit more than one page. Robots follow links. If your site has a nice link trail, your entire site will get indexed.My site has a nice human friendly link trail which robots follow easily. All my pages get indexed without ever submitting more than the main index page once.

what is robot text optimization

Some of you may ask what is it and why do we need it? In a nutshell, as it’s name implies, the Robots exclusion protocol is used by Webmasters and site owners to prevent search engine crawlers (or spiders) from indexing certain parts of their Web sites. It could be for a number of reasons, such as sensitive corporate information, semi-confidential data, information that needs to stay private, or to prevent certain programs or scripts from being indexed, etc.

A search engine crawler or spider is a Web “robot” and will normally follow the robots.txt file (Robots exclusion protocol) if it is present in the root directory of a Website. The robots.txt exclusion protocol was developed at the end of 1993 and still today remains the Internet’s standard for controlling how search engine spiders access a particular website.

If the robots.txt file can be used to prevent access to certain parts of a web site, if not correctly implemented, it can also prevent access to the whole site! On more than one occasion, I have found the robots exclusion protocol (Robots.txt file) to be the main culprit of why a site wasn't listed in certain search engines. If it isn't written correctly, it can cause all kinds of problems and, the worst part is, you will probably never find out about it just by looking at your actual HTML code.


As the name implies, the “Disallow” command in a robots.txt file instructs the search engine’s robots to "disallow reading", but that certainly does not mean "disallow indexing". In other words, a disallowed resource may be listed in a search engine’s index, even if the search engine follows the protocol. On the other hand, an allowed resource, such as many of the public (HTML) files of a website can be prevented from being indexed if the Robots.txt file isn’t carefully written for the search engines to understand.

The most obvious demonstration of this is the Google search engine. Google can add files to its index without reading them, merely by considering links to those files. In theory, Google can build an index of an entire Web site without ever visiting that site or ever retrieving its robots.txt file.

In so doing, it is not violating the robots.txt protocol, because it’s not reading any disallowed resources, it is simply reading other web sites' links to those resources, which Google constantly uses for its page rank algorithm, among other things.

Contrary to popular belief, a website does not necessarily need to be “read” by a robot in order to be indexed. To the question of how the robots.txt file can be used to prevent a search engine from listing a particular resource in its index, in practice, most search engines have placed their own interpretation on the robots.txt file which allows it to be used to prevent them from adding resources or disallowed files to their index.

Most modern search engines today interpret a resource being disallowed by the robots.txt file as meaning they should not add it to their index. Conversely, if it’s already in their index, placed there by previous crawling activity, they would normally remove it. This last point is important, and an example will illustrate that critical subject.

The inadequacies and limitations of the robots exclusion protocol are indicative of what sometimes could be a bigger problem. It is impossible to prevent any directly accessible resource on a site from being linked to by external sites, be they partner sites, affiliates, websites linked to competitors or, search engines.

Wednesday, December 17, 2008

How to make your website load fast

1. Use simple Design: Always go for simple Website Design It is easy to maintain and is quick to load up on visitor’s PC. Complicated design will increase loading time. Use minimum lines of code to develop the webpage.

2. Maximize the use of HTML and CSS: HTML (Hypertext Markup Language) is one of the most popular languages used in website design. Use HTML where ever possible to design your web pages. HTML coupled with CSS( Cascading Style Sheet) based coding of your web pages enables your website not only to load and navigate fast but also to be modified easily at a later stage if need be.

3. Use Light Images: A light webpage will load faster. Use only limited images be it text or picture in the image. This allows loading the webpage faster and even helps in SEO (Search Engine Optimization) of the website. It is easy for search engines to comprehend text on a webpage then an image. Use various image compression tools available online to reduce the image size in kilobytes to make it light. Generally a GIF image will be lighter than a JPEG one, but there is a bargain in terms of image quality in such a case.

4. Flash: Use of Flash in web pages may not be a very good idea. It does not stand very good in case you are trying to optimize your website for search engines like Yahoo, Google, MSN, etc. In addition it increases the loading time of webpage. So better option is to try and avoid a Flash website design. If still it is necessary use it within HTML site.

5. Use Tables: Loading time of tables is less because it is just HTML. Tables can be used everywhere it may be your home page, menu or somewhere else. It allows you to code and have a properly aligned design when you use tables in your coding. It definitely scores over frames. All kind of frames should be avoided on a website.

6. Increase Content Section: Content provides information about your services and your products to visitors. If used wisely content can make your webpage load faster. Imaging replacing a heavy image on your webpage which is there just to beautify the web page, with some meaningful content. This allows your website to perform better in search engine searches.

7. Text Link: In some sites you will find that buttons don’t fit properly on webpage. This can be due to size, shape or color of buttons. It is always desirable to use text links in place of graphics buttons. Another benefit of using text links is that you can use them within your text content area on a webpage.

8. Java Script: Main disadvantage of using JavaScript is that it is not supported by some browsers. It is also not search engine friendly and increase the loading time of page. So use it as less as possible. But on the other hand JavaScript is a very powerful tool in web programming and AJAX which is an advanced form of JavaScript is being used widely in web2.0 projects around the world.

9. Check Load time: Periodically check the load time of your website. On internet there are many free sites available where you can check the loading time for your website. This allows you to understand if any change to website has increased or decreased the loading time for your website. These webpage speed tests also provide you with valuable suggestion to improve you website performance.

hi friends famous seo company in hyderabad http://www.searchenginefactors.com

Monday, December 15, 2008

Page rank, an introduction

Google uses the ‘page rank algorithm’ amongst many other factors to determine which web pages are displayed at the top of search results. Expressed in a range from 0 to 10, the algorithm essentially ‘confers’ ranking from one page to another. Thus, for example, if a page rank 6 page links to you and there are no other external links on that page your page potentially gets a page rank boost of up to 4.8 (only a maximum of around 0.8 of the page’s own rank can be conferred). The reality is that the conferred ranking is distributed across ALL links on that page so a page rank 5 page with 5 links on it confers at most 0.8 page rank to each of its linked-to pages.

In practice, page rank is something the webmasters tend to obsess about, frankly because it is one of the few visual measures of ranking provided by Google, other than of course the presence of a site at the top of the search engine results pages (SERPs). Page rank is, in fact, a measure of . . .page rank. Many webmasters rigidly refuse to trade links with sites with lower page rank than their own, regardless of relevance. This is a mistake.

There are a good many page rank 0 websites that appear at the top of the results for relatively competitive terms. Equally some high page rank sites don’t appear for their target keywords. All of this indicates that, as you may expect, page rank is but one of many ranking factors in the Google algorithm.

The Google algorithm has developed over time and is far more complicated than it used to be. For example trusted sites will have higher search rankings and Google is known to have had teams hand reviewing high traffic and ‘major brand’ sites. The ranking that really matters is overall ranking in the SERPs. Over the last few years many methods have developed to improve a page’s rank. Today, Search Engine Optimisation (SEO) is the practice of promoting a site’s overall search ranking by using a combination of various legitimate techniques.

Friday, December 12, 2008

SEO (Search Engine Optimisation)

Search engine optimization is the process of making ones website content more search engine friendly to attract traffic by ranking higher. Search Engine Factors is the most experienced hand in the field of search engine optimization.Search Engine Factors is a specialized search engine optimization, marketing, promotion and ranking firm. We set very high, but realistic goals and do not believe in misleading our clients. Search Engine Optimization also known as Ranking is a gradual process as we all know and at the end of the time frame agreed upon, we guarantee top ranking results.

We assure you that at Search Engine Factors we use completely tested and perfectly legal methods to optimize your website. We have people who are highly qualified and understanding any complex search engine algorithm, Our team of consultants and marketing specialists are well versed with the algorithms of every major search engine.

hi friends iam launched new website for seo www.searchenginefactors.com

Supplemental Pages

Pages which are indexed in Google but do not exist at this time. But during searching for a particular thing they are shown in the search result pages. These pages provides additional information about the particular search.

Wednesday, December 10, 2008

Video Search Engine Optimization

Video search engine optimization is the one of the newest part of SEO. Now a day’s video creation tremendously increases. Why is that? Any one says only righting content and user will read all article or news content. So most of seo peoples turns towards video creation and there optimization. For example normal user goes [...]

Keyword Research is the king of your website

When you want to publish your website in competitor market, which basics your website will famous in it. First disused which subject basic of your website, then you try to choose correct keywords for website. Because every one knows keyword is the king of website to get more hits on your website in terms of [...]

Monday, December 8, 2008

AltaVista

AltaVista:
Used to be the #1 search engine until Google came along.

Link Popularity history:

To gain a better understanding of link popularity it is useful to know why it became so crucial for search engine rankings. In the past a web page's ranking was determined, amongst other factors, by the number of keyword occurences within 'on-page' elements i.e. in page text, META tags, title tag. When web developers learned that they could trick a search engine to return their web pages by cramming keywords into their pages the search engines had to get a bit smarter. They were using 'on-page' elements to determine relevance so it was only natural that they would look to elements out of direct control of the web page creator i.e. 'off-page' elements. Search engines made the assumption that the greater the number of links from other sites pointing to a web site, the more popular the web site is and therefore a more quality resource. This worked nicely in theory but in practice it was also to be abused.

Web site owners figured out many ways to get links pointing to their web sites one example of which was through the use of link farms, pages the contained nothing more than a collection of links, Quantity of links was being abused so the search engines made use of the old saying "quality not quantity" and began to assign a quality factor to each of the links pointing to a web site. Now web sites that had a higher number of high quality links were looked upon favourably by the search engines. Building link popularity became a science in itself and today is still the most time-consuming and frustrating activity for a search engine optimizer.

Friday, November 28, 2008

Negative Keyword

Negative Keyword is a term referenced by Google AdWords and is a form of keyword matching. This means that an advertiser can specify search terms that they do not want their ad to be associated with.

For example, if you add the negative keyword "-nike" to the keyword "running shoes", the ad will not be displayed if a person searches upon the term "nike running shoes".

Negative keyword matching ensures that only qualified traffic is clicking upon advertising.

hi Friends, Here is my friend's website, who is a famous web designer in Hyderabad. Feel free to check his portfolio website.http://www.picturesque.in

Tuesday, November 25, 2008

Inbound link

Also known as back link, backward link, or backlinks, inbound links are all of the links on other websites that direct the users who click on them to your site. Inbound links can significantly improve your site’s search rankings, particularly if they contain Anchor Text keywords relevant to your site and are located on sites with high Page Rank.

Outbound Link

A link that points away from your website..

Monday, November 24, 2008

Hidden Text

This is text that can not be seen in a normal browser. Some websites use hidden text to trick search engine spiders. Hidden text can contain high density keywords that make no sense to a normal person but can make search engines believe a site is about a certain subject. Hidden text can be implemented in many ways: beneath images, in white on a white background, within certain code-tags or just in a very, very, small font. It is therefore hard for search engine spiders to detect this kind of fraud, but they are becoming increasingly better at it.

Wednesday, November 19, 2008

paid placement

paying a search engine to have your listing show up prominently. These listings are usually denoted as "sponsored listings."

hi Friends, Here is my friend's website, who is a famous web designer in Hyderabad. Feel free to check his portfolio website. www.picturesque.in

Tuesday, November 18, 2008

Back link

A Back Link is a link from an external site to yours. Getting many Back Links is essential for Search Engine Optimization as it causes a PageRank to increase. Most times a Back Link is aquired by exchanging links, affiliating or in other words reciprocal linking with other webmasters. Please be on the lookout for too many reciprocal links (link-farms) as Google punishes websites for link farming.

hi Friends, Here is my friend's website, who is a famous web designer in Hyderabad. Feel free to check his portfolio website.http://www.picturesque.in

Monday, November 17, 2008

Googlewashing

Googlewashing is when a websites content is illegally copied to someone else’s website. It is also referred to as pagejacking. Usually copying content doesn’t pay off, because of Googles duplicate content filter. A site with a high PageRank is trusted more than a new starting site, and in this case, when the new page is Googlewashed by the high PR site it can damage visitor numbers.

Sunday, November 16, 2008

Meta Keywords Tag

A Meta Tag holds information about the content of a page. The Meta Keywords Tag isn’t used a lot nowadays. It’s initial purpose was to hold keywords that described a pages content. But as webmasters became more cunning and started abusing keywords, Search Engines stopped using Keyword Meta Tags for indexing purposes. Keyword Meta Tags are far from important for SEO, but having them never hurts (as long as you don’t stuff them), because some older search engines still use Keyword Meta Tags.

Saturday, November 15, 2008

Directory

A directory is a list of websites that have been categorized, usually by humans. For SEO it’s very easy to get inbound links from directories. One can usually get a free inbound link by submitting their website. Some directories require reciprocal links, but one should be careful is linking back to websites. If you link to so-called bad neighborhoods your site may be punished. Check if the directory has not been banned from google (the PR bar in your Google-bar is grey if a site has been banned).
Other directories require payment for inclusion. Consider the PR of the directory, the directories relevancy and your budget before paying them.

Several directories, like DMOZ are trusted by Google and getting listed here is very important for your website SEO.

Friday, November 14, 2008

CTR

CTR or Click Through Rate is the amount of clicks an add gets compared to the times it has been showed. A lot of research has been done on what ads generate the highest click through rate. This is important as most publisher networks still charge per 1000 exposures (times a add is displayed).
One little fact the automotive branch uses, is that cars viewed from the low angles focused on the left or right front of the car generate a higher CTR.
There are many other techniques that boost CTR though and it pays to do some research on this topic.

One way to do this when using Google Adwords or Yahoo Search Marketing is to use several campaigns at once and compare their CTR. Keep the campaign with the highest CTR and copy it to a new campaign. Now write a new campaign text and let the three campaigns run for a while. Let some time pass and again keep the one with the highest CTR and begin the process again.

The good thing about this system is that you will only put 33% of your advertising budget on the line for finding new campaign strategies.

Cache

Search Engines cache websites in their database after spidering them. It’s basically a copy of a website stored on the Search Engines sever.

Wednesday, November 12, 2008

5 Steps to Increase your Google Page Rank.

Google Page rank is based on back links. Back links are Links pointing to your website from another website. The more back links you have the higher your PR will be.

1. Join forums, forums are a great way to achieve links to your website. In most forums you are allowed to have a signature and in your signature you can put a link to your website. But another important note to look on is making sure the forum is somewhat related to your website. You will still get credit if it's not, but if it's related to your website than you will be accomplishing two tasks at once.

You will be advertising for your website (bringing in targeted traffic) You will also be building your websites presence.

Your websites presence is very important to your survival. The more people see, or hear about your website the more credibility you will have and this increases your chances of having these visitors come back and possibly become leads.

2. Submit to search engine directories. Search engine directories are a good way to get a free link to your website. They also increase your chances at being listed higher on popular search engines like Google, and overture.

Most search engine directories allow you to submit to their website for free. This will allow you to increase your web presence by being listed on another search engine, and it will also be a free link.

Remember the more links you have the higher your PR will be

3. Using ezine ads (or newsletters). Creating an ezine will probably be the most beneficial step you can take to increasing your web presence. When you create an ezine you will be able to keep visitors coming back to your website for more by using signatures and giving special deals.

Ezine's will also allow you to increase your back links. By creating an ezine you can submit your information about your ezine to an ezine directory. This directory will than link to your website(thus giving you a free link).

4. Creating and publishing articles. Articles are an easy source of generating new traffic. You can include your signature in your article. This will bring in more traffic from article submission directories.

Your signature usually consists of 4 to 8 lines. Usually the first line would be the title of the website that you are trying to advertise. The last line would be the link to the website and the lines in between these would be a sales pitch to draw your viewers into your website.

5. Links from related websites. Gaining links from related websites can be one of the most frustrating tasks you can attempt

Robots.txt

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.

The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.

Monday, November 10, 2008

Meta element used in search engine optimization

Meta elements provide information about a given webpage, most often to help search engines categorize them correctly. They are inserted into the HTML document, but are often not directly visible to a user visiting the site.

They have been the focus of a field of marketing research known as search engine optimization (SEO), where different methods are explored to provide a user's site with a higher ranking on search engines. In the mid to late 1990s, search engines were reliant on meta data to correctly classify a webpage and webmasters quickly learned the commercial significance of having the right meta element, as it frequently led to a high ranking in the search engines — and thus, high traffic to the website.

As search engine traffic achieved greater significance in online marketing plans, consultants were brought in who were well versed in how search engines perceive a website. These consultants used a variety of techniques (legitimate and otherwise) to improve ranking for their clients.

Meta elements have significantly less effect on search engine results pages today than they did in the 1990s and their utility has decreased dramatically as search engine robots have become more sophisticated. This is due in part to the nearly infinite re-occurrence (keyword stuffing) of meta elements and/or to attempts by unscrupulous website placement consultants to manipulate (spamdexing) or otherwise circumvent search engine ranking algorithms.

While search engine optimization can improve search engine ranking, consumers of such services should be careful to employ only reputable providers. Given the extraordinary competition and technological craftsmanship required for top search engine placement, the implication of the term "search engine optimization" has deteriorated over the last decade. Where it once implied bringing a website to the top of a search engine's results page, for the average consumer it now implies a relationship with keyword spamming or optimizing a site's internal search engine for improved performance.

Major search engine robots are more likely to quantify such extant factors as the volume of incoming links from related websites, quantity and quality of content, technical precision of source code, spelling, functional v. broken hyperlinks, volume and consistency of searches and/or viewer traffic, time within website, page views, revisits, click-throughs, technical user-features, uniqueness, redundancy, relevance, advertising revenue yield, freshness, geography, language and other intrinsic characteristics.

Robots exclusion standard

The robot exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code. The standard complements Sitemaps, a robot inclusion standard for websites.

Sunday, November 9, 2008

One way links Defnition

One-way links are links to your site from sites which do not receive a link from your site.

They send a powerful message to the search engines - that your website is so valuable or interesting or useful that other sites want to tell people about it.

One-way links are wonderful things to have because they increase your link popularity - the number of pages linking to your site. Search engines such as Google place huge importance on link popularity when ranking your site.

You can also receive direct traffic to your site from people who click on the links.

All links to your site are good, but where possible always aim for topic-related links.

Late in 2005, Matt Cutts of Google made it plain in his blog that Google frowns upon sites which "overdo" reciprocal links. Heed his warning and try to make as many of your links as possible one-way links.

for more info visit blog: www.andhraruchi.com

Friday, November 7, 2008

BackLinks

Backlinks are incoming links to a website or web page. Backlinks enable you to keep track of other pages on the web that link to your website. The number of backlinks is an indication of the popularity or importance of that website or page. In basic link terminology, a backlink is any link received by a website (web page, directory, website, or top level domain) from another website. Backlinks are also called incoming links, inbound links, inlinks, and inward links.



Why backlinks are so important?

Search Engine Results. This is perfect and a very small answer to this question. When search engines calculate the relevance of a site to a keyword, they consider the number of QUALITY inbound links (backlinks) to that site. More backlinks to your website, higher your search engine result position. Backlinks hold an important place in the field of Search Engine Optimization. It is one of several criteria considered by search engines when indexing and driving traffic to your site. Link Popularity and number of backlinks are directly proportional to each other. The more the backlinks, the better the popularity of a site. Backlinks are very crucial for Search Engine Optimization and Ranking. Some Search Engines, especially Google gives a lot of importance to the number of backlinks a website has. Google considers website more credible that others on the basis of number of quality backlinks it has and considers them more accurate and relevant in their results pages for a search queries.

What is quality backlinks and relevant backlinks?

Please note that quality backlinks and relevant backlinks are two very important considerations for SERP. A backlink from a site that has quality content is reffered as quality backlinks or relevant backlinks. Relevant backlinks will hold you in a much better stead than a back link coming from a poorly designed or optimized site Or from a site that has nothing to do with the theme of your site. The major search engines such as google see incoming links from relevant sites, and give them more rank compared a backlink from an unrelated site. For example, if you have a website about web design products, your link strategy should target web design related sites.



Thursday, November 6, 2008

common mistakes in seo

1.Incorrectly Designed Websites :

  • Lack of proper Navigation
  • Using frames to save web designers designing times
  • Large image sizes will cause more time to download pages. If it is necessary to use large images then consider using thumbnails and open it in separate page. (This helps in creating more pages and more text which help the spiders to crave)
  • Using high resolution graphics (Try to use low resolution graphics)
2.Poorly Written of content :

content absolutely must have targeted keywords and phrases. If content is written properly you can make more targeted keywords and appropriate phrases.

Absence of targeted keywords and phrases can break your site. If you have not used related keyword in your body text then your site will not come in listing when user type particular keywords related to your site.

More sure to use keywords placed in the meta keyword tag is logical to your content.

People visiting your site as a result of their search would leave as soon as they see the home page is it is irrelevant or don’t match to the keyword or phrase they are searching. Use some tools such as word tracker to find what people are actually typing in to the search engines to find goods and services similar to yours and should concentrate on ranking well for those terms.

Replica of Content :

Using more than one page with different name, but content in page are same then search engines will consider this as trick and this affect ranking never try to copy content from other websites.

Wednesday, November 5, 2008

Link popularity.why it is important

Link popularity is the total number of web sites that link to your site

Because good link popularity can dramatically increase traffic to your web site. Well placed links are an excellent source of consistent and targeted traffic. And due to recent developments, they can even generate additional search engine traffic to your site.

Most of the major search engines now factor Link Popularity into their relevancy algorithms. As a result, increasing the number of quality, relevant sites which link to your site can actually improve your search engine rankings. There is still no one "secret trick" to getting good rankings, but boosting your site's popularity may give it the edge it needs.

Knowing who links to your site and increasing the number of quality links is an important part of any web site promotion effort. This free service allows you to query Google, Yahoo, and MSN and reports on link popularity.

hi friends my friend recently launched website for free templates all types of templates visit us
www.templatescup.com


Sunday, October 26, 2008

What does “organic search results” mean

It is the same as “natural search results”. This is the opposite to paid listings.

Faqs in Seo

Explain the importance of domain name

Ans:
Domain name although the Age of Domain matters, search engines do not favor older and established sites with no recently updated content, it is better to have the content separated and have a different domain (preferably with the keywords in it) for different niches

From seo point of view, which is better - one big site or several smaller site?

Ans:
If you have enough content in the same niche it is better to have it in one big site because first, this way the site is easier to maintain and the great number of pages is good for ranking high in search results. Many small sites allow to focus on specific niches, therefore compete for different keywords.

Saturday, October 25, 2008

White Hat Search Engine Optimization

There are ethical and unethical practices in search engine optimization, Ethical practices are called as white hat seo. The opposite of bat hat seo is called as white hat seo. White hat seo includes some of these activities like arranging content in a clear hierarchy so that important words and phrases are used as required .

Getting page views at any cost to your website is not a good tactic. It never works in the long run, even if it may work for a short while. Beware of any SEO company that recommends you employ any of these techniques.

Black Hat SEO

lack Hat SEO includes things like keyword stuffing. The terminology “Black Hat” in Black Hat SEO comes from the old cowboy western movies where all the bad guys wore black hats and all the good guys wore white hats. Cloaking is another black hat seo technique, Google will penalize sites that flagrantly buy and sell links. If you are buying or selling links: do it covertly & do it under the radar. Buying and selling links without the no follow tag is now officially a black hat SEO practice that is against Google’s Webmaster Guidelines.

Black hat SEO techniques usually include following characteristics:

  • breaking search engine rules and regulations
  • Unethically presenting content in a different visual or non-visual way to search engine spiders and search engine users.

Black hat SEO practices will actually provide short-term gains in terms of rankings, but if we are discovered utilizing these spammy Techniques on a Web site, we run the risk of being penalized by search engines.

Black Hat SEO Techniques To Avoid
  • Keyword stuffing
  • Invisible text
  • Doorway Pages

Website will be penalized by Search Engines if the web pages are stuffed / placed with long list of keywords and nothing content else. Web masters try to put list of keywords in black text on a black back ground., and white text in white background in order to attract more search engine spiders. When searchers search this page these keywords are not visible to them but search engines read this keywords which is considered as black hat seo technique.
A door way page is a fake page which a user will never see., this page is created purely for search engine spiders and the web masters attempt to index these pages higher.
Black Hat Seo Techniques work temporarily. It is better to stay away from anything that even looks like Black Hat SEO.

Example : BMW car maker from Germany have been kicked out of the Google index for spamming.

The reason for the ban of BMW website is likely to be that have been caught employing a technique used by black-hat search engine optimizers: doorway pages. (Doorway page is stuffed full of keywords that the WEB SITE feels a need to be optimized for; however, as opposed to real pages, this doorway is only displayed to the Googlebot. Human visitors will be immediately redirected to another page upon visit. )And that’s exactly what happened at BMW.de.
BMW almost immediately removed the pages after the news broke ,but it was too late . German BMW Web Site is now suffering with “Google death penalty”: a ban from almost any imaginable top search result, and a degrading of the Page Rank to the lowest possible value.

Penalty on BMW Website is a good example of what can happen to sites going against the Google webmaster guidelines – no matter how big or important one might deem the web site.
So always remember “If an SEO company creates deceptive or misleading content on your behalf, such as doorway pages or ’throwaway’ domains, your web site could be removed entirely from Google’s index.”

Google’s guidelines say webmasters should optimize for humans, not machines, because Google doesn’t like to be cheated. Many ethical SEOs provide useful services for website owners, from writing copy to giving advice on site architecture and helping to find relevant directories to which a web site can be submitted. But a few unethical SEOs have given the industry a black eye through their unfair practices like manipulating search engine results and doing aggressive marketing.

Friday, October 24, 2008

Link Building

There are three types of links which will increase link popularity for a website.

Internal links : Incoming links : Outgoing links : Link popularity is defined as the number of links pointing to and from related websites and it is an extremely important method for improving sites relevancy in search engines.

Internal links :
Number of Links to and from pages within a site is called as internal link popularity. Search engine spiders find and index important related pages quicker if some pages are buried deep within the site when we do cross linking of important related pages
Incoming links :
Incoming links are of 2 types.
(Links from sites we control)
(Links from sites we don’t control)
Links pointing to a website from other related sites is called incoming link popularity.To find link popularity for a website or which sites are linking to our website or competitors website we need to go to Google search box and enter “link:” followed by domain name with out using “www”

Outgoing links :
Outgoing links refers to links pointing to other related sites from your site. Search engine spiders will crawl your site's outgoing links and determine that the content of the sites you link to are related to the content of your own site.How much importance outgoing links add to a site's link popularity rating is still being debated by search engine optimization specialists

Site maps :
Site maps leads to links which leads to most or all pages of the website. Site maps are hierarchically organized. Site maps are visual models of web site content that allows users to find specific webpage. If more pages are available in a website it is recommended to have a site map, by using site map search engine spiders will crawl the links and index the entire website.

Keywords Terminology

1.Keyword Density :
Keyword density refers to the ratio or percentage of keywords contained within the total number of indexable words within a web page. It is important for your main keywords to have the correct keyword density to rank well in Search Engines. Keyword density ratio varies from search engine to search engine. The recommended keyword density ratio is 2 to 8 percent. Keywords quality is very important more than keyword quantity because if we have keywords in the page title, the headings, and in the first paragraphs this count more. The reason is that the URL (and especially the domain name), file names and directory names, the web page title, the headings for the separate sections are more important than ordinary text on the web page . we may have same keyword density as our competitors web site but if we have keywords in the URL, this will boost our website ranking incredibly, especially with Yahoo search.

2.Keyword Frequency:
Keyword frequency means the number of times a keyword phrase or keyword appears with in a webpage. The more times a keyword or keyword phrase appears within a web page, the more relevance a search engines are likely to give the page for a search with those keywords or key phrase.

3.Keyword Prominence :
Keyword prominence refers to how prominent keywords are within a web page. It is recommendation to place important keywords at the start or near of a web page, starting of a sentence, TITLE or META tag.

4.Keyword Proximity:
Keyword proximity refers to the closeness between two or more keywords. It is better to have keywords placed closer in a sentence.Keyword proximity examples: Example 1: How Keyword Density Affects Search Engine Rankings.Example 2: How Keyword Density Affects Rankings In Search Engine

Levels of keywords

Keywords Levels are totally 3types

1.Low level keywords :
If the keyword competition (in Google) is <= 4 digits then it is considered as low level keyword. For low-level keyword we can optimize a website with in 3 months.

2.Medium level keywords :
If the keyword competition (in Google) is <= 7 digits it is considered as medium level keyword, For medium -level key words we can optimize our site with in 6 to 7 months.

3.High level keywords :
if keyword competition is above or equal to 8 digits it is considered as high level keyword .we can optimize these high level keyword in 1year.

Common Mistakes in SEO

.Incorrectly Designed Websites :

Lack of proper Navigation
Using frames to save web designers designing times
Large image sizes will cause more time to download pages. If it is necessary to use large images then consider using thumbnails and open it in separate page. (This helps in creating more pages and more text which help the spiders to crave)
Using high resolution graphics (Try to use low resolution graphics)

2.Poorly Written of content :
content absolutely must have targeted keywords and phrases. If content is written properly you can make more targeted keywords and appropriate phrases. Absence of targeted keywords and phrases can break your site. If you have not used related keyword in your body text then your site will not come in listing when user type particular keywords related to your site.More sure to use keywords placed in the meta keyword tag is logical to your content.People visiting your site as a result of their search would leave as soon as they see the home page is it is irrelevant or don’t match to the keyword or phrase they are searching. Use some tools such as word tracker to find what people are actually typing in to the search engines to find goods and services similar to yours and should concentrate on ranking well for those terms.


3.Replica of Content :
Using more than one page with different name, but content in page are same then search engines will consider this as trick and this affect ranking never try to copy content from other websites.


4.Improper use of Meta Tags or site without Meta Tags :
Meta Tags are used to include keyword and description of tags. These Meta Tags help search engines for quick search. Meta tags also help websites to increase ranking in search engines meta tags have to be included in all pages of the website. Improper Metatags or without Meta Tags can misguide search engines and lead to improper listing of websites absence of page title will harm ranking of the websites.


5.Absence of Sitemap for Websites :
Sitemaps assist web crawlers in indexing websites more efficiently and efficiently. Sitemap provides the structure of entire website in one page which is very useful for search engine Optimization. Generally site maps drive search engines to go directly to the page instead of searching the links. Google sitemaps is an easy way to tell Google about all the pages on your site; which pages are most important to you, and when those pages change, for a smarter crawl and fresher search result.


6.Improper use of Meta Tags or site without Meta Tags :
Meta Tags are used to include keyword and description of tags. These Meta Tags help search engines for quick search. Meta tags also help websites to increase ranking in search engines meta tags have to be included in all pages of the website.
Improper Metatags or without Meta Tags can misguide search engines and lead to improper listing of websites absence of page title will harm ranking of the websites.

7.Page Cloaking :
In this method the web masters deceive search engines by altering the real page content with declared description. Normally spidering robots recognized by their IP addresses or host names are redirected to a page that is specially polished to meet search engines' requirements, but is unreadable to a human being. In order to detect cloakers, spiders often come from fake IP addresses and under fictitious names. Also, users' feedback is collected to see the relevancy of content with description, the page is also revised by search engine owners’ staff and if found any difference the sites are penalized

Seo Methodology

When a website is specifically designed so that it is friendly to the tools that search engine use to analyze websites (Called spiders) is called Search Engine Optimization.SEO Methodology, methodology, Search Engine Methodology, off page methodology, on page methodology, methodology for seo, methodology for Optimization Tips, search engine optimization, step by step seo methodology, seo process, onpage optimization, off page optimization, what is seo methodology, define seo methodology?, definition of seo methodology, methodology of seo.
Seo Methodology :
Offline/off page Optimization
Online/on page Optimization
Position Monitoring

Onpage Optimization :
Pre Optimization Report
Key word research
Competitors website analysis
Rewriting robot friendly text
H1 H2 Tags Optimization
Title Tag Optimization
Meta Tag Optimization
Key words Optimization
Alt Tag Optimization
Website structure Optimization
Body text and content Optimization
Sitemap for link Optimization

Offpage Optimization :
Hosting of Google sitemap
Website submission to all leading search engines having global data base.
Submission to country specific search engines having country related data base
Submission to general directories
Submission to product specific directories
Submission to country specific directories
Trade lead posting

Position Monitoring :
Monitoring website ranking with different keywords
Renewal of expiry trade leads and posting new trade leads
Constant research of updated technology for better positioning
Research on current popular directories and site submission
Changing methodology with change in search engine algorithm

Architecture of search engines:

Spider - a browser-like program that downloads web pages.Crawler – a program that automatically follows all of the links on each web page.Indexer - a program that analyzes web pages downloaded by the spider and the crawler. Database– storage for downloaded and processed pages.Results engine – extracts search results from the database. Web server – a server that is responsible for interaction between the user and other search engine components.

How do search engines work

All search engines consist of three main parts:

The Spider (or worm)
The Index
The Search Algorithm.
The spider (or worm), continuously ‘crawls’ web space, following links that leads either to different website or with in the limits of the website. The spider ‘reads’ all pages content and passes the data to the index.The index is the next step of search engine after crawling. Index is a storage area for spidered web pages and is of a huge magnitude. Google index is said to consist of more than three billion pages.For example Google’s index, is said to consist of more than three billion pages.Search algorithm is more sophisticated and third step of a search engine system. Search algorithm is very complicated mechanism that sorts an immense database within a few seconds and produces the results list. The most relevant the search engine sees the webpage the nearer the top of the list. So site owners or webmasters should therefore see site’s relevancy to keywords. Algorithm is unique for each and every search engine, and is a trade secret, kept hidden from the public.
Most of the modern web search combines the two systems to produce their results

History Of Search Engines

Running a company by having a website does not guarantee success Following the tips given in this website tips on interview will surely help your website to stand high in directories and search engine results and therefore increase traffic and the number of potential clients.

Before HTTP protocol was invented the internet was just a huge network consisting of FTP servers, and internet is used as a means of file exchange. Before websites existed the first search engines ran via FTP and similar protocols. Internet acquired its actual shape only after Tim Burners-Lee had created HTTP protocol, we got the World Wide Web, and the Internet acquired its actual shape.


Search Engines automatically 'crawl' web pages by following hyperlinks and store copies of them in an index, so that they can generate a list of resources according to user’s requests.Directories are compiled according to categories by humans who are site owners or directory editors.The top-13 search engines listed Below cover about 90 percent of all online searches performed on

the Internet. Those search engines are:
www.Google.com
www.Yahoo.com
www.MSN Search.com
www.AOL Search.com
www.AltaVista.com
www.Lycos.com
www.Netscape.com
www.HotBot.com
Ask Jeeves/Teoma
www.AllTheWeb.com
www.Wisenut.com
www.iWon.com