Sunday, October 26, 2008

What does “organic search results” mean

It is the same as “natural search results”. This is the opposite to paid listings.

Faqs in Seo

Explain the importance of domain name

Ans:
Domain name although the Age of Domain matters, search engines do not favor older and established sites with no recently updated content, it is better to have the content separated and have a different domain (preferably with the keywords in it) for different niches

From seo point of view, which is better - one big site or several smaller site?

Ans:
If you have enough content in the same niche it is better to have it in one big site because first, this way the site is easier to maintain and the great number of pages is good for ranking high in search results. Many small sites allow to focus on specific niches, therefore compete for different keywords.

Saturday, October 25, 2008

White Hat Search Engine Optimization

There are ethical and unethical practices in search engine optimization, Ethical practices are called as white hat seo. The opposite of bat hat seo is called as white hat seo. White hat seo includes some of these activities like arranging content in a clear hierarchy so that important words and phrases are used as required .

Getting page views at any cost to your website is not a good tactic. It never works in the long run, even if it may work for a short while. Beware of any SEO company that recommends you employ any of these techniques.

Black Hat SEO

lack Hat SEO includes things like keyword stuffing. The terminology “Black Hat” in Black Hat SEO comes from the old cowboy western movies where all the bad guys wore black hats and all the good guys wore white hats. Cloaking is another black hat seo technique, Google will penalize sites that flagrantly buy and sell links. If you are buying or selling links: do it covertly & do it under the radar. Buying and selling links without the no follow tag is now officially a black hat SEO practice that is against Google’s Webmaster Guidelines.

Black hat SEO techniques usually include following characteristics:

  • breaking search engine rules and regulations
  • Unethically presenting content in a different visual or non-visual way to search engine spiders and search engine users.

Black hat SEO practices will actually provide short-term gains in terms of rankings, but if we are discovered utilizing these spammy Techniques on a Web site, we run the risk of being penalized by search engines.

Black Hat SEO Techniques To Avoid
  • Keyword stuffing
  • Invisible text
  • Doorway Pages

Website will be penalized by Search Engines if the web pages are stuffed / placed with long list of keywords and nothing content else. Web masters try to put list of keywords in black text on a black back ground., and white text in white background in order to attract more search engine spiders. When searchers search this page these keywords are not visible to them but search engines read this keywords which is considered as black hat seo technique.
A door way page is a fake page which a user will never see., this page is created purely for search engine spiders and the web masters attempt to index these pages higher.
Black Hat Seo Techniques work temporarily. It is better to stay away from anything that even looks like Black Hat SEO.

Example : BMW car maker from Germany have been kicked out of the Google index for spamming.

The reason for the ban of BMW website is likely to be that have been caught employing a technique used by black-hat search engine optimizers: doorway pages. (Doorway page is stuffed full of keywords that the WEB SITE feels a need to be optimized for; however, as opposed to real pages, this doorway is only displayed to the Googlebot. Human visitors will be immediately redirected to another page upon visit. )And that’s exactly what happened at BMW.de.
BMW almost immediately removed the pages after the news broke ,but it was too late . German BMW Web Site is now suffering with “Google death penalty”: a ban from almost any imaginable top search result, and a degrading of the Page Rank to the lowest possible value.

Penalty on BMW Website is a good example of what can happen to sites going against the Google webmaster guidelines – no matter how big or important one might deem the web site.
So always remember “If an SEO company creates deceptive or misleading content on your behalf, such as doorway pages or ’throwaway’ domains, your web site could be removed entirely from Google’s index.”

Google’s guidelines say webmasters should optimize for humans, not machines, because Google doesn’t like to be cheated. Many ethical SEOs provide useful services for website owners, from writing copy to giving advice on site architecture and helping to find relevant directories to which a web site can be submitted. But a few unethical SEOs have given the industry a black eye through their unfair practices like manipulating search engine results and doing aggressive marketing.

Friday, October 24, 2008

Link Building

There are three types of links which will increase link popularity for a website.

Internal links : Incoming links : Outgoing links : Link popularity is defined as the number of links pointing to and from related websites and it is an extremely important method for improving sites relevancy in search engines.

Internal links :
Number of Links to and from pages within a site is called as internal link popularity. Search engine spiders find and index important related pages quicker if some pages are buried deep within the site when we do cross linking of important related pages
Incoming links :
Incoming links are of 2 types.
(Links from sites we control)
(Links from sites we don’t control)
Links pointing to a website from other related sites is called incoming link popularity.To find link popularity for a website or which sites are linking to our website or competitors website we need to go to Google search box and enter “link:” followed by domain name with out using “www”

Outgoing links :
Outgoing links refers to links pointing to other related sites from your site. Search engine spiders will crawl your site's outgoing links and determine that the content of the sites you link to are related to the content of your own site.How much importance outgoing links add to a site's link popularity rating is still being debated by search engine optimization specialists

Site maps :
Site maps leads to links which leads to most or all pages of the website. Site maps are hierarchically organized. Site maps are visual models of web site content that allows users to find specific webpage. If more pages are available in a website it is recommended to have a site map, by using site map search engine spiders will crawl the links and index the entire website.

Keywords Terminology

1.Keyword Density :
Keyword density refers to the ratio or percentage of keywords contained within the total number of indexable words within a web page. It is important for your main keywords to have the correct keyword density to rank well in Search Engines. Keyword density ratio varies from search engine to search engine. The recommended keyword density ratio is 2 to 8 percent. Keywords quality is very important more than keyword quantity because if we have keywords in the page title, the headings, and in the first paragraphs this count more. The reason is that the URL (and especially the domain name), file names and directory names, the web page title, the headings for the separate sections are more important than ordinary text on the web page . we may have same keyword density as our competitors web site but if we have keywords in the URL, this will boost our website ranking incredibly, especially with Yahoo search.

2.Keyword Frequency:
Keyword frequency means the number of times a keyword phrase or keyword appears with in a webpage. The more times a keyword or keyword phrase appears within a web page, the more relevance a search engines are likely to give the page for a search with those keywords or key phrase.

3.Keyword Prominence :
Keyword prominence refers to how prominent keywords are within a web page. It is recommendation to place important keywords at the start or near of a web page, starting of a sentence, TITLE or META tag.

4.Keyword Proximity:
Keyword proximity refers to the closeness between two or more keywords. It is better to have keywords placed closer in a sentence.Keyword proximity examples: Example 1: How Keyword Density Affects Search Engine Rankings.Example 2: How Keyword Density Affects Rankings In Search Engine

Levels of keywords

Keywords Levels are totally 3types

1.Low level keywords :
If the keyword competition (in Google) is <= 4 digits then it is considered as low level keyword. For low-level keyword we can optimize a website with in 3 months.

2.Medium level keywords :
If the keyword competition (in Google) is <= 7 digits it is considered as medium level keyword, For medium -level key words we can optimize our site with in 6 to 7 months.

3.High level keywords :
if keyword competition is above or equal to 8 digits it is considered as high level keyword .we can optimize these high level keyword in 1year.

Common Mistakes in SEO

.Incorrectly Designed Websites :

Lack of proper Navigation
Using frames to save web designers designing times
Large image sizes will cause more time to download pages. If it is necessary to use large images then consider using thumbnails and open it in separate page. (This helps in creating more pages and more text which help the spiders to crave)
Using high resolution graphics (Try to use low resolution graphics)

2.Poorly Written of content :
content absolutely must have targeted keywords and phrases. If content is written properly you can make more targeted keywords and appropriate phrases. Absence of targeted keywords and phrases can break your site. If you have not used related keyword in your body text then your site will not come in listing when user type particular keywords related to your site.More sure to use keywords placed in the meta keyword tag is logical to your content.People visiting your site as a result of their search would leave as soon as they see the home page is it is irrelevant or don’t match to the keyword or phrase they are searching. Use some tools such as word tracker to find what people are actually typing in to the search engines to find goods and services similar to yours and should concentrate on ranking well for those terms.


3.Replica of Content :
Using more than one page with different name, but content in page are same then search engines will consider this as trick and this affect ranking never try to copy content from other websites.


4.Improper use of Meta Tags or site without Meta Tags :
Meta Tags are used to include keyword and description of tags. These Meta Tags help search engines for quick search. Meta tags also help websites to increase ranking in search engines meta tags have to be included in all pages of the website. Improper Metatags or without Meta Tags can misguide search engines and lead to improper listing of websites absence of page title will harm ranking of the websites.


5.Absence of Sitemap for Websites :
Sitemaps assist web crawlers in indexing websites more efficiently and efficiently. Sitemap provides the structure of entire website in one page which is very useful for search engine Optimization. Generally site maps drive search engines to go directly to the page instead of searching the links. Google sitemaps is an easy way to tell Google about all the pages on your site; which pages are most important to you, and when those pages change, for a smarter crawl and fresher search result.


6.Improper use of Meta Tags or site without Meta Tags :
Meta Tags are used to include keyword and description of tags. These Meta Tags help search engines for quick search. Meta tags also help websites to increase ranking in search engines meta tags have to be included in all pages of the website.
Improper Metatags or without Meta Tags can misguide search engines and lead to improper listing of websites absence of page title will harm ranking of the websites.

7.Page Cloaking :
In this method the web masters deceive search engines by altering the real page content with declared description. Normally spidering robots recognized by their IP addresses or host names are redirected to a page that is specially polished to meet search engines' requirements, but is unreadable to a human being. In order to detect cloakers, spiders often come from fake IP addresses and under fictitious names. Also, users' feedback is collected to see the relevancy of content with description, the page is also revised by search engine owners’ staff and if found any difference the sites are penalized

Seo Methodology

When a website is specifically designed so that it is friendly to the tools that search engine use to analyze websites (Called spiders) is called Search Engine Optimization.SEO Methodology, methodology, Search Engine Methodology, off page methodology, on page methodology, methodology for seo, methodology for Optimization Tips, search engine optimization, step by step seo methodology, seo process, onpage optimization, off page optimization, what is seo methodology, define seo methodology?, definition of seo methodology, methodology of seo.
Seo Methodology :
Offline/off page Optimization
Online/on page Optimization
Position Monitoring

Onpage Optimization :
Pre Optimization Report
Key word research
Competitors website analysis
Rewriting robot friendly text
H1 H2 Tags Optimization
Title Tag Optimization
Meta Tag Optimization
Key words Optimization
Alt Tag Optimization
Website structure Optimization
Body text and content Optimization
Sitemap for link Optimization

Offpage Optimization :
Hosting of Google sitemap
Website submission to all leading search engines having global data base.
Submission to country specific search engines having country related data base
Submission to general directories
Submission to product specific directories
Submission to country specific directories
Trade lead posting

Position Monitoring :
Monitoring website ranking with different keywords
Renewal of expiry trade leads and posting new trade leads
Constant research of updated technology for better positioning
Research on current popular directories and site submission
Changing methodology with change in search engine algorithm

Architecture of search engines:

Spider - a browser-like program that downloads web pages.Crawler – a program that automatically follows all of the links on each web page.Indexer - a program that analyzes web pages downloaded by the spider and the crawler. Database– storage for downloaded and processed pages.Results engine – extracts search results from the database. Web server – a server that is responsible for interaction between the user and other search engine components.

How do search engines work

All search engines consist of three main parts:

The Spider (or worm)
The Index
The Search Algorithm.
The spider (or worm), continuously ‘crawls’ web space, following links that leads either to different website or with in the limits of the website. The spider ‘reads’ all pages content and passes the data to the index.The index is the next step of search engine after crawling. Index is a storage area for spidered web pages and is of a huge magnitude. Google index is said to consist of more than three billion pages.For example Google’s index, is said to consist of more than three billion pages.Search algorithm is more sophisticated and third step of a search engine system. Search algorithm is very complicated mechanism that sorts an immense database within a few seconds and produces the results list. The most relevant the search engine sees the webpage the nearer the top of the list. So site owners or webmasters should therefore see site’s relevancy to keywords. Algorithm is unique for each and every search engine, and is a trade secret, kept hidden from the public.
Most of the modern web search combines the two systems to produce their results

History Of Search Engines

Running a company by having a website does not guarantee success Following the tips given in this website tips on interview will surely help your website to stand high in directories and search engine results and therefore increase traffic and the number of potential clients.

Before HTTP protocol was invented the internet was just a huge network consisting of FTP servers, and internet is used as a means of file exchange. Before websites existed the first search engines ran via FTP and similar protocols. Internet acquired its actual shape only after Tim Burners-Lee had created HTTP protocol, we got the World Wide Web, and the Internet acquired its actual shape.


Search Engines automatically 'crawl' web pages by following hyperlinks and store copies of them in an index, so that they can generate a list of resources according to user’s requests.Directories are compiled according to categories by humans who are site owners or directory editors.The top-13 search engines listed Below cover about 90 percent of all online searches performed on

the Internet. Those search engines are:
www.Google.com
www.Yahoo.com
www.MSN Search.com
www.AOL Search.com
www.AltaVista.com
www.Lycos.com
www.Netscape.com
www.HotBot.com
Ask Jeeves/Teoma
www.AllTheWeb.com
www.Wisenut.com
www.iWon.com