Saturday, December 29, 2012

What is Social Media Optimization?

Social Media Optimization

When you understand what social media are, it’s not a far step from there to social-media optimization. It’s about using social media to spread the news about your web site. And instead of tweaking the elements of your web site, when you’re practicing social-media optimization you’re participating in social networks and tweaking the content that you share within those networks.


It’s through that participation and optimization that your site is distributed, and that’s what brings traffic to your web site. There are also a couple of added benefits to social-media optimization. First, it’s free in terms of monetary investment. There is no cost to participate in these networks except your time. But make no mistake about it, you will have to invest a chunk of front-end time into this type of optimization. You must become familiar with the communities and participate to establish a name for yourself before anyone will take serious notice of you.

The other benefit of social-media optimization is that it’s really a viral marketing strategy. Nearly everyone is familiar with a virus. It’s that nasty little bug that multiplies exponentially and makes you believe death would be an improvement. Viral marketing is neat, not nasty, but it works in a somewhat similar way.

When you can tap viral marketing for your online presence, word of your brand spreads like a virus — it moves faster and faster as the buzz increases, and there doesn’t seem to be any way to stop it. And that’s good. If your marketing message has become viral, you can expect to see some serious growth in the number of visitors to your site and by extension to the number of goal conversions that you see.

Viral marketing is a good thing. And when you’ve optimized your social-media participating, you can almost guarantee that your marketing will soon have that viral quality that you’re looking for.

Source : Search Engine Optimization Bible by Wiley

Thursday, December 6, 2012

What is SEO Spam?

What Constitutes SEO Spam?

So, if SEO spam is so hard to define, how do you know whether what you’re doing is right or wrong?

Good question. And the answer is that you don’t always know, but there are some guidelines that you can follow that will help you stay out of the spam category. Basic, good web-design practices are your best defense. If you’re handling your web-site search marketing using the guidelines provided by the various search engines you’ll target, you should be in a position not to worry about being classified as a spammer.

Don’t do anything that makes you worry that you’re doing wrong.
It sounds like simple advice, but when you think about it, if you’re doing something on your web site that you have to worry is going to get you banned from a search engine, you probably shouldn’t do it. This includes strategies like using hidden text on your web pages, using doorway pages or cloaking your pages, and creating false link structures. Even if you don’t know that these strategies are banned by search engines, when you consider the sneakiness of the strategy, you’ll be able to tell that it’s not likely a strategy that you should use.

Don’t make your web site appear to be something that it’s not.
It’s easy to “put a spin” on something to make it appear more attractive than it really is. People do it all the time where products and services are concerned. But using that same strategy on your web site may get you banned. Creating false link structures is one way you might make your site appear more popular than it really is. The problem with using that strategy is that it won’t take a crawler long to figure out that all of those sites are interconnected.

Don’t trust someone who says that a certain practice is acceptable, if you even suspect that it’s not.
Some unethical SEO people will tell you that it’s okay if you use certain obvious spam techniques as long as you use them correctly. Wrong. Spam is spam. It doesn’t matter how you use it, the search crawler will still see it as spam and you’ll still pay the consequences, while the unethical SEO consultant will take the money and leave you in the crawler’s bad graces.

Source : Search Engine Optimization Bible by Wiley

Thursday, November 22, 2012

SEO Tools for better Internet marketing

SEO Tools

There are certain SEO Tools that helps to know the number of visits, ROI (Return on Investment), bounce rate, new online customers, SERP (Search Engine Result page), etc. This helps to monitor the SEO campaign for better optimization. Below is the list of SEO tools which helps to find the complete details of the website and its promotion.



Traffic Travis : Market Research SEO and PPC optimization tool


Backlink Watch : Backlink Checker

Quantcast : Website Evaluation Tool

SEO For Firefox : Page Ranking Extension for Firefox

Goingup : Document and Image Converter

SEO Monitor : Video and Audio Format Converter

Sitemapdoc : Google Sitemap Generator and Editor

Automapit : Sitemap Creation Service
GWebCrawler & Google Sitemap Creator : Source Code Web Indexing Engine
Free Online Sitemap Generator : Online sitemap creation

Sitemapxml : XML Sitemap Generator

Brokenlinkcheck : Broken link check 

Broken Link Checker : Link broken checker

W3C Link Checker : Link Checker


Keyword Pad : Keyword List Generator

Primitive Word Counter : Keyword Density Calculator

Keyword Analyzer : Tool for keyword gathering and result estimation

Xedant Keyword Harvester : Keyword Harvest Tool

Google Keyword Tool : Free Keyword Tool

Keyword Density Analyzer : Keyword Density and Word Depth Calculator

Niche Watch : Niche Keywords Research

Google Suggest Keyword Suggestion : Keywords Suggestion Tool

Free Keyword Suggestion : Keywords vs Search Volume Estimator 

Keyword Suggestion and Keyword Popularity Tool : Keyword Suggestion along Popularity

Google Semantics : Firefox Add-on

Alexa Toolbar : Free alexa tool

Search Status : Firefox Toolbar Extension 

Meta Tags : Firefox Sidebar Add-on for SEO

Web Developer : Firefox, Flock, Seamonkey Extension

Web Tools : Ranking and related Tools

Pagerank Lookup : Page Rank Checker

Rank Tracker : Search Engine Ranking Tool

Check Google Pagerank : Positioning Estimation Tool

Google Ranking : Ranking Tool for popular Search Engines

SEO Rank Tool : Ranking and Backlink Tool

Rank Checker : Ranking Checking Tool for Firefox

Gorank : Professional SEO Ranking Tool

Alexa Site Information : Website Monitoring Tools

Sitemeter : Real Time Reporting Tool

Traffic Estimator : Google Traffic Estimation Tool

Social Poster : Social Bookmarking Tool

Google Analytics : website traffic

Social Maker : Social Bookmarking and Promotion Tool

Website Optimizer : Website Testing and Optimization Tool

Web Page Analyzer : Website speed Test

Copyscape : Website dupicate content checker

Search Engine Spider Simulator : Spider Simulation Tool

Web Page Analyser : Web Page Analyser gives a report on the good, and the bad points of the page in SEO terms

Builtwith : Website Optimizer

Xml Sitemaps : Sitemap Generator

Thursday, November 8, 2012

How to Handle Google Penguin Algorithm Change

Google Penguin Algorithm Change 

Google Penguin: Putting Link Abusers On Ice

What it is: First announced on April 24, 2012, the Penguin update was "huge," Meyers says. Unlike previous algorithm updates, he adds, Penguin was more punitive, as opposed to simply being designed to improve search quality.

Google named its new algorithm Penguin. Initially, it affected about 3.1 percent of English-language search queries, according to Search Engine Land. Penguin sought to decrease rankings for websites that engaged in dubious link exchanges, unnatural links, relied on too many of the same anchor text links and so on. (Anchor text links are hyperlinks that contain a targeted keyword phrase.) 
 
Google's goal: With Penguin, Google is cracking down on a common black hat SEO practice: abusing links to gain search engine rankings. If you paid for links from lots of dubious, low-quality link directories, link exchanges and other sites, you may have felt the Penguin slap. 

What you should do: Penguin has already been updated twice and is likely to updated again soon, Meyers says. As a result, it's more important than ever to have link quality and diversity. Earn "natural" links from a variety of other quality sites because you've posted compelling, useful content. 

News: Google Begins Penalizing Search "Over-Optimization"News: Copyright Removal Requests to Affect Google Search Ranking

Don't focus on getting links from other sites using identical anchor text. "Look at where your links are coming from using Google Webmaster tools and what the anchor text links are," says Ting-Yu Liu, Covario's manager of paid media services. "Try to have at least 60 percent keyword diversification. If you have 80 percent of external sites linking to you with the same anchor text, that's a problem."

How to Handle Google Panda Algorithm Change?

Google Panda: Putting Content Farms Out to Pasture

What it is: In February 2011, Google rolled out a major new algorithm. It was called "Farmer" because it was targeted at demoting high-volume content farms in Google search results. The update eventually became known as Panda, a reference to the name of Google engineer Navneet Panda. Since February 2011, Google Panda has been updated 20 times, Meyers says.

The initial Panda update reportedly affected the rankings of nearly 12 percent of all search results, according to the Search Engine Land blog.

Google's goal: Panda was designed to push down sites that are overly optimized, offer "thin" content and/or operate as content farms, explains Michael Martin, SEO manager at Covario, a global search marketing agency. (A content farm produces large amounts of content specifically to attract traffic from search engines and use those page views to generate easy advertising revenues.)

Meyers gives as an example a pest control service, operating nationwide, which may have created a specific Web page for every U.S. city in which it operates. The content on those pages is nearly identical except for the different geographic locations. With Panda, Google's search technology is better able to identify nearly duplicate content like that, recognize that those pages offer no real value to its users and push that content way down in search result rankings.

What you should do: Don't create content simply based on keyword optimization or post thousands of pages with nearly duplicate content. If you do, Google is likely to push down your entire site in its rankings, Meyers advises. Instead, make sure your site's content is as unique as possible and that it adds reader value. Ask yourself: "What does my content do for people who find it?" Does it help them, educate them or engage them in some way?

News: Google to Punish Sites with Many Ads At the Top of Pages
Sometimes, duplicate content is part of what a company legitimately offers. A large publishing company, for instance, may publish the same article on multiple sites it owns. In those cases, to avoid a Google penalty, publishers should properly identify the parent content and make sure others use rel=canonical to point back to the original content, Martin says. (You can learn more at the Google Webmaster Tools' rel=canonical tutorial.)

Friday, November 2, 2012

Types of Pay Per Click Marketing Campaigns

Types of Pay Per Click Programs

Pay-per-click programs are not all created equal. When you think of PPC programs, you probably think of keyword marketing bidding on a keyword to determine where your site will be placed in search results. And that’s an accurate description of PPC marketing programs as they apply to keywords. However, there are two other types of PPC programs, as well. And you may find that targeting a different category of PPC marketing is more effective than simply targeting keyword PPC programs.

Keyword pay-per-click programs:

Keyword PPC programs are the most common type of PPC programs. They are also the type this book focuses on most often. As known, keyword PPC programs are about bidding on keywords associated with your site. The amount that you’re willing to bid determines the placement of your site in search engine results.
In keyword PPC, the keywords used can be any word or phrase that might apply to your site. However, remember that some of the most common keywords have the highest competition for top spot, so it’s not always advisable to assume that the broadest term is the best one. If you’re in a specialized type of business, a broader term might be more effective, but as a rule of thumb, the more narrowly focused your keywords are, the better results you are likely to have with them.

Search PPC marketing programs such as those offered by vendors like Google.com, Yahoo.com, Search Marketing, and MSN.com are some of the most well-known PPC programs.

Product pay-per-click programs

You can think of product pay-per-click programs as online comparison shopping engines or price comparison engines. A product PPC program focuses specifically on products, so you bid on placement for your product advertisements.

The requirements for using a product PPC program are a little different from keyword PPC programs, however. With a product PPC, you must provide a feed, think of it as a regularly updated pricelist for your products, to the search engine. Then, when users search for a product your links are given prominence, depending on the amount you have bid for placement. However, users can freely display those product listings returned by the search engine in the order or price from lowest to highest if that is their preference. This means that your product may get good placement initially, but if it’s not the lowest-priced product in that category, it’s not guaranteed that your placement results will stay in front of potential visitors.

Some of these product PPC programs include Shopping.com, NexTag, Pricegrabber.com, and Shopzilla.com.

Service pay-per-click programs

When users search for a service of any type, such as travel reservations, they are likely to use search engines related specifically to that type of service. For example, a user searching for the best price for hotel reservations in Orlando, Florida, might go to TripAdvisor.com. Advertisers, in this case hotel chains, can choose to pay for their rank in the search results using a service PPC program.

Service PPC programs are similar to product PPC programs with the only difference being the type of product or service that is offered. Product PPC programs are more focused on e-commerce products, whereas service PPC programs are focused on businesses that have a specific service to offer.

Service PPC programs also require an RSS feed, and even some of the same attribute listings as product PPC programs. Some of the service PPC programs you might be familiar with are SideStep.com and TripAdvisor.com.

Source : Search Engine Optimization Bible by Wiley

Wednesday, October 31, 2012

How Pay Per Click Works

What is Pay Per Click?

Pay-per-click marketing is an advertising method that allows you to buy search engine placement by bidding on keywords or phrases. There are two different types of PPC marketing.

In the first, you pay a fee for an actual SERP ranking, and in some cases, you also pay a per-click fee meaning that the more you pay, the higher in the returned results your page will rank.

The second type is more along true advertising lines. This type of PPC marketing involves bidding on keywords or phrases that appear in, or are associated with, text advertisements. Google is probably the most notable provider of this service. Google’s AdWords service is an excellent example of how PPC advertisements work

PPC advertisements are those advertisements that you see at the top and on the sides of search pages.

Putting pay-per-click to work:

Now you can begin to look at the different keywords on which you might bid. Before you do, however, you need to look at a few more things. One of the top mistakes made with PPC programs is that users don’t take the time to clarify what it is they hope to gain from using a PPC service. It’s not enough for your PPC program just to have a goal of increasing your ROI (return on investment). You need something more quantifiable than just the desire to increase profit. How much would you like to increase your profit? How many visitors will it take to reach the desired increase?

Let’s say that right now each visit to your site is worth $.50, using our simplified example, and your average monthly profit is $5,000. That means that your site receives 10,000 visits per month. Now you need to decide how much you’d like to increase your profit. For this example, let’s say that you want to increase it to $7,500. To do that, if each visitor is worth $.50, you would need to increase the number of visits to your site to 15,000 per month. So, the goal for your PPC program should be “To increase profit $2,500 by driving an additional 5,000 visits per month.” This gives you a concrete, quantifiable measurement by which you can track your PPC campaigns.

Once you know what you want to spend, and what your goals are, you can begin to look at the different types of PPC programs that might work for you. Although keywords are the main PPC element associated with PPC marketing, there are other types of PPC programs to consider as well.

Source : Search Engine Optimization Bible By Wiley

Thursday, October 11, 2012

How to do Competitive Analysis in SEO

Competitive Analysis:

Competitive analysis is a step you should take in the very beginning of your SEO efforts. It should be right at the top of your to-do list, along with keyword analysis and tagging your web site. In fact, you should probably do a competitive analysis even before you begin tagging your site.

But did you know that your competitive analysis doesn’t end there? Like analyzing your web statistics, conversions, and other elements of your web site, your competitive analysis should be ongoing. Your competitors will change. They’ll learn how to reach a search engine better. They may even change their customer approach just enough to always stay ahead of you. They’ll keep you guessing, and the only way to figure out what they’re doing that you’re not is to spend the time it takes to analyze what they’re doing.

As you’re going through this analysis process, the first thing to keep in mind is that you’re not checking out only your direct competitors. You need to look at those competitors who are ahead of you in search rankings, even if their offerings are different from yours.

Plan to spend a few hours a week on this analysis. You should look at all the sites that are ahead of you, but specifically those sites that rank in the top five to ten position in the SERPs. Look for the same indications that you examined during your original competitive analysis. These include:

Site rankings: Where in the SERPs is the site ranked? Make note, especially, of the top three to five sites.

Page saturation: How many of the competition’s pages are indexed? Not every page on a site will be indexed, but if your competition has more or fewer pages ranked, there may be a factor you haven’t taken into consideration about how to include or exclude your site pages.

Page titles: Are page titles consistent? And what keywords do they contain, if any at all? How your competition uses titles can give you an indication of what you’re doing right or wrong with your own.

Meta data: What meta data is your competition including? How is it worded? And how does it differ from your own? Remember that you can access the source code of a web site by selecting Source from the View menu of your web browser.

Site design: How is the competition’s web site designed? Site architecture and the technology that is used to design and present the site are factors in how your site ranks. Learn what the competition is doing and how that differs from what you’re doing.

A robots.txt file: The robots.txt file is accessible to you, and looking at it could give you some valuable insight to how your competition values and works with search engines.

Content quality and quantity: How much quality is included on your competitor’s site and is it all original, or is it re-used from some other forum? If a site is ahead of you in search rankings, its content is probably performing better than yours. Analyze it and find out why.

Link quality and quantity: Your competitors’ linking strategies could hold a clue about why they rank well. Look at the link structure. If they’re using legitimate linking strategies, what are they? If they’re not, don’t try to follow suit. Their actions will catch up with them soon enough.

Source : Search Engine Optimization Bible by Wiley

Thursday, September 13, 2012

SEO and Programming Languages

SEO and Programming Languages

One aspect of web-site design you might not think of when planning your SEO strategy is the programming language used in developing the site. Programming languages all behave a little differently. For example, HTML uses one set of protocols to accomplish the visuals you see when you open a web page, whereas PHP uses a completely different set of protocols. And when most people think of web-site programming, they think in terms of HTML.

But the truth is that many other languages also are used for coding web pages. And those languages may require differing SEO strategies.

JavaScript :

JavaScript is a programming language that allows web designers to create dynamic content. However, it’s also not necessarily SEO-friendly. In fact, JavaScript often completely halts a crawler from indexing a web site, and when that happens the result is lower search engine rankings or complete exclusion from ranking.

To overcome this, many web designers externalize any JavaScript that’s included on the web site. Externalizing the JavaScript creates a situation where it is actually run from an external location, such as a file on your web server. To externalize your JavaScript:

1. Copy the code, beginning at the starting tags, and paste it into a Notepad file.
2. Save the Notepad file as filename.js.
3. Upload the file to your web server.
4. Create a reference on your web page to the external JavaScript code. The reference should be placed where the JavaScript will appear and might look like this:

<script language=”JavaScript” type=”text/javascript” src=”filename.js”></script>

This is just one of the solutions you can use to prevent JavaScript from becoming a problem for your SEO efforts. There are many others, and depending on your needs you should explore some of those.

Flash :

Flash is another of those technologies that some users absolutely hate. That’s because Flash, though very cool, is resource intensive. It causes pages to load slower, and users often get stuck on an opening Flash page and can’t move forward until the Flash has finished executing. If the user is in a hurry, it’s a frustrating thing to deal with.

Flash is also a nightmare when it comes to SEO. A Flash page can stop a web crawler in its tracks, and once stopped, the crawler won’t resume indexing the site. Instead, it will simply move on to the next web site on its list.

The easiest way to overcome Flash problems is simply not use it. But despite the difficulties with search rankings, some organizations need to use Flash. If yours is one of them, the Flash can be coded in HTML and an option can be added to test for the ability to see Flash before the Flash is executed. However, there’s some debate over whether or not this is an “acceptable” SEO practice, so before you implement this type of strategy in an effort to improve your SEO effectiveness, take the time to research the method.

Dynamic ASP :

Most of the sites you’ll encounter on the Web are static web pages. These sites don’t change beyond the regular updates by a webmaster. On the other hand, dynamic web pages are web pages that are created on the fly according to preferences that users specify in a form or menu. The sites can be created using a variety of different programming technologies including dynamic ASP. The problem with these sites is that they don’t technically exist until the user creates them. Because a web crawler can’t make

the selections that “build” these pages, most dynamic web pages aren’t indexed in search engines. There are ways around this, however. Dynamic URLs can be converted to static URLs with the right coding. It’s also possible to use paid inclusion services to index dynamic pages down to a predefined number of levels (or number of selections, if you’re considering the site from the user’s point of view).

Dynamic ASP, like many of the other languages used to create web sites, carries with it a unique set of characteristics. But that doesn’t mean SEO is impossible for those pages. It does mean that the approach used for the SEO of static pages needs to be modified. It’s an easy enough task, and a quick search of the Internet will almost always provide the programming code you need to achieve SEO.
 
PHP :

Search engine crawlers being what they are preprogrammed applications there’s a limit to what they can index. PHP is another of those programming languages that falls outside the boundaries of normal web-site coding. Search engine crawlers see PHP as another obstacle if it’s not properly executed.

Properly executed means that PHP needs to be used with search engines in mind. For example, PHP naturally stops or slows search engine crawlers. But with some attention and a solid understanding of PHP and SEO, it’s possible to code pages that work, even in PHP.

One thing that works well with PHP is designing the code to look like HTML. It requires an experienced code jockey, but it can be done. And once the code has been disguised, the PHP site can be crawled and indexed so that it’s returned in search results.

Source : Search Engine Optimization Bible by Wiley

Thursday, August 23, 2012

What is Search Algorithm?

Search Algorithm

In very general terms, a search algorithm is a problem-solving procedure that takes a problem, evaluates a number of possible answers, and then returns the solution to that problem. A search algorithm for a search engine takes the problem (the word or phrase being searched for), sifts through a database that contains cataloged keywords and the URLs those words are related to, and then returns pages that contain the word or phrase that was searched for, either in the body of the page or in a URL that points to the page.

There are several classifications of search algorithms, and each search engine uses algorithms that are slightly different. That’s why a search for one word or phrase will yield different results from different search engines. Some of the most common types of search algorithms include the following:

  • List Search: 
A list search algorithm searches through specified data looking for a single key. The data is searched in a very linear, list-style method. The result of a list search is usually a single element, which means that searching through billions of web sites could be very time-consuming, but would yield a smaller search result. 
  • Tree Search: 
Envision a tree in your mind. Now, examine that tree either from the roots out or from the leaves in. This is how a tree search algorithm works. The algorithm searches a data set from the broadest to the most narrow, or from the most narrow to the broadest. Data sets are like trees; a single piece of data can branch to many other pieces of data, and this is very much how the Web is set up. Tree searches, then, are more useful when conducting searches on the Web, although they are not the only searches that can be successful.
  • SQL Search:
One of the difficulties with a tree search is that it’s conducted in a hierarchical manner, meaning it’s conducted from one point to another, according to the ranking of the data being searched. A SQL (pronounced See-Quel) search allows data to be searched in a non-hierarchical manner, which means that data can be searched from any subset of data.

  • Informed Search: 
An informed search algorithm looks for a specific answer to a specific problem in a tree-like data set. The informed search, despite its name, is not always the best choice for web searches because of the general nature of the answers being sought. Instead, informed search is better used for specific queries in specific data sets.
  • Adversarial Search: 
An adversarial search algorithm looks for all possible solutions to a problem, much like finding all the possible solutions in a game. This algorithm is difficult to use with web searches, because the number of possible solutions to a word or phrase search is nearly infinite on the Web.
  • Constraint Satisfaction Search:  
When you think of searching the Web for a word or phrase, the constraint satisfaction search algorithm is most likely to satisfy your desire to find something. In this type of search algorithm, the solution is discovered by meeting a set of constraints, and the data set can be searched in a variety of different ways that do not have to be linear. Constraint satisfaction searches can be very useful for searching the Web.

Classification of Search Engines

Types of Search Engines

There might be lots of search engines in the internet world but there are some where users give preference to. Because all search engines are not created equal and algorithm of each search engines differ from other. Based on the website traffic generation and relevant search, these are classified into three, they are
  • Primary Search Engine
  • Secondary Search Engine
  • Targeted Search Engine

Primary Search Engine:

A primary search engine is the type you think of most often when search engines come to mind. Some index most or all sites on the Web. For example, Yahoo! Google, and MSN are primary (also called major) search engines.

Primary search engines will generate the majority of the traffic to your web site, and as such will be the primary focus of your SEO efforts. Each primary search engine differs slightly from the others. Most primary search engines are also more than just search. Additional features such as e-mail, mapping, news, and different types of entertainment applications are also available from most of the primary search engine companies. These elements were added long after the search was established, as a way to draw more and more people to the search engine.

Secondary Search Engine: 

Secondary search engines are targeted at smaller, more specific audiences, although the search engine’s content itself is still general. They don’t generate as much traffic as the primary search engines, but they’re useful for regional and more narrowly focused searches. Secondary search engines, just like the primary ones, will vary in the way they rank search results. Some will rely more heavily upon keywords, whereas others will rely on reciprocal links. Still others might rely on criteria such as meta tags or some proprietary criteria.

Secondary search engines should be included in any SEO plan. Though these search engines might not generate as much traffic as the primary search engines, they will still generate valuable traffic that should not be overlooked. Many users of secondary search engines are users because they have some loyalty to that specific search engine.

Targeted Search Engine:

Targeted search engines - sometimes called topical search engines - are the most specific of them all. These search engines are very narrowly focused, usually to a general topic, like medicine or branches of science, travel, sports, or some other topic.

When considering targeted search engines for SEO purposes, keep in mind that many of these
search engines are much more narrowly focused than primary or secondary search engines

Source : Search Engine Optimization Bible - Wiley

Monday, August 13, 2012

Basics of Search Engine

Search Engine and its Basics

Web Search Engine or Simply Search Engine are designed to search the information in the world wide web. In the modern world more than half the population in the world use Internet daily to find information and things online. Internet is actually a collection of FTP (File Transfer Protocol) sites that users could access to download or upload files.

As like the web directories which is maintained by the manually by the human,, search engines also maintain real time information by running an algorithm web crawler. There are certain agents which saves the web page in a related database and these agents are called as crawlers, spiders or robots. The work of these agents is to crawl the web page store the information of the webpage in database by means of indexing. So when the user types something in the search, it results in showing the relevant result which is fetched form the website database.

The first real search engine, in the form that we know search engines today, didn’t come into being until 1993. It was developed by Matthew Gray, and it was called Wandex. Wandex was the first program to both index and search the index of pages on the Web. This technology was the first program to crawl the Web, and later became the basis for all search crawlers. And from there, search engines took on a life of their own. From 1993 to 1998, the major search engines that you’re probably familiar with today were created:

Excite - 1993
Yahoo! - 1994
Web Crawler - 1994
Lycos - 1994
Infoseek - 1995
AltaVista - 1995
Inktomi - 1996
Ask Jeeves - 1997
Google - 1997
MSN Search - 1998

Source : Search Engine Optimization by Wiley

Friday, August 10, 2012

On Page and Off Page Optimization Technique

On-page and Off -Page Optimization

There are certain activities to be done for a website in the search engines. SEO techniques are broadly divided into two such as Onpage Optimization and Off Page Optimization.

On-page Optimization concentrates on the page work, which immediately reflects on the website but offpage optimization is completely promotion based where the website is promoted in various other sites in the search engine to get listed.

Below is furnished list of Onpage and Offpage optimization techniques,

On-Page Optimization Techniques:
  1. URL Optimization
  2. Content Optimization
  3. Footer links optimization
  4. Determining the Structure of website
  5. Adding Meta tags (Meta Title, Meta Tag, and Meta Description)
  6. Keywords research and analysis
  7. Competitor analysis
  8. Anchor text optimization
  9. Internal and External Link Structuring 
  10. Sitemap Generation
  11. Robot Text

Off-Page Optimization Techniques:

  1. Directory Submission
  2. Link building
  3. Article Creation and Submission
  4. Forum posting
  5. Social Bookmarking
  6. Profile link creation
  7. RSS Feed Generation
  8. Classified Submission
  9. Guest Posting
  10. Press Release
  11. Blog creation, posting and commenting 
  12. Review submission
  13. Document Sharing
  14. CSS Submission

Above is the list of On page and Off page SEO techniques, which helps to gain traffic for the website.

Thursday, August 9, 2012

What is Social Media Marketing?

Social Media Marketing

Social Media Marketing is also a part of Internet marketing which helps to increase branding, gaining attention and traffic for the website. Social media encourages the readers to share their thoughts through social networks.

Most of the corporate companies, private companies and other public companies use social media networks to share things about their company and also the general awareness. Through Social media networks, message can easily delivered to the users as much as can. It also allows individuals to interact with one another and build interaction. When companies or the products joins social media sites, users or the people can easily interact with each other about the product or the company.

There are lots of Social Media Marketing sites which helps to promote the business online, gain customers online and also increases branding of the website. Some of the most important social media marketing sites are,

  1. Facebook
  2. Linkedin
  3. Youtube
  4. Flickr
  5. Twitter
  6. Google+
  7. Delicious
  8. Digg
  9. Stumbleupon
  10. Squidoo and many
Check out more for other SEO and SMM related topics

Wednesday, August 8, 2012

What is SEO?

Search Engine Optimization?

SEO (Search Engine Optimization) is a process of increasing the visibility of the website in the search engine through organic search results. More the visibility of website, more the amount of visitors to the website which leads to increase in online customers. SEO may target different kinds of search, including image search, local search, video search, academic search, news search and etc.

Search Engine Optimization is a part of internet marketing which helps to market the website online and get exposure in the search engines. SEO effectiveness of a website is determined by the position of website in SERP (Search Engine Result Page) when searching for certain keyword and also by web analytics.

Various Techniques of SEO?

There are certain legal and illegal techniques that are followed by the internet marketers to make the website or web page to get listed in the search engine, they are
  • Black Hat Technique -  Illegal approach in handling a website without following any rule of search  engines, which leads the website to get banned
  • White Hate Technique - Honest approach in optimizing a website by following the search engine rules
  • Grey Hat Technique - Its a combination of White Hat and Black Hat Technique where the results doesn't lasts long