Thursday, January 9, 2014

What are Robots, Spiders, and Crawlers?

Know about Robots, Spiders and Crawlers!!!

You should already have a general understanding that a robot, spider, or crawler is a piece of software that is programmed to “crawl” from one web page to another based on the links on those pages. As this crawler makes it way around the Internet, it collects content (such as text and links) from web sites and saves those in a database that is indexed and ranked according to the search engine algorithm.

When a crawler is first released on the Web, it’s usually seeded with a few web sites and it begins on one of those sites. The first thing it does on that first site is to take note of the links on the page. Then it “reads” the text and begins to follow the links that it collected previously. This network of links is called the crawl frontier; it’s the territory that the crawler is exploring in a very systematic way.

The links in a crawl frontier will sometimes take the crawler to other pages on the same web site, and sometimes they will take it away from the site completely. The crawler will follow the links until it hits a dead end and then backtrack and begin the process again until every link on a page has been followed.

As to what actually happens when a crawler begins reviewing a site, it’s a little more complicated than simply saying that it “reads” the site. The crawler sends a request to the web server where the web site resides, requesting pages to be delivered to it in the same manner that your web browser requests pages that you review. The difference between what your browser sees and what the crawler sees is that the crawler is viewing the pages in a completely text interface. No graphics or other types of media files are displayed. It’s all text, and it’s encoded in HTML. So to you it might look like gibberish.

The crawler can request as many or as few pages as it’s programmed to request at any given time. This can sometimes cause problems with web sites that aren’t prepared to serve up dozens of pages of content at a time. The requests will overload the site and cause it to crash, or it can slow down

If the crawler does go away, it will eventually return to try the task again. And it might try several times before it gives up entirely. But if the site doesn’t eventually begin to cooperate with the crawler, it’s penalized for the failures and your site’s search engine ranking will fall.

Source : Search Engine Optimization Bible - Wiley

Friday, December 20, 2013

Content Optimization and its types

Website Content Optimization

When your content stinks, though, your site is headed in the opposite direction. If you have content on your site that’s not professional, the search engine crawler will register this, and your ranking may drop lower and lower and you could possibly be delisted altogether.

How do you know if your content stinks or not? It’s mostly a game of finding the right combination of content types and consistent updates. But before you can even get to determining the right type of content, you need to create a content strategy.

Your content strategy is the plan by which you’ll infuse your site with the right types of content at the right times. It starts with determining how to reach your target audience. By now your target audience should be engraved on your forehead, but how you reach that audience is something entirely different. If your audience is teens, the language and method with which you’ll reach them will be different than if your audience is senior adults or stay-at-home moms, or even full-time professionals.

So what words and phrases will your target audience use to find your web site? Those are some of the keywords that you’ll be using in your content. Additional keywords may be discovered using some of the methods that have been covered in previous chapters.

Next, determine what users will benefit from visiting your site. Visitors click through a link looking for something. If you don’t provide some benefit, the users will click away nearly as fast as they found you. When determining what value you have to offer, don’t think in terms of your desire to bring users to your site, think in terms of what those users are seeking. What do they want?

There are several different types of content, and each type has its own implications:

Licensed Content: Licensed content is what you might buy from a content broker. For example, if you’re looking for a quick way to populate your site with articles, you might turn to a company like FreeSticky.com that offers many different articles you can use. The problem with this type of content is that it’s often repeated all over the Internet. You’re not the only one who will need to populate your site, and others will likely use some of the same services you do. Being used often doesn’t make the content less valuable, but it’s not going to rank as well with search engines because of the duplication.

Original Content: There are several types of original content. There’s the content you write and share with others for free. This is a good way to get links back to your site. You can use the content for a limited amount of time exclusively on your own site and then allow others to use it for nothing more than a link back to your site. This incoming link adds credibility to your site.

Another type of original content is that which is distributed freely by visitors to your site. This original content can take the form of comments on your site or forum message boards. This type of original content is an excellent addition to your SEO efforts, because it tends to be focused on a specific subject.

Some original content is exclusive to your site. This is content that you create, and the only place it appears is your web site. This is the most valuable type of content for your site, and it’s the type search engine crawlers like the best. Think of it as giving the crawler some variety in its diet. It gets tired of the same thing day in and day out, just like we do. The more original and exclusive content you can provide for a crawler, the better you’ll rank in search results. It also doesn’t hurt if that content has an appropriate number of keywords in it.

Dynamic Content: Dynamic content can be licensed or original. Blogs are the perfect example of dynamic content.

Source : SEO Bible, Wiley

Saturday, December 29, 2012

What is Social Media Optimization?

Social Media Optimization

When you understand what social media are, it’s not a far step from there to social-media optimization. It’s about using social media to spread the news about your web site. And instead of tweaking the elements of your web site, when you’re practicing social-media optimization you’re participating in social networks and tweaking the content that you share within those networks.


It’s through that participation and optimization that your site is distributed, and that’s what brings traffic to your web site. There are also a couple of added benefits to social-media optimization. First, it’s free in terms of monetary investment. There is no cost to participate in these networks except your time. But make no mistake about it, you will have to invest a chunk of front-end time into this type of optimization. You must become familiar with the communities and participate to establish a name for yourself before anyone will take serious notice of you.

The other benefit of social-media optimization is that it’s really a viral marketing strategy. Nearly everyone is familiar with a virus. It’s that nasty little bug that multiplies exponentially and makes you believe death would be an improvement. Viral marketing is neat, not nasty, but it works in a somewhat similar way.

When you can tap viral marketing for your online presence, word of your brand spreads like a virus — it moves faster and faster as the buzz increases, and there doesn’t seem to be any way to stop it. And that’s good. If your marketing message has become viral, you can expect to see some serious growth in the number of visitors to your site and by extension to the number of goal conversions that you see.

Viral marketing is a good thing. And when you’ve optimized your social-media participating, you can almost guarantee that your marketing will soon have that viral quality that you’re looking for.

Source : Search Engine Optimization Bible by Wiley

Thursday, December 6, 2012

What is SEO Spam?

What Constitutes SEO Spam?

So, if SEO spam is so hard to define, how do you know whether what you’re doing is right or wrong?

Good question. And the answer is that you don’t always know, but there are some guidelines that you can follow that will help you stay out of the spam category. Basic, good web-design practices are your best defense. If you’re handling your web-site search marketing using the guidelines provided by the various search engines you’ll target, you should be in a position not to worry about being classified as a spammer.

Don’t do anything that makes you worry that you’re doing wrong.
It sounds like simple advice, but when you think about it, if you’re doing something on your web site that you have to worry is going to get you banned from a search engine, you probably shouldn’t do it. This includes strategies like using hidden text on your web pages, using doorway pages or cloaking your pages, and creating false link structures. Even if you don’t know that these strategies are banned by search engines, when you consider the sneakiness of the strategy, you’ll be able to tell that it’s not likely a strategy that you should use.

Don’t make your web site appear to be something that it’s not.
It’s easy to “put a spin” on something to make it appear more attractive than it really is. People do it all the time where products and services are concerned. But using that same strategy on your web site may get you banned. Creating false link structures is one way you might make your site appear more popular than it really is. The problem with using that strategy is that it won’t take a crawler long to figure out that all of those sites are interconnected.

Don’t trust someone who says that a certain practice is acceptable, if you even suspect that it’s not.
Some unethical SEO people will tell you that it’s okay if you use certain obvious spam techniques as long as you use them correctly. Wrong. Spam is spam. It doesn’t matter how you use it, the search crawler will still see it as spam and you’ll still pay the consequences, while the unethical SEO consultant will take the money and leave you in the crawler’s bad graces.

Source : Search Engine Optimization Bible by Wiley

Thursday, November 22, 2012

SEO Tools for better Internet marketing

SEO Tools

There are certain SEO Tools that helps to know the number of visits, ROI (Return on Investment), bounce rate, new online customers, SERP (Search Engine Result page), etc. This helps to monitor the SEO campaign for better optimization. Below is the list of SEO tools which helps to find the complete details of the website and its promotion.



Traffic Travis : Market Research SEO and PPC optimization tool


Backlink Watch : Backlink Checker

Quantcast : Website Evaluation Tool

SEO For Firefox : Page Ranking Extension for Firefox

Goingup : Document and Image Converter

SEO Monitor : Video and Audio Format Converter

Sitemapdoc : Google Sitemap Generator and Editor

Automapit : Sitemap Creation Service
GWebCrawler & Google Sitemap Creator : Source Code Web Indexing Engine
Free Online Sitemap Generator : Online sitemap creation

Sitemapxml : XML Sitemap Generator

Brokenlinkcheck : Broken link check 

Broken Link Checker : Link broken checker

W3C Link Checker : Link Checker


Keyword Pad : Keyword List Generator

Primitive Word Counter : Keyword Density Calculator

Keyword Analyzer : Tool for keyword gathering and result estimation

Xedant Keyword Harvester : Keyword Harvest Tool

Google Keyword Tool : Free Keyword Tool

Keyword Density Analyzer : Keyword Density and Word Depth Calculator

Niche Watch : Niche Keywords Research

Google Suggest Keyword Suggestion : Keywords Suggestion Tool

Free Keyword Suggestion : Keywords vs Search Volume Estimator 

Keyword Suggestion and Keyword Popularity Tool : Keyword Suggestion along Popularity

Google Semantics : Firefox Add-on

Alexa Toolbar : Free alexa tool

Search Status : Firefox Toolbar Extension 

Meta Tags : Firefox Sidebar Add-on for SEO

Web Developer : Firefox, Flock, Seamonkey Extension

Web Tools : Ranking and related Tools

Pagerank Lookup : Page Rank Checker

Rank Tracker : Search Engine Ranking Tool

Check Google Pagerank : Positioning Estimation Tool

Google Ranking : Ranking Tool for popular Search Engines

SEO Rank Tool : Ranking and Backlink Tool

Rank Checker : Ranking Checking Tool for Firefox

Gorank : Professional SEO Ranking Tool

Alexa Site Information : Website Monitoring Tools

Sitemeter : Real Time Reporting Tool

Traffic Estimator : Google Traffic Estimation Tool

Social Poster : Social Bookmarking Tool

Google Analytics : website traffic

Social Maker : Social Bookmarking and Promotion Tool

Website Optimizer : Website Testing and Optimization Tool

Web Page Analyzer : Website speed Test

Copyscape : Website dupicate content checker

Search Engine Spider Simulator : Spider Simulation Tool

Web Page Analyser : Web Page Analyser gives a report on the good, and the bad points of the page in SEO terms

Builtwith : Website Optimizer

Xml Sitemaps : Sitemap Generator

Thursday, November 8, 2012

How to Handle Google Penguin Algorithm Change

Google Penguin Algorithm Change 

Google Penguin: Putting Link Abusers On Ice

What it is: First announced on April 24, 2012, the Penguin update was "huge," Meyers says. Unlike previous algorithm updates, he adds, Penguin was more punitive, as opposed to simply being designed to improve search quality.

Google named its new algorithm Penguin. Initially, it affected about 3.1 percent of English-language search queries, according to Search Engine Land. Penguin sought to decrease rankings for websites that engaged in dubious link exchanges, unnatural links, relied on too many of the same anchor text links and so on. (Anchor text links are hyperlinks that contain a targeted keyword phrase.) 
 
Google's goal: With Penguin, Google is cracking down on a common black hat SEO practice: abusing links to gain search engine rankings. If you paid for links from lots of dubious, low-quality link directories, link exchanges and other sites, you may have felt the Penguin slap. 

What you should do: Penguin has already been updated twice and is likely to updated again soon, Meyers says. As a result, it's more important than ever to have link quality and diversity. Earn "natural" links from a variety of other quality sites because you've posted compelling, useful content. 

News: Google Begins Penalizing Search "Over-Optimization"News: Copyright Removal Requests to Affect Google Search Ranking

Don't focus on getting links from other sites using identical anchor text. "Look at where your links are coming from using Google Webmaster tools and what the anchor text links are," says Ting-Yu Liu, Covario's manager of paid media services. "Try to have at least 60 percent keyword diversification. If you have 80 percent of external sites linking to you with the same anchor text, that's a problem."

How to Handle Google Panda Algorithm Change?

Google Panda: Putting Content Farms Out to Pasture

What it is: In February 2011, Google rolled out a major new algorithm. It was called "Farmer" because it was targeted at demoting high-volume content farms in Google search results. The update eventually became known as Panda, a reference to the name of Google engineer Navneet Panda. Since February 2011, Google Panda has been updated 20 times, Meyers says.

The initial Panda update reportedly affected the rankings of nearly 12 percent of all search results, according to the Search Engine Land blog.

Google's goal: Panda was designed to push down sites that are overly optimized, offer "thin" content and/or operate as content farms, explains Michael Martin, SEO manager at Covario, a global search marketing agency. (A content farm produces large amounts of content specifically to attract traffic from search engines and use those page views to generate easy advertising revenues.)

Meyers gives as an example a pest control service, operating nationwide, which may have created a specific Web page for every U.S. city in which it operates. The content on those pages is nearly identical except for the different geographic locations. With Panda, Google's search technology is better able to identify nearly duplicate content like that, recognize that those pages offer no real value to its users and push that content way down in search result rankings.

What you should do: Don't create content simply based on keyword optimization or post thousands of pages with nearly duplicate content. If you do, Google is likely to push down your entire site in its rankings, Meyers advises. Instead, make sure your site's content is as unique as possible and that it adds reader value. Ask yourself: "What does my content do for people who find it?" Does it help them, educate them or engage them in some way?

News: Google to Punish Sites with Many Ads At the Top of Pages
Sometimes, duplicate content is part of what a company legitimately offers. A large publishing company, for instance, may publish the same article on multiple sites it owns. In those cases, to avoid a Google penalty, publishers should properly identify the parent content and make sure others use rel=canonical to point back to the original content, Martin says. (You can learn more at the Google Webmaster Tools' rel=canonical tutorial.)