Brief about Robots Exclusion Standard and Guidelines to Create Robots.txt File

Robots.txt File

Recently one of my friends emailed me a query that he doesn’t know the concept of robots exclusion standard or robots exclusion protocol and also he wants information about how to create the robots.txt file. This is just one example but there are many SEO professionals who really don’t know about the robot exclusion standards or robots.txt file.

So considering all queries about the robots exclusion standard or robots.txt file; in this article Seogdk sorts out the information about the robots exclusion protocol and gives guidelines to create the robots.txt file.

First of all don’t be confused about the terms robots exclusion standard, robots exclusion protocol, and robots.txt file because these three terms are the same. Basically, these are the guidelines to keep crawlers in line. The robots.txt file is simply defined as the file that is used to tell robots and crawlers what not to crawl on your website. The robots.txt file is the actual component that you will work with. It is a text-based document that should be included in the root of your domain and it essentially contains directions to any crawler that comes to your website about what they are and are not allowed to index.

Every search engine has its own crawler with a specific name and if you want to see the name of the crawler then just check your web server log; you will probably see that name. Below see the list of different search engines with their crawler names:

-Google – Googlebot
-Bing – Bingbot
-Yahoo Search – Yahoo!Slurp
-MSN - Msnbot
-Baidu – Baiduspider
-Yandex – Yandexbot
-Alexa – ia_archiver
-Ask – Teoma
-Searchsight – SearchSight
-AltaVista – Scooter
-Guruji – GurujiBot
-Goo – Ichiro
-LookSmart – FurlBot
-FyberSearch – FyberSpider
-SiteSell – SBIder

Guidelines to Create Robots.txt File

1. To communicate with the crawler you need a particular syntax that it can understand. Below see the basic form of the syntax:

User-agent: *
Disallow: /

Above both lines are mandatory when you create the robots.txt file.

2. The first line User-agent:, tells a crawler what user agent you are commanding. The asterisk (*) denotes that all crawlers are covered but you can specify a single crawler or even multiple crawlers.

3. The second line Disallow:, tells the crawler what it is not allowed to access. The slash (/) denotes “all directories.” So in the previous code example, the robots.txt file is mainly saying that “all crawlers are to ignore all directories.”

4. When you create a robots.txt file always remember to include a colon (:) after the ‘User-agent’ indicator and after the ‘Disallow’ indicator. The colon denotes that important information follows to which the creator should pay attention.


5. If you want all crawlers to ignore specific directories then you simply mention a particular directory name as below:

User-agent: *
Disallow: /private/

As well as you can take one step further and tell all crawlers to ignore multiple directories as below:

User-agent: *
Disallow: /private/
Disallow: /public/
Disallow: /program/links.html

It means that the text tells the crawler to ignore private directories, public directories, and program directories that contain links that are not accessed by the crawler.

6. One thing always keeps in mind about crawlers is that they read the robots.txt file from top to bottom and as soon as they find a guideline that applies to them then they stop reading and begin crawling your website. So be careful about writing when you are commanding multiple crawlers with your robots.txt file.

7. Below text format totally wrong to write a robots.txt file:

User-agent: *
Disallow: /private/

User-agent: CrawlerName
Disallow: /private/
Disallow: /program/links.html

First, this text tells crawlers that all crawlers should ignore the ‘private’ directories. So every crawler reading that file will automatically ignore the ‘private’ files. But you have also told a particular crawler denoted by ‘CrawlerName’ to disallow both ‘private’ directories and ‘program’ directories that contain links. The problem is that the specified crawler will never get that message because it has already read that all crawlers should ignore the ‘private’ directories.

8. When you want to command multiple crawlers then you need to first begin by naming the crawlers you want to control. Only after they have been named should you leave your instructions for all crawlers. After written correctly the previous code should look like below:

User-agent: CrawlerName
Disallow: /private/
Disallow: /program/links.html

User-agent: *
Disallow: /private/

9. You view the robots.txt file for any website that has one by adding the robots.txt extension to the base URL of the website. For example, yourwebsitename.com/robots.txt will display a page that shows you the text file guiding robots for that website.

10. If you use a blank robots.txt file then crawlers automatically assume an empty file means you don’t want your website to be crawled. So using a blank robots.txt file is the best way to keep you out of search engine results.

Conclusion

From the above information, you can easily create a robots.txt file and if you have certain pages or links that you want the crawler to ignore then you can achieve this without causing the crawler to ignore a whole website. Additionally, you can find a complete list along with the text of the robots exclusion standard document on the Web Robots Pages. So friends convey your feedback about this article through your comments and emails till then enjoy your life.....!!!


Recently one of my friends emailed me a query that he doesn’t know the concept of robots exclusion standard or robots exclusion pro...

Valuable SEM Tips to Reduce Pay Per Click Costs of Your PPC Campaigns

PPC Campaigns

PPC advertising is a very important part of Search Engine Marketing (SEM) as well as Online Marketing. Every PPC advertiser wants more exposure to their PPC campaigns or product ads in a small amount of bidding or costs on the internet. So in this article, Seogdk brings valuable information or guidelines about PPC advertising to reduce PPC costs of your PPC campaigns.

During the early years of PPC advertising, you could pretty much figure out a bid for a keyword or phrase and then assume that you didn’t need to monitor it too closely. Because that time PPC advertising was one of the effective online advertising methods used on the internet. Nowadays, the competition for PPC keywords and phrases is very high which means you are likely to spend way more money to achieve the same rank than you needed to in the past.

If you are not careful about your PPC campaign’s budget then you could spend a lot of money without getting better results. So first make PPC management strategies to reduce the cost of your PPC campaigns while maintaining or even improving your conversion and click-through rates (CTR). Follow guidelines or SEM tips to reduce PPC costs of your PPC campaigns:

Monitoring your PPC Ad Campaigns

When you consider the management of your PPC campaign in the context of reducing the budget you have many options to reduce your costs without decreasing the effectiveness of your campaign. Your first step should be to replace any low-performing ads. Monitoring your ads should be a key factor in your PPC campaigns so you can easily determine and replace those ads that are not performing well.

Reduce the Amount of Bid per Keyword

We have another option to reduce your PPC costs is to reduce the amount of your bid per keyword. As mentioned earlier it is not necessary to attempt for the top advertising space and reducing your keyword bid by a less amount per click can make a huge difference in the cost of the campaign.

Select Less Demandable but More Targeted Keywords or Phrases

We know that the high-demand keywords are very expensive and with many high-demand keywords the resulting web traffic may not be as well targeted as it could be. It means that a reduction in conversions resulted in less Return on Investment (ROI). So it is far better to choose keywords or phrases that are not in as much demand but are more targeted. You can pay fewer amounts for these keywords and get more qualified traffic from them which will translate into more conversions and a better ROI.

Use ‘Match Types’ Condition

The Match Types condition helps you to control your PPC budget. It is the condition under which you prefer that your PPC ad is shown. For example, if you have chosen the keyword or phrase “SEO Expert” for your PPC advertisement then you can specify that the ad is shown only under a certain condition. There are main three match types that you could use:

1. Broad Match

It occurs when your ad is shown to the broadest possible segments of searchers. It means that anytime someone searches for “SEO Expert” your ad will be shown. No matter whether those two words appear together or other words are included in the term.

2. Phrase Match

It occurs when your ad is shown for searches that contain the keyword or phrase you have selected not only incorrect word order but also including other words. For instance, instead of showing your ad only for searches containing “SEO Expert,” your ad might also be shown for searches like “SEO Consultant,” or “SEO Specialist.” 

3. Exact Match

It takes place when your ad is shown only for searches that contain the exact words that you have chosen. In other words, if you select an exact match for “SEO Expert” then the only time the ad will be shown is when a user searches for “SEO Expert.”

Modify Your PPC Strategy

One of the best ways for reducing your PPC budget is to modify your PPC strategy. Monitor and review the amount of PPC spending versus the number of conversions you achieve. Then use this information to make changes to your keywords and ad copy to increase the effectiveness of your PPC Campaigns.

Dayparting

This is a very effective method to reduce the cost of your PPC campaign. It is nothing but the practice of dividing the day into several parts and each of which a different type of radio programming or television programming appropriate for that time is aired. In other words, a strategy requires that you monitor your PPC campaigns to know what days and hours of the day your campaigns perform the best.

In addition, it can help you to reduce the instances of click fraud that you may face. Because your ad is shown only at specific times, it is less available for your competition or other malicious souls to use to drive up your PPC costs.

Geo-Targeting

It is an effective method that reduces the PPC costs of your PPC campaign. This is a method of targeting specific traffic for your PPC ad based on the geographical location of your business.

Conclusion

From the above information, we conclude that the budget must be monitored and maintained as well as find a way to reduce the costs associated with your PPC campaigns. So friends convey your feedback about this article via comments and emails till then enjoy your life…..!!!


PPC advertising is a very important part of Search Engine Marketing (SEM) as well as Online Marketing. Every PPC advertiser wants mor...