Brief about Robots Exclusion Standard and Guidelines to Create Robots.txt File

Posted By // 2 comments:
Robots.txt File
Recently one of my friends emailed me a query that he doesn’t know the concept of robots exclusion standard or robots exclusion protocol and also he wants information about how to create robots.txt file. This is just one example but there are many SEO peoples who really don’t know about robot exclusion standard or robots.txt file.

So considering all queries about robots exclusion standard or robots.txt file; in this article Seogdk sort out the information about robots exclusion protocol and gives guidelines to create robots.txt file.

First of all don’t be confused about the terms robots exclusion standard, robots exclusion protocol and robots.txt file because these three terms are same. Basically these are the guidelines to keep crawlers in line. The robots.txt file simply defined as the file that is used to tell robots and crawlers what not to crawl on your website. The robots.txt file is the actual component that you will work with. It is text based document that should be included in the root of your domain and it essentially contains directions to any crawler that comes to your website about what they are and are not allowed to index.

Every search engine has their own crawler with specific name and if you want to see name of crawler then just check your web server log; you will probably see that name. Below see the list of different search engines with their crawler names:

-Google – Googlebot
-Bing – Bingbot
-Yahoo Search – Yahoo!Slurp
-MSN - Msnbot
-Baidu – Baiduspider
-Yandex – Yandexbot
-Alexa – ia_archiver
-Ask – Teoma
-Searchsight – SearchSight
-AltaVista – Scooter
-Guruji – GurujiBot
-Goo – ichiro
-LookSmart – FurlBot
-FyberSearch – FyberSpider
-SiteSell – SBIder

Guidelines to Create Robots.txt File

Robots Exclusion Protocol
1. To communicate with the crawler you need a particular syntax that it can understand. Below see the basic form of the syntax:

User-agent: *
Disallow: /

Above both lines are mandatory when you create robots.txt file.

2. The first line User-agent:, tells a crawler what user agent you are commanding. The asterisk (*) denotes that all crawlers are covered but you can specify a single crawler or even multiple crawlers.

3. The second line Disallow:, tells the crawler what it is not allowed to access. The slash (/) denotes “all directories.” So in the previous code example the robots.txt file is mainly saying that “all crawlers are to ignore all directories.”

4. When you creating robots.txt file always remember to include colon (:) after the ‘User-agent’ indicator and after the ‘Disallow’ indicator. The colon denotes that important information follows to which the creator should pay attention.

5. If you want to all crawlers to ignore specific directories then you simply mention particular directory name as below:

User-agent: *
Disallow: /private/

As well as you can take one step further and tell all crawlers to ignore multiple directories as below:

User-agent: *
Disallow: /private/
Disallow: /public/
Disallow: /program/links.html

It means that the text tells the crawler to ignore private directories, public directories and program directories that contains links which are not accessed by the crawler.

6. One thing always keep in mind about crawlers is that they read the robots.txt file from top to bottom and as soon as they find a guideline that applies to them then they stop reading and begin crawling your website. So be careful about to write when you are commanding multiple crawlers with your robots.txt file.

7. Below text format totally wrong to write robots.txt file:

User-agent: *
Disallow: /private/

User-agent: CrawlerName
Disallow: /private/
Disallow: /program/links.html

First this text tells crawlers that all crawlers should ignore the ‘private’ directories. So every crawler reading that file will automatically ignore the ‘private’ files. But you have also told a particular crawler denoted by ‘CrawlerName’ to disallow both ‘private’ directories and ‘program’ directories which contains links. The problem is that the specified crawler will never get that message because it has already read that all crawlers should ignore the ‘private’ directories.

8. When you want to command multiple crawlers then you need to first begin by naming the crawlers you want to control. Only after they have been named should you leave your instructions for all crawlers. After written correctly the previous code should look like below:

User-agent: CrawlerName
Disallow: /private/
Disallow: /program/links.html

User-agent: *
Disallow: /private/

9. You view the robots.txt file for any website that has one by adding the robots.txt extension to the base URL of the website. For example will display a page that shows you the text file guiding robots for that website.

10. If you use blank robots.txt file then crawlers automatically assumes an empty file means you don’t want your website to be crawled. So using blank robots.txt file is a best way to keep you out of search engine results.


From above information you can easily create robots.txt file and if you have certain pages or links that you want the crawler to ignore then you can achieve this without causing the crawler to ignore a whole website. Additionally, you can find a complete list along with the text of the robots exclusion standard document on the Web Robots Pages. So friends convey your feedback about this article through your comments and emails till then enjoy your life.....!!!

Author Bio
Gangadhar Kulkarni
Gangadhar Kulkarni is an Internet Marketing Professional having extensive experience in digital marketing. He is also the founder of Seogdk and Director at DigiTechMantra Solutionsa one-stop shop for all that your website needs. It provides you cost effective and efficient content writing and digital marketing services. For more information catch him on Facebook | Twitter | LinkedIn | G+ | Pinterest 

Valuable SEM Tips to Reduce Pay Per Click Costs of Your PPC Campaigns

Posted By // 4 comments:
PPC Campaigns

PPC advertising is very important part of Search Engine Marketing (SEM) as well as Online Marketing. Every PPC advertiser wants more exposure to their PPC campaigns or product ads in small amount of bidding or costs on internet. So in this article, Seogdk brings valuable information or guidelines about PPC advertising to reduce PPC costs of your PPC campaigns.

During the early years of PPC advertising you could pretty much figure a bid for a keyword or phrase and then assume that you didn’t need to monitor it too closely. Because that time PPC advertising was one of the effective online advertising method used on the internet. Nowadays, the competition for PPC keywords and phrases is very high that means you are likely to spend way more money to achieve the same rank than you needed to in the past.

If you are not careful about your PPC campaign’s budget then you could spend lot of money without getting better results. So first make PPC management strategies to reduce the cost of your PPC campaigns while maintaining or even improving your conversion and click through rates (CTR). Follow below guidelines or SEM tips to reduce PPC costs of your PPC campaigns:

Monitoring your PPC Ad Campaigns

When you consider management of your PPC campaign in the context of reducing the budget you have many options to reduce your costs without decreasing effectiveness of your campaign. Your first step should be to replace any low performing ads. Monitoring your ads should be a key factor of your PPC campaigns so you can easily determine and replace those ads that are not performing well.

Reduce Amount of Bid per Keyword

We have another option to reduce your PPC costs is to reduce the amount of your bid per keyword. As mentioned earlier it is not necessary to attempt for the top advertising space and reducing your keyword bid by a less amount per click can make a huge difference in the cost of the campaign.

Select Less Demandable but More Targeted Keywords or Phrases

We know that the high demand keywords are very expensive and with many high demand keywords the resulting web traffic may not be as well targeted as it could be. It means that reduction in number of conversions resulted in a less Return on Investment (ROI). So it is far better to choose keywords or phrases that are not in as much demand but are more targeted. You can pay fewer amounts for these keywords and get more qualified traffic from them which will translate into more conversions and a better ROI.

Use ‘Match Types’ Condition

Pay Per Click Advertising
The Match Types condition helps you to control your PPC budget. It is the condition under which you prefer that your PPC ad is shown. For example, if you have chosen keyword or phrase “SEO Expert” for your PPC advertisement then you can specify that ad shown only under certain condition. There are main three match types that you could use:

1. Broad Match

It occurs when your ad is shown to the broadest possible segments of searchers. It means that anytime someone searches for “SEO Expert” your ad will be shown. No matter whether those two words appear together or if there are other words included with the term.

2. Phrase Match

It occurs when your ad is shown for searches that contain the keyword or phrase you have selected not only in correct word order but also including other words. For instance, instead of showing your ad only for searches containing “SEO Expert,” your ad might also be shown for searches like “SEO Consultant,” or “SEO Specialist.” 

3. Exact Match

It takes place when your ad is shown only for searches that contain the exact words that you have chosen. In other words, if you select an exact match for “SEO Expert” then only time the ad will be shown is when a user searches for “SEO Expert.”

Modify Your PPC Strategy

One of the best ways for reducing your PPC budget is to modify your PPC strategy. Monitor and review the amount of PPC spending versus the number of conversions you achieve. Then use this information to make changes to your keywords and ad copy to increase the effectiveness of your PPC Campaigns.


This is very effective method to reduce the cost of your PPC campaign. It is nothing but the practice of dividing the day into several parts and during each of which a different type of radio programming or television programming appropriate for that time is aired. In other words, a strategy requires that you monitor your PPC campaigns to know what days and hours of the day your campaigns perform the best.

In addition it can help you to reduce the instances of click fraud that you may face. Because your ad shown only at specific times, it is less available for your competition or other malicious souls to use it to drive up your PPC costs.


It is effective methods that reduce the PPC costs of your PPC campaign. This is a method of targeting specific traffic for your PPC ad based on the geographical location of your business.


From above information we conclude that the budget must be monitored and maintained as well as finding a way to reduce the costs associated with your PPC campaigns. So friends convey your feedback about this article via comments and emails till then enjoy your life…..!!!

Author Bio
Gangadhar Kulkarni
Gangadhar Kulkarni is an Internet Marketing Professional having extensive experience in digital marketing. He is also the founder of Seogdk and Director at DigiTechMantra Solutionsa one-stop shop for all that your website needs. It provides you cost effective and efficient content writing and digital marketing services. For more information catch him on Facebook | Twitter | LinkedIn | G+ | Pinterest