Thousands of new blog created by people and it is very difficult for search engine to know about new websites. Robots.txt files give feedback to search engines so that they can scan our new blog post and can help blog. It is the easiest way to approach directly to search engine robots. It helps us to index blog post on a search engine. If your blog’s robots txt file is not good then it is the downfall of your blog. If you have done some mistake in your Robots.txt file then the search engine will stop approaching or be indexing blog. Before creating a Robots.txt list, do a proper study about it.