Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
What is the function of robots.txt?
#1
May I know the function of robots.txt? Any idea is welcome.
Reply
#2
Robots.txt is a text file that is defined in your website. This file contains webpages that are allowed and disallowed from search engine crawling. They help to control the crawling activity of your website.
Reply
#3
A robots.txt is a small file that tells a search engine which pages to index and which pages to ignore.
Reply
#4
The main function of robot.txt file is either if a link should not be crawled by the search engine, so for those particular link if we disallow in the robot.txt then it won't be indexed and crawl by the search engine.
Reply
#5
Robots.txt file is a very powerful file if you're working on a website SEO. We have to put on your site to tell search engine robots which pages you would like visit & which page you want to hide from search engine.
Reply
#6
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit.
Reply
#7
robot.txt.... you can control your Page crawling In Search Engines
Reply
#8
It restricts search engine crawlers to certain parts of your website.
Reply
#9
A robots.txt document tells search engine crawlers which records or pages the crawler mayor cannot ask from your website. This is used mainly in order to avoid ridding your website with asks; it isn't just a mechanism for preserving a web page outside of Google. To keep an internet page out of Google, you ought to use no-index directives or password-protect your page.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)