Results 1 to 4 of 4
  1. #1

    What is disallow in robots.txt file?

    Hlo Friends,
    What is disallow in robots.txt file?

  2. #2
    Disallow is used to block bots from accessing and indexing certain files and /or folders on your website.
    Disallow: /folder/ - This will block bots from accessing anything inside this folder and indexing them. This is useful when you have certain pages or files on your website which you don't want to show up in SERP, like dummy pages or pages you are working on.
    Similarly, Allow: will say bots that they are allowed to index them.

  3. #3
    Disallow in robots.txt file implies that the webpages listed under the disallow are not crawled by search engines. It is used to prevent search engine spiders from crawling certain webpages.
    Cheap VPS Hosting | Cheap cPanel VPS
    Cheap Dedicated Servers | Free IPMI | Unmetered Bandwidth

  4. #4
    Disallow command blocks the webpage from robots not to crawl the data in a webpage. So it wont show in SERPs


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts

Ad Here