Book “FREE DEMO” Classes ➡️

Robot.txt file ka use tab karna chahiye jab aap apni website ke kuch specific parts ko search engine crawlers (jaise Googlebot, Bingbot) se block karna chahein ya unhe guide karna ho ki kaunse pages ya files crawl karni hain aur kaunse nahi. Yah ek website ke root directory mein rakhi jaati hai aur iska primary purpose hota hai web crawlers ke liye instructions dena. Use Robots.txt

Use of Robot.txt file

  1. Sensitive Pages Ko Block Karne Ke Liye:
    • Agar aapki website mein kuch confidential pages (e.g., admin panel, payment gateway URLs) hain, jinhe aap public search results mein dikhana nahi chahte.
    • Example: /admin/, /login/
  2. Duplicate Content Ko Avoid Karne Ke Liye:
    • Agar website mein duplicate pages ya URLs hain jo search engine ke liye confusion create karte hain, to unhe block karna better hai.
  3. Testing Ya Staging Websites Ke Liye:
    • Agar aapki ek development ya staging website hai jo live nahi hai, to robots.txt ka use karke usse crawlers se block karte hain.
  4. Certain File Types Ko Block Karne Ke Liye:
    • Aap specific file types (e.g., .pdf, .jpg, .css, .js) ko block kar sakte hain agar aap nahi chahte ki search engine unhe crawl kare.

Example: Block all .pdf files:
txt
Copy code
User-agent: *

Disallow: /*.pdf$

  1. Crawl Budget Optimize Karne Ke Liye:
    • Search engines ke crawlers ka ek limited crawl budget hota hai. Robots.txt ke through aap ensure karte hain ki woh sirf important pages crawl karein.
  2. Temporary Block Karne Ke Liye:
    • Agar aap ek temporary section ya folder par kaam kar rahe hain aur chahte hain ki search engines usse abhi access na karein.
  3. If You want in Details and Depth Read Google Article and you can read SEO SEARCH LAND. or Digital Future Academy is the Best SEO learn Platform where you can Learn about SEO and Digital Marketing with Offline and Online Classes City in Bhiwani ,Hisar, Chandigarh, Rohtak, Karnal, Fatehad, Hansi, Ambala, Charkhi Dadri also all Haryana Teach By Mr Ashish Verma

Robots.txt Example:

txt

Copy code

User-agent: *

Disallow: /admin/

Disallow: /private-data/

Disallow: /test-page/

Allow: /public-folder/

Important Notes:

Agar aapki website me complex URL structures hain ya SEO ke liye optimized architecture banana chahte hain, to robots.txt ka sahi implementation bohot zaroori hai.

In Table Use Robot.txt

1. To Block Sensitive Pages from Crawling
2. Avoid Duplicate Content Issues
3. For Testing or Staging Websites
4. Block Specific File Types
5. Save Your Crawl Budget
6. Temporarily Block Pages
Example of a Simple Robots.txt File
Things to Remember
Conclusion