What Is robots.txt?
The robots.txt file tells search engine crawlers which pages they can and cannot access on your website. It lives at the root of your domain (e.g., yoursite.com/robots.txt) and is one of the first files crawlers check.
Why robots.txt Matters for SEO
How to Generate robots.txt
Common Rules
Allow All Crawlers
```
User-agent: *
Allow: /
```
Block Admin Pages
```
User-agent: *
Disallow: /admin/
Disallow: /login/
```
Block Specific Bots
```
User-agent: BadBot
Disallow: /
```