Two Files Every Website Should Have

When a search engine 'bot' visits your site for the first time, it looks for two files right off the bat: robots.txt and sitemap.xml. Search engines use these files to learn about your website and can make recommendations on optimizing, modifying, and even fixing your website.

The robots.txt file is like a stop light. It tells bots where they can crawl and where they cannot. Don't want your images included in Google index? This is where you tell them. Don't want them to crawl your admin section? I don't blame you. Block them.

The robots file goes in your root directory, and is a simple text file you can edit with notepad or any text editor. To see if you have one, just visit If nothing is found, you do not have a robots file or you put it in the wrong place.

There are two basic parts to the robots file: the bot specification and permissions for that bot. Only want to block your site from Yahoo!? You can (although I'm not sure why you would).

Lastly, you can add a line to tell search engines where to find your site map file. This is the full URL.

The other file, sitemap.xml, is just what it sounds like. It's a sitemap, or roadmap, of your entire website - complete with links to every page all bundled into one nice little file. To make this file, visit, put in your website address and it provides back to you a downloadable file for you to put on your website. The only downside to this process is you have to redo it every time your site changes.

If you're using WordPress, it's even simpler. Plugins like Yoast (is Yoast correct?) SEO (which I recommend) and Google XML Sitemaps will create and update your Sitemap file with every change you make.

Lastly, you can add a line to tell search engines where to find your site map file. This is the full URL path, so in most cases it will be


Having these two files on your website makes it easier for search engines to know what your website is about and will decrease the time it takes to get your new site indexed.