The MU forums have moved to WordPress.org

robots.txt and WPMU (10 posts)

  1. tene
    Member
    Posted 16 years ago #

    Thought I'd drop in a plug for Adam Brown's robots.txt plugin, haven't seen it referenced on this forum yet. Works brilliantly for WPMU, so simple. Very useful for anyone trying to manage robots.txt for subdomains and/or externally mapped domains.

    Plugin: KB Robots.txt

    Maybe this approach would nail the Google Sitemap issue .. without hand to hand combat with filesystems, permissions etc?

  2. drmike
    Member
    Posted 16 years ago #

    We still have a Google Sitemap issue? I thought we solved that months ago.

    You may want to mention what this plugin does by the way with respect to WPMu. Gotta admit that I don't like the idea of my users have access to a core file like that.

  3. adamrbrown
    Member
    Posted 16 years ago #

    The plugin lets you control your robots.txt from within the admin interface. I wrote it with WPMU in mind, since it allows each user to block robots from parts of their blog. (Given that each subdomain can have its own robots.txt file.)

    You write what you want robots.txt to say into a text box under Options -> Robots.txt. If somebody (likely a robot) visits yourblog.example.com/robots.txt, then WP serves up whatever you wrote in the option.

    A very simple plugin, really.

  4. drmike
    Member
    Posted 16 years ago #

    Thanks. I've made a note of it.

    I know this comes up over in wp.com land on occasion as folks ask why they can't drop IPs into a master block list. That would be too easy to abuse. Something like this would work better.

    Of course you would have to watch folks from blocking themselves ;)

  5. adamrbrown
    Member
    Posted 16 years ago #

    Right, well, by "block" I just mean disallowing in robots.txt, it's not like anybody would actually be prevented from viewing the site.

  6. Ovidiu
    Member
    Posted 16 years ago #

    I use a "real" robots.txt - curious if that one or the one "generated" by this plugin would take precedence...

    also I'd like to mention that I would prefer only the siteadmin to be able to edit this plugns options and have them sitewide set for all blogs, I don't see the necessity to allow each blog owner to have access to these settings personally.

  7. drmike
    Member
    Posted 16 years ago #

    Agreed as I have hte login pages and whatnot blocked out. (Well I think I do. Don't remember if I ever did put that in there.)

  8. adamrbrown
    Member
    Posted 16 years ago #

    As for a "real" robots.txt file, I don't know, I haven't tried. If your .htaccess file uses the -f flag to prevent redirecting actual files to your WP install, then the real robots.txt would be served. But if WP is what responds to the URL, the plugin responds. It depends on whether your server is directing requests for robots.txt to WP.

    As for your suggestion, that's fine, but that would require a different plugin. The point of this one is to letting users control it. Probably the easiest way to have a single robots.txt file control the entire installation is to use the subdirectory installation instead of the subdomain install. But like I said, that's not really the point of this particular plugin.

  9. Ovidiu
    Member
    Posted 16 years ago #

    I try and protect almost all wpmu itnernal pages from being spidered... also I try and keep print-layout pages and email-this-post pages from being spidered to prevent having a lot of different pages with the same content spidered.

  10. drmike
    Member
    Posted 16 years ago #

    Maybe you could hardcode the stuff you want in there into the plugin.

About this Topic