Robots Failing In Orchard
I've been getting to grips with the excellent ASP.Net Orchard, and have fully migrated my old blog over to this platform over the last couple of weeks - the issue that I had with the application restarting was due to my hosting provider having throttled the memory restrictions for my Application Pool too far. To their credit, once the issue was discovered they relaxed the restrictions immediately and the site has been working perfectly ever since.
The only stupid little problem that I've had was with enabling Chad Sharf's Robots module which serves 'robots.txt' through the pipeline and provides a simple editor for it through the administration interface. Every time I tried to access 'robots.txt' from my site when the module was enabled I was being given a 404 error.
After several hours pulling my hair out and trawling through the stupidly simple code, the answer hit me.
When I'd originally set up Orchard before uploading to my hosting provider, I'd been using WebMatrix; it turns out that WebMatrix had added a 'robots.txt' for me in the root of the project, which duly got copied over to my hosting provider when I deployed Orchard. This might not seem to be an issue, but Orchard's configuration explicitly turns off all handlers to force all requests through its own routing mechanism:
<handlers accessPolicy="Script"> <!-- clear all handlers, prevents executing code file extensions, prevents returning any file contents --> <clear/> <!-- Return 404 for all requests via managed handler. The url routing handler will substitute the mvc request handler when routes match. --> <add name="NotFound" path="*" verb="*" type="System.Web.HttpNotFoundHandler" preCondition="integratedMode" requireAccess="Script"/> </handlers>
The upshot if this meant that (counter-intuitively) if the 'robots.txt' file existed on the filesystem, any request for it would immediately return a 404 error and wouldn't hit the routing mechanism within Orchard - simply deleting the file immediately kicked the module into life.