Robots.txt file is set to disallow all search engines



It appears I have a robots.txt file set to disallow all search engines from indexing my site. This was happening even before I provided my own robots file. Even when I push a different file that allows all search engines to index, the result is the same. I cannot figure out where this rule or file is coming from.

Any help would be greatly appreciated.




It’s quite possibly coming from your host. What host/platform are you using?


I’m using Heroku currently. I have also written their support team.


Found it tucked away in one of apps directories. My bad. Thank you for the help sdayman.

closed #5

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.