Robots.txt file is set to disallow all search engines

#1

Hello,

It appears I have a robots.txt file set to disallow all search engines from indexing my site. This was happening even before I provided my own robots file. Even when I push a different file that allows all search engines to index, the result is the same. I cannot figure out where this rule or file is coming from.

https://yourfishinglog.com/robots.txt

Any help would be greatly appreciated.

Thanks,

Andrew

#2

It’s quite possibly coming from your host. What host/platform are you using?

#3

I’m using Heroku currently. I have also written their support team.

#4

Found it tucked away in one of apps directories. My bad. Thank you for the help sdayman.

2 Likes
closed #5

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.