Okay.
So I found a way to use Cloudflare to load balance two wordpress news websites.
What I did
Replaced real details with below
Site A www.site. com IP 0.0.0.1 - Australia
Site B balance.site. com IP 0.0.0.2 - Singapore
www.site. com is my origial welll establised wordpress site
I got myself a second hosting account with a different webhost (both are cpanel)
I created a sub-domain on my second server and called this balance.site. com
I then used Cloudflare to create an “A” record and pointed this to balance.site. com
I then migrated a copy of my site www.site. com over to the second server balance.site. com
Now my site is working on both servers. But I made sure to NOINDEX balance.site. com to avoid duplicate content issues in the search engines (At this stage is basically a staged site) running on balance.site. com hosted on different server elsewhere. (Stratigeic geo location for my purposes)
I then created a Cloudflare loadbalancer with IP and using HOSTHEADER names
Boom! Loadbalancing working great!
For site sync I used a "paid plugin* that has some good options that syncs the two sites, posts, plugins, databases all of it. I found something called wpsynchro. com
THE PROBLEM!
As I had to set the second site on balance.site. com to NOINDEX to avoid duplicate content issues, if the seach engines were to read the robots.txt file from site B balance.site. com/robots.txt this will cause my site to get deranked from Google.
So close to such a simple loadblancing solution!
So. My questions.
While using Cloudflare load balancing (dont care about failover) does anyone think or know if there is a way to make sure that only the robots.txt is being read from site “A” www.site. com at all times?
Of course, with this sync plugin I found there is actually a way to tell it not to push/pull a certain file. So thats good and avoids site B balance.site. com from being sent the robots.txt file from site A causing site B to end up with the indexable intructions from site "A"s robots.txt thus coming back to a duplicate content issue in search engines again
So.
Happy to hear from anyone who might be able to tell me if there is a way to ask cloudfare only to read the robots.txt file from site A at times only?
Thanks