I have created several websites with one deployed with only 20 pages. While I am trying to update this and other sites to be deployed later, I noticed that with the SEO Noindex Website switch turned off, the robots.txt still has
User-agent: *
Disallow:
I tried toggling the switch back and forth with the same results *jic*.
I checked the site.config.json file and found
"metaRobotsIndex": "index, follow",
and found nothing similar in the other json files.
I am trying to get the 20 page site indexed and while I perfect the site before getting too carried away with a lot of posts, and there will be a lot, I keep forgetting to edit the robots.txt on the server and that is screwing me up with google.
This happens on all the sites I created with different templates. I am using TechNews for the deployed site.
What can I do to fix this on my end or is it something you have to do?
Cheers!!