We're evolving to serve you better! This current forum has transitioned to read-only mode. For new discussions, support, and engagement, we've moved to GitHub Discussions.

Problem with SEO Noindex Website not working.

  • #8705
    Avatar photo[anonymous]

    I have created several websites with one deployed with only 20 pages. While I am trying to update this and other sites to be deployed later, I noticed that with the SEO Noindex Website switch turned off, the robots.txt still has

    User-agent: *

    I tried toggling the switch back and forth with the same results *jic*.

    I checked the site.config.json file and found

    "metaRobotsIndex": "index, follow",

    and found nothing similar in the other json files.

    I am trying to get the 20 page site indexed and while I perfect the site before getting too carried away with a lot of posts, and there will be a lot, I keep forgetting to edit the robots.txt on the server and that is screwing me up with google.

    This happens on all the sites I created with different templates. I am using TechNews for the deployed site.

    What can I do to fix this on my end or is it something you have to do?


    Avatar photoBob

    I’ve checked it several times, and this is how it works:

    When the “Noindex site” option is enabled, the robots.tx file generates: “Disallow: /” which means disallow all.
    As a result of disabling the “Noindex site” option, the robots.tx file generates: “Disallow:” which means allow all.

    So for me, it works as it should.

    Avatar photo[anonymous]

    I will also add that we have the same syntax and we did not observed any issues with Google:

    Avatar photo[anonymous]

    My apologies. I guess I was tired when looking the other night and did not see the difference.

    Perhaps I went cross-eyed?

    When I created robots.txt in the past I just leave these lines out.

    Thanks for the explanation.

    Google has complained from time to time, however, I will call it Googles problem.