We're evolving to serve you better! This current forum has transitioned to read-only mode. For new discussions, support, and engagement, we've moved to GitHub Discussions.
I think you can try to use robots.txt with the following rules:
Block one folder
`User-agent: *
Disallow: /folder/
` Block one file
`User-agent: *
Disallow: /file.html
`
More detailed information you can find here: https://moz.com/learn/seo/robotstxt