How to edit robots.txt

Hace 1 mes
I'm running a NopCommerce 4.20 site.
I need to add some additional Disallow rules to the robots.txt.

I created robots.additions.txt and added the rules. After uploading I verify the content of this file (using FTP).
Now I open the robots.txt in my browser (https://www.josephiena.nl/robots.txt) to check.
The new rules have NOT been added.
I go back to the robots.additions.txt using FTP and see the content has been deleted.

What do I need to do to add the additional rules?
Hace 1 mes
Having a robots.additions.txt file in the root directory of your website is the correct thing to do.
I searched the 4.20 code and there is nothing that would empty the contents or delete that file.
public virtual string PrepareRobotsTextFile()
...
    //load and add robots.txt additions to the end of file.
    var robotsAdditionsFile = _fileProvider.Combine(_fileProvider.MapPath("~/"), "robots.additions.txt");
    if (_fileProvider.FileExists(robotsAdditionsFile))
    {
        var robotsFileContent = _fileProvider.ReadAllText(robotsAdditionsFile, Encoding.UTF8);
        sb.Append(robotsFileContent);
    }


Do you have some type of "Continuous Delivery DevOps" that might be doing it?
Hace 1 mes
Thanks for looking into the code.
No, I don't use any CI/CD. I just uploaded the file using FTP.

I just did another look and I placed the file in de wwwroot which also contains the DLLs and web.config.
I noticed this folder has another wwwroot folder which already contains a robots.additions.txt.
I updated this file, FTP-ed it again. But nothing happens.

Might be a caching issue. I see we have several customers online right now. Will do a restart later tonight.
Hace 1 mes
I don't know why my above shows  .MapPath("~/") rather than ~/wwwroot.  I copy / pasted the code!  It is looking in wwwroot

.Combine(_fileProvider.MapPath("~/wwwroot"), RobotsTxtDefaults.RobotsAdditionsFileName);


FYI, note that instead of "additions" added to the generated file, you can can completely override it using robots.custom.txt (also should be in in wwwroot folder)

//if robots.custom.txt exists, let's use it instead of hard-coded data below
var robotsFilePath = _fileProvider.Combine(_fileProvider.MapPath("~/wwwroot"), RobotsTxtDefaults.RobotsCustomFileName);
if (_fileProvider.FileExists(robotsFilePath))
    ...
Hace 1 mes
I now added a robots.txt, a robots.custom.txt and a robots.additions.txt in both wwwroot folders and now it seems to work.

Not sure what file (and location) solved it, but I'm happy now.
Thanks for all the advise.
Hace 1 mes
pmeems wrote:
I now added a robots.txt, a robots.custom.txt and a robots.additions.txt in both wwwroot folders and now it seems to work.

Not sure what file (and location) solved it, but I'm happy now.
Thanks for all the advise.


That's great!
Just need to know that once robots.custom.txt is found at the application root, not the wwroot, system won't use the robots.additions.txt

robots.additions.txt
is file that allows us to extend or append new rules on top of the NopC's default rules for robots.txt

Use of robots.custom.txt means we have custom rules for robots and hence overrides the NopC's default rules and also will render robots.additions.txt useless