We're evolving to serve you better! This current forum has transitioned to read-only mode. For new discussions, support, and engagement, we've moved to GitHub Discussions.

Publii, Git, and Large Sites

  • #10491
    Avatar photo[anonymous]

    I have a very large site (400+ content pages).

    When trying to sync to github, it fails because it’s too big to push changes in one commit, or I get some Error: Connection Reset.

    When trying to sync to gitlab, the upload will fail at about 700/2100 objects to be written and I’ll get something like “Error: Bad Request”.

    And I appear to be getting desync errors very often in gitlab, however the logs and status message just tells me I received a “Connection Reset” or an “Error: Bad Request”.

    This is driving me nuts.

    I imagine this is mostly due to the huge size of the site. For this scale of site, is it even possible to use a git backend? Or do you advise only using something like S3?

    Avatar photo[anonymous]

    We have the same issue, I’ve taken to exporting the site after changes to a folder and uploading that separately via a script:

    rsync -aP ~/Documents/Publii/sites/electragirl/electragirl-files/* ~/Documents/Publii/sites/electragirl/electragirl
    cd ~/Documents/Publii/sites/electragirl/electragirl
    git add *
    git commit -m "Manual update"
    git push

    It’s a cludge but functional. It seems that Publii doesn’t honour the api rate limit as far as I can tell.

    Avatar photo[anonymous]

    Thank you! I got setup with github desktop so now I just have to press 2 buttons instead of 1 LOL.

    I didn’t realize Publii does funky stuff with what should just be a simple git push. This is a great workaround.

    Avatar photo[anonymous]


    Maybe it will help with this issue – we have released alpha version with support for git repositories which is much more efficient than current API-based way: