go - How does Hugo maintain site-wide data, like .Site.AllPages? -
i'm looking bite-sized examples on how hugo might managing site-wide data, site.allpages
.
specifically, hugo seems too fast reading in every file , it's metadata, before beginning generate pages , making things .site.allpages
available -- has case.
are ruby (jekyll) , python (pelican) slow, or there specific (algorithmic) method hugo employs generate pages before ready?
there no magic, , hugo not start rendering until .site.pages
etc. collections filled , ready.
some key points here:
- we have processing pipeline concurrent processing whenever can, cpus should pretty busy.
- whenever content manipulation (shortcodes, emojis etc.), see hand crafted parser or replacement function built speed.
- we care "being fast" part, have solid set of benchmarks reveal performance regressions.
- hugo built
go
-- fast, , have great set of tools (pprof
, benchmark support etc.)
some other points makes hugo server
variant faster regular hugo
build:
- hugo uses virtual file system, , render directly memory when in server/development mode.
- we have partial reloading logic in there. so, if render every time, try reload , rebuild content files have changed , don't reload/rebuild templates if content change etc.
i'm bep on github, main developer on hugo.
Comments
Post a Comment