I’ve spent a fair amount of time scouring the documentation and source code but I’m unable to speed up this usecase.
We’re using webpack in a huge application, bordering on 1000 components, all of them fairly substantial. Our initial compile times are slow (40s), and our incremental builds are also slow (5s).
Here’s a repository that seeks to emulate a test case for what I’m dealing with: https://github.com/tvararu/webpack-test.
- The entry point is
A.js, which requires
B.jsis tiny and doesn’t have a lot of dependencies.
C.jsis monolithic and has thousands of requires.
My expectation is that when using
webpack-dev-server in the test project, whenever I save
B.js, webpack should recognize that
C.js and none of its dependencies have been touched. It should compile
B.js swiftly (in <10ms), replace it in the cache, and output the compiled
A.js using the cached version of
C.js from the initial compile.
However, webpack compiles
3002 hidden modules every time I save
B.js, leading to a compile time of
960ms. This isn’t bad on its own, but spirals out of control if you add some loaders like
babel, which is the case in our real project.
I do have a solution: on the same test project there is a
dll branch. On that branch, you can run
webpack --config webpack.dll.config.js to generate two DLLs from
C.js which will then get leveraged when compiling
A.js. Whenever you save
B.js, its DLL will get recompiled,
A.js will notice that one of its DLLs has updated and it’ll just take the old DLL of
C.js and the new DLL of
B.js and conjoin them together into one quick happy bundle.
I could go further on that branch and do directory reads or dependency graph walks to generate a DLL for every component. That should in theory make compiling as fast as I would like it. But at that point it seems like I’ve just reimplemented (poorly) what the caching layer in webpack should do on its own, so what’s going on here?
What blindingly obvious thing am I missing?
Thanks for reading. 🙂