Process out of memory – Webpack

After a couple of builds running in a Watch, Webpack crashes due to being out of memory.

<--- Last few GCs --->

 9223585 ms: Scavenge 1390.0 (1454.7) -> 1390.0 (1454.7) MB, 9.0 / 0 ms (+ 0.7 ms in 1 steps since last GC) [allocation failure] [incremental marking delaying mark-sweep].
 9224727 ms: Mark-sweep 1390.0 (1454.7) -> 1388.3 (1452.7) MB, 1142.7 / 0 ms (+ 9.8 ms in 89 steps since start of marking, biggest step 2.5 ms) [last resort gc].
 9225694 ms: Mark-sweep 1388.3 (1452.7) -> 1388.8 (1454.7) MB, 966.6 / 0 ms [last resort gc].


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x35903c44a49 <JS Object>
    1: walkFunctionDeclaration [<PROJECT_DIR>/node_modules/webpack/lib/Parser.js:~443] [pc=0xa07a14ec8ee] (this=0x59f67991119 <a Parser with map 0x71f2d115d49>,statement=0x3507a80af661 <a Node with map 0x71f2d1157c9>)
    2: walkStatement [<PROJECT_DIR>/node_modules/webpack/lib/Parser.js:~348] [pc=0xa07a06dfc10] (this=0x59f6799111...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Abort trap: 6

Author: Fantashit

16 thoughts on “Process out of memory – Webpack

  1. @sokra In the documentation for css-loader it says to use hash for easier debugging wrt css modules. Does that have the same problem?

    You can configure the generated ident with the localIdentName query parameter (default [hash:base64]). Example: css-loader?localIdentName=[path][name]---[local]---[hash:base64:5] for easier debugging.

    We’ve also started to see this problem in dev (out of memory), but our only hash in dev mode is the css one.

    EDIT: Of course it uses hash by default, so I’m guessing it’s not a problem.

  2. We have the exact same memory leak problem after using webpack -w on MacOS X. Can you please suggest what is a the best way to fix / workaround this issue. The ngOfficeUIFabric seems to deal with a different problem.

    Thanks,
    Deepak

  3. Having the same problem running webpack via webpack-stream in gulp. I use gulp.watch to rerun on code changes, which then just runs

        return gulp.src(tsStartFile)
                  .pipe(webpackStream(webPackConfig))
                  .on('error', logError) 
                  .pipe(gulp.dest(outputDir))
    

    I’d have thought this would run webpack from scratch each time, but after 5 or so builds I get the out of memory error and have to rerun my gulp file.

  4. Another data point here; I have filename hashes disabled and my build isn’t terribly large (it uses Angular 2, but few other dependencies), but after about an hour of dev work the watch dies with Abort trap: 6.

    I’ve temporarily increased the memory limit, which gives me a lot more time, but I’d rather not be consuming gigabytes just to run a watch command.

    FYI if anyone is curious about upping the memory limit, I did it like so in my package.json (obviously the additional flags are dependent on your needs):

    {
      "scripts": {
        "start": "node --max_old_space_size=4096 node_modules/webpack-dev-server/bin/webpack-dev-server.js --inline --progress --port 3000 --open"
      }
    }
  5. Not sure if this is going to help anyone but we had a problem after updating to webpack 4 with the out of memory error. The build itself would often get stuck or become very slow. What very unexpectedly helped us was removing the BundleAnalyzerPlugin.

  6. @buildbreakdo A better option is to do this:

    "scripts": {
        "build": "cross-env NODE_OPTIONS=--max_old_space_size=8192 webpack"
    }

    (It uses the cross-env package.)

    The nice thing about using NODE_OPTIONS is that it will affect everything, including any sub-processes that Webpack might spawn.

    (It’s useful for non-Webpack things as well: we’re using it for TypeDoc, which consumes a lot of RAM.)

    This solution did resolve the problem in my project.

  7. remove max_old_space_size

    just add to ts-loader

               loader: 'ts-loader',
               options: {
                 transpileOnly: true // this works for me
               }
    

Comments are closed.