0

I have decided to optimize my deployment strategy recently, and started doing some tests about various stuff. One of them was deciding how to run processes.

To test it, I've created a simple node.js app, that just creates a placeholder array, and waits. I've also added a "start" script in package.json, because that's usually how you run apps. Especially when there's prerequisite stuff like copying files before running, or if the app consists of a framework that comes with a "serve" script.

I've spawned 10 of my test app using for i in {1..10}; do pnpm run start & done. And saw this happening:

screenshot of btop, showing memory usages

The app uses ~45M of memory while the pnpm run start uses ~72M. I thought it might not be "real" memory usage since there could be memory sharing happening in the background. So I ran smem to see USS, PSS, and RSS values:

grep'd screenshot of smem, for pnpm run start grep'd screenshot of smem, for the node apps running here, we can see each pnpm run start uses ~32M of unique memory, while the actual process uses ~7M.

So, my questions are:

  1. Why initiating a node app with a run script has substantial memory overhead compared to initiating the app directly with node app.js?
  2. Is there a way to fix it? (rather than not using scripts)
  3. Is my method of measurement wrong?

- PS: The same thing also happens when I use npm run instead of pnpm run. I don't think it's specific to pnpm.

- PS: I also tried running 10 of the apps with for i in {1..10}; do node app.js & done, thinking maybe the run script and the app "divide" memory between them for stdout etc.., making it look like pnpm is using more memory. But nope. App still only uses 6M of memory when I spawn it with node app.js

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.