Long run in Newman or Postman runner

Hello,

I wonder if there is a possibility in Newman or collection runner to run a collection which has for example 1 milon queries?

In a standard approach there isn’t. Node process collects data to an error: “Javascript heap out of memory”.

I really don’t want to store any data for the report it’s useless for me. Only globals variables can be stored. I just want to send queries almost infinitely.

Is there any possibility to run long collection in that environment?
Thanks for any kind of help.

1 Like

Hey @mkolomanski,

Are you able to explain more about the actual use case here - I’m trying to understand the why behind your problem.

What feedback are you looking to get from 1 million, 500K, 100K requests running? Are you testing for a particular thing? Are you doing some form of Load or Performance Test? Are you seeding a database with a bunch of data?

2 Likes

Hello @danny-dainton,

I fill my database with objects, events, trends etc.
In my collection I have 20 queries but I jump to them plenty of times. Lets say that I jump to the query which creates an event 1 milion times.

I would be delighted if there will be an option to not store any data for the report and just keep sending an infinite number of queries (Memory occupation by node process doesnt increase).

Node only allocates around ~1.5 GB of memory, and Newman uses all of it, so it crashes.

You can run Newman as la Node.js library and do something like:

node --max-old-space-size=8192 newman.js #increase to 8gb

But don’t get your hopes up. I had only managed to get ~100000 to run without increasing the memory. You may need a lot of memory to finish 1 million. I had even tried disabling the cli reporter but to no avail.

My first impression is that there is a memory leak somewhere.

3 Likes

Thanks for the tip @vdespa. It’s always something.

I thoroughly explored the subject on the Git and many users have the same problem like me, for example: Running 100000 iterations of a single POST call seems to break the reporter · Issue #935 · postmanlabs/newman · GitHub
That was created 3.03.2017 :frowning:

Unfortunately, the problem isn’t solved

I raised an issue on similar lines a couple of weeks ago - Newman running in Docker with large number of iterations fails at around 10K requests

In my case, i use a collection of 5 requests with 2500 iterations and it fails after about 10K requests. The logs getting generated out of this is about 10 MB (i can see that in Jenkins), so i don’t think logging is an issue here.

1 Like

The fact that the issue is that old, shows that this use-case is not what the tool is made for.

If I were you, I would just create my own script to generate that amount of requests.

1 Like

Which language do you recommend for that purpose?

try Gatling @mkolomanski

Since you already have Node.js installed, just use JavaScript for that.

You can easily generate some code from Postman.

1 Like

Why would newman use 1.5GB? If it stores all the data, it would be nice to have an option to not do that. We do nowhere near that many tests but still get crashes. Another issue is that it’s EXTREMELY SLOW. The request takes a two minutes combined but the test runs for hours.

The data-leak is due to the fact that newman stores all the request and response data for reporting purposes.

AFAIK there is no flag, but I have seen some related issues on the GitHub.

I tried to run 50 collections in parallel using Newman npm. Each run adds 10 MB of memory usage (that’s a huge amount for a single collection), and each operation adds a further load of 1-2 MB.