0
votes

I would like to inject n-rows from my csv file to Gatling feeder. The default approach of Gatling is to read and inject one row at a time. However, I cannot find anywhere, how to take and inject an eg. Array into a template.
I came up with creating a JSON template with Gatling Expressions as some of the fields. The issue is I have a JSON array with N-elements:

[
  {"myKey": ${value}, "mySecondKey": ${value2}, ...}, 
  {"myKey": ${value}, "mySecondKey": ${value2}, ...},
  {"myKey": ${value}, "mySecondKey": ${value2}, ...},
  {"myKey": ${value}, "mySecondKey": ${value2}, ...}
]

And my csv:

value,value2,... 
value,value2,... 
value,value2,... 
value,value2,... 
...

I would like to make it as efficient as possible. My data is in CSV file, so I would like to use csv feeder. Also, the size is large, so readRecords is not possible, since I'm getting out of memory.

Is there a way I can put N-records into the request body using Gatling?

1
If I clearly understanding your problem - try to use batch mode which load only piece of data val csvFeeder2 = csv("foo.csv").batch(200).random - Amerousful
What I need is to fill objects in the array by taking required amount of rows from csv file. Default Gatling approach is to read one row at the time. - Forin

1 Answers

1
votes

From the documentation:

Note

You can also feed multiple records all at once. If so, attribute names, will be suffixed. For example, if the columns are name “foo” and “bar” and you’re feeding 2 records at once, you’ll get “foo1”, “bar1”, “foo2” and “bar2” session attributes.

feed(feeder, 2)