0
votes

In jmeter using http requests i'm posting some json bundles and from the responses i'm using jsr223 post processor to extract data and store it inside csv files, each entry in each line. now for 10 post requests i'm getting duplicate data into the csv file. Is there a way to read back csv files and remove duplicate lines using jmeter. The number of lines in csv files can be almost 200,000.

eg:csv file be like csvFile1.csv: line1 line2 duplicateline ...........so on

1

1 Answers

1
votes
  1. You can read the file into an ArrayList as

    new File('/path/to/file').readLines()
    
  2. You can remove the duplicate entries using unique() function as

    def lines = file.readLines().unique()
    
  3. You can write the unique lines back using Writer

Putting everything together:

def file = new File('/path/to/file')
def lines = file.readLines().unique()
file.withWriter { writer ->
    lines.each {line ->
        writer.writeLine(line)
    }
}

Demo:

enter image description here

Just in case: The Groovy Templates Cheat Sheet for JMeter