1
votes

I'm creating a synchronous (dunno if that matters) REST API with Spring and MongoDB. Currently, I'm trying to do a POST method using 2 CSV files with 3 columns id, name, salary, I test it with Postman. Whenever I do a POST request, I'd get this response

com.mongodb.MongoBulkWriteException: Bulk write operation error on server localhost:27017. Write errors: [BulkWriteError{index=0, code=11000, message='E11000 duplicate key error collection: mongodb-test.employee index: id dup key: { : 0 }', details={}}].

I know for a fact I dont have a duplicate key, here are 2 samples of my CSV files. enter image description here

Below is my Controller, EmployeeServiceImpl, and application.properties

@RestController
public class Controller {
    @Autowired
    private EmployeeServiceImpl service;

@PostMapping(value = "/employees", consumes = {MediaType.MULTIPART_FORM_DATA_VALUE}, produces = "application/json")
    public ResponseEntity saveEmployee(@RequestParam(value = "files") MultipartFile[] files) throws Exception {
        for (MultipartFile file : files) {
            service.saveEmployee(file);
        }
        return ResponseEntity.status(HttpStatus.CREATED).build();
    }
@Transactional
public class EmployeeServiceImpl {

    @Autowired
    private Repository repository;

    Object target;
    Logger logger = LoggerFactory.getLogger(EmployeeServiceImpl.class);

    @Synchronized
    public CompletableFuture<List<Employee>> saveEmployee(MultipartFile file) throws Exception {
        long start = System.currentTimeMillis();
        List<Employee> employees = parseCSVFile(file);
        logger.info("Saving list of employee of size {} records", employees.size(), "" + Thread.currentThread().getName());
        employees = repository.saveAll(employees);
        long end = System.currentTimeMillis();
        logger.info("Total time in millis {}", (end - start));
        return CompletableFuture.completedFuture(employees);
    }
    @Synchronized
    public CompletableFuture<List<Employee>> findAllEmployee(){
        logger.info("Retrieving list of employee by "+Thread.currentThread().getName());
        List<Employee> employees=repository.findAll();
        return CompletableFuture.completedFuture(employees);
    }
    @Synchronized
    private List<Employee> parseCSVFile(final MultipartFile file) throws Exception {
        final List<Employee> employees = new ArrayList<>();
        try {
            try (final BufferedReader br = new BufferedReader(new InputStreamReader(file.getInputStream()))) {
                String line;
                while ((line = br.readLine()) != null) {
                    final String[] data = line.split(",");
                    final Employee employee = new Employee();
                    employee.setName(data[0]);
                    employee.setSalary(data[1]);                 
                    employees.add(employee);
                }
                return employees;
            }
        } catch (final IOException e) {
            logger.error("Failed to parse CSV file {}", e);
            throw new Exception("Failed to parse CSV file {}", e);
        }
    }
# MONGODB (MongoProperties)
spring.data.mongodb.uri=mongodb://localhost:27017/mongodb-test
server.port = 8083

Anything that am missing?

2

2 Answers

0
votes

The error message indicates your code is trying to insert an id of 0 when it's already in the database. Check that you are not trying to insert header data as values.

0
votes

Update If the document contains an _id field, then the save() method is equivalent to an update with the upsert option set to true and the query predicate on the _id field.

Your Employee need :

@Document
class Employee {
    @Id
    private String id;

Upsert :

BulkOperations bulkOperations = mongoTemplate.bulkOps(BulkMode.UNORDERED, Employee.class);
bulkOperations.upsert(employeeList);
bulkOperations.execute();