1
votes

I have a schema of the form:

CREATE TABLE definitions (
  id BIGINT(20) AUTO_INCREMENT PRIMARY KEY,
  json LONGTEXT NOT NULL
);

json is what I'd like to return to the client, and it should include the auto-generated id.

I wanted to do the following in a single transaction:

  1. Insert a new row where json does not contain an id; get the auto-increment id from the insert.
  2. Update the same row and replace json with a new object that contains the id.

The slick documentation shows me how to get the auto-increment id, but I can't figure out how to compose my query/insert to do both actions within a single transaction.

// TableQuery object for my table
class Definitions(driver: RelationalDriver, tag: Tag) extends ... {
  import driver.api._ 

  // implicit conversion for Definition
  private implicit val definitionToJson =
    MappedColumnType.base[Definition, String](
     { definition => definitionToJson(definition) },
     { json => definitionFromJson(json) }
  )

  def id: slick.lifted.Rep[Long] =
    column[Long](
      "id", 
      ColumnOption.PrimaryKey,
      ColumnOption.AutoInc
    )

  def json: slick.lifted.Rep[Definition] =     
    column[Definition]("json")

  override def * = (
    id,
    json
  ) <> (DefinitionRow.tupled, DefinitionRow.unapply)
}

class Dao {

  // ...

  // operation 1: insert row, and get back auto-increment id
  // definitions is an instance of above
  val op1 = (definitions returning definitions.map(_.id)) += json

  // operation 2: find the inserted row and update the object
  val op2 = op1.flatMap(insertId =>
    definitions.filter(_.id === insertId)
               .map(_.json)
               .update(updatedJson(insertId))

  // run both in a transaction
  db.run(op2.transactionally)
}

op2 refuses to compile.

[error] Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your *  
  projection. Or you use an unsupported type in a Query
  (e.g. scala List).
[error]   Required level: slick.lifted.FlatShapeLevel
[error]      Source type: slick.lifted.Rep[Definition]
[error]    Unpacked type: T
[error]      Packed type: G
[error]       val op2 = op1.flatMap(autoIncId =>   
                correlationDefinitionSlick.filter(_.id ===
                autoIncId.get).map(_.json).update(json))

To be honest, I don't understand why it can't map the requested type.

EDIT: I got the above to work by redefining the implicit conversion in my DAO, but I don't understand why I needed to do that.

1

1 Answers

0
votes

Slick Queries require implicit mappings in scope

You got this compiler error because you declared the implicit definition mapping inside the table as private and is not accessible to query where you are using definition.

Move you code into Container class and declare the implicit slick mapping outside the Table

class Container(driver: RelationalDriver) {

      import driver.api._ 
    // implicit conversion for Definition
      private implicit val definitionToJson =
        MappedColumnType.base[Definition, String](
         { definition => definitionToJson(definition) },
         { json => definitionFromJson(json) }
      )

        class Definitions(driver: RelationalDriver, tag: Tag) extends ... {



      def id: slick.lifted.Rep[Long] =
        column[Long](
          "id", 
          ColumnOption.PrimaryKey,
          ColumnOption.AutoInc
        )

      def json: slick.lifted.Rep[Definition] =     
        column[Definition]("json")

      override def * = (
        id,
        json
      ) <> (DefinitionRow.tupled, DefinitionRow.unapply)
    }

    class Dao {

      // ...

      // operation 1: insert row, and get back auto-increment id
      // definitions is an instance of above
      val op1 = (definitions returning definitions.map(_.id)) += json

      // operation 2: find the inserted row and update the object
      val op2 = op1.flatMap(insertId =>
        definitions.filter(_.id === insertId)
                   .map(_.json)
                   .update(updatedJson(insertId))

      // run both in a transaction
      db.run(op2.transactionally)
    }

}