23
votes

On Play Framework's homepage they claim that "JSON is a first class citizen". I have yet to see the proof of that.

In my project I'm dealing with some pretty complex JSON structures. This is just a very simple example:

{
    "key1": {
        "subkey1": {
            "k1": "value1"
            "k2": [
                "val1",
                "val2"
                "val3"
            ]
        }
    }
    "key2": [
        {
            "j1": "v1",
            "j2": "v2"
        },
        {
            "j1": "x1",
            "j2": "x2"
        }
    ]
}

Now I understand that Play is using Jackson for parsing JSON. I use Jackson in my Java projects and I would do something simple like this:

ObjectMapper mapper = new ObjectMapper();
Map<String, Object> obj = mapper.readValue(jsonString, Map.class);

This would nicely parse my JSON into Map object which is what I want - Map of string and object pairs and would allow me easily to cast array to ArrayList.

The same example in Scala/Play would look like this:

val obj: JsValue = Json.parse(jsonString)

This instead gives me a proprietary JsObject type which is not really what I'm after.

My question is: can I parse JSON string in Scala/Play to Map instead of JsObject just as easily as I would do it in Java?

Side question: is there a reason why JsObject is used instead of Map in Scala/Play?

My stack: Play Framework 2.2.1 / Scala 2.10.3 / Java 8 64bit / Ubuntu 13.10 64bit

UPDATE: I can see that Travis' answer is upvoted, so I guess it makes sense to everybody, but I still fail to see how that can be applied to solve my problem. Say we have this example (jsonString):

[
    {
        "key1": "v1",
        "key2": "v2"
    },
    {
        "key1": "x1",
        "key2": "x2"
    }
]

Well, according to all the directions, I now should put in all that boilerplate that I otherwise don't understand the purpose of:

case class MyJson(key1: String, key2: String)
implicit val MyJsonReads = Json.reads[MyJson]
val result = Json.parse(jsonString).as[List[MyJson]]

Looks good to go, huh? But wait a minute, there comes another element into the array which totally ruins this approach:

[
    {
        "key1": "v1",
        "key2": "v2"
    },
    {
        "key1": "x1",
        "key2": "x2"
    },
    {
        "key1": "y1",
        "key2": {
            "subkey1": "subval1",
            "subkey2": "subval2"
        }
    }
]

The third element no longer matches my defined case class - I'm at square one again. I am able to use such and much more complicated JSON structures in Java everyday, does Scala suggest that I should simplify my JSONs in order to fit it's "type safe" policy? Correct me if I'm wrong, but I though that language should serve the data, not the other way around?

UPDATE2: Solution is to use Jackson module for scala (example in my answer).

5
did you actually go through play tutorial page s on JSON? It doesn't seem like it to me.m09
@m09: I actually think this is a pretty reasonable question. The type class-based approach to decoding is very different from what you find in a library like Jackson (and in my opinion much nicer, as I've tried to argue below).Travis Brown
@TravisBrown: Seeing the question and the lack of mention of Reads and JSON paths I was just afraid that OP didn't take the time to actually try to get the Play's way of doing things before asking for an exact copy of his past workflow. I do agree that both methods are very different, and I'd argue that it is not an opinion that play's approach is much better, but a fact ;)m09
@m09 It's because I read the documentation I raised this question (playframework.com/documentation/2.2.1/ScalaJson). It seemed to me that there is a lot of extra effort and proprietary approach just to make it "type safe" or whatever the goal is.Caballero
@Caballero: Personally I don't see it as "extra effort"—it's more like moving the effort up front, so that the compiler can help you. Of course tastes differ, but I'm occasionally stuck working with Map[String, Object]-plagued libraries and I find the experience pretty miserable at this point.Travis Brown

5 Answers

33
votes

Scala in general discourages the use of downcasting, and Play Json is idiomatic in this respect. Downcasting is a problem because it makes it impossible for the compiler to help you track the possibility of invalid input or other errors. Once you've got a value of type Map[String, Any], you're on your own—the compiler is unable to help you keep track of what those Any values might be.

You have a couple of alternatives. The first is to use the path operators to navigate to a particular point in the tree where you know the type:

scala> val json = Json.parse(jsonString)
json: play.api.libs.json.JsValue = {"key1": ...

scala> val k1Value = (json \ "key1" \ "subkey1" \ "k1").validate[String]
k1Value: play.api.libs.json.JsResult[String] = JsSuccess(value1,)

This is similar to something like the following:

val json: Map[String, Any] = ???

val k1Value = json("key1")
  .asInstanceOf[Map[String, Any]]("subkey1")
  .asInstanceOf[Map[String, String]]("k1")

But the former approach has the advantage of failing in ways that are easier to reason about. Instead of a potentially difficult-to-interpret ClassCastException exception, we'd just get a nice JsError value.

Note that we can validate at a point higher in the tree if we know what kind of structure we expect:

scala> println((json \ "key2").validate[List[Map[String, String]]])
JsSuccess(List(Map(j1 -> v1, j2 -> v2), Map(j1 -> x1, j2 -> x2)),)

Both of these Play examples are built on the concept of type classes—and in particular on instances of the Read type class provided by Play. You can also provide your own type class instances for types that you've defined yourself. This would allow you to do something like the following:

val myObj = json.validate[MyObj].getOrElse(someDefaultValue)

val something = myObj.key1.subkey1.k2(2)

Or whatever. The Play documentation (linked above) provides a good introduction to how to go about this, and you can always ask follow-up questions here if you run into problems.


To address the update in your question, it's possible to change your model to accommodate the different possibilities for key2, and then define your own Reads instance:

case class MyJson(key1: String, key2: Either[String, Map[String, String]])

implicit val MyJsonReads: Reads[MyJson] = {
  val key2Reads: Reads[Either[String, Map[String, String]]] =
    (__ \ "key2").read[String].map(Left(_)) or
    (__ \ "key2").read[Map[String, String]].map(Right(_))

  ((__ \ "key1").read[String] and key2Reads)(MyJson(_, _))
}

Which works like this:

scala> Json.parse(jsonString).as[List[MyJson]].foreach(println)
MyJson(v1,Left(v2))
MyJson(x1,Left(x2))
MyJson(y1,Right(Map(subkey1 -> subval1, subkey2 -> subval2)))

Yes, this is a little more verbose, but it's up-front verbosity that you pay for once (and that provides you with some nice guarantees), instead of a bunch of casts that can result in confusing runtime errors.

It's not for everyone, and it may not be to your taste—that's perfectly fine. You can use the path operators to handle cases like this, or even plain old Jackson. I'd encourage you to give the type class approach a chance, though—there's a steep-ish learning curve, but lots of people (including myself) very strongly prefer it.

27
votes

I've chosen to use Jackson module for scala.

import com.fasterxml.jackson.databind.ObjectMapper
import com.fasterxml.jackson.module.scala.DefaultScalaModule
import com.fasterxml.jackson.module.scala.experimental.ScalaObjectMapper

val mapper = new ObjectMapper() with ScalaObjectMapper
mapper.registerModule(DefaultScalaModule)
val obj = mapper.readValue[Map[String, Object]](jsonString)
12
votes

For further reference and in the spirit of simplicity, you can always go for:

Json.parse(jsonString).as[Map[String, JsValue]]

However, this will throw an exception for JSON strings not corresponding to the format (but I assume that goes for the Jackson approach as well). The JsValue can now be processed further like:

jsValueWhichBetterBeAList.as[List[JsValue]]

I hope the difference between handling Objects and JsValues is not an issue for you (only because you were complaining about JsValues being proprietary). Obviously, this is a bit like dynamic programming in a typed language, which usually isn't the way to go (Travis' answer is usually the way to go), but sometimes that's nice to have I guess.

0
votes

You can simply extract out the value of a Json and scala gives you the corresponding map. Example:

   var myJson = Json.obj(
          "customerId" -> "xyz",
          "addressId" -> "xyz",
          "firstName" -> "xyz",
          "lastName" -> "xyz",
          "address" -> "xyz"
      )

Suppose you have the Json of above type. To convert it into map simply do:

var mapFromJson = myJson.value

This gives you a map of type : scala.collection.immutable.HashMap$HashTrieMap

0
votes

Would recommend reading up on pattern matching and recursive ADTs in general to better understand of why Play Json treats JSON as a "first class citizen".

That being said, many Java-first APIs (like Google Java libraries) expect JSON deserialized as Map[String, Object]. While you can very simply create your own function that recursively generates this object with pattern matching, the simplest solution would probably be to use the following existing pattern:

import com.google.gson.Gson
import java.util.{Map => JMap, LinkedHashMap}

val gson = new Gson()

def decode(encoded: String): JMap[String, Object] =   
   gson.fromJson(encoded, (new LinkedHashMap[String, Object]()).getClass)

The LinkedHashMap is used if you would like to maintain key ordering at the time of deserialization (a HashMap can be used if ordering doesn't matter). Full example here.