0
votes

I have special characters in my json response which are german umlauts (ä,ö,ü).

I have set the encoding to UTF-8 and this should work but the output from dataweave is ü and ä and ö and it is a com.mulesoft.weave.reader.ByteArraySeekableStream datatype.The input is a byte[].

How can I set my workflow up so the response shows the umlauts and not junk?

Dataweave Input Payload:

{
  "id": 1234567890
  "name": "prod123",
  "desc": "ü and ä and ö"
}

Dataweave Mapping:

%dw 1.0
%input payload application/json
%output application/json encoding="UTF-8"
---
{
    "status": 0,
    "desc": payload.desc
}

JSON Response:

{
  "status": 0,
  "desc": "ü and ä and ö"
}
2
Are you sure you are reading or logging your JSON response using UTF-8 encoding? Maybe its properly encoded, but as your output is a Byte stream, if you are somehow decoding using another encoding (such as ASCII) it may seem an error. Also, are you sure the input payload is encoded as UTF-8?Pierre B.
I have set the content type of the message to application/json; charset=UTF-8. I am also having problems with Chinese characters like 傳 too.user3165854

2 Answers

0
votes

Just remove the encoding property on output from your dataweave script and you should get proper output as expected. I tested in my local and it works fine. The reason it does not work with UTF-8 is because your input might be encoded with some other encoding.

0
votes

Do you happen to run on an older windows version? There is a problem where mule does not honor the UTF-8 output directive.

Mule support:

You might be running into a known issue that System property -Dfile.encoding=UTF-8 is ignored in a windows environment. ..... You can try it using Mule 4.1.3 as it has the fix for the known issue and let me know if you still had the same issue or not.

You can try to set this system property in your runtime's wrapper.conf file, it might fix your issue.