Edit: Note that the best way to handle UTF on Oracle is to create the database using the database character set AL32UTF8, and use ordinary varchar2 columns. One of the problems with using nchar columns is that oracle can't use indexes for ordinary char/varchar2 columns when arguments are sent as nchar by default.
Anyway: If you can't convert the database:
First, unicode literals needs to be prefixed with an 'n', like this:
select n'Language - Språk - Język' from dual;
*) 8-bit encodings can't handle this text
Unfortunately, that is not enough.
For some reason, the default behaviour for database clients is to translate all string literals to the database character set,
meaning that values will be changed even before the database gets to see the string.
The clients need some configuration in order to be able to insert a unicode character into an NCHAR or NVARCHAR column:
SQL Plus on Unix
These environemnet variables sets up the unix environment and sqlplus to use UTF-8 files,
and also configure sqlplus to send string literals in unicode.
NLS_LANG=AMERICAN_AMERICA.AL32UTF8
LC_CTYPE="en_US.UTF-8"
ORA_NCHAR_LITERAL_REPLACE=true
(en_US.UTF-8 is for Solaris - Linux or other systems may need different strings, use locale -a
to list supported locales.)
JDBC Driver
Applications using Oracles JDBC driver needs to have the following system property defined to send strings literals in unicode.
-Doracle.jdbc.defaultNChar=true
-Doracle.jdbc.convertNcharLiterals=true
SQL Developer
Locate sqldeveloper.conf, and add the following lines:
AddVMOption -Doracle.jdbc.defaultNChar=true
AddVMOption -Doracle.jdbc.convertNcharLiterals=true
SQL Plus on Microsoft Windows
I haven't tried if SQLplus on Microsoft Windows or Toad handles utf-8 at all.
Sqlplusw.exe may do that, and the following registry settings may do the trick.
NLS_LANG=AMERICAN_AMERICA.AL32UTF8
ORA_NCHAR_LITERAL_REPLACE=true