I've asked a similar question recently, more geared toward dplyr.
See dplyr & monetdb - appropriate syntax for querying schema.table?
As dplyr & MonetDB (according to @Hannes Mühleisen reply above) don't have a proper way to manage schemas, I resolved to use MonetDB.R / DBI native functions.
Also in this case the results have been somewhat problematic: it seems that at this moment MonetDB.R is capable to manage properly only one schema per database.
Let's see some code. I create two schemas, and I create the same table name in each (standard sql practice). I then try to write data into each.
> conn <– dbConnect(MonetDB.R(), "db.url", user= "monetdb", password="monetdb")
> df <- data.frame(i=10,j=20)
> q <- dbGetQuery(conn, "CREATE SCHEMA foo;")
> q <- dbGetQuery(conn, "SET SCHEMA foo;")
> q <- dbGetQuery(conn, "CREATE TABLE mytable (i int, j int);")
> q <- dbWriteTable(conn, "mytable",df,overwrite=TRUE)
> q <- dbGetQuery(conn, "CREATE SCHEMA bar;")
> q <- dbGetQuery(conn, "SET SCHEMA bar;")
> q <- dbGetQuery(conn, "CREATE TABLE mytable (i int, j int);")
At this stage I should have two tables with identical names but in two distinct schemas. This is confirmed by dbListTables (it would be nice to see the schemas but I can live without it):
> dbListTables(conn)
[1] "mytable" "mytable"
But as soon as I try to write to mytable with an identical query to the one used before, disaster!
> q <- dbWriteTable(conn, "mytable",df,overwrite=TRUE)
Error in .local(conn, statement, ...) :
Unable to execute statement 'INSERT INTO mytable VALUES (10, 20)'.
Server says 'INSERT INTO: no such table 'mytable'' [#42S02].
At this point the DB is left in a messy state and is asking to roll back the transaction.
Let's try with overwrite=FALSE, just to see if there is anything different:
> df <- data.frame(i=10,j=20)
> q <- dbGetQuery(conn, "CREATE SCHEMA foo;")
> q <- dbGetQuery(conn, "SET SCHEMA foo;")
> q <- dbGetQuery(conn, "CREATE TABLE mytable (i int, j int);")
> q <- dbWriteTable(conn, "mytable",df,overwrite=TRUE)
> q <- dbGetQuery(conn, "CREATE SCHEMA bar;")
> q <- dbGetQuery(conn, "SET SCHEMA bar;")
> q <- dbGetQuery(conn, "CREATE TABLE mytable (i int, j int);")
> dbListTables(conn)
[1] "mytable" "mytable"
> q <- dbWriteTable(conn, "mytable",df,overwrite=FALSE)
Error in .local(conn, name, value, ...) :
Table mytable already exists. Set overwrite=TRUE if you want
to remove the existing table. Set append=TRUE if you would like to add the new data to the
existing table.
I've hit the mother lode with append=true
> df <- data.frame(i=10,j=20)
> q <- dbGetQuery(conn, "CREATE SCHEMA foo;")
> q <- dbGetQuery(conn, "SET SCHEMA foo;")
> q <- dbGetQuery(conn, "CREATE TABLE mytable (i int, j int);")
> q <- dbWriteTable(conn, "mytable",df,overwrite=TRUE)
> q <- dbGetQuery(conn, "CREATE SCHEMA bar;")
> q <- dbGetQuery(conn, "SET SCHEMA bar;")
> q <- dbGetQuery(conn, "CREATE TABLE mytable (i int, j int);")
> dbListTables(conn)
[1] "mytable" "mytable"
> q <- dbWriteTable(conn, "mytable",df,append=TRUE)
The latter sequence seems to work and load the data into the bar schema without problems.
But is you use (as I normally do) csvdump=TRUE
you get another error:
Error in .local(conn, statement, ...) :
Unable to execute statement 'COPY 15 RECORDS INTO mytable FROM '/var/folders/m4/yfyjwcpj6rv5nq730bl9vr1h000...'.
Server says 'COPY INTO: no such table 'mytable'' [#42S02].
I could always avoid to use csvdump
but I need to write big tables...
I've tried also with copy_to
& dplyr
, but I get errors as well.
Not surprisingly, seen the problems with csvdump
I got similar errors also using monetdb.read.csv
: you get error if you try to load a csv when the same name table exists in another schema.
Am I doing something wrong? Are there better way to do things? Is this a bug?
Any help is welcome.