1
votes

Introduction:

I'm trying to insert data with Python/psycopg2 into Postgres in the following format:

(integer, date, integer, customtype[], customtype[], customtype[], customtype[])

However, as I try to insert them, I always get this error:

'"customtype[]" does not exist'


How is my setup:

I have a dict with the data I need, like so:

data_dict = {'integer1':1, 'date': datetime(), 
             'integer2': 2, 'custom1':[(str, double, double),(str, double, double)], 
             'custom2':[(str, double, double),(str, double, double),(str, double, double)],
             'custom3':[(str, double, double),(str, double, double),(str, double, double)],
             'custom4':[(str, double, double)]}

Each custom array can have as many custom tuples as needed.

I've already created a type for these custom tuples, as such:

"CREATE TYPE customtype AS (text, double precision, double precision)"

And I've created a table with columns of customtype[].


What I've tried so far:

query = """INSERT INTO table (column names...) VALUES
            (%(integer1)s, %(date)s, %(integer2)s, 
            %(custom1)s::customtype[], [...]);"""

And:

query = """INSERT INTO table (column names...) VALUES
            (%(integer1)s, %(date)s, %(integer2)s, 
            CAST(%(custom1)s AS customtype[]), [...]);"""

But both options render the same results.

The final question:

How to insert these record-type arrays in Postgresql with Psycopg2?

Maybe I'm probably misunderstanding completely how Postgresql works. I'm comming from a BigQuery Record/Repeated type background.

Ps.: This is how I'm querying:

cursor.execute(query,data_dict)
1

1 Answers

0
votes

The problem is that I created the type inside the Database.

When referencing custom types in Postgresql, there's a need to reference the database where the type was created as well as the type.

Like so:

(%(dict_name)s)::"database_name".type
#or
CAST(%(dict_name)s as "database_name".type)

Be carefull with the quoting!