0
votes

I'm trying to find an efficient way to extract specific fields from a Postgres jsonb column.

CREATE TABLE foo (
  id integer,
  data jsonb
)

"data" contains a row with:

{
  "firstname": "bob",
  "lastname": "smith,
  "tags": ["tag0","tag1"]
}

I want to extract a large number of fields from the data column. This select statement works, but it's cumbersome with large numbers of fields, yields really long SQL statements, and also I don't know if it is traversing the jsonb repeatedly for each column:

SELECT data->>'firstname', data->'tags' FROM foo

I tried this:

SELECT jsonb_path_query(data, '$.[firstname,tags]') FROM foo

but it gave an error message: syntax error, unexpected '[' This syntax is, in fact, correct jsonpath per https://jsonpath.com/, but it appears that Postgres doesn't implement it.

Is there a way to extract jsonb fields efficiently, both in terms of execution speed and compactness of the SQL query command?

1
If you want multiple columns in the result, you need to write one expression for each column you want. There is no shortcuta_horse_with_no_name
I don't need multiple columns in the result. I can live with a json blob with the required fields. @a_horse_with_no_name It's going to get serialized into a blob in my app server anyway before it gets sent to the browser.ccleve

1 Answers

1
votes

Yes, your query will read the complete data column for all rows of foo.

Even if you normalize the data model and turn your JSON attributes into regular columns, it will read the table row by row, but then your query becomes cheaper if you only access the first couple of columns in the table.

What you are looking for is a column store, but PostgreSQL doesn't have that built in.