I have some performance issue with jsonb in postgres. I have quite a lot of products in my db and each has one column called technical_details, which saves all the technical details each like this
{
detail: {
unit: unit_name,
value: value
}
}
e.g:
{
weight: {
unit: ākgā,
value: 1000
}
}
Now I would like to make a query like the following:
Product Load (149.4ms)
SELECT "products".* FROM "products"
WHERE (COALESCE(CAST( nullif(technical_details#>>ā{weight,value}','') AS FLOAT),0) BETWEEN 500 AND 1500)
But as you can see it takes a really long time to make such a query. Does someone have an idea how I could improve the performance of such a query?
Thanks so much!
Edit: As Dmitry requested: These are the indices I have which are related to technical details:
"index_products_on_technical_details" gin (technical_details)
"index_products_on_technical_details_width" btree ((technical_details ->> 'width'::text))
"index_products_on_technical_details_weight" btree ((technical_details ->> 'weight'::text))
This is the EXPLAIN ANALYZE
query:
Seq Scan on products (cost=0.00..926.80 rows=22 width=1288) (actual time=0.100..30.563 rows=498 loops=1)
Filter: ((COALESCE((NULLIF((technical_details #>> '{weight,value}'::text[]), ''::text))::double precision, 0::double precision) >= 500::double precision) AND (COALESCE((NULLIF((technical_details #>> '{weight,value}'::text[]), ''::text))::double precision, 0::double precision) <= 1500::double precision))
Rows Removed by Filter: 3997
Planning time: 0.236 ms
Execution time: 30.740 ms
\d tableName
) andEXPLAIN ANALYZE
output ? - Dmitry Savinkov