jsonb

Postgresql jsonb traversal

谁说我不能喝 提交于 2019-12-06 06:06:35
I am very new to the PG jsonb field. I have for example a jsonb field containing the following { "RootModule": { "path": [ 1 ], "tags": { "ModuleBase1": { "value": 40640, "humanstring": "40640" }, "ModuleBase2": { "value": 40200, "humanstring": "40200" } }, "children": { "RtuInfoModule": { "path": [ 1, 0 ], "tags": { "in0": { "value": 11172, "humanstring": "11172" }, "in1": { "value": 25913, "humanstring": "25913" } etc.... Is there a way to query X levels deep and search the "tags" key for a certain key. Say I want "ModuleBase2" and "in1" and I want to get their values? Basically I am looking

Postgres jsonb search in array with greater operator (with jsonb_array_elements)

ぐ巨炮叔叔 提交于 2019-12-06 04:58:36
I try to search a solution but I didn't find anything for my case... Here is the database declaration (simplified): CREATE TABLE documents ( document_id int4 NOT NULL GENERATED BY DEFAULT AS IDENTITY, data_block jsonb NULL ); And this is an example of insert. INSERT INTO documents (document_id, data_block) VALUES(878979, {"COMMONS": {"DATE": {"value": "2017-03-11"}}, "PAYABLE_INVOICE_LINES": [ {"AMOUNT": {"value": 52408.53}}, {"AMOUNT": {"value": 654.23}} ]}); INSERT INTO documents (document_id, data_block) VALUES(977656, {"COMMONS": {"DATE": {"value": "2018-03-11"}}, "PAYABLE_INVOICE_LINES":

Best PostgreSQL datatype for storing key-value maps?

送分小仙女□ 提交于 2019-12-06 03:17:25
问题 I'd like to store a simple map of key-value strings as a field in my PostgreSQL table. I intend to treat the map as a whole; i.e, always select the entire map, and never query by its keys nor values. I've read articles comparing between hstore , json and jsonb , but those didn't help me choose which data-type is most fitting for my requirements, which are: Only key-value, no need for nesting. Only strings, no other types nor null . Storage efficiency, given my intended use for the field. Fast

Why can't I query directly on jsonb_array_elements?

孤街醉人 提交于 2019-12-06 01:49:46
问题 I have data stored as jsonb in a column called "data": {'people': [{"name": "Bob", "Occupation": "janitor"}, {"name": "Susan", "Occupation", "CEO"}]} I can query this via: SELECT mydata.pk FROM mydata, jsonb_array_elements(mydata.data->'people') AS a WHERE (a->>'name') = 'bob' Why can't I substitute "a" for the jsonb_array_elements(...)?: SELECT mydata.pk FROM mydata WHERE (jsonb_array_elements(mydata.data->'people')->>'name') = 'bob' Instead, I get the following: ERROR: argument of WHERE

PostgreSQL JSONB - query condition with variable key names

为君一笑 提交于 2019-12-05 05:25:40
I have gone through various JSONB tutorials: https://blog.codeship.com/unleash-the-power-of-storing-json-in-postgres/ https://www.wagonhq.com/sql-tutorial/values-from-nested-json http://schinckel.net/2014/05/25/querying-json-in-postgres/ http://stormatics.com/howto-use-json-functionality-in-postgresql/ Consider the following example. There is a table called plans . It has the following columns: id (integer, auto-incrementing primary key). name (string). structure (jsonb). The structure column has a regular JSON object having the following structure: { "some_unique_id": { "key1": "valueA", //

Remove jsonb array element by value

拈花ヽ惹草 提交于 2019-12-05 04:23:25
I did figure out how to remove a value from an array for a single record, but how to do it for many of them. The problem is in the way how I use the subquery. As it has to return only single element. Maybe my approach is wrong. Given input: '{attributes:['is_new', 'is_old']}' Expected result '{attributes: ['is_old']}' #remove 'is_new' from jsonb array Real example: # sku | properties # -------+-------------------------------- # nu3_1 | { + # | "name": "silly_hodgkin", + # | "type": "food", + # | "attributes": [ + # | "is_gluten_free", + # | "is_lactose_free", + # | "is_new" + # | ] + # | }

Roughly how fast is JSON -> JSONB column conversion in Postgres 9.4

梦想的初衷 提交于 2019-12-05 04:05:56
I'm looking to migrate from Postgres 9.3 to 9.4, and have a lot of data in JSON columns. While it's fine, I wanted to have a look at migrating to the more efficient column storage (which JSONB seems to be — a really exciting piece of tech!). To actually migrate, I want to know migration characteristics for something like ALTER TABLE table_with_json ALTER COLUMN my_json SET DATA TYPE jsonb USING my_json::jsonb; (from this helpful question ). Ideally, it would be good to know how long it takes to migrate 1mil and 10mil entries, and how it scales. While I can get these numbers myself, I thought

How to use Jackson 2 in Payara 5?

耗尽温柔 提交于 2019-12-05 02:42:29
问题 I'm using Jackson 2 with Payara 4 and I would liked to use Jackson 2 in Payara 5 . Using JAX-RS, I also would like to avoid changing annotations and so on... In Payara 5 the default Jsonb provider is Yasson. Any ideas to disable it and use Jackson instead? All comments/ideas are welcome :-) NB: Yasson is very interesting but handle abstract class or interface serialization/deserialization is a little more complex than putting a Jackson annotation. My current understanding is that it requires

how do I perform a case-insensitive search in a Postgres 9.4 JSONB column?

喜夏-厌秋 提交于 2019-12-04 23:50:52
问题 i'm using this query to look for data in a table where profile is a JSONB column and it works but only if the name is exactly that SELECT * FROM "users" WHERE "profile" @> '{"name":"Super User"}' is it possible to have more flexibility like case insensitivity, wildcards and so on ? Something like "Super%" or "super user" 回答1: I found the solution to my problem: SELECT * FROM "users" WHERE (profile #>> '{name}') ILIKE 'super %' I don't know if this is performing well enough but it works.

SQLAlchemy filter according to nested keys in JSONB

白昼怎懂夜的黑 提交于 2019-12-04 21:19:47
问题 I have a JSONB field that sometimes has nested keys. Example: {"nested_field": {"another URL": "foo", "a simple text": "text"}, "first_metadata": "plain string", "another_metadata": "foobar"} If I do .filter(TestMetadata.metadata_item.has_key(nested_field)) I get this record. How can I search for existence of the nested key? ( "a simple text" ) 回答1: With SQLAlchemy the following should work for your test string: class TestMetadata(Base): id = Column(Integer, primary_key=True) name = Column