Sample Header Ad - 728x90

Database Administrators

Q&A for database professionals who wish to improve their database skills

Latest Questions

0 votes
1 answers
1227 views
Use ENUM in array of objects
Is it possible to setup an ARRAY column with objects/JSON on a TABLE, forcing a parameter of the objects in the ARRAY to adhere to an ENUM, while keeping that object parameter unique? Data examples: ENUM val1, val2, val3 [{p1: val1, p2: 'something'}, {p1: val2, p2: 'something'}] <-- valid [{p1: val1...
Is it possible to setup an ARRAY column with objects/JSON on a TABLE, forcing a parameter of the objects in the ARRAY to adhere to an ENUM, while keeping that object parameter unique? Data examples: ENUM val1, val2, val3 [{p1: val1, p2: 'something'}, {p1: val2, p2: 'something'}] <-- valid [{p1: val1, p2: 'something'}, {p1: val4, p2: 'something'}] <-- not valid, val4 not an ENUM value [{p1: val1, p2: 'something'}, {p1: val1, p2: 'something else'}] <-- not valid, p1 not unique If it is possible, using PostgreSQL and sequelize, how would I go about it, setting up the column?
David Gustavsson (235 rep)
Sep 12, 2017, 02:34 PM • Last activity: Aug 4, 2025, 10:02 PM
1 votes
1 answers
850 views
Remap array columns to single column with value
Here is the sample database schema. | id | Critical | High | Low | -------------------------------------------- | 1 | {apple, ball} | {cat} | {dog, egg} | Now I want to remap these to another table. Critical set to 1, High to 2 and Low to 3 like: | element | priority | ---------------------- | apple...
Here is the sample database schema. | id | Critical | High | Low | -------------------------------------------- | 1 | {apple, ball} | {cat} | {dog, egg} | Now I want to remap these to another table. Critical set to 1, High to 2 and Low to 3 like: | element | priority | ---------------------- | apple | 1 | | ball | 1 | | cat | 2 | | dog | 3 | | egg | 3 | 1,2,3 can be changed to column name too. I've written a query to do it for 1 column at a time: insert into new_table (element, priority) select unnest(critical), 1 from old_table where id=1; But I want to do all the 3 columns in this single query. I want it fast performing. Is UNION the most efficient way?
PaxPrz (219 rep)
Mar 15, 2021, 04:57 AM • Last activity: Jul 30, 2025, 12:01 PM
5 votes
1 answers
4246 views
Unnest array of arrays
In my parameterized query I have to unnest arrays with 1 dimension and 2 dimensions: ``` $1: ARRAY['id1', 'id2'] $2: ARRAY[ARRAY['tag1'], ARRAY['tag2']] ``` I tried this query: ``` INSERT INTO table (id, tags) SELECT * FROM UNNEST ($1::text[], $2::text[][]) ``` But got this error: > column "tags" is...
In my parameterized query I have to unnest arrays with 1 dimension and 2 dimensions:
$1: ARRAY['id1', 'id2']
$2: ARRAY[ARRAY['tag1'], ARRAY['tag2']]
I tried this query:
INSERT INTO table (id, tags)
SELECT * FROM UNNEST ($1::text[], $2::text[][])
But got this error: > column "tags" is of type text[] but expression is of type text I want the result of SELECT * FROM UNNEST ... to be 2 rows:
'id1', ARRAY['tag1']
'id2', ARRAY['tag2']
pietrushka (51 rep)
Jan 24, 2022, 08:30 PM • Last activity: Jul 24, 2025, 12:26 AM
0 votes
2 answers
573 views
Storing arrays of data in a time-series database
I'm building an low utilization time-series database to capture yearly data points for a set of items fewer than 100,000. My question has to do with storing arrays of data in a way that is easily queried later. Right now the `yearly_visits` table looks something like: visitID MEDIUMINT primary key u...
I'm building an low utilization time-series database to capture yearly data points for a set of items fewer than 100,000. My question has to do with storing arrays of data in a way that is easily queried later. Right now the yearly_visits table looks something like: visitID MEDIUMINT primary key userID SMALLINT id of individual submitting yearly data weight SMALLINT weight of individual The intake form also contains a checkbox with a list of favorite colors (via numeric value) from a lookup table. Users can select one or more colors. Should colors be stored in a separate visit_colors table that looks something like: visitID MEDIUMINT colorID SMALLINT Or is there a better way of storing arrays of data in a time series? I haven't written any code yet, so I want to design this in a way that doesn't bite me down the road when I'm asked to query against the color data down the road.
a coder (208 rep)
Aug 21, 2018, 06:09 PM • Last activity: Jul 10, 2025, 05:06 AM
1 votes
1 answers
61 views
SELECT with array values on WHERE using postgres
I'm using an query to update a object array inside a **jsonb** column. Example data: ``` [ { "_id": "68696e0a3aab2f9ff9c40679", "altura": 1, "comprimento": 1, "largura": 1, "peso": 1, "valor": 1 }, { "_id": "6869744b44829f42ccdbb32c", "altura": 2, "comprimento": 2, "largura": 2, "peso": 2, "valor":...
I'm using an query to update a object array inside a **jsonb** column. Example data:
[
  {
    "_id": "68696e0a3aab2f9ff9c40679",
    "altura": 1,
    "comprimento": 1,
    "largura": 1,
    "peso": 1,
    "valor": 1
  },
  {
    "_id": "6869744b44829f42ccdbb32c",
    "altura": 2,
    "comprimento": 2,
    "largura": 2,
    "peso": 2,
    "valor": 2
  }
]
Using one ID, this works perfectly:
UPDATE
	objetos o
SET
	itens = o.itens - (
		SELECT
			i.id::int - 1
		FROM
			jsonb_array_elements(o.itens) WITH ORDINALITY i(v, id)
		WHERE
			i.v->'_id' = '6869744b44829f42ccdbb32c'
		LIMIT 1
	)
WHERE
	_id = ${_id}
RETURNING
	_id,
	updated_at;
It deletes a entry containing _id = 6869744b44829f42ccdbb32c --- I have tried to delete entries using ARRAY ids, example ['68696e0a3aab2f9ff9c40679', '6869744b44829f42ccdbb32c'], but I get: ~~~none operator does not exist: jsonb = text ~~~ I'm trying add this in WHERE: i.v->'_id' = ANY(ARRAY['68696e0a3aab2f9ff9c40679', '6869744b44829f42ccdbb32c']) and IN, but IN does not return any information. How to compare i.v->'_id' to elements of an array? Like: ['68696e0a3aab2f9ff9c40679', '6869744b44829f42ccdbb32c'].includes(i.v->'_id') References: 1. https://stackoverflow.com/a/10738459/2741415 2. https://dba.stackexchange.com/a/315124/321838 3. https://stackoverflow.com/a/75053441/2741415
flourigh (145 rep)
Jul 5, 2025, 07:34 PM • Last activity: Jul 10, 2025, 12:46 AM
1 votes
1 answers
273 views
PostgreSQL - PostGIS ST_SetValues - int array cast to double and back
I am using `ST_SetValues` with array defined as `ARRAY[[9, 9], [9, 9]]::double precision[][]`. If I store 32bit `integer` value in this array can I retrieve this value exactly after casting from `double` to `integer`? Or is there some limited precission for `integer` part of the `double`. I want to...
I am using ST_SetValues with array defined as ARRAY[[9, 9], [9, 9]]::double precision[][]. If I store 32bit integer value in this array can I retrieve this value exactly after casting from double to integer? Or is there some limited precission for integer part of the double. I want to store "packed" RGBA into a single 32bit integer there and "unpack" single color channels after output in my app. As from documentation, double should be 64bit, so in my oppinion, this should be possible, but maybe I am missing something.
Martin Perry (231 rep)
Nov 6, 2016, 12:53 PM • Last activity: May 20, 2025, 09:01 AM
0 votes
1 answers
304 views
Snowflake - extracting country/ state names - RegEx or Array?
I have a bunch of sloppy Geo text in a field. However the country/ state names or codes are relatively clean. It'll say Poland 23 or Illinois Remote or Frogballs, Germany or TX, AL, AK. I have a finite list of country names/ codes ... and US 50 state names, codes. I'm trying to figure out the best w...
I have a bunch of sloppy Geo text in a field. However the country/ state names or codes are relatively clean. It'll say Poland 23 or Illinois Remote or Frogballs, Germany or TX, AL, AK. I have a finite list of country names/ codes ... and US 50 state names, codes. I'm trying to figure out the best way to convert the "trash STATENAME trash" into a clean state name or country name. I'm thinking either go the array route STRTOK_TO_ARRAY(location_field) - which will convert the string to 'word items' in an array. But I'm not sure the best function to extract a matching 'item' within an array. Array_contains() merely is true/ false. Not "Poland". Maybe regex is better for this purpose? Something like regexp_like(location_field,country_list|country_list,'i'). Only issue here is that -- only want to match countries/ states that are a "word" (preceding or trailing space) -- not "AL" for Alabama when it's part of portugAL for instance.
user45867 (1739 rep)
Jun 5, 2023, 09:32 PM • Last activity: May 14, 2025, 08:00 AM
0 votes
1 answers
360 views
Unstructed data field: Query all values from array of objects by key
I have a table that has a unique ID, and a second 'collumn' named 'data' that contains simple key/value items like: "nickname": "value" "fullName": "value" "office": "value" "unity": "value" and a few, more elaborated structure items like: "address": { "city": "value", "state": "value", }, and "pers...
I have a table that has a unique ID, and a second 'collumn' named 'data' that contains simple key/value items like: "nickname": "value" "fullName": "value" "office": "value" "unity": "value" and a few, more elaborated structure items like: "address": { "city": "value", "state": "value", }, and "personalVehicle": [ { "brand": "value", "model": "value", "plate": "value", "color": "value" }, { "brand": "value", "model": "value", "plate": "value", "color": "value" } ] Where, as you can see, personalVehicle is a key that stores an array of objects, in which every object has it's own simple key/value items. I can query specific key values from address for all registries: SELECT data->'address'->'city' as city FROM person +------------+ | city | |------------| | "city1" | | "city2" | | "city3" | +------------+ Here is the situation: I can query all info about the vehicles with SELECT data->'personalVehicle' as vehicles FROM person +------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | vehicles | |------------------------------------------------------------------------------------------------------------------------------------------------------------------| | [ { "brand": "Toyota", "model": "Corolla", "plate": "AAA-1111", "color": "Red" }, { "brand": "Ford", "model": "Focus", "plate": "ZZZ-9999", "color": "Blue" } ] | | | | [ { "brand": "Hyundai", "model": "Tucson", "plate": "ABC-1212", "color": "Grey" } ] | +------------------------------------------------------------------------------------------------------------------------------------------------------------------+ But I cannot retrieve an specific key for all objects, when the objects are inside of an array; in which case, I need to specify the index: SELECT data->personalVehicle->0->model as model from person +-------------+ | model | |-------------| | "Toyota" | | | | "Hyundai" | +-------------+ This guy up here, is the first index of the array, that is, the first car. I need to get the models for all N number of cars that the person might have. How do I do that? Query that without specifying the index?
Jhonatan Cruz (1 rep)
May 28, 2019, 06:02 PM • Last activity: May 8, 2025, 04:06 AM
1 votes
1 answers
834 views
Multi dimensional JSON Array SQL Query
I'm struggling to write an appropriate query for my data -> ```{ "schools":[ { "org_symbol":"School 1", "criteria":[ [ { "value":"private", "type":"school type" }, { "value":"usa", "type":"country" }, { "value":"english", "type":"language" }, { "value":"1-6", "type":"grades" }, { "value":"Silver", "...
I'm struggling to write an appropriate query for my data ->
{
   "schools":[
      {
         "org_symbol":"School 1",
         "criteria":[
            [
               {
                  "value":"private",
                  "type":"school type"
               },
               {
                  "value":"usa",
                  "type":"country"
               },
               {
                  "value":"english",
                  "type":"language"
               },
               {
                  "value":"1-6",
                  "type":"grades"
               },
               {
                  "value":"Silver",
                  "type":"level"
               }
            ]
         ]
      },
      {
         "org_symbol":"School 2",
         "criteria":[
            [
               {
                  "value":"private",
                  "type":"school type"
               },
               {
                  "value":"usa",
                  "type":"country"
               },
               {
                  "value":"english",
                  "type":"language"
               },
               {
                  "value":"1-6",
                  "type":"grades"
               },
               {
                  "value":"gold",
                  "type":"level"
               }
            ]
         ]
      }
   ]
}
I have this
SELECT distinct on(id) * FROM tribes, json_array_elements(meta::json -> 'attributes') as elem 
WHERE 
( 
    (elem ->> 'type' = 'school type' and elem ->> 'value' = 'private') 
    and (elem ->> 'type' = 'country' and elem ->> 'value' = 'usa') 
    and (elem ->> 'type' = 'language' and elem ->> 'value' = 'english')
    and (elem ->> 'type' = 'grades' and elem ->> 'value' = '1-6')
    and (elem ->> 'type' = 'level' and elem ->> 'value' = 'gold')
  ) ;
but it doesn't return anything, I know i'm indexing correctly (full json not included) but I can't seem to get the multi query to work. I need to be able to check where value and type match each set of criteria I think i'm close but really not sure, any help would be greatly appreciated
chris (11 rep)
Aug 26, 2022, 01:56 AM • Last activity: May 7, 2025, 10:05 PM
2 votes
1 answers
795 views
Filter by list of values pairs
I have Postgresql 9.4 database and products table. For each product there must be list of pairs: price; quantity for which this price is acutal Products table may contain millions and billions of records. For this products table i must provide filtering by price, filter by quantity, and filter by pr...
I have Postgresql 9.4 database and products table. For each product there must be list of pairs: price; quantity for which this price is acutal Products table may contain millions and billions of records. For this products table i must provide filtering by price, filter by quantity, and filter by price+quantity. If there is only price filter than if product has at least one price variant that satisfites filter then this product will be in result list. If there is price+quantity filter then product will be in result list only if there is at least one price variant which has price AND quantity that satisfies filters. If I create separate tables create table prod (id integer primary key); create table optprice (prod integer, price decimal, q integer); than with millions of products query takes realy long time: select * from prod where id in (select o.prod from optprice o where price between 10 and 500 ) limit 20; Planning time: 0.166 ms Execution time: 867.663 ms select count(*) from prod where id in (select o.prod from optprice o where price between 10 and 500 ); Planning time: 0.166 ms Execution time: 867.663 ms Even if I replace first query with joins, count query still too slow select count(*) from prod left join optprice on id=optprice.prod where price between 10 and 500 limit 20; Planning time: 0.149 ms Execution time: 1478.455 ms I decided to use postgresql arrays, so each product has field optprice with something like: {{112.3, 33}, {555.12, 66}, {77.8, 88}} But I can't understand how can I implement filtering, described earlier. I can implement separate price or quantity filters. I can't see how price+query filtering is possible here. I can write some function but, if i not mistaken, i lose indexing ability and again queries become too slow. Is it possible to do something like this in postgresql, so it will work relatively fast even on large datasets? (Also, sorry for my bad english).
Moisizz (21 rep)
Sep 15, 2016, 03:13 PM • Last activity: Apr 23, 2025, 06:04 AM
1 votes
2 answers
3510 views
Converting results to array
I have many entities like: user ``` id | name ----------- 1 | Joe 2 | David 3 | Jane ``` cars ``` id| name ------------ 1 | cars1 2 | cars2 3 | cars3 4 | cars4 5 | cars5 6 | cars6 7 | cars7 8 | cars8 9 | cars9 ``` cars_data ``` id | price | category | uid | car_id ---+--------+----------+-----+-----...
I have many entities like: user
id | name
-----------
 1 | Joe
 2 | David
 3 | Jane
cars
id| name 
------------
1 | cars1 
2 | cars2 
3 | cars3 
4 | cars4 
5 | cars5 
6 | cars6 
7 | cars7 
8 | cars8 
9 | cars9
cars_data
id | price  | category | uid | car_id
---+--------+----------+-----+-------
1  | 225.00 |	p1 	   |  1  |	1
2  | 451.00 |	p2 	   |  1  |	1
3  | 324.00 |	p2 	   |  1  |	2
4  | 784.00 |	p2 	   |  1  |	3
5  | 724.00 |	p3 	   |  1  |	2
6  | 214.00 |	p1 	   |  2  |	1
7  | 451.00 |	p1 	   |  2  |	2
8  | 926.00 |	p1 	   |  2  |	3
9  | 271.00 |	p2 	   |  2  |	3
10 | 421.00 |	p2 	   |  2  |	4
11 | 684.00 |	p2 	   |  2  |	2
12 | 124.00 |	p3 	   |  2  |	5
13 | 128.00 |	p3 	   |  2  |	1
14 | 741.00 |	p1 	   |  3  |	1
15 | 965.00 |	p1 	   |  3  |	3
16 | 124.00 |	p2 	   |  3  |	4
17 | 415.00 |	p2 	   |  3  |	1
18 | 51.00 	|   p2    |  3  |	2
19 | 965.00 |	p2 	   |  3  |	6
filters
id 	| name 	  |  filter 	     | uid
----+ --------+------------------+-----
1 	| filter1 |	string filters 1 |	1
2 	| filter2 |	string filters 2 |	1
3 	| filter3 |	string filters 3 |	1
4 	| filter3 |	string filters 3 |	1
5 	| filter3 |	string filters 3 |	1
6 	| filter3 |	string filters 3 |	1
7 	| filter3 |	string filters 3 |	1
8 	| filter  |	string filters 1 |	2
9 	| filter5 |	string filters 5 |	2
10 	| filter6 |	string filters 6 |	2
11 	| filter6 |	string filters 6 |	2
12 	| filter6 |	string filters 6 |	2
13 	| filter6 |	string filters 6 |	2
14 	| filter7 |	string filters 7 |	3
15 	| filter8 |	string filters 8 |	3
16 	| filter8 |	string filters 8 |	3
17 	| filter8 |	string filters 8 |	3
18 	| filter8 |	string filters 8 |	3
19 	| filter9 |	string filters 9 |	3
assign_filters
uid | category  | filter_id
----+ ----------+-----------
1   |	p1 	    |1          
1   |	p2 	    |1          
1   |	p2 	    |2          
1   |	p2 	    |3          
1   |	p3 	    |4          
2   |	p1 	    |9          
2   |	p1 	    |8          
2   |	p1 	    |13         
3   |	p2 	    |14         
3   |	p2 	    |16         
3   |	p2 	    |17         
3   |	p3 	    |19         
3   |	p3 	    |18         
3   |	p1 	    |14         
3   |	p1 	    |18
What I want is results like this:
uid | category  | filter_id |car_id  
----+ ----------+-----------+--------
1   |	p1 	    |1          |[1] 
1   |	p2 	    |1          |[1,2,3]
1   |	p2 	    |2          |[1,2,3]
1   |	p2 	    |3          |[1,2,3]
1   |	p3 	    |4          |
2   |	p1 	    |9          |[1,2,3]
2   |	p1 	    |8          |[1,2,3]
2   |	p1 	    |13         |[1,2,3]
3   |	p2 	    |14         |[1,2,3,4,6]
3   |	p2 	    |16         |[1,2,3,4,6]
3   |	p2 	    |17         |[1,2,3,4,6]
3   |	p3 	    |19         |[6,7]
3   |	p3 	    |18         |[6,7]
3   |	p1 	    |14         |[1] 
3   |	p1 	    |18         |[1] 
How can I change the original SQL query to give me the result above? dbfiddle **update:** - Each user can define a filter for himself and each filter is specific to one user. - Each user can place cars in specific categories in the cars_data table.(categories is [p1,p2,p3]) - Each user can assign a number of filters to each of their categories. I need to know which filters each user has used, and on which cars the filter has been applied. For example, user number 1 has placed cars1, cars2, cars3 in category p1. Also, filter1, filter2, filter3 have been assigned to this category. The result I need is this:
uid | category  | filter_id |car_id  
----+ ----------+-----------+--------
1   |	p2 	    |1          |[1,2,3]
1   |	p2 	    |2          |[1,2,3]
1   |	p2 	    |3          |[1,2,3]
majid (15 rep)
Nov 1, 2021, 06:35 AM • Last activity: Apr 20, 2025, 04:56 AM
6 votes
2 answers
2112 views
Postgres not returning data on array_agg query as below
The problem arise when there are no data for books in specific library. Consider a following working scenario. Table `library` ``` -------------------------------- | id | name | owner | -------------------------------- | 1 | ABC | A | | 2 | DEF | D | | 3 | GHI | G | --------------------------------...
The problem arise when there are no data for books in specific library. Consider a following working scenario. Table library
--------------------------------
| id |    name     |    owner  |
--------------------------------
|  1 |     ABC     |     A     |
|  2 |     DEF     |     D     |
|  3 |     GHI     |     G     |
--------------------------------
Table books
--------------------------------
| id |    title    |  library  |
--------------------------------
|  a |     xxx     |     1     |
|  b |     yyy     |     1     |
|  c |     zzz     |     2     |
--------------------------------
Now when I do query like below:
SELECT library.name, array_agg(b.title) AS book_list FROM library, 
(SELECT title FROM books WHERE books.library = :library_no) as b 
WHERE library.id = :library_no GROUP BY library.id
The query generates output for library 1 & 2, but not for library 3. Why and how to solve this issue? (Generate an empty list on no library books) Required Output:
----------------------
| name |    book_list |
----------------------
|  GHI |      {}      |   # or {null}
-----------------------
I've even tried coalesce as below:
SELECT library.name, coalesce(array_agg(b.title), ARRAY[]::VARCHAR[]) AS book_list FROM library, 
(SELECT title FROM books WHERE books.library = :library_no) as b 
WHERE library.id = :library_no GROUP BY library.id
Postgres version: 12
PaxPrz (219 rep)
Jan 14, 2021, 03:01 AM • Last activity: Mar 15, 2025, 03:15 PM
1 votes
1 answers
1128 views
In Postgresql, can I create a column of a one-dimension array type with foreign key constraint? If not, how to simulate it?
In essence I have to design todo table, and an item table. I need to preserve item order, as well as allowing client to rearrange the order. If postgres allow an array of foreign key that would be great. However base on my research thats not possible. the consensus is just use a junction join table....
In essence I have to design todo table, and an item table. I need to preserve item order, as well as allowing client to rearrange the order. If postgres allow an array of foreign key that would be great. However base on my research thats not possible. the consensus is just use a junction join table. I can certainly make join table work, however I still need to preserve the order of those items. I know that I could add another column on the join table called 'item order'. But with that solution, it means each rearrangement would require n times of updates for its related items in the join table. The performance cost seems pretty big. However, I'm pretty new to sql, not sure the scale of things or that there might be other solution I might've missed? Would love to hear a better solution. and I guess without saying, if we go in with plain array on todo table, drop the join table, lose referential integrity and cascade on delete, it would not a good compromise for performance.
Qi luo (25 rep)
Jan 20, 2022, 09:38 PM • Last activity: Feb 8, 2025, 11:09 PM
2 votes
1 answers
905 views
Creating a TYPE that is an Array within a Function (or a predefined Array TYPE) in PostgreSQL
I am researching migration of a major system from Oracle to PostgreSQL. Getting into coding functions now, and am looking to see if there is a compatible object in PostgreSQL where, dynamically within a Function I can create a TYPE as an array. The syntax in Oracle is: TYPE VAR_STRING IS VARRAY(10)...
I am researching migration of a major system from Oracle to PostgreSQL. Getting into coding functions now, and am looking to see if there is a compatible object in PostgreSQL where, dynamically within a Function I can create a TYPE as an array. The syntax in Oracle is: TYPE VAR_STRING IS VARRAY(10) VARCHAR2(30); This creates a 10 element array TYPE named VAR_STRING to hold 10 strings of 30 characters. I am playing with PG 9.6, and even in the Create Type dialog, it is not obvious to me to make a pre-defined TYPE that can be an array.
user210170 (21 rep)
Jun 4, 2020, 11:58 PM • Last activity: Jan 26, 2025, 07:00 PM
11 votes
1 answers
20636 views
Delete array element by index
Is it possible to delete a Postgres array element by index? (Using Postgres 9.3.) I don't see anything for this in the docs (http://www.postgresql.org/docs/9.3/static/functions-array.html) but perhaps there are other functions I am missing?
Is it possible to delete a Postgres array element by index? (Using Postgres 9.3.) I don't see anything for this in the docs (http://www.postgresql.org/docs/9.3/static/functions-array.html) but perhaps there are other functions I am missing?
Fawn (313 rep)
Mar 7, 2015, 01:55 AM • Last activity: Jan 22, 2025, 07:19 PM
1 votes
1 answers
1557 views
Mongodb what is the cost of array queries?
As far as I can tell, find queries in mongodb involving an array appear very expensive. Take as example the queries given in: https://docs.mongodb.com/manual/tutorial/query-array-of-documents/ db.inventory.insertMany( [ { item: "journal", instock: [ { warehouse: "A", qty: 5 }, { warehouse: "C", qty:...
As far as I can tell, find queries in mongodb involving an array appear very expensive. Take as example the queries given in: https://docs.mongodb.com/manual/tutorial/query-array-of-documents/ db.inventory.insertMany( [ { item: "journal", instock: [ { warehouse: "A", qty: 5 }, { warehouse: "C", qty: 15 } ] }, { item: "notebook", instock: [ { warehouse: "C", qty: 5 } ] }, { item: "paper", instock: [ { warehouse: "A", qty: 60 }, { warehouse: "B", qty: 15 } ] }, { item: "planner", instock: [ { warehouse: "A", qty: 40 }, { warehouse: "B", qty: 5 } ] }, { item: "postcard", instock: [ { warehouse: "B", qty: 15 }, { warehouse: "C", qty: 35 } ] } ]); And find query: db.inventory.find( { "instock": { warehouse: "A" } } ) The above example can be described as simple as "Find all records connected to warehouse A". But what is the cost of the find query (big o)? Are there any indexing optimizations going on here, or is it basically an exhaustive search with a cost of O(N*K), where N is the number of inventory records, and K is the number of elements in the instock array of each record? and if so, are there any ways of optimizing this to minimize cost, by say indexing?
Daniel Valland (425 rep)
Dec 30, 2018, 06:02 PM • Last activity: Jan 14, 2025, 04:05 PM
146 votes
8 answers
475600 views
How to turn JSON array into Postgres array?
I have a column `data` of type `json` that holds JSON documents like this: { "name": "foo", "tags": ["foo", "bar"] } I would like to turn the nested `tags` array into a concatenated string (`'foo, bar'`). That would be easily possible with the `array_to_string()` function in theory. However, this fu...
I have a column data of type json that holds JSON documents like this: { "name": "foo", "tags": ["foo", "bar"] } I would like to turn the nested tags array into a concatenated string ('foo, bar'). That would be easily possible with the array_to_string() function in theory. However, this function does not accept json input. So I wonder how to turn this JSON array into a Postgres array (type text[])?
Christoph (1653 rep)
Dec 2, 2013, 08:48 PM • Last activity: Dec 16, 2024, 08:18 AM
19 votes
1 answers
41641 views
Postgres query to return JSON object keys as array
Is it possible to return a JSON object keys as an array of values in PostgreSQL? In JavaScript, this would simply be `Object.keys(obj)`, which returns an array of strings. For example, if I have a table like this: tbl_items --------- id bigserial NOT NULL obj json NOT NULL And if there's a row like...
Is it possible to return a JSON object keys as an array of values in PostgreSQL? In JavaScript, this would simply be Object.keys(obj), which returns an array of strings. For example, if I have a table like this: tbl_items --------- id bigserial NOT NULL obj json NOT NULL And if there's a row like this: id obj ----- ------------------------- 123 '{"foo":1,"bar":2}' How can I have a query to return: id keys ----- ------------------ 123 '{"foo","bar"}'
Yanick Rochon (1651 rep)
Jan 25, 2016, 02:51 PM • Last activity: Nov 26, 2024, 09:15 PM
1 votes
1 answers
67 views
Finding the distinct values in an array of documents
I've got a database full of documents which each contain a collection of transactions: ``` [ { "key": 1, "data": [ { "trans": 1, "uid": 1 }, { "trans": 2, "uid": 1 } ] }, { "key": 2, "data": [ { "trans": 3, "uid": 1 }, { "trans": 4, "uid": 2 } ] } ] ``` I want to create a new field in each of the ma...
I've got a database full of documents which each contain a collection of transactions:
[
  {
    "key": 1,
    "data": [
      {
        "trans": 1,
        "uid": 1
      },
      {
        "trans": 2,
        "uid": 1
      }
    ]
  },
  {
    "key": 2,
    "data": [
      {
        "trans": 3,
        "uid": 1
      },
      {
        "trans": 4,
        "uid": 2
      }
    ]
  }
]
I want to create a new field in each of the main documents, which has the unique values of the uid field. I can get partway there using $map:
db.collection.aggregate([
  {
    "$set": {
      "uid": {
        "$map": {
          "input": "$data",
          "as": "trans",
          "in": "$$trans.uid"
        }
      }
    }
  }
])
This gives me:
[
  {
    "key": 1,
    "uid": [1,1],
    "data": [
      {
        "trans": 1,
        "uid": 1
      },
      {
        "trans": 2,
        "uid": 1
      }
    ]
  },
  {
    "key": 2,
    "uid": [1, 2],
    "data": [
      {
        "trans": 3,
        "uid": 1
      },
      {
        "trans": 4,
        "uid": 2
      }
    ]
  }
]
This is close, but I can't seem to figure out the last step, I want to use only the unique values, so the uid for the first document should be `, not [1, 1]`. The distinct() function works across collections, not single documents. I would think that $addToSet would work, but it doesn't operate on arrays, only on the output of $group. I also looked at trying to create a $reduce specification using $setUnion, but I don't know how to promote my numeric value into an array. I can use the $unwind stage with grouping by _id to get the right values for the new field, but I can't figure out how to attach them back to the original objects.
ralmond (13 rep)
Nov 8, 2024, 11:26 PM • Last activity: Nov 9, 2024, 01:10 AM
1 votes
1 answers
107 views
Cardinality of a multirange? How many ranges or gaps are there in a multirange?
How do I obtain the cardinality of a multirange? I'm using `range_agg()` to aggregate ranges, which merges ranges if they are continuous, or returns what looks like an array of ranges if the ranges are discontinuous. I want to detect if there are any gaps in the multirange and wanted to use `cardina...
How do I obtain the cardinality of a multirange? I'm using range_agg() to aggregate ranges, which merges ranges if they are continuous, or returns what looks like an array of ranges if the ranges are discontinuous. I want to detect if there are any gaps in the multirange and wanted to use cardinality() for this. A cardinality of 1 would mean no gaps, a cardinality of 2 means 1 gap, etc. However cardinality is not defined on a multirange, because a multirange is not technically an array: with x(a) as ( values (daterange(date '2024-01-30', date '2024-03-31', '[)')), (daterange(date '2024-04-01', date '2024-04-30', '[)')) ) select cardinality(range_agg(a)) from x; returns ERROR: function cardinality(datemultirange) does not exist LINE 6: select cardinality(range_agg(a)) from x; I can probably implement the function myself with unnest() and counting the number of resulting rows, but I'm curious if there are any other good solutions.
Colin &#39;t Hart (9455 rep)
Sep 26, 2024, 07:42 AM • Last activity: Sep 26, 2024, 11:28 AM
Showing page 1 of 20 total questions