Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
1
votes
2
answers
145
views
How to concatenate multiple JSON arrays into a single array in MariaDB?
I'm working with a MariaDB database where I have a table with two columns: `id` and `values`, where `values` contains JSON arrays. Here is a simplified example of the data: | id | values | |----|-------------| | 1 | "[1, 2, 3]" | | 1 | "[5]" | | 2 | "[4]" | I want to group the rows by `id` and conca...
I'm working with a MariaDB database where I have a table with two columns:
id
and values
, where values
contains JSON arrays. Here is a simplified example of the data:
| id | values |
|----|-------------|
| 1 | "[1, 2, 3]" |
| 1 | "" |
| 2 | "" |
I want to group the rows by id
and concatenate the JSON arrays in the values
column into a single array, like this:
| id | values |
|----|---------------|
| 1 | "[1, 2, 3, 5]"|
| 2 | "" |
I tried using JSON_ARRAYAGG
but it creates an array of arrays, and I can't figure out how to merge the arrays into one. How can I achieve this in MariaDB?
SELECT id, JSON_ARRAYAGG(values)
FROM REC
GROUP BY id
Bonus question: How to output only unique values?
Emax
(111 rep)
Dec 16, 2024, 01:48 PM
• Last activity: Aug 5, 2025, 01:04 PM
0
votes
1
answers
1227
views
Use ENUM in array of objects
Is it possible to setup an ARRAY column with objects/JSON on a TABLE, forcing a parameter of the objects in the ARRAY to adhere to an ENUM, while keeping that object parameter unique? Data examples: ENUM val1, val2, val3 [{p1: val1, p2: 'something'}, {p1: val2, p2: 'something'}] <-- valid [{p1: val1...
Is it possible to setup an ARRAY column with objects/JSON on a TABLE, forcing a parameter of the objects in the ARRAY to adhere to an ENUM, while keeping that object parameter unique?
Data examples:
ENUM val1, val2, val3
[{p1: val1, p2: 'something'}, {p1: val2, p2: 'something'}] <-- valid
[{p1: val1, p2: 'something'}, {p1: val4, p2: 'something'}] <-- not valid, val4 not an ENUM value
[{p1: val1, p2: 'something'}, {p1: val1, p2: 'something else'}] <-- not valid, p1 not unique
If it is possible, using PostgreSQL and sequelize, how would I go about it, setting up the column?
David Gustavsson
(235 rep)
Sep 12, 2017, 02:34 PM
• Last activity: Aug 4, 2025, 10:02 PM
0
votes
1
answers
145
views
Json or separate column?
I'm developing a table structure to store the physical activity data and users can save their progress. The activities belong to multiple programs. like beginner program suppose be like having 1000 step counts and 100 pushups. The intermediate program would be like like exercise for 30 days and get...
I'm developing a table structure to store the physical activity data and users can save their progress. The activities belong to multiple programs. like beginner program suppose be like having 1000 step counts and 100 pushups. The intermediate program would be like like exercise for 30 days and get Saturday off like so. I am confused about whether to store these program info in separate column or as json.
Vidya L
(101 rep)
Jan 19, 2024, 12:56 PM
• Last activity: Jul 26, 2025, 02:01 PM
0
votes
1
answers
149
views
Producing JSON in Oracle 12c1
I need to produce a JSON document with several properties from a list of Oracle 12c1 production databases, and I was given read-only access in order to generate this data. The problem is these databases were created over time and may store VARCHAR2 values in different character sets. My naive attemp...
I need to produce a JSON document with several properties from a list of Oracle 12c1 production databases, and I was given read-only access in order to generate this data.
The problem is these databases were created over time and may store VARCHAR2 values in different character sets. My naive attempt at producing JSON is:
SELECT '{' ||
'"content":"' || content || '", "' ||
'"name":"' || name || '"}'
FROM my_table
WHERE ...
However, since the columns
content
and name
may have all kind of special characters and use non-trivial character sets, the query above may produce invalid JSON documents.
Is there any function that can convert a VARCHAR2 into Unicode from any character set? I think if the string is in Unicode it could be easy to manage. Unfortunately, I don't really know each column's character set in advance.
Joe DiNottra
(101 rep)
Oct 26, 2021, 10:42 PM
• Last activity: Jul 24, 2025, 07:02 AM
2
votes
0
answers
30
views
Usage of JSON in mySQL
In the project I am working on, there is a **Booking** table. - There are 8 types of Bookings. - I have elected to have just one table - There are 12 common fields I could store the rest of the fields as columns, but I am thinking of using JSON storage (which I have never used). It will add complexi...
In the project I am working on, there is a **Booking** table.
- There are 8 types of Bookings.
- I have elected to have just one table
- There are 12 common fields
I could store the rest of the fields as columns, but I am thinking of using JSON storage (which I have never used). It will add complexity to the code (in terms of readability) to split the JSON in the JavaScript application for editing and then recombining.
Can I get MySQL to do the splitting and combining of JSON fields?
Rohit Gupta
(2126 rep)
Jul 10, 2025, 01:06 PM
3
votes
1
answers
373
views
Why does a Json Path inequality over an array not work?
Given the following `jsonb` array, I want to filter it only for certain objects. ```json [ { "id": "123" }, { "id": "456" }, { "id": "789" } ] ``` I can use the function `jsonb_path_query_array` with the path `strict $[*] ? (@.id == $idToSearch[*])` and a `variables` parameter of `{"idToSearch":["12...
Given the following
jsonb
array, I want to filter it only for certain objects.
[
{
"id": "123"
},
{
"id": "456"
},
{
"id": "789"
}
]
I can use the function jsonb_path_query_array
with the path strict $[*] ? (@.id == $idToSearch[*])
and a variables
parameter of {"idToSearch":["123","456"]}
to filter by the id
parameter. The logic being:
* $[*]
breaks out the array
* ?
is a filter predicate...
* ... where the id
property is equal to ...
* ... any value in the idToSearch
array variable.
I get this straightforward result:
[
{
"id": "123"
},
{
"id": "456"
}
]
If however, I invert the ==
in the path query to strict $[*] ? (@.id != $idToSearch[*])
then instead of getting the correct result of
[
{
"id": "789"
}
]
instead I just get all the objects again.
As a workaround, I can use a negated equals strict $[*] ? (! (@.id == $idToSearch[*]))
to get the correct result.
**Why is the query not returning the correct result?**
Is this a bug? If so, what's the cause and has it been filed? If it's not a bug, where am I going wrong in the path query, and what would be the correct way to express it?
____
create table test (
_id int,
items jsonb
);
insert into test values (1, '[
{
"id": "123"
},
{
"id": "456"
},
{
"id": "789"
}
]');
select
jsonb_path_query_array(
items,
'strict $[*] ? (@.id != $idToSearch[*])',
'{"idToSearch":["123","456"]}')
from test;
**dbfiddle**
Charlieface
(17545 rep)
Jul 7, 2025, 01:01 PM
• Last activity: Jul 7, 2025, 02:39 PM
0
votes
1
answers
177
views
How to backup faunadb collection?
I want to migrate my faunadb collection to a different database, how can I download all data in collection in a JSON format ?
I want to migrate my faunadb collection to a different database, how can I download all data in collection in a JSON format ?
Mahesh Ghamaand
(101 rep)
Oct 17, 2022, 10:03 AM
• Last activity: Jun 30, 2025, 11:09 PM
1
votes
1
answers
218
views
MariaDB: inconsistent SELECT results involving JSON
When executing the following statments on MariaDB 10.6.14 SET @json_1 = JSON_OBJECT('id', 'name'); SET @json_result_1 = JSON_OBJECT('test', @json_1); SELECT @json_result_1; SET @json_result_2 = JSON_OBJECT('test', JSON_OBJECT('id', 'name')); SELECT @json_result_2; I receive the following results: Fo...
When executing the following statments on MariaDB 10.6.14
SET @json_1 = JSON_OBJECT('id', 'name');
SET @json_result_1 = JSON_OBJECT('test', @json_1);
SELECT @json_result_1;
SET @json_result_2 = JSON_OBJECT('test', JSON_OBJECT('id', 'name'));
SELECT @json_result_2;
I receive the following results:
For
@json_result_1
:
{"test": "{\"id\": \"name\"}"}
And for @json_result_2
:
{"test": {"id": "name"}}
How does it come that I retrieve different result? Aren't those JSON string supposed to be internally all treated as TEXT?
Andreas Hinderberger
(111 rep)
Nov 17, 2023, 04:24 PM
• Last activity: Jun 15, 2025, 03:06 PM
25
votes
6
answers
140354
views
MySQL: Return JSON from a standard SQL Query
I have read about JSON objects and the JSON object type. I only want to do a select and it return JSON. I do not necessarily want to store a JSON object. Serialization per se it not my question. The columns are regular Varchar, Int, etc. columns, no JSON Objects, "normal" database rows. Can I do a r...
I have read about JSON objects and the JSON object type. I only want to do a select and it return JSON. I do not necessarily want to store a JSON object. Serialization per se it not my question. The columns are regular Varchar, Int, etc. columns, no JSON Objects, "normal" database rows.
Can I do a regular old SELECT and return JSON for MySQL?
Isn't this what FOR JSON in SQL SERVER and rows_for_json do in PostgreSQL?
They seemed ahead in this are but I didn't want to fool myself.
I found this question from 2016:
https://stackoverflow.com/questions/35324795/mysql-5-7-return-row-as-json-using-new-json-features
johnny
(612 rep)
Dec 1, 2017, 08:44 PM
• Last activity: Jun 6, 2025, 07:59 AM
0
votes
1
answers
800
views
Performance issues on json field update
I have a postgreSQL database on which I am experiencing huge latency issues when trying to update a json field using the following query: ```sql UPDATE collections SET "layout" = jsonb_set(layout, '{actions}', '[{"actionId":45,"actionName":"my_action"}]') WHERE id = 1 ``` It takes about 30sec to com...
I have a postgreSQL database on which I am experiencing huge latency issues when trying to update a json field using the following query:
UPDATE collections
SET "layout" = jsonb_set(layout, '{actions}', '[{"actionId":45,"actionName":"my_action"}]')
WHERE id = 1
It takes about 30sec to complete even though it does not seems like a big query.
Furthermore the query seems to block all other incoming queries even the simplest SELECT
(on the same collection).
Good to know too, it happens only on our production database. On local databases it's pretty fast (so I'm kind of assuming it has to do with the traffic and locks on that table).
It's also instantaneous on a replica of our production database. Which makes me think it might be related to some things happening under the hood that lock this query.
If anybody has any hints on what could cause those troubles, it would be much appreciated.
Regards,
PostgreSQL: v9.5.20, 8GB RAM, 74/256GB disk used
Larcin
(1 rep)
Feb 26, 2020, 02:59 PM
• Last activity: May 30, 2025, 03:03 PM
0
votes
1
answers
642
views
Query Quill-Delta JSON array in Postgres
I need to store text in Quill-Delta format, which is in the form of a JSON array, e.g. | id | title | content | | --- | --- | --- | | 10 | Hello | [ { "insert": "Hello" }, { "insert": "World" } ]| | 20 | Image | [ { "insert": { "image": "data:image/j..." } } ]| Then, I need to query it, e.g. select...
I need to store text in Quill-Delta format, which is in the form of a JSON array, e.g.
| id | title | content |
| --- | --- | --- |
| 10 | Hello | [ { "insert": "Hello" }, { "insert": "World" } ]|
| 20 | Image | [ { "insert": { "image": "data:image/j..." } } ]|
Then, I need to query it, e.g. select where text contains "hello world" such as:
1. if I query
content
for "hello world" I get row 10;
2. if I query content
for "insert" I get no results;
3. if I query content
form "image" I get no results.
In practice, it should search only the insert
key of array objects, where the value is a JSON string (no other types).
Can I achieve this storing data in Postgres? I read about full-text search over JSONB, but I couldn't come up with a working example.
Giovanni Lovato
(465 rep)
Feb 21, 2019, 11:33 AM
• Last activity: May 30, 2025, 03:03 AM
0
votes
2
answers
290
views
Query nested json columns in postgresql
I want to return Germany and UK from the below json column/string. ``` { "id": 1, "name": "phone", "components": [ { "id": 1, "type": "screen", "manufacturers": [ { "id": 1, "name": "a", "country": "Germany" } ] }, { "id": 2, "type": "keyboard", "manufacturers": [ { "id": 1, "name": "a", "country":...
I want to return Germany and UK from the below json column/string.
{
"id": 1,
"name": "phone",
"components": [
{
"id": 1,
"type": "screen",
"manufacturers": [
{
"id": 1,
"name": "a",
"country": "Germany"
}
]
},
{
"id": 2,
"type": "keyboard",
"manufacturers": [
{
"id": 1,
"name": "a",
"country": "UK"
}
]
}
]
}
Jackie Ndinwa
(1 rep)
Jun 4, 2024, 09:13 AM
• Last activity: May 26, 2025, 09:04 AM
1
votes
2
answers
614
views
Sql Server, OPENJSON sql:identity() parse problems
I'm starting to work with the JSON operators in Sql Server 2019, and on an msdn page, I found a chunk talking about the $.sql:identity() operator and how that's supposed to give you an integer index into the json array you're processing. The problem is that the example on the msdn page doesn't work....
I'm starting to work with the JSON operators in Sql Server 2019, and on an msdn page, I found a chunk talking about the $.sql:identity() operator and how that's supposed to give you an integer index into the json array you're processing.
The problem is that the example on the msdn page doesn't work.
This is the code from msdn
DECLARE @array VARCHAR(MAX);
SET @array = '[{"month":"Jan", "temp":10},{"month":"Feb", "temp":12},{"month":"Mar", "temp":15},
{"month":"Apr", "temp":17},{"month":"May", "temp":23},{"month":"Jun", "temp":27}
]';
SELECT * FROM OPENJSON(@array)
WITH ( month VARCHAR(3),
temp int,
month_id tinyint '$.sql:identity()') as months
but when I try to run it, I get a parsing exception:
JSON path is not properly formatted. Unexpected character ':' is found at position 5.
Any clues? Is there a working way to get the array index in a json array as you go through it?
Thanks
user1664043
(379 rep)
Dec 14, 2023, 10:18 PM
• Last activity: May 19, 2025, 05:03 PM
1
votes
1
answers
1868
views
Full text search with json field in SQLite
Is it possible to have full text search in SQLite along using json? In SQLite json is stored as `TEXT` ([JSON1](https://www.sqlite.org/json1.html)). Full text search requires creation of virtual table ([full text](https://www.sqlitetutorial.net/sqlite-full-text-search/)) but I don't know how to conn...
Is it possible to have full text search in SQLite along using json?
In SQLite json is stored as
TEXT
([JSON1](https://www.sqlite.org/json1.html)) .
Full text search requires creation of virtual table ([full text](https://www.sqlitetutorial.net/sqlite-full-text-search/)) but I don't know how to connect those two extensions so that full text search wouldn't search in json field names.
WRITING (ID, FK_PERSON_ID, BOOK_NAME, TEXT_JSON)
PEOPLE (PERSON_ID, NAME)
additionally TEXT_JSON
for certain person contains following json objects:
-- WRITING.ID = 1, WRITING.FK_PERSON_ID = 1, BOOK_NAME = "NAME_1"
{
"chapter1": "Title1",
"text1": "This is sample sentence with word text",
"text2": "This is sample sentence with word text"
}
-- WRITING.ID = 2, WRITING.FK_PERSON_ID = 1, BOOK_NAME = "NAME_101"
{
"chapter1": "Title2",
"text1": "This is sample sentence without"
}
(the structure of json object can differ)
How should I setup virtual table for full text search to search single person writings BOOK_NAME
and values of all TEXT_JSON
attributes? Searching word text
in writings of PERSON_ID = 1
would return only WRITING.ID = 1
.
SundayProgrammer
(11 rep)
May 3, 2020, 12:27 PM
• Last activity: May 10, 2025, 09:03 PM
0
votes
1
answers
360
views
Unstructed data field: Query all values from array of objects by key
I have a table that has a unique ID, and a second 'collumn' named 'data' that contains simple key/value items like: "nickname": "value" "fullName": "value" "office": "value" "unity": "value" and a few, more elaborated structure items like: "address": { "city": "value", "state": "value", }, and "pers...
I have a table that has a unique ID, and a second 'collumn' named 'data' that contains simple key/value items like:
"nickname": "value"
"fullName": "value"
"office": "value"
"unity": "value"
and a few, more elaborated structure items like:
"address":
{
"city": "value",
"state": "value",
},
and
"personalVehicle":
[
{
"brand": "value",
"model": "value",
"plate": "value",
"color": "value"
},
{
"brand": "value",
"model": "value",
"plate": "value",
"color": "value"
}
]
Where, as you can see,
personalVehicle
is a key that stores an array of objects, in which every object has it's own simple key/value items.
I can query specific key values from address
for all registries: SELECT data->'address'->'city' as city FROM person
+------------+
| city |
|------------|
| "city1" |
| "city2" |
| "city3" |
+------------+
Here is the situation: I can query all info about the vehicles with SELECT data->'personalVehicle' as vehicles FROM person
+------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| vehicles |
|------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| [ { "brand": "Toyota", "model": "Corolla", "plate": "AAA-1111", "color": "Red" }, { "brand": "Ford", "model": "Focus", "plate": "ZZZ-9999", "color": "Blue" } ] |
| |
| [ { "brand": "Hyundai", "model": "Tucson", "plate": "ABC-1212", "color": "Grey" } ] |
+------------------------------------------------------------------------------------------------------------------------------------------------------------------+
But I cannot retrieve an specific key for all objects, when the objects are inside of an array; in which case, I need to specify the index: SELECT data->personalVehicle->0->model as model from person
+-------------+
| model |
|-------------|
| "Toyota" |
| |
| "Hyundai" |
+-------------+
This guy up here, is the first index of the array, that is, the first car. I need to get the models for all N number of cars that the person might have.
How do I do that? Query that without specifying the index?
Jhonatan Cruz
(1 rep)
May 28, 2019, 06:02 PM
• Last activity: May 8, 2025, 04:06 AM
1
votes
1
answers
834
views
Multi dimensional JSON Array SQL Query
I'm struggling to write an appropriate query for my data -> ```{ "schools":[ { "org_symbol":"School 1", "criteria":[ [ { "value":"private", "type":"school type" }, { "value":"usa", "type":"country" }, { "value":"english", "type":"language" }, { "value":"1-6", "type":"grades" }, { "value":"Silver", "...
I'm struggling to write an appropriate query for my data ->
{
"schools":[
{
"org_symbol":"School 1",
"criteria":[
[
{
"value":"private",
"type":"school type"
},
{
"value":"usa",
"type":"country"
},
{
"value":"english",
"type":"language"
},
{
"value":"1-6",
"type":"grades"
},
{
"value":"Silver",
"type":"level"
}
]
]
},
{
"org_symbol":"School 2",
"criteria":[
[
{
"value":"private",
"type":"school type"
},
{
"value":"usa",
"type":"country"
},
{
"value":"english",
"type":"language"
},
{
"value":"1-6",
"type":"grades"
},
{
"value":"gold",
"type":"level"
}
]
]
}
]
}
I have this
SELECT distinct on(id) * FROM tribes, json_array_elements(meta::json -> 'attributes') as elem
WHERE
(
(elem ->> 'type' = 'school type' and elem ->> 'value' = 'private')
and (elem ->> 'type' = 'country' and elem ->> 'value' = 'usa')
and (elem ->> 'type' = 'language' and elem ->> 'value' = 'english')
and (elem ->> 'type' = 'grades' and elem ->> 'value' = '1-6')
and (elem ->> 'type' = 'level' and elem ->> 'value' = 'gold')
) ;
but it doesn't return anything, I know i'm indexing correctly (full json not included) but I can't seem to get the multi query to work. I need to be able to check where value and type match each set of criteria
I think i'm close but really not sure, any help would be greatly appreciated
chris
(11 rep)
Aug 26, 2022, 01:56 AM
• Last activity: May 7, 2025, 10:05 PM
1
votes
1
answers
2399
views
Mysql 5.7 - select where json
I have some JSON data stored in a `filters` column. An example from one row: ``` name | filters ---------+---------------------------------------------------------- John Doe | [{"Year":2004, "SportID":3}, {"Year":2005, "SportID":27}] ``` Other rows may have more than 2 objects, this is just an examp...
I have some JSON data stored in a
filters
column. An example from one row:
name | filters
---------+----------------------------------------------------------
John Doe | [{"Year":2004, "SportID":3}, {"Year":2005, "SportID":27}]
Other rows may have more than 2 objects, this is just an example.
I am trying to select
data using a where
clause which would iterate the JSON objects and act upon the value. For example, find me records where Year > 2004
or SportID = 24
.
It seems like this should work:
SELECT name, filters from my_table
where json_extract(filters
, "$[*].Year") = 2004;
However I get an empty result set. 🙍♂️
Of course I could just return the entire JSON object and parse it in app code, but I'd rather not do that if I don't have to.
**UPDATE:**
As requested, my show create table
:
CREATE TABLE my_table
(
userId
int(11) NOT NULL,
name
varchar(255) NOT NULL,
filters
json DEFAULT NULL,
PRIMARY KEY (userId
),
UNIQUE KEY my_table_userId_uindex
(userId
)
) ENGINE=InnoDB DEFAULT CHARSET=latin1
Brian Brownton
(133 rep)
May 17, 2019, 06:22 PM
• Last activity: May 5, 2025, 09:05 AM
1
votes
3
answers
982
views
Cannot import JSON file into MySQL
I am trying to import a JSON file into a MySQL database. It is FAA data of all the airports in the US. https://services6.arcgis.com/ssFJjBXIUyZDrSYZ/arcgis/rest/services/US_Airport/FeatureServer/0/query?where=1%3D1&outFields=*&outSR=4326&f=json It looks like it has field definitions first and it is...
I am trying to import a JSON file into a MySQL database. It is FAA data of all the airports in the US. https://services6.arcgis.com/ssFJjBXIUyZDrSYZ/arcgis/rest/services/US_Airport/FeatureServer/0/query?where=1%3D1&outFields=*&outSR=4326&f=json
It looks like it has field definitions first and it is confusing DBeaver and MySQL Workbench. Any ideas on how to get this into a database table? Here is what the first few rows look like. {"OBJECTID":1 is the start of the first column of the first row of data. How do I get past all the data definition stuff at the beginning? It looks like this is a description of the fields
{"objectIdFieldName":"OBJECTID","uniqueIdField":{"name":"OBJECTID","isSystemMaintained":true},"globalIdFieldName":"","geometryType":"esriGeometryPoint","spatialReference":{"wkid":4326,"latestWkid":4326},"fields":[{"name":"OBJECTID","type":"esriFieldTypeOID","alias":"OBJECTID","sqlType":"sqlTypeOther","domain":null,"defaultValue":null},{"name":"GLOBAL_ID","type":"esriFieldTypeString","alias":"GLOBAL_ID","sqlType":"sqlTypeOther","length":50,"domain":null,"defaultValue":null},{"name":"TYPE_CODE","type":"esriFieldTypeInteger","alias":"TYPE_CODE","sqlType":"sqlTypeOther","domain":null,"defaultValue":null},{"name":"IDENT","type":"esriFieldTypeString","alias":"IDENT","sqlType":"sqlTypeOther","length":254,"domain":null,"defaultValue":null},{"name":"CHANNEL","type":"esriFieldTypeString","alias":"CHANNEL","sqlType":"sqlTypeOther","length":254,"domain":null,"defaultValue":null},{"name":"NAS_USE","type":"esriFieldTypeSmallInteger","alias":"NAS_USE","sqlType":"sqlTypeOther","domain":null,"defaultValue":null},{"name":"CLASS_TXT","type":"esriFieldTypeString","alias":"CLASS_TXT","sqlType":"sqlTypeOther","length":12,"domain":null,"defaultValue":null},{"name":"NAME_TXT","type":"esriFieldTypeString","alias":"NAME_TXT","sqlType":"sqlTypeOther","length":100,"domain":null,"defaultValue":null},{"name":"CITY","type":"esriFieldTypeString","alias":"CITY","sqlType":"sqlTypeOther","length":254,"domain":null,"defaultValue":null},{"name":"STATE","type":"esriFieldTypeString","alias":"STATE","sqlType":"sqlTypeOther","length":254,"domain":null,"defaultValue":null},{"name":"COUNTRY","type":"esriFieldTypeString","alias":"COUNTRY","sqlType":"sqlTypeOther","length":254,"domain":null,"defaultValue":null},{"name":"STATUS","type":"esriFieldTypeString","alias":"STATUS","sqlType":"sqlTypeOther","length":254,"domain":null,"defaultValue":null},{"name":"REMARKS","type":"esriFieldTypeString","alias":"REMARKS","sqlType":"sqlTypeOther","length":254,"domain":null,"defaultValue":null},{"name":"AK_LOW","type":"esriFieldTypeSmallInteger","alias":"AK_LOW","sqlType":"sqlTypeOther","domain":null,"defaultValue":null},{"name":"AK_HIGH","type":"esriFieldTypeSmallInteger","alias":"AK_HIGH","sqlType":"sqlTypeOther","domain":null,"defaultValue":null},{"name":"US_LOW","type":"esriFieldTypeSmallInteger","alias":"US_LOW","sqlType":"sqlTypeOther","domain":null,"defaultValue":null},{"name":"US_HIGH","type":"esriFieldTypeSmallInteger","alias":"US_HIGH","sqlType":"sqlTypeOther","domain":null,"defaultValue":null},{"name":"US_AREA","type":"esriFieldTypeSmallInteger","alias":"US_AREA","sqlType":"sqlTypeOther","domain":null,"defaultValue":null},{"name":"PACIFIC","type":"esriFieldTypeSmallInteger","alias":"PACIFIC","sqlType":"sqlTypeOther","domain":null,"defaultValue":null}],"exceededTransferLimit":true,"features":[{"attributes":{"OBJECTID":1,"GLOBAL_ID":"DECD1B7B-F419-4D71-97AE-25BC91363C1B","TYPE_CODE":4,"IDENT":"ADK","CHANNEL":null,"NAS_USE":1,"CLASS_TXT":"HW/DME","NAME_TXT":"MOUNT MOFFETT","CITY":"ADAK ISLAND","STATE":"AK","COUNTRY":"UNITED STATES","STATUS":"IFR","REMARKS":"DME CHANNEL 087X IS PAIRED WITH VHF FREQ 114.0.||DME UNUSBL 080-105 BYD 27 NM, 105-115, 115-155 BYD 27 NM, 155-225, 225-290 BYD 27 NM, 290-340, 340-055 BYD 20 NM.","AK_LOW":1,"AK_HIGH":1,"US_LOW":0,"US_HIGH":0,"US_AREA":0,"PACIFIC":0},"geometry":{"x":-176.67602422142775,"y":51.87189606056274}},{"attributes":
Paul
(153 rep)
May 27, 2022, 08:17 PM
• Last activity: May 4, 2025, 09:04 PM
0
votes
2
answers
167
views
Grouping and arranging data using FOR JSON PATH and INCLUDE_NULL_VALUES
I'm having an issue with my T-SQL script. I am thinking about and working on how to produce the expected output from it. Here's my T-SQL script: ```sql DECLARE @user_id VARCHAR(50) = 'UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL'; -- FOR TESTING PURPOSES ONLY DECLARE @status_t TABLE (status VARCHAR(5...
I'm having an issue with my T-SQL script. I am thinking about and working on how to produce the expected output from it.
Here's my T-SQL script:
DECLARE @user_id VARCHAR(50) = 'UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL'; -- FOR TESTING PURPOSES ONLY
DECLARE @status_t TABLE (status VARCHAR(50));
INSERT INTO @status_t VALUES ('PENDING'),('APPROVED'),('PROCESSING')
SELECT
orders.batch_code, -- FOR DEMONSTRATION PURPOSES ONLY
oa_prod.nested_data AS nested_prod,
cnb.amount_to_pay
FROM
[testDB].[dbo].[Orders] AS orders
RIGHT OUTER JOIN
[testDB].[dbo].[Payment] AS cnb
ON
orders.order_id = cnb.order_id
LEFT OUTER JOIN
[testDB].[dbo].[StatusRef] AS stat_ref
ON
orders.stat_ref_id = stat_ref.stat_ref_id
CROSS APPLY( -- OR OUTER APPLY
SELECT
orders.batch_code,
orders.order_id,
prod.prod_name,
pv_kn.key_name,
pv_kv.value,
CASE WHEN
orders.prod_id IS NOT NULL
THEN
prod.disc_price
WHEN
orders.prod_var_id IS NOT NULL
THEN
prod_var.disc_price
END AS disc_price,
orders.quantity
FROM
[testDB].[dbo].[Product] AS prod
RIGHT OUTER JOIN
[testDB].[dbo].[SubProduct] AS prod_var
ON
prod.prod_id = prod_var.prod_id
LEFT OUTER JOIN
[testDB].[dbo].[SubProductVarKeyValue] AS pv_kv
ON
prod_var.prod_var_id = pv_kv.prod_var_id
LEFT OUTER JOIN
[testDB].[dbo].[SubProductVarKeyNames] AS pv_kn
ON
pv_kv.pv_key_name_id = pv_kn.pv_key_name_id
WHERE
prod.prod_id = orders.prod_id
OR prod_var.prod_var_id = orders.prod_var_id
FOR JSON PATH,
INCLUDE_NULL_VALUES
) AS oa_prod(nested_data) -- it's a syntax analyzer lint conflict ONLY
WHERE
orders.disable = 0
AND cnb.disable = 0
AND orders.user_id = @user_id
AND stat_ref.com_usd_wrds IN (SELECT status FROM @status_t)
ORDER BY
orders.dt_stamp DESC,
orders.batch_code ASC -- To prevent batch_code from being separated
FOR JSON PATH,
INCLUDE_NULL_VALUES
Here's my [minimal, reproducible example](https://stackoverflow.com/help/minimal-reproducible-example) using table variables:
DECLARE @StatusRef TABLE(
stat_ref_id VARCHAR(50) PRIMARY KEY NOT NULL,
com_usd_wrds NVARCHAR(100) NOT NULL
);
DECLARE @Product TABLE(
prod_id VARCHAR(50) PRIMARY KEY NOT NULL,
prod_name VARCHAR(200) NOT NULL,
stock INT NOT NULL,
disc_price DECIMAL(12, 2) NOT NULL
);
DECLARE @SubProduct TABLE(
prod_var_id VARCHAR(50) PRIMARY KEY NOT NULL,
stock INT NOT NULL,
disc_price DECIMAL(12, 2) NOT NULL,
prod_id VARCHAR(50) NOT NULL
);
DECLARE @Orders TABLE(
order_id VARCHAR(50) PRIMARY KEY NOT NULL,
batch_code VARCHAR(50) NULL,
quantity INT NOT NULL,
stat_ref_id VARCHAR(50) NOT NULL,
disable BIT DEFAULT (0) NOT NULL,
dt_stamp DATETIME NOT NULL,
prod_id VARCHAR(50) NULL,
prod_var_id VARCHAR(50) NULL,
user_id VARCHAR(50) NOT NULL
);
DECLARE @Payment TABLE(
amount_to_pay DECIMAL(14, 2) NOT NULL,
order_id VARCHAR(50) NOT NULL,
disable BIT DEFAULT (0) NOT NULL
);
DECLARE @SubProductVarKeyValue TABLE(
value VARCHAR(100) NOT NULL,
prod_var_id VARCHAR(50) NOT NULL,
pv_key_name_id VARCHAR(50) NOT NULL
);
DECLARE @SubProductVarKeyNames TABLE(
pv_key_name_id VARCHAR(50) PRIMARY KEY NOT NULL,
key_name VARCHAR(100) NOT NULL
);
INSERT INTO @StatusRef
VALUES
(
'STAT-REF-1001', -- stat_ref_id
'PENDING' -- com_usd_wrds
),
(
'STAT-REF-1002', -- stat_ref_id
'APPROVED' -- com_usd_wrds
),
(
'STAT-REF-1003', -- stat_ref_id
'PROCESSING' -- com_usd_wrds
);
INSERT INTO @Product
VALUES
(
'PROD-ID-1001', -- prod_id
'iPhone', -- prod_name
0, -- stock | dependent to @SubProduct
0.00 -- disc_price | dependent to @SubProduct
),
(
'PROD-ID-1002', -- prod_id
'Samsung', -- prod_name
0, -- stock | dependent to @SubProduct
0.00 -- disc_price | dependent to @SubProduct
),
(
'PROD-ID-1003', -- prod_id
'Nokia', -- prod_name
75, -- stock
33150.00 -- disc_price
),
(
'PROD-ID-1004', -- prod_id
'Google', -- prod_name
100, -- stock
53509.00 -- disc_price
),
(
'PROD-ID-1005', -- prod_id
'Sony', -- prod_name
0, -- stock | dependent to @SubProduct
0.00 -- disc_price | dependent to @SubProduct
),
(
'PROD-ID-1006', -- prod_id
'Lenovo', -- prod_name
0, -- stock | dependent to @SubProduct
0.00 -- disc_price | dependent to @SubProduct
);
INSERT INTO @SubProduct
VALUES
(
'PROD-VAR-ID-1', -- prod_var_id
25, -- stock
45809.00, -- disc_price
'PROD-ID-1001' -- prod_id
),
(
'PROD-VAR-ID-2', -- prod_var_id
50, -- stock
40209.00, -- disc_price
'PROD-ID-1002' -- prod_id
),
(
'PROD-VAR-ID-3', -- prod_var_id
0, -- stock | dependent to @Product
0.00, -- disc_price | dependent to @Product
'PROD-ID-1003' -- prod_id
),
(
'PROD-VAR-ID-4', -- prod_var_id
0, -- stock | dependent to @Product
0.00, -- disc_price | dependent to @Product
'PROD-ID-1004' -- prod_id
),
(
'PROD-VAR-ID-5', -- prod_var_id
125, -- stock
25809.00, -- disc_price
'PROD-ID-1005' -- prod_id
),
(
'PROD-VAR-ID-6', -- prod_var_id
150, -- stock
49100.00, -- disc_price
'PROD-ID-1006' -- prod_id
);
INSERT INTO @SubProductVarKeyValue
VALUES
(
'new', -- value
'PROD-VAR-ID-1', -- prod_var_id
'PVKN-ID-1' -- pv_key_name_id
),
(
'new', -- value
'PROD-VAR-ID-2', -- prod_var_id
'PVKN-ID-1' -- pv_key_name_id
),
(
'new', -- value
'PROD-VAR-ID-5', -- prod_var_id
'PVKN-ID-1' -- pv_key_name_id
),
(
'new', -- value
'PROD-VAR-ID-6', -- prod_var_id
'PVKN-ID-1' -- pv_key_name_id
)
INSERT INTO @SubProductVarKeyNames
VALUES
(
'PVKN-ID-1', -- pv_key_name_id
'Condition' -- key_name
)
INSERT INTO @Orders
(
order_id,
batch_code,
quantity,
stat_ref_id,
dt_stamp,
prod_id,
prod_var_id,
user_id
)
VALUES
(
'ORDER-2025-04-11-B71D0E2F5D8C', -- order_id
NULL, -- batch_code
1, -- quantity
'STAT-REF-1003', -- stat_ref_id
'2025-04-14 10:17:20.963', -- dt_stamp
NULL, -- prod_id
'PROD-VAR-ID-1', -- prod_var_id
'UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL' -- user_id
),
(
'ORDER-2025-04-11-D95EB033CA40', -- order_id
'BGUID-2025-04-11-6D81B58FAE94', -- batch_code
2, -- quantity
'STAT-REF-1001', -- stat_ref_id
'2025-04-13 09:17:20.963', -- dt_stamp
NULL, -- prod_id
'PROD-VAR-ID-2', -- prod_var_id
'UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL' -- user_id
),
(
'ORDER-2025-04-11-7F04EFA2BB60', -- order_id
'BGUID-2025-04-11-6D81B58FAE94', -- batch_code
2, -- quantity
'STAT-REF-1001', -- stat_ref_id
'2025-04-13 09:17:20.963', -- dt_stamp
'PROD-ID-1003', -- prod_id
NULL, -- prod_var_id
'UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL' -- user_id
),
(
'ORDER-2025-04-10-3F03EAA47686', -- order_id
'BGUID-2025-04-10-20239FD2059F', -- batch_code
1, -- quantity
'STAT-REF-1002', -- stat_ref_id
'2025-04-12 08:17:20.963', -- dt_stamp
'PROD-ID-1004', -- prod_id
NULL, -- prod_var_id
'UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL' -- user_id
),
(
'ORDER-2025-04-10-F4A89E2C4A30', -- order_id
'BGUID-2025-04-10-20239FD2059F', -- batch_code
1, -- quantity
'STAT-REF-1002', -- stat_ref_id
'2025-04-12 08:17:20.963', -- dt_stamp
NULL, -- prod_id
'PROD-VAR-ID-5', -- prod_var_id
'UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL' -- user_id
),
(
'ORDER-2025-04-08-31BD887341FA', -- order_id
NULL, -- batch_code
1, -- quantity
'STAT-REF-1001', -- stat_ref_id
'2025-04-11 07:17:20.963', -- dt_stamp
NULL, -- prod_id
'PROD-VAR-ID-6', -- prod_var_id
'UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL' -- user_id
);
INSERT INTO @Payment
(
amount_to_pay,
order_id
)
VALUES
(
45809.00, -- amount_to_pay
'ORDER-2025-04-11-B71D0E2F5D8C' -- order_id
),
(
146718.00, -- amount_to_pay
'ORDER-2025-04-11-D95EB033CA40' -- order_id
),
(
146718.00, -- amount_to_pay
'ORDER-2025-04-11-7F04EFA2BB60' -- order_id
),
(
79318.00, -- amount_to_pay
'ORDER-2025-04-10-3F03EAA47686' -- order_id
),
(
79318.00, -- amount_to_pay
'ORDER-2025-04-10-F4A89E2C4A30' -- order_id
),
(
49100.00, -- amount_to_pay
'ORDER-2025-04-08-31BD887341FA' -- order_id
);
SELECT * FROM @StatusRef
OUTPUT:
| stat_ref_id | com_usd_wrds |
|---------------|--------------|
| STAT-REF-1001 | PENDING |
| STAT-REF-1002 | APPROVED |
| STAT-REF-1003 | PROCESSING |
SELECT * FROM @Product
OUTPUT:
| prod_id | prod_name | stock | disc_price |
|--------------|------------|-------|-------------|
| PROD-ID-1001 | iPhone | 0 | 0.00 |
| PROD-ID-1002 | Samsung | 0 | 0.00 |
| PROD-ID-1003 | Nokia | 75 | 33150.00 |
| PROD-ID-1004 | Google | 100 | 53509.00 |
| PROD-ID-1005 | Sony | 0 | 0.00 |
| PROD-ID-1006 | Lenovo | 0 | 0.00 |
SELECT * FROM @SubProduct
OUTPUT:
| prod_var_id | stock | disc_price | prod_id |
|---------------|-------|------------|--------------|
| PROD-VAR-ID-1 | 25 | 45809.00 | PROD-ID-1001 |
| PROD-VAR-ID-2 | 50 | 40209.00 | PROD-ID-1002 |
| PROD-VAR-ID-3 | 0 | 0.00 | PROD-ID-1003 |
| PROD-VAR-ID-4 | 0 | 0.00 | PROD-ID-1004 |
| PROD-VAR-ID-5 | 125 | 25809.00 | PROD-ID-1005 |
| PROD-VAR-ID-6 | 150 | 49100.00 | PROD-ID-1006 |
SELECT * FROM @Orders ORDER BY dt_stamp
OUTPUT:
| order_id | batch_code | quantity | stat_ref_id | disable | dt_stamp | prod_id | prod_var_id | user_id |
|----------|-------------|----------|-------------|---------|----------|----------|-------------|----------|
| ORDER-2025-04-08-31BD887341FA | NULL | 1 | STAT-REF-1001 | 0 | 2025-04-11 07:17:20.963 | NULL | PROD-VAR-ID-6 | UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL |
| ORDER-2025-04-10-3F03EAA47686 | BGUID-2025-04-10-20239FD2059F | 1 | STAT-REF-1002 | 0 | 2025-04-12 08:17:20.963 | PROD-ID-1004 | NULL | UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL |
| ORDER-2025-04-10-F4A89E2C4A30 | BGUID-2025-04-10-20239FD2059F | 1 | STAT-REF-1002 | 0 | 2025-04-12 08:17:20.963 | NULL | PROD-VAR-ID-5 | UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL |
| ORDER-2025-04-11-7F04EFA2BB60 | BGUID-2025-04-11-6D81B58FAE94 | 2 | STAT-REF-1001 | 0 | 2025-04-13 09:17:20.963 | PROD-ID-1003 | NULL | UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL |
| ORDER-2025-04-11-D95EB033CA40 | BGUID-2025-04-11-6D81B58FAE94 | 2 | STAT-REF-1001 | 0 | 2025-04-13 09:17:20.963 | NULL | PROD-VAR-ID-2 | UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL |
| ORDER-2025-04-11-B71D0E2F5D8C | NULL | 1 | STAT-REF-1003 | 0 | 2025-04-14 10:17:20.963 | NULL | PROD-VAR-ID-1 | UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL |
SELECT * FROM @Payment
OUTPUT:
| amount_to_pay | order_id | disable |
|---------------|------------|---------|
| 45809.00 | ORDER-2025-04-11-B71D0E2F5D8C | 0 |
| 146718.00 | ORDER-2025-04-11-D95EB033CA40 | 0 |
| 146718.00 | ORDER-2025-04-11-7F04EFA2BB60 | 0 |
| 79318.00 | ORDER-2025-04-10-3F03EAA47686 | 0 |
| 79318.00 | ORDER-2025-04-10-F4A89E2C4A30 | 0 |
| 45809.00 | ORDER-2025-04-08-31BD887341FA | 0 |
SELECT * FROM @SubProductVarKeyValue
OUTPUT:
| value | prod_var_id | pv_key_name_id |
|-------------|--------------|---------|
| new | PROD-VAR-ID-1 | PVKN-ID-1 |
| new | PROD-VAR-ID-2 | PVKN-ID-1 |
| new | PROD-VAR-ID-5 | PVKN-ID-1 |
| new | PROD-VAR-ID-6 | PVKN-ID-1 |
SELECT * FROM @SubProductVarKeyNames
OUTPUT:
| pv_key_name_id | key_name |
|-------------|--------------|
| PVKN-ID-1 | Condition |
Here's the modified example of the script that's similar to the first one provided:
DECLARE @user_id VARCHAR(50) = 'UGUID-2025-01-27-14-09-22-1967-ABCDEFGHIJKL'; -- FOR TESTING PURPOSES ONLY
DECLARE @status_t TABLE (status VARCHAR(50));
INSERT INTO @status_t VALUES ('PENDING'),('APPROVED'),('PROCESSING')
SELECT
orders.batch_code, -- FOR DEMONSTRATION PURPOSES ONLY
oa_prod.nested_data AS nested_prod,
cnb.amount_to_pay
FROM
@Orders AS orders
RIGHT OUTER JOIN
@Payment AS cnb
ON
orders.order_id = cnb.order_id
LEFT OUTER JOIN
@StatusRef AS stat_ref
ON
orders.stat_ref_id = stat_ref.stat_ref_id
CROSS APPLY( -- OR OUTER APPLY
SELECT
orders.batch_code,
orders.order_id,
prod.prod_name,
pv_kn.key_name,
pv_kv.value,
CASE WHEN
orders.prod_id IS NOT NULL
THEN
prod.disc_price
WHEN
orders.prod_var_id IS NOT NULL
THEN
prod_var.disc_price
END AS disc_price,
orders.quantity
FROM
@Product AS prod
RIGHT OUTER JOIN
@SubProduct AS prod_var
ON
prod.prod_id = prod_var.prod_id
LEFT OUTER JOIN
@SubProductVarKeyValue AS pv_kv
ON
prod_var.prod_var_id = pv_kv.prod_var_id
LEFT OUTER JOIN
@SubProductVarKeyNames AS pv_kn
ON
pv_kv.pv_key_name_id = pv_kn.pv_key_name_id
WHERE
prod.prod_id = orders.prod_id
OR prod_var.prod_var_id = orders.prod_var_id
FOR JSON PATH,
INCLUDE_NULL_VALUES
) AS oa_prod(nested_data) -- it's a syntax analyzer lint conflict ONLY
WHERE
orders.disable = 0
AND cnb.disable = 0
AND orders.user_id = @user_id
AND stat_ref.com_usd_wrds IN (SELECT status FROM @status_t)
ORDER BY
orders.dt_stamp DESC
--orders.batch_code ASC -- To prevent batch_code from being separated
FOR JSON PATH,
INCLUDE_NULL_VALUES
Unfortunately, when I beautify the JSON output, this is what it generates:
[
{
"batch_code": null,
"nested_prod": [
{
"batch_code": null,
"order_id": "ORDER-2025-04-11-B71D0E2F5D8C",
"prod_name": "iPhone",
"key_name": "Condition",
"value": "new",
"disc_price": 45809,
"quantity": 1
}
],
"amount_to_pay": 45809
},
{
"batch_code": "BGUID-2025-04-11-6D81B58FAE94",
"nested_prod": [
{
"batch_code": "BGUID-2025-04-11-6D81B58FAE94",
"order_id": "ORDER-2025-04-11-D95EB033CA40",
"prod_name": "Samsung",
"key_name": "Condition",
"value": "new",
"disc_price": 40209,
"quantity": 2
}
],
"amount_to_pay": 146718
},
{
"batch_code": "BGUID-2025-04-11-6D81B58FAE94",
"nested_prod": [
{
"batch_code": "BGUID-2025-04-11-6D81B58FAE94",
"order_id": "ORDER-2025-04-11-7F04EFA2BB60",
"prod_name": "Nokia",
"key_name": null,
"value": null,
"disc_price": 33150,
"quantity": 2
}
],
"amount_to_pay": 146718
},
{
"batch_code": "BGUID-2025-04-10-20239FD2059F",
"nested_prod": [
{
"batch_code": "BGUID-2025-04-10-20239FD2059F",
"order_id": "ORDER-2025-04-10-3F03EAA47686",
"prod_name": "Google",
"key_name": null,
"value": null,
"disc_price": 53509,
"quantity": 1
}
],
"amount_to_pay": 79318
},
{
"batch_code": "BGUID-2025-04-10-20239FD2059F",
"nested_prod": [
{
"batch_code": "BGUID-2025-04-10-20239FD2059F",
"order_id": "ORDER-2025-04-10-F4A89E2C4A30",
"prod_name": "Sony",
"key_name": "Condition",
"value": "new",
"disc_price": 25809,
"quantity": 1
}
],
"amount_to_pay": 79318
},
{
"batch_code": null,
"nested_prod": [
{
"batch_code": null,
"order_id": "ORDER-2025-04-08-31BD887341FA",
"prod_name": "Lenovo",
"key_name": "Condition",
"value": "new",
"disc_price": 49100,
"quantity": 1
}
],
"amount_to_pay": 49100
}
]
However, I need the output to look something like this:
[
{
"batch_code": null,
"nested_prod": [
{
"batch_code": null,
"order_id": "ORDER-2025-04-11-B71D0E2F5D8C",
"prod_name": "iPhone",
"key_name": "Condition",
"value": "new",
"disc_price": 45809,
"quantity": 1
}
],
"amount_to_pay": 45809
},
{
"batch_code": "BGUID-2025-04-11-6D81B58FAE94",
"nested_prod": [
{
"batch_code": "BGUID-2025-04-11-6D81B58FAE94",
"order_id": "ORDER-2025-04-11-D95EB033CA40",
"prod_name": "Samsung",
"key_name": "Condition",
"value": "new",
"disc_price": 40209,
"quantity": 2
},
{
"batch_code": "BGUID-2025-04-11-6D81B58FAE94",
"order_id": "ORDER-2025-04-11-7F04EFA2BB60",
"prod_name": "Nokia",
"key_name": null,
"value": null,
"disc_price": 33150,
"quantity": 2
}
],
"amount_to_pay": 146718
},
{
"batch_code": "BGUID-2025-04-10-20239FD2059F",
"nested_prod": [
{
"batch_code": "BGUID-2025-04-10-20239FD2059F",
"order_id": "ORDER-2025-04-10-3F03EAA47686",
"prod_name": "Google",
"key_name": null,
"value": null,
"disc_price": 53509,
"quantity": 1
},
{
"batch_code": "BGUID-2025-04-10-20239FD2059F",
"order_id": "ORDER-2025-04-10-F4A89E2C4A30",
"prod_name": "Sony",
"key_name": "Condition",
"value": "new",
"disc_price": 25809,
"quantity": 1
}
],
"amount_to_pay": 79318
},
{
"batch_code": null,
"nested_prod": [
{
"batch_code": null,
"order_id": "ORDER-2025-04-08-31BD887341FA",
"prod_name": "Lenovo",
"key_name": "Condition",
"value": "new",
"disc_price": 49100,
"quantity": 1
}
],
"amount_to_pay": 49100
}
]
Note: The amount_to_pay
was pre-computed in the client app. So, it should become a single instance when duplicate batch_code
s are grouped.
User orders can cause *variations* in the desirable JSON output shown above.
Is anyone familiar with the issue I'm dealing with?
As much as possible, I prefer not to implement the Common Table Expression
(CTE) approach.
DevQt
(165 rep)
Apr 12, 2025, 02:31 AM
• Last activity: Apr 16, 2025, 02:05 AM
1
votes
2
answers
1338
views
Postgres row_to_json precision for numeric type
I am using the row_to_json function to convert a table row to a json object. There are a few columns in the table that are of type `numeric` which contain float values. When row_to_json returns the json, values like 9.98 becomes 9.98000000000000043 or 6.30 becomes 6.29999999999999982 in these numeri...
I am using the row_to_json function to convert a table row to a json object.
There are a few columns in the table that are of type
numeric
which contain float values.
When row_to_json returns the json, values like 9.98 becomes 9.98000000000000043 or 6.30 becomes 6.29999999999999982 in these numeric columns.
Is there a way to impose precision at the json creation function? There are no other data issues in the table which are affecting other modules.
Changing the column type to numeric(12,2)
etc. is not possible since the table has over 2 billion rows(the table would be locked for 30+hours) and would affect other production modules.
--update--
I should have made the question a little bit clearer with the use case.
row_to_json is used in a trigger to audit updates in the DB. The json result is stored in another table and this same audit function is used for multiple tables(100+) where there may be other tables with this same issue.
One solution would be to handle it at the application level when retrieving the jsons from the audit table, but I would like to see if there is a function that can handle it in the DB itself.
Min2j
(19 rep)
Jan 24, 2018, 12:50 PM
• Last activity: Apr 14, 2025, 11:10 PM
Showing page 1 of 20 total questions