Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
0
votes
1
answers
62
views
Oracle - create table with numeric column with (90,2) as size
I have to populate a table in Oracle and I need to have numeric values with (90,2) as size for some columns. As number is not a viable option, is there any way in which i can store this data in numeric columns? I do not want to use varchar as datatype for this column.
I have to populate a table in Oracle and I need to have numeric values with (90,2) as size for some columns. As number is not a viable option, is there any way in which i can store this data in numeric columns? I do not want to use varchar as datatype for this column.
Mathew Linton
Mar 17, 2025, 10:00 PM
• Last activity: Mar 18, 2025, 03:36 PM
0
votes
1
answers
74
views
Explanation needed for my result
I am new to string handling in SQL server and was wondering what makes the below code return the result it returns. Could I please have a detailed explanation? **My code:** ``` Select CAST(CAST(CAST('08' as nvarchar) as varbinary) as varchar) as [result] ``` **Output:** >| Result |\ | 0 | What is ha...
I am new to string handling in SQL server and was wondering what makes the below code return the result it returns. Could I please have a detailed explanation?
**My code:**
Select CAST(CAST(CAST('08' as nvarchar) as varbinary) as varchar) as [result]
**Output:**
>| Result |\
| 0 |
What is happening here from '08' being converted to nvarchar to varbinary to varchar??
varun as
(11 rep)
Sep 21, 2024, 11:45 AM
• Last activity: Sep 28, 2024, 04:58 AM
0
votes
2
answers
431
views
how to convert nvarchar(max) or varchar(max) to xml in sql server
I am trying to [convert msdb jobs messages to xml][1]. I am using [Ola Hallengren backup routines][2]. when I run the following routine it works amazing and converts everything. now, if I change the number 3333 to 3702 it fails. I cannot identify what is after the character 3701 in the message, but...
I am trying to convert msdb jobs messages to xml .
I am using Ola Hallengren backup routines .
when I run the following routine it works amazing and converts everything.
now, if I change the number 3333 to 3702 it fails.
I cannot identify what is after the character 3701 in the message, but that is holding me back to convert the full message to xml
If you use Ola's you can test on your own system. let me know if you find out how to change my function so that I can successfully convert nvarchar(max) or varchar(max) to xml .
create or alter FUNCTION dbo.fnCleanString (@InputString VARCHAR(MAX))
RETURNS VARCHAR(MAX)
AS
BEGIN
DECLARE @CleanedString VARCHAR(MAX) = '';
DECLARE @Char VARCHAR(1);
DECLARE @Index INT = 1;
DECLARE @Len INT = 3333-- LEN(@InputString);
WHILE @Index <= @Len
BEGIN
SET @Char = SUBSTRING(@InputString, @Index, 1);
IF (ASCII(@Char) NOT BETWEEN 0 AND 31 OR ASCII(@Char) IN (9, 10, 13))
BEGIN
SET @CleanedString += @Char;
END
SET @Index += 1;
END
RETURN @CleanedString;
END
go
SELECT TOP 50
SysJobs.name,
SysJobs.enabled,
Job.message AS Message_Text,
TRY_CAST(
(SELECT dbo.fnCleanString(Job.message) AS [text()]
FOR XML PATH(''), TYPE
).value('.', 'NVARCHAR(MAX)') AS XML
) AS Message_XML
FROM msdb.dbo.sysjobhistory Job WITH (NOLOCK)
INNER JOIN msdb.dbo.sysjobs SysJobs WITH (NOLOCK)
ON Job.job_id = SysJobs.job_id
WHERE SysJobs.name = 'DatabaseBackup - USER_DATABASES - FULL'
ORDER BY Job.instance_id DESC;
Marcello Miorelli
(17274 rep)
May 29, 2024, 09:04 PM
• Last activity: Jul 9, 2024, 03:35 PM
0
votes
2
answers
790
views
Converting from latin1 to utf8mb4 of ENUM and SET with accented characters on mysql
I am trying to convert tables from latin1 to utf8mb4 on MySQL 5.7. Everything works fine except the tables with ENUM and SET which contains a value with an accented character. 2 types of errors are generated. For exemple: CREATE TABLE `car` ( ... `color` enum('Rouge','Bleu foncé',...) DEFAULT N...
I am trying to convert tables from latin1 to utf8mb4 on MySQL 5.7.
Everything works fine except the tables with ENUM and SET which contains a value with an accented character. 2 types of errors are generated.
For exemple:
CREATE TABLE
car
(
...
color
enum('Rouge','Bleu foncé',...) DEFAULT NULL,
...
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
This query does not work:
> ALTER TABLE car
CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;
#1265 - Data truncated for column 'color' at row 4
On other tables I get this error:
#1291 - Column 'origin' has duplicated value 'Cor?e' in ENUM
Global and session variables are:
+--------------------------+--------------------+
| Variable_name | Value |
+--------------------------+--------------------+
| character_set_client | utf8mb4 |
| character_set_connection | utf8mb4 |
| character_set_database | utf8mb4 |
| character_set_filesystem | binary |
| character_set_results | utf8mb4 |
| character_set_server | utf8mb4 |
| character_set_system | utf8 |
| collation_connection | utf8mb4_unicode_ci |
| collation_database | utf8mb4_unicode_ci |
| collation_server | utf8mb4_unicode_ci |
+--------------------------+--------------------+
PS: Current work around is to:
1. convert enums
/sets
to varchars
2. convert table charset
3. convert back varchars
to enums
/sets
Toto
(93 rep)
Mar 10, 2023, 02:27 AM
• Last activity: Mar 29, 2023, 11:19 PM
0
votes
1
answers
1127
views
Incorrect datetime str_to_date INSERT SELECT
I have never run into such an issue before. Was given a bunch of data and told to compile it into some tables and develop some reports. The users who gathered the data used various different date formats which has never been an issue in the past for me. So I import all the data into some temp tables...
I have never run into such an issue before. Was given a bunch of data and told to compile it into some tables and develop some reports. The users who gathered the data used various different date formats which has never been an issue in the past for me. So I import all the data into some temp tables and generate a SELECT query which converts the different date formats into yyyy-mm-dd dates using the str_to_date() in order to be imported into a date column. I like what I see, all the data types and column names match so I add the SELECT query to an INSERT statement and receive the following:
> Error Code: 1411. Incorrect datetime value: '2012-10-31' for function
> str_to_date.
The column being inserted into is a Date data type not a DateTime. I did try to add the time part to these dates but received exactly the same error message. I have attempted everything I can possibly think of. I am about to export to a csv file and import just so I can move on with my life but something isn't right. I have used functions like str_to_date and to_date (sqlserver) for a long time. Why is the SELECT query returning without error or warning but the INSERT INTO SELECT throwing this error?
I have changed sql_mode, added ALLOW_INVALID_DATES, removed NO_ZERO_DATE and NO_ZERO_IN_DATE. Hell, I set sql_mode = ''! I have looked for ascii characters in the data. Crazy thing is if I copy and past the returned data from the select query into str_to_date(datestring, '%m/%d/%Y') <--or whatever the expected format and issue an INSERT, it works!
Here is the query that has me rethinking my life choices:
INSERT INTO leases_new (leases_new.id,
leases_new.payment_start_date
)
SELECT lease_import_data_view.id,
case
when UPPER(lease_import_data_view.lease_start_date) like '%EASE%'
then STR_TO_DATE('1899-01-01', '%Y-%m-%d')
when UPPER(lease_import_data_view.lease_start_date) like '%MOL%'
then STR_TO_DATE('2013-12-01', '%Y-%m-%d')
else case
when STR_TO_DATE(lease_import_data_view.lease_start_date, '%m/%d/%Y') is not null
then STR_TO_DATE(lease_import_data_view.lease_start_date, '%m/%d/%Y')
when STR_TO_DATE(lease_import_data_view.lease_start_date, '%c/%e/%Y') is not null
then STR_TO_DATE(lease_import_data_view.lease_start_date, '%c/%e/%Y')
when STR_TO_DATE(lease_import_data_view.lease_start_date, '%m%d%Y') is not null
then STR_TO_DATE(lease_import_data_view.lease_start_date, '%m%d%Y')
when STR_TO_DATE(lease_import_data_view.lease_start_date, '%Y/%m/%d') is not null
then STR_TO_DATE(lease_import_data_view.lease_start_date, '%Y/%m/%d')
when STR_TO_DATE(lease_import_data_view.lease_start_date, '%Y/%c/%e') is not null
then STR_TO_DATE(lease_import_data_view.lease_start_date, '%Y/%c/%e')
when STR_TO_DATE(lease_import_data_view.lease_start_date, '%Y%m%d') is not null
then STR_TO_DATE(lease_import_data_view.lease_start_date, '%Y%m%d')
when STR_TO_DATE(lease_import_data_view.lease_start_date, '%Y-%m-%d') is not null
then STR_TO_DATE(lease_import_data_view.lease_start_date, '%Y-%m-%d')
when STR_TO_DATE(lease_import_data_view.lease_start_date, '%m-%d-%Y') is not null
then STR_TO_DATE(lease_import_data_view.lease_start_date, '%m-%d-%Y')
else STR_TO_DATE('1899-01-01', '%Y-%m-%d')
end
end as payment_start_date
FROM lease_import_data_view
GROUP BY lease_import_data_view.id
Here is the current state of my SQL_MODE
SELECT @@SQL_MODE;
'STRICT_TRANS_TABLES,ERROR_FOR_DIVISION_BY_ZERO,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION,ALLOW_INVALID_DATES'
I will admit, I am fighting off the flu right now so it is very possible I am overlooking something very obvious. Does anyone know what might be going on here? Help with this is greatly appreciated.
waltmagic
(141 rep)
Nov 9, 2022, 11:00 PM
• Last activity: Nov 10, 2022, 01:24 AM
1
votes
0
answers
1233
views
Why this aggregate function cannot be converted with pg_upgrade from PostgreSQL 9.5 to PG 14.3?
I am trying to port old PostgreSQL v9.5 database (or family of databases) to PostgreSQL v14.3, with pg_upgrade utility. After many hassles with old v9.5 binaries, source files (had to be recompiled to be runnable in the much newer OS and libraries, fedora 24->fedora 36), adding several extensions fr...
I am trying to port old PostgreSQL v9.5 database (or family of databases) to PostgreSQL v14.3, with pg_upgrade utility.
After many hassles with old v9.5 binaries, source files (had to be recompiled to be runnable in the much newer OS and libraries, fedora 24->fedora 36), adding several extensions from postgresql-contrib and from GIT; I finally got pg_upgrade --check completed saying the databases are compatible.
postgres$ postgresql-setup --initdb
postgres$ pg_upgrade --check -b postgresql-9.5.7/bin -B /usr/bin -d data.old -D data
Performing Consistency Checks
-----------------------------
Checking cluster versions ok
Checking database user is the install user ok
Checking database connection settings ok
Checking for prepared transactions ok
Checking for system-defined composite types in user tables ok
Checking for reg* data types in user tables ok
Checking for contrib/isn with bigint-passing mismatch ok
Checking for user-defined encoding conversions ok
Checking for user-defined postfix operators ok
Checking for tables WITH OIDS ok
Checking for invalid "sql_identifier" user columns ok
Checking for invalid "unknown" user columns ok
Checking for hash indexes ok
Checking for roles starting with "pg_" ok
Checking for presence of required libraries ok
Checking database user is the install user ok
Checking for prepared transactions ok
Checking for new cluster tablespace directories ok
*Clusters are compatible*
But then running pg_upgrade without --check it stops here:
$ tail -12 pg_upgrade_dump_308771.log
pg_restore: creating AGGREGATE "foobardb.ads("anyelement")"
pg_restore: while PROCESSING TOC:
pg_restore: from TOC entry 1800; 1255 514033 AGGREGATE ads("anyelement") dbuser
pg_restore: error: could not execute query: ERROR: function array_append(anyarray, anyelement) does not exist
Command was: CREATE AGGREGATE "foobar"."ads"("anyelement") (
SFUNC = "array_append",
STYPE = "anyarray",
INITCOND = '{}',
FINALFUNC = "foobardb"."array_sort_unique"
);
The aggregate function file which has created the aggregate: ads.sql, which have function array_sort_unique(ANYARRAY), which takes any array, finds unique values in it and orders them. And ads(ANYELEMENT) aggregate function, which can be used to aggregate items in GROUP BY queries and returning an array of unique items ordered.
-- Array_agg + unique/Distinct 'sort'
CREATE OR REPLACE FUNCTION array_sort_unique (ANYARRAY) RETURNS ANYARRAY
LANGUAGE SQL
AS $body$
SELECT ARRAY(
SELECT DISTINCT $1[s.i]
FROM generate_series(array_lower($1,1), array_upper($1,1)) AS s(i)
ORDER BY 1
);
$body$;
--https://www.postgresql.org/docs/9.5/xaggr.html
CREATE aggregate ads (ANYELEMENT) --RETURNS ANYARRAY
(
sfunc = array_append,
stype = anyarray,
initcond = '{}',
finalfunc = array_sort_unique
)
pg_upgrade seems to claim, there is no array_append(anyarray, anyelement) function, but there is:
PGSQL: array_append(anyarray, anyelement)
And it was working without problems in PGSQL v9.5
**Any idea, why there is an error and how to fix it?** Why PGSQL v9.5 is happy, but porting to PGSQL 14.3 fails?
Edit:
Is it because in newer PGSQL *array_append(anyarray, anyelement)* is now *array_append(anycompatiblearray, anycompatible)*? I guess.
PGSQL type system - polymorphic types
**How to tell pg_upgrade to handle with that? Or how to tackle that problem?**
zimon
(23 rep)
Aug 10, 2022, 09:30 AM
• Last activity: Aug 10, 2022, 09:55 AM
0
votes
0
answers
431
views
How to convert correctly from datetime2 data-type to small date and to elliminate HH:mm:SS.000000 padding
I have created a table with a column with a date data type which correspond to a date data in yyy-MM-dd format. In the database it presents it as if the data type is datetime2(7). More generally all the date data-types (date, smalldate, datetime) are presented as datetime2(7) even datetime2 with les...
I have created a table with a column with a date data type which correspond to a date data in yyy-MM-dd format.
In the database it presents it as if the data type is datetime2(7).
More generally all the date data-types (date, smalldate, datetime) are presented as datetime2(7) even datetime2 with less ms presicions.
I tried to execute the following query:
SELECT
CAST('2007-05-08 12:35:29. 1234567 +12:15' AS time(2)) AS 'time'
,CAST('2007-05-08 12:35:29. 1234567 +12:15' AS date) AS 'date'
,CAST('2007-05-08 12:35:29.123' AS smalldatetime) AS
'smalldatetime'
,CAST('2007-05-08 12:35:29.123' AS datetime) AS 'datetime'
,CAST('2007-05-08 12:35:29. 1234567 +12:15' AS datetime2(7)) AS
'datetime2'
And I displayed the following table:
Is there any management settings that stricts the date conversion or alternitavly, is there any more powerful conversion functions to display and store data in the correct data types?


Zeevik.sha
(1 rep)
Aug 9, 2022, 11:25 AM
• Last activity: Aug 9, 2022, 11:28 AM
1
votes
1
answers
324
views
Datapump metadata-only import changes datalength when importing with conversion
Datapump metadata only import changes datalength when importing with conversion: (export done in WE8ISO8859P15 and import done in AL32UTF8 character set and AL16UTF16 NCHAR character set) Example table: Source-SYSTEM (WE8ISO8859P15): select OWNER,TABLE_NAME,COLUMN_NAME,DATA_LENGTH from dba_tab_colum...
Datapump metadata only import changes datalength when importing with conversion: (export done in WE8ISO8859P15 and import done in AL32UTF8 character set and AL16UTF16 NCHAR character set)
Example table:
Source-SYSTEM (WE8ISO8859P15):
select OWNER,TABLE_NAME,COLUMN_NAME,DATA_LENGTH from dba_tab_columns where TABLE_NAME='STRING';
OWNER TABLE_NAME COLUMN_NAME DATA_LENGTH
--------------- -------------------- -------------------- -----------
SCHEMA_NAME STRING DE 2000
Target-SYSTEM (AL16UTF16):
select OWNER,TABLE_NAME,COLUMN_NAME,DATA_LENGTH from dba_tab_columns where TABLE_NAME='STRING';
OWNER TABLE_NAME COLUMN_NAME DATA_LENGTH
--------------- -------------------- -------------------- -----------
SCHEMA_NAME STRING DE 4000
The data length of varchar2(char 2000) has been changed to varchar2(char 4000) automatically. What is the rule set for this behavior? Is this a documented behavior?
Edit: The issue seem to be not related to datapump but to the conversion from single character to multicharacter set:
Source-SYSTEM (WE8ISO8859P15):
create table test(name varchar2(50), name2 varchar2(5 char), name3 clob);
INSERT INTO test VALUES('Susanne','Test','Hi This is Row one');
select owner,TABLE_NAME, COLUMN_NAME, DATA_LENGTH, CHAR_LENGTH, data_type,char_used FROM ALL_TAB_COLUMNS where TABLE_NAME='TEST';
SCHEMA_NAME TEST NAME 50 50 VARCHAR2 B
SCHEMA_NAME TEST NAME2 5 5 VARCHAR2 C
SCHEMA_NAME TEST NAME3 4000 0 CLOB
Target-SYSTEM (AL16UTF16):
create table test(name varchar2(50), name2 varchar2(5 char), name3 clob);
INSERT INTO test VALUES('Susanne','Test','Hi This is Row one');
select owner,TABLE_NAME, COLUMN_NAME, DATA_LENGTH, CHAR_LENGTH, data_type,char_used FROM ALL_TAB_COLUMNS where TABLE_NAME='TEST';
SCHEMA_NAME TEST NAME 50 50 VARCHAR2 B
SCHEMA_NAME TEST NAME2 20 5 VARCHAR2 C
SCHEMA_NAME TEST NAME3 4000 0 CLOB
Note: In old DB (character set = WE8ISO8859P15)
DATA_LENGTH=1*CHAR_LENGTH for all columns having VARCHAR2 DATA_TYPE (indipendent of CHAR_USED).
In new DB (with UNICODE character set AL32UTF8)
DATA_LENGTH=4*CHAR_LENGTH (or max-value = 4000) for all columns having VARCHAR2 DATA_TYPE and CHAR_USED=C.
r0tt
(1078 rep)
Aug 27, 2021, 06:25 AM
• Last activity: Aug 27, 2021, 02:38 PM
2
votes
1
answers
1541
views
How do I convert a TIMESTAMP to a BIGINT in Azure Synapse
What's the best way to convert a TIMESTAMP to a BIGINT In Azure Synapse. I tried this (which works) but seems clumsy. Is there a better, more concise and efficient way of doing the same thing? SELECT CURRENT_TIMESTAMP AS CURR_TS, CONVERT(BIGINT, CONVERT(CHAR(8), CURRENT_TIMESTAMP, 112) + REPLACE(CON...
What's the best way to convert a TIMESTAMP to a BIGINT In Azure Synapse. I tried this (which works) but seems clumsy. Is there a better, more concise and efficient way of doing the same thing?
SELECT CURRENT_TIMESTAMP AS CURR_TS,
CONVERT(BIGINT,
CONVERT(CHAR(8), CURRENT_TIMESTAMP, 112) +
REPLACE(CONVERT(CHAR(12), CURRENT_TIMESTAMP, 114), ':', '')) AS CURR_TS_NUM
Lauren_G
(69 rep)
Mar 9, 2021, 09:34 PM
• Last activity: Mar 10, 2021, 12:05 AM
0
votes
2
answers
2023
views
64 bit hexadecimal AS Date in SQL
I am loading a Qlikview CalData PGO XML file into a SQL table. Dates in this XML are stored as a 64 bit HEX string and I need to convert it to a date. How do I do that? ### Example `40e58f2c4153d0f9` should be converted to `18/11/2020 09:11:29` I have seen some sample code within qlikview, which is...
I am loading a Qlikview CalData PGO XML file into a SQL table.
Dates in this XML are stored as a 64 bit HEX string and I need to convert it to a date.
How do I do that?
### Example
40e58f2c4153d0f9
should be converted to 18/11/2020 09:11:29
I have seen some sample code within qlikview, which is
daystart(date($(HEX64CONVERT(ToBeDeleted))))
mouliin
(530 rep)
Nov 25, 2020, 12:43 PM
• Last activity: Nov 25, 2020, 02:46 PM
Showing page 1 of 10 total questions