Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
0
votes
1
answers
375
views
How to pass, specific day month and year in the below query?
I have registration according to the data in a table and I want to show the whole month, I found the easy solution with below query but I do not know how can I pass specific day, month and year in below query? SELECT AAA.date_field, IFNULL(BBB.val,0) val FROM ( SELECT date_field FROM ( SELECT MAKEDA...
I have registration according to the data in a table and I want to show the whole month, I found the easy solution with below query but I do not know how can I pass specific day, month and year in below query?
SELECT AAA.date_field,
IFNULL(BBB.val,0) val
FROM
(
SELECT date_field
FROM
(
SELECT MAKEDATE(YEAR('2014-02-06'),1)
+ INTERVAL (MONTH('2014-02-06')-1) MONTH
+ INTERVAL daynum DAY date_field
FROM
(
SELECT t*10+u daynum FROM
(SELECT 0 t UNION SELECT 1 UNION SELECT 2 UNION SELECT 3) A,
(SELECT 0 u UNION SELECT 1 UNION SELECT 2 UNION SELECT 3
UNION SELECT 4 UNION SELECT 5 UNION SELECT 6 UNION SELECT 7
UNION SELECT 8 UNION SELECT 9) B ORDER BY daynum
) AA
) AA WHERE MONTH(date_field) = MONTH('2014-02-06')
) AAA LEFT JOIN (SELECT date_field,val FROM school) BBB USING (date_field)
Can you please help me how can I pass the day month and year in the above query?
Prashant Barve
(101 rep)
Aug 6, 2016, 11:02 AM
• Last activity: Aug 4, 2025, 03:05 PM
-1
votes
1
answers
703
views
Select different dates from mysql
I have a database structure like this ID | title | startTime | endTime And what I need is to get 3 records from the database where the first recorder is following this structure `DAY(NOW())-1` the second record has to be like `DAY(NOW())` or the first upcoming record and the third has to be like `DA...
I have a database structure like this
ID | title | startTime | endTime
And what I need is to get 3 records from the database where the first recorder is following this structure
DAY(NOW())-1
the second record has to be like DAY(NOW())
or the first upcoming record and the third has to be like DAY(NOW())+1
What I was able to do is this:
SELECT DISTINCT title, FROM_UNIXTIME(startTime) as start FROM tl_calendar_events
WHERE MONTH(FROM_UNIXTIME(startTime)) = MONTH(NOW())
AND DAY(FROM_UNIXTIME(startTime)) >= DAY(NOW())
ORDER BY DAY(FROM_UNIXTIME(startTime)) ASC LIMIT 3;
But this will return the first 3 upcoming records. The startTime
and endTime
are all unix timestamp fields. So the final output of this query would look something like this if I take that the date is: 06-12-2015
:
1. Record #1 (startTime - 10-11-2015)
2. Record #2 (startTime - 06-12-2015)
3. Record #3 (startTime - 12-01-2016)
Also I need some sort of a DISTINCT select so that if it happens I have like 3 records with the on the same day I only select one. I figured I could do this with 3 joins but just ended up creating a lot of mess. I hope you understood my question and can provide some help.
**SAMPLE DATA:**
- Event #1 (04-11-2015)
- Event #2 (06-12-2015)
- Event #3 (10-12-2015)
- Event #4 (12-03-2016)
- Event #5 (17-04-2016)
**SAMPLE OUTPUT**
- EVENT #2 (as it is the first one before the current date)
- EVENT #3 (as it is the current date or the first closest match)
- EVENT #4 (as it is the first one to come after event #3)
I hope this will help explain my situation much better.
Igor Ilic
(107 rep)
Dec 6, 2015, 03:12 PM
• Last activity: Jul 22, 2025, 09:09 AM
2
votes
1
answers
147
views
Get minute of the day in MySQL
I'm using this to get the minute of the day from a datetime field: ```sql (HOUR(datetime) * 60) + MINUTE(datetime) ``` It works but I am concerned there may be edge cases where it returns the wrong value. Is there any built-in or more bullet proof function like `DAYOFYEAR` that will get me this valu...
I'm using this to get the minute of the day from a datetime field:
(HOUR(datetime) * 60) + MINUTE(datetime)
It works but I am concerned there may be edge cases where it returns the wrong value. Is there any built-in or more bullet proof function like DAYOFYEAR
that will get me this value?
You Old Fool
(790 rep)
Jul 11, 2025, 06:15 PM
• Last activity: Jul 11, 2025, 07:19 PM
-2
votes
2
answers
87
views
Dayofweek vs weekday
Why does MariaDb use these terms differently? ```sql #drop table dbstructA; create table dbstructA(datum date); insert into dbstructA values('2025-01-01'); insert into dbstructA values('2025-01-02'); insert into dbstructA values('2025-01-03'); insert into dbstructA values('2025-01-04'); insert into...
Why does MariaDb use these terms differently?
#drop table dbstructA;
create table dbstructA(datum date);
insert into dbstructA values('2025-01-01');
insert into dbstructA values('2025-01-02');
insert into dbstructA values('2025-01-03');
insert into dbstructA values('2025-01-04');
insert into dbstructA values('2025-01-05');
insert into dbstructA values('2025-01-06');
insert into dbstructA values('2025-01-07');
select date_format(datum,'%a') as day,
dayofweek(datum) as daynum,
weekday(datum) as numday
from dbstructA where datum>='2025-01-01';
MBE
(91 rep)
Jul 4, 2025, 07:42 PM
• Last activity: Jul 5, 2025, 07:52 AM
0
votes
1
answers
208
views
Compare table datetime, SQL & SSIS
I am trying to add a step in my SSIS job where it checks a table (FYI, so this table has only 1column which is a datetime data type that gets updated daily with a datetime from another table) whether it is greater than midnight or not. how do I setup this with the appropriate query. This is the tabl...
I am trying to add a step in my SSIS job where it checks a table (FYI, so this table has only 1column which is a datetime data type that gets updated daily with a datetime from another table) whether it is greater than midnight or not. how do I setup this with the appropriate query.
This is the table name - ETLTimeCheck
This is the only column in this table- EXEC_END_TIME
I started out with this idea:
SELECT...
FROM [dbo].[ETLTimeCheck]
WHERE [EXEC_END_TIME] > DATEADD(D,0,DATEDIFF(D,0,GETDATE()))
This is suppose to return a midnight-> DATEADD(D,0,DATEDIFF(D,0,GETDATE()))
I know there must be an 'IF' statement inside my query.
Can you please help me finish this?
WhoIsNotActive
(13 rep)
Oct 20, 2022, 09:13 PM
• Last activity: Jun 22, 2025, 07:06 AM
3
votes
1
answers
21077
views
MySQL - Truncated incorrect datetime value on update statement
I have an application which produces log files. These log files include a `Timestamp` field in the format `2015-08-25T09:35:01 UTC`. (there will be approximately 60 logs produced at 0.25 GB per day). I need to import these log files into MySQL for analysis. But I have a problem converting the `Times...
I have an application which produces log files. These log files include a
Timestamp
field in the format 2015-08-25T09:35:01 UTC
.
(there will be approximately 60 logs produced at 0.25 GB per day).
I need to import these log files into MySQL for analysis. But I have a problem converting the Timestamp
to DateTime
.
example:
CREATE TABLE test1 (
TIMESTAMP
varchar(25) DEFAULT NULL,
EVENT_TIME
datetime DEFAULT NULL
);
INSERT INTO test1 (TIMESTAMP
)
VALUES
('2015-08-25T09:35:01 UTC'),
('2015-08-25T09:36:01 UTC'),
('2015-08-25T09:37:01 UTC'),
('2015-08-25T09:38:01 UTC'),
('2015-08-25T09:39:01 UTC');
So far so good. I can now run a SELECT
query to get the datetime
SELECT CAST(TIMESTAMP
AS datetime) FROM test1;
But, if I try to update the table with the datetime
format I get an error
UPDATE test1 SET EVENT_TIME
= CAST(TIMESTAMP
AS datetime);
Error Code: 1292. Truncated incorrect datetime value: '2015-08-25T09:35:01 UTC'
Is there a way to do this? as I really need the datetime field in the database, so I don't have to do the CAST
every time I run a query.
I was also looking to partition the table by date, as there will be a lot of data produced, and so I only want to keep the minimum amount of data, and then drop the oldest partitions once I am done.
IGGt
(2276 rep)
Sep 1, 2015, 09:57 AM
• Last activity: Jun 13, 2025, 12:03 PM
3
votes
3
answers
83
views
Ensure only one association record with future expiry date in Postgres
#### The Schema Say I have the following: ``` table user id bigint table promo_codes user_id bigint code string expires_at datetime NOT NULL invalidated_at datetime ``` #### The Need Each user can have only one promo code that expires in the future AND is not deactivated, i.e `expires_at > NOW() AND...
#### The Schema
Say I have the following:
table user
id bigint
table promo_codes
user_id bigint
code string
expires_at datetime NOT NULL
invalidated_at datetime
#### The Need
Each user can have only one promo code that expires in the future AND is not deactivated, i.e expires_at > NOW() AND invalidated_at IS NULL
#### What I tried
1. I tried adding a partial unique index:
CREATE UNIQUE INDEX only_one_active_promo
ON promo_codes (user_id)
WHERE expires_at > NOW() AND invalidated_at IS NULL
but got:
> PG::InvalidObjectDefinition: ERROR: functions in index predicate must be marked IMMUTABLE
That's because NOW()
, CURRENT_TIMESTAMP
and pretty much any other date/time function I'm aware of is STABLE
at best but none of them is IMMUTABLE
which is a requirement for creating an index.
2. I looked into adding a check constraint instead. It's okay if the check is only done during promo_code creation, but I didn't figure out how to do that.
3. Rules is another option that I considered but I have too little knowledge there and am not sure this is the right path actually.
4. At the time of writing this question I'm looking into exclusion constraints which is totally new to me.
Using Postgres 16.3
Tamer Shlash
(145 rep)
May 25, 2025, 03:54 PM
• Last activity: May 27, 2025, 11:43 PM
0
votes
1
answers
519
views
select time intervals from a nested query
I have a table which has the following columns: id, start_timestamp, device_id I am trying to get results of all start_timestamps within 5 seconds of a range of generated periodic timestamps. For example, generating a series of timestamps every 80 seconds, I need to get all 'start_timestamps' that f...
I have a table which has the following columns:
id, start_timestamp, device_id
I am trying to get results of all start_timestamps within 5 seconds of a range of generated periodic timestamps.
For example, generating a series of timestamps every 80 seconds, I need to get all 'start_timestamps' that fall within 5 seconds of each of those timestamps.
So something like this query, but can't figure out the actual query. CTEs might be too exhaustive as the table has millions of records.
Appreciate any help, thanks!
```
SELECT *
FROM time_intersections
WHERE start_timestamp **within 5 seconds of each of these:**
(
SELECT *
FROM generate_series(timestamp '2021-03-07',
timestamp '2021-08-16',
interval '80 sec')
)
roy naufal
(187 rep)
Sep 23, 2021, 04:13 PM
• Last activity: Mar 16, 2025, 06:04 PM
5
votes
1
answers
10794
views
How to query dates in different timezones?
I have a table and index in a PostgreSQL 10.18 database: ``` CREATE TABLE some_table ( expires_at timestamptz ); CREATE INDEX ON some_table(expires_at); ``` Is there a way to write this query in a way to use the index on `expires_at`? ```sql SELECT * FROM some_table WHERE TIMEZONE('America/New_York'...
I have a table and index in a PostgreSQL 10.18 database:
CREATE TABLE some_table (
expires_at timestamptz
);
CREATE INDEX ON some_table(expires_at);
Is there a way to write this query in a way to use the index on expires_at
?
SELECT
*
FROM some_table
WHERE
TIMEZONE('America/New_York', expires_at)::date
< TIMEZONE('America/New_York', NOW())::date
LIMIT 5;
America/New_York
is added as an example, this query is run by using different time zones.
ffox003
(305 rep)
May 20, 2022, 04:48 PM
• Last activity: Mar 14, 2025, 04:21 AM
3
votes
3
answers
3082
views
Is timestamptz preferred when timezone logic is performed by the client?
I'm in the process of migrating a web application from SqlServer to PostgreSQL and I'm trying to figure out which type to replace `datetime2` with. The general advice seems to be always use `timestamptz`, and never use `timestamp`. The reasons given tend to be along the lines that timestamp and `tim...
I'm in the process of migrating a web application from SqlServer to PostgreSQL and I'm trying to figure out which type to replace
datetime2
with.
The general advice seems to be always use timestamptz
, and never use timestamp
. The reasons given tend to be along the lines that timestamp and timestamptz
are stored the same regardless (so no performance penalty) and timestamptz
auto-converts to the timezone of the connection. Ignoring timezones altogether in Rails and PostgreSQL | Stack Overflow
Unfortunately my legacy .NET codebase is very inconsistent with datetimes and we usually render in UTC regardless of the users timezone. More recent code has been using NodaTime and it's Instant
class, but we rarely have to deal with times and displaying just date has been "close enough". My understanding of using NodaTime properly, however, is to convert an Instant
to LocalDateTime
as late as possible - and not in the database.
In addition to this, I'm not entirely sure how Postgres knows the correct timezone of the "current user". I know you can set the timezone specifically as a session parameter SET TIME ZONE 'UTC';
, are you expected to do this for every connection as appropriate for the "current user"? If so, is this reset whenever the connection is retrieved from the connection pool? I also see that Npgsql has the ability to set a timezone for a connection string, presumably this isn't appropriate if it's per user?
All this leads me to think the best option is to use timestamp
for all datetimes, and use application logic to convert to local datetime. I guess another option is to use timestamptz
for all datetimes, force the connection to use UTC in the connection string, and use application logic to convert to local datetime. However I worry that Postgres will perform extra work in doing a no-op conversion between UTC and UTC.
TLDR: Is timestamptz
still preferred if the application always inserts/reads UTC and converts to local datetime itself?
berkeleybross
(175 rep)
Apr 8, 2018, 06:11 PM
• Last activity: Mar 3, 2025, 04:03 PM
1
votes
1
answers
986
views
Weed out certain rows where timestamp is within interval
Given a table structure: create table contacts ( contact_id int(8) not null auto increment, #primary key pid int(8) not null, #personal id contakt_time datetime, #time of contact ) Where each person can have multiple contacts with the system at any point in time. What is the best way to filter out c...
Given a table structure:
create table contacts (
contact_id int(8) not null auto increment, #primary key
pid int(8) not null, #personal id
contakt_time datetime, #time of contact
)
Where each person can have multiple contacts with the system at any point in time.
What is the best way to filter out contacts by a given person in a certain time frame?
E.g. if:
SELECT * FROM contacts where pid = 1201;
yields:
contact pid contact_time
-----------------------------------
10530 1201 2014-01-04 14:33:00
11475 1201 2014-01-13 18:16:00
13460 1201 2014-02-03 12:23:00
14220 1201 2014-02-10 16:55:10
15599 1201 2014-02-24 21:03:34
16488 1201 2014-03-06 12:45:00
16499 1201 2014-03-06 14:55:00
16900 1201 2014-03-10 11:18:00
19108 1201 2014-04-02 10:40:00
20954 1201 2014-04-22 17:24:00
I would need to remove all contacts except the first one in a given time window. For instance, all contacts happening within a 5 hours window after each contact. In the example above, the result would no longer include contact #16499 as this is too close to contact #16488 (< 5 hours apart).
Thank you for any help
tschmitty
(11 rep)
Jul 6, 2015, 11:25 AM
• Last activity: Feb 16, 2025, 12:01 PM
2
votes
2
answers
1074
views
Calculate Total Working Minutes in MySql
I have a design a table for attendance and here are the genral fileds 1. AttendanceMachineLoginId 2. EmpId 1. AttendanceDateTime Whenever user will come to office, he has to make his attendance. The **first** attendance will consider **login** and the **second** will consider the **logout**. **Each...
I have a design a table for attendance and here are the genral fileds
1. AttendanceMachineLoginId
2. EmpId
1. AttendanceDateTime
Whenever user will come to office, he has to make his attendance.
The **first** attendance will consider **login** and the **second** will consider the **logout**. **Each time a record will be added with the time**. A user can make **multiple entries** (login logout in a single day. Like this
EmpId 81 has login and logout two times in same day.
Now, My aim is to generate per day employee report that how many minutes he has given to the company. I just came to know that

TIMESTAMPDIFF()
can provide the minutes but i am unable to understand that how can I apply it to my table. Additionally, I want to ask that, do the **table Fields** are right for the desired report or I need to change it?
**One Another Strategy:** I was also thinking that I should add minutes column in the table and whenever user logout I should calculate the minutes and add that minutes with logout entry.
**Sample Data:**
INSERT INTO attendancemachinelogin
(AttendanceMachineLoginId
, EmpId
, TimeTrackId
, AttendanceDateTime
, RecordAddDateTime
) VALUES
(0, 81, 315079, '2018-8-15 14:8:46', '2018-08-15 14:09:25'),
(0, 81, 315079, '2018-8-15 14:20:38', '2018-08-15 14:21:17'),
(0, 81, 315079, '2018-8-15 14:21:9', '2018-08-15 14:21:47'),
(0, 81, 315079, '2018-8-15 14:28:37', '2018-08-15 14:29:16'),
(0, 81, 315079, '2018-8-15 14:28:58', '2018-08-15 14:29:36'),
(0, 81, 315079, '2018-8-15 14:36:42', '2018-08-15 14:37:21'),
(0, 81, 315079, '2018-8-15 15:36:34', '2018-08-15 15:37:13'),
(0, 81, 315079, '2018-8-15 15:52:39', '2018-08-15 15:53:17'),
(0, 81, 315079, '2018-8-15 16:5:38', '2018-08-15 16:06:17'),
(0, 81, 315079, '2018-8-15 16:6:50', '2018-08-15 16:07:29'),
(0, 81, 315079, '2018-8-15 16:8:49', '2018-08-15 16:09:29'),
(0, 81, 315079, '2018-8-15 16:18:28', '2018-08-15 16:19:08'),
(0, 81, 315079, '2018-8-15 16:20:49', '2018-08-15 16:21:28'),
(0, 81, 315079, '2018-8-15 16:23:18', '2018-08-15 16:23:58'),
(0, 81, 315079, '2018-8-15 16:24:3', '2018-08-15 16:24:42'),
(0, 81, 315079, '2018-8-15 16:24:47', '2018-08-15 16:25:26'),
(0, 81, 315079, '2018-8-15 16:24:58', '2018-08-15 16:25:37'),
(0, 81, 315079, '2018-8-15 16:25:54', '2018-08-15 16:26:33'),
(0, 81, 315079, '2018-8-15 16:56:47', '2018-08-15 16:57:27'),
(0, 101, 417092, '2018-8-15 17:37:53', '2018-08-15 17:38:32'),
(0, 101, 417092, '2018-8-15 18:4:34', '2018-08-15 18:05:14'),
(0, 101, 417092, '2018-8-15 18:7:43', '2018-08-15 18:08:22'),
(0, 81, 315079, '2018-8-15 18:13:15', '2018-08-15 18:13:54'),
(0, 81, 315079, '2018-8-17 10:50:16', '2018-08-17 10:50:54'),
(0, 101, 417092, '2018-8-17 10:51:54', '2018-08-17 10:52:31'),
(0, 4, 413034, '2018-8-17 11:45:16', '2018-08-17 11:45:54'),
(0, 91, 916086, '2018-8-17 11:59:34', '2018-08-17 12:00:12'),
(0, 81, 315079, '2018-8-17 12:0:19', '2018-08-17 12:00:56'),
(0, 81, 315079, '2018-8-17 15:7:41', '2018-08-17 15:08:17'),
(0, 101, 417092, '2018-8-17 15:9:54', '2018-08-17 15:10:32'),
(0, 101, 417092, '2018-8-17 15:10:9', '2018-08-17 15:10:45'),
(0, 101, 417092, '2018-8-17 15:10:23', '2018-08-17 15:10:59'),
(0, 101, 417092, '2018-8-17 15:10:25', '2018-08-17 15:11:02'),
(0, 101, 417092, '2018-8-17 15:11:6', '2018-08-17 15:11:43'),
(0, 101, 417092, '2018-8-17 15:11:15', '2018-08-17 15:11:52'),
(0, 101, 417092, '2018-8-17 15:11:17', '2018-08-17 15:11:54'),
(0, 81, 315079, '2018-8-17 15:11:32', '2018-08-17 15:12:09'),
(0, 81, 315079, '2018-8-17 15:12:32', '2018-08-17 15:13:09'),
(0, 81, 315079, '2018-8-17 15:35:33', '2018-08-17 15:36:10'),
(0, 81, 315079, '2018-8-17 15:41:58', '2018-08-17 15:42:34'),
(0, 81, 315079, '2018-8-17 15:42:17', '2018-08-17 15:42:54'),
(0, 81, 315079, '2018-8-17 16:8:25', '2018-08-17 16:09:01'),
(0, 81, 315079, '2018-8-17 16:8:32', '2018-08-17 16:09:08'),
(0, 101, 417092, '2018-8-17 16:8:53', '2018-08-17 16:09:30'),
(0, 101, 417092, '2018-8-17 16:9:20', '2018-08-17 16:09:57'),
(0, 4, 413034, '2018-8-17 16:10:16', '2018-08-17 16:10:53'),
(0, 36, 413037, '2018-8-17 16:10:46', '2018-08-17 16:11:23'),
(0, 81, 315079, '2018-8-17 16:22:21', '2018-08-17 16:22:58'),
(0, 101, 417092, '2018-8-17 16:22:45', '2018-08-17 16:23:21'),
(0, 4, 413034, '2018-8-17 16:23:12', '2018-08-17 16:23:49'),
(0, 81, 315079, '2018-8-17 16:23:35', '2018-08-17 16:24:12'),
(0, 81, 315079, '2018-8-17 16:44:4', '2018-08-17 16:44:42'),
(0, 101, 417092, '2018-8-17 16:44:22', '2018-08-17 16:44:58'),
(0, 81, 315079, '2018-8-17 17:6:51', '2018-08-17 17:07:28'),
(0, 101, 417092, '2018-8-17 17:7:8', '2018-08-17 17:07:45'),
(0, 4, 413034, '2018-8-17 17:7:52', '2018-08-17 17:08:28'),
(0, 81, 315079, '2018-8-17 17:9:25', '2018-08-17 17:10:02'),
(0, 101, 417092, '2018-8-17 17:9:46', '2018-08-17 17:10:22'),
(0, 4, 413034, '2018-8-17 17:10:6', '2018-08-17 17:10:42'),
(0, 81, 315079, '2018-8-17 17:10:24', '2018-08-17 17:11:01'),
(0, 81, 315079, '2018-8-17 17:10:39', '2018-08-17 17:11:15'),
(0, 101, 417092, '2018-8-17 17:10:47', '2018-08-17 17:11:24'),
(0, 101, 417092, '2018-8-17 17:10:58', '2018-08-17 17:11:35'),
(0, 81, 315079, '2018-8-17 17:11:10', '2018-08-17 17:11:46'),
(0, 101, 417092, '2018-8-17 17:11:31', '2018-08-17 17:12:09'),
(0, 4, 413034, '2018-8-17 17:40:40', '2018-08-17 17:41:18'),
(0, 101, 417092, '2018-8-17 17:41:23', '2018-08-17 17:41:59'),
(0, 36, 413037, '2018-8-17 17:41:37', '2018-08-17 17:42:14'),
(0, 81, 315079, '2018-8-17 17:42:9', '2018-08-17 17:42:45'),
(0, 3, 213020, '2018-8-17 17:47:34', '2018-08-17 17:48:11'),
(0, 81, 315079, '2018-8-17 17:48:16', '2018-08-17 17:48:52'),
(0, 4, 413034, '2018-8-17 17:48:59', '2018-08-17 17:49:36'),
(0, 4, 413034, '2018-8-17 17:49:59', '2018-08-17 17:50:36'),
(0, 36, 413037, '2018-8-17 17:52:36', '2018-08-17 17:53:13'),
(0, 101, 417092, '2018-8-17 17:52:53', '2018-08-17 17:53:29'),
(0, 6, 213016, '2018-8-17 17:53:30', '2018-08-17 17:54:06'),
(0, 81, 315079, '2018-8-17 17:53:44', '2018-08-17 17:54:20'),
(0, 4, 413034, '2018-8-17 17:54:27', '2018-08-17 17:55:03'),
(0, 3, 213020, '2018-8-17 17:54:49', '2018-08-17 17:55:27'),
(0, 4, 413034, '2018-8-17 17:55:23', '2018-08-17 17:56:00'),
(0, 36, 413037, '2018-8-17 17:58:33', '2018-08-17 17:59:10'),
(0, 101, 417092, '2018-8-17 17:58:47', '2018-08-17 17:59:24'),
(0, 102, 517094, '2018-8-17 17:59:4', '2018-08-17 17:59:40'),
(0, 81, 315079, '2018-8-17 17:59:33', '2018-08-17 18:00:09'),
(0, 4, 413034, '2018-8-17 18:0:16', '2018-08-17 18:00:52'),
(0, 3, 213020, '2018-8-17 18:0:40', '2018-08-17 18:01:17'),
(0, 6, 213016, '2018-8-17 18:1:30', '2018-08-17 18:02:06'),
(0, 36, 413037, '2018-8-17 18:26:24', '2018-08-17 18:27:01'),
(0, 101, 417092, '2018-8-17 18:26:38', '2018-08-17 18:27:14'),
(0, 6, 213016, '2018-8-17 18:27:9', '2018-08-17 18:27:45'),
(0, 81, 315079, '2018-8-17 18:27:24', '2018-08-17 18:28:00'),
(0, 102, 517094, '2018-8-17 18:27:38', '2018-08-17 18:28:14'),
(0, 4, 413034, '2018-8-17 18:28:13', '2018-08-17 18:28:49'),
(0, 81, 315079, '2018-8-17 19:36:49', '2018-08-17 19:37:26'),
(0, 101, 417092, '2018-8-17 19:37:17', '2018-08-17 19:37:54'),
(0, 102, 517094, '2018-8-17 19:37:30', '2018-08-17 19:38:07'),
(0, 36, 413037, '2018-8-17 19:38:13', '2018-08-17 19:38:50'),
(0, 4, 413034, '2018-8-17 19:38:54', '2018-08-17 19:39:32'),
(0, 3, 213020, '2018-8-17 19:39:58', '2018-08-17 19:40:35'),
(0, 101, 417092, '2018-8-18 10:21:26', '2018-08-18 10:22:03'),
(0, 81, 315079, '2018-8-18 10:30:23', '2018-08-18 10:31:09'),
(0, 4, 413034, '2018-8-18 10:31:46', '2018-08-18 10:32:27'),
(0, 102, 517094, '2018-8-18 10:32:15', '2018-08-18 10:32:53'),
(0, 6, 213016, '2018-8-18 10:32:44', '2018-08-18 10:33:22'),
(0, 3, 213020, '2018-8-18 10:33:23', '2018-08-18 10:34:03'),
(0, 81, 315079, '2018-8-18 10:42:49', '2018-08-18 10:43:27'),
(0, 101, 417092, '2018-8-18 10:43:25', '2018-08-18 10:44:03'),
(0, 81, 315079, '2018-8-18 10:48:51', '2018-08-18 10:49:30'),
(0, 102, 517094, '2018-8-18 10:49:9', '2018-08-18 10:49:49'),
(0, 81, 315079, '2018-8-18 10:56:46', '2018-08-18 10:57:25'),
(0, 1, 1211003, '2018-8-18 10:57:0', '2018-08-18 10:57:38'),
(0, 4, 413034, '2018-8-18 10:57:51', '2018-08-18 10:58:38'),
(0, 3, 213020, '2018-8-18 10:58:43', '2018-08-18 10:59:26');
Muhammad Faizan Khan
(143 rep)
Aug 18, 2018, 07:25 AM
• Last activity: Feb 5, 2025, 04:08 PM
1
votes
1
answers
394
views
Convert Extended Event action mdmget_TimeStampUTC to datetime
SQL Server Extended Events can include an action of `mdmtargetpkg.mdmget_TimeStampUTC`. It returns a value such as 132085458320550473. How can that be converted to an actual date? It doesn't appear to be a valid Unix timestamp, even using microseconds. `@@VERSION`: ```lang-none Microsoft SQL Azure (...
SQL Server Extended Events can include an action of
mdmtargetpkg.mdmget_TimeStampUTC
. It returns a value such as 132085458320550473. How can that be converted to an actual date?
It doesn't appear to be a valid Unix timestamp, even using microseconds.
@@VERSION
:
-none
Microsoft SQL Azure (RTM) - 12.0.2000.8
Jul 3 2019 10:02:53
Copyright (C) 2019 Microsoft Corporation
Riley Major
(1965 rep)
Jul 25, 2019, 04:48 PM
• Last activity: Jan 23, 2025, 08:06 PM
0
votes
1
answers
2290
views
Pass 'interval' value to date_trunc function in PostgreSQL procedure
I am fetching the **"age"** of **two timestamp columns** (**End and Start**) and further choosing it as **to_char 'MM'** format. So I have the difference of the months from two timestamp columns. The syntax in the above statement goes like : i = (select to_char(age(End,Start),'MM')); interval_value...
I am fetching the **"age"** of **two timestamp columns** (**End and Start**) and further choosing it as **to_char 'MM'** format. So I have the difference of the months from two timestamp columns.
The syntax in the above statement goes like :
i = (select to_char(age(End,Start),'MM'));
interval_value = (select i || ' month');
**Also tried:**
interval_value = i || ' Month'
Now,
Passing the value of 'i' in another function **date_trunc** of PostgreSQL.
xyz = (select date_trunc('month',Start::date) + interval_value::text::interval);
The data types for the above variables are:
i numeric :=0
xyz date;
interval_value varchar;
But it doesn't seem to work.
Is there any alternate approach I could use here. The main idea is to get the difference from two timestamps in months and then further passing the difference into the date_trunc function.
Pranjal Kaushik
(1 rep)
Mar 18, 2019, 08:28 AM
• Last activity: Dec 27, 2024, 07:02 AM
-1
votes
1
answers
105
views
Joining with generate_series for missing dates with derived date column
There are a few similar questions to this (e.g.https://dba.stackexchange.com/questions/72419/filling-in-missing-dates-in-record-set-from-generate-series) but the solution does not appear to work in my case... Essentially I am trying to generate zero entries for dates not present in a series but I su...
There are a few similar questions to this (e.g.https://dba.stackexchange.com/questions/72419/filling-in-missing-dates-in-record-set-from-generate-series) but the solution does not appear to work in my case... Essentially I am trying to generate zero entries for dates not present in a series but I suspect the issue is that I am having to extract the date value from a timestamp? I've used SQL for years but very new to postgres - impressed so far, though.. Have tried both a left and right join here but no joy...
Here is a little test case (are sql fiddles still encouraged?):
-- temp test table - works as expected
WITH incomplete_data(payment_date, payment_id) AS (
VALUES
('2024-09-06 11:26:57.509429+01'::timestamp with time zone, 'uuid01')
,('2024-09-06 12:26:57.509429+01', 'uuid02')
,('2024-09-07 07:26:57.509429+01', 'uuid03')
,('2024-09-08 10:26:57.509429+01', 'uuid05')
,('2024-09-08 12:26:57.509429+01', 'uuid08')
,('2024-09-08 14:26:57.509429+01', 'uuid11')
,('2024-09-10 09:26:57.509429+01', 'uuid23')
)
select * from incomplete_data;
-- generated dates - work as expected
select * FROM (
SELECT generate_series(timestamp '2024-01-01'
, timestamp '2024-01-01' + interval '1 year - 1 day'
, interval '1 day')::date
) d(day)
;
-- join - failing to do what I was hoping..
WITH incomplete_data(payment_date, payment_id) AS (
VALUES
('2024-09-06 11:26:57.509429+01'::timestamp with time zone, 'uuid01')
,('2024-09-06 12:26:57.509429+01', 'uuid02')
,('2024-09-07 07:26:57.509429+01', 'uuid03')
,('2024-09-08 10:26:57.509429+01', 'uuid05')
,('2024-09-08 12:26:57.509429+01', 'uuid08')
,('2024-09-08 14:26:57.509429+01', 'uuid11')
,('2024-09-10 09:26:57.509429+01', 'uuid23')
)
select count(payment_id), date_trunc('day',payment_date)::date as time
FROM (
SELECT generate_series(timestamp '2024-01-01'
, timestamp '2024-01-01' + interval '1 year - 1 day'
, interval '1 day')::date
) d(day)
right JOIN incomplete_data p ON date_trunc('day',payment_date) = d.day
where payment_date BETWEEN '2024-09-01T12:55:36.824Z' AND '2024-09-30T13:55:36.824Z'
GROUP BY date_trunc('day',payment_date)
ORDER BY date_trunc('day',payment_date);
count | time
-------+------------
2 | 2024-09-06
1 | 2024-09-07
3 | 2024-09-08
1 | 2024-09-10
(4 rows)
I was hoping to get a row for every day in the month with zeroes for unpopulated days. The background is that this is for populating a grafana query.
Can anyone suggest what I am doing wrong or am I failing to grasp a bigger issue here? My version is: PostgreSQL 15.9 (Debian 15.9-1.pgdg120+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 12.2.0-14) 12.2.0, 64-bit
**UPDATE**
jjanes answer below helped clarify the sequence of joining and filtering for me - this is the required select:
WITH incomplete_data(payment_date, payment_id) AS (
VALUES
('2024-09-06 11:26:57.509429+01'::timestamp with time zone, 'uuid01')
,('2024-09-06 12:26:57.509429+01', 'uuid02')
,('2024-09-07 07:26:57.509429+01', 'uuid03')
,('2024-09-08 10:26:57.509429+01', 'uuid05')
,('2024-09-08 12:26:57.509429+01', 'uuid08')
,('2024-09-08 14:26:57.509429+01', 'uuid11')
,('2024-09-10 09:26:57.509429+01', 'uuid23')
)
select count(payment_id), d.day as time
FROM (
SELECT generate_series(timestamp '2024-01-01'
, timestamp '2024-01-01' + interval '1 year - 1 day'
, interval '1 day')::date
) d(day)
left JOIN incomplete_data p ON date_trunc('day',payment_date) = d.day
where d.day BETWEEN '2024-09-01T12:55:36.824Z' AND '2024-09-30T13:55:36.824Z'
GROUP BY d.day
ORDER BY d.day
;
cam
(1 rep)
Nov 25, 2024, 03:31 PM
• Last activity: Nov 25, 2024, 06:12 PM
5
votes
2
answers
4054
views
Split up interval into year slices
Being a novice Postgres user, I have a table `ad` in my PostgreSQL 9.5 (x64) database with 87 rows. In addition to other columns, it has two columns 'start' and 'end' having date-time duration range like this: ID Start End 1 2003-06-07 00:00:00 2004-09-30 23:59:59 I need to split the range into one...
Being a novice Postgres user, I have a table
ad
in my PostgreSQL 9.5 (x64) database with 87 rows. In addition to other columns, it has two columns 'start' and 'end' having date-time duration range like this:
ID Start End
1 2003-06-07 00:00:00 2004-09-30 23:59:59
I need to split the range into one year windows in separate rows (from start year to the last year of interval) stored in database like this:
ID Start
1_2003 2003-06-07 00:00:00 2003-12-31 23:59:59
1_2004 2004-01-01 00:00:00 2004-09-30 23:59:59
Using the operator ||'_'||
and Extract()
function, I am able to concatenate ID with year. Also, this question addresses how to split up in interval in weeks and this one shows how to do the same for days but none of them addresses how to split up an interval in years specifically.
I avoided this question because I don't want to go for a stored procedure based approach. I am aware that generate_series()
returns a series from start and stop parameters but actually I am struggling to break the interval at last day of the year and then re-starting from first day of the year in next row. I would highly appreciate if someone could guide me to this?
khajlk
(197 rep)
Aug 5, 2016, 12:22 PM
• Last activity: Nov 21, 2024, 10:52 PM
0
votes
2
answers
151
views
Datetime2 in SQL 2014 Vs SQL 2022
Behaviour of column type 'DateTime2' between SQL 2014 and SQL 2022. This was found when our application's calculation for datetime difference of columns being type DateTime2 went wrong. Tried to find from google if there would be a specific setting in server property for datetime2 type of columns, b...
Behaviour of column type 'DateTime2' between SQL 2014 and SQL 2022.
This was found when our application's calculation for datetime difference of columns being type DateTime2 went wrong.
Tried to find from google if there would be a specific setting in server property for datetime2 type of columns, but no luck.
Noticed - DateTime2 improvements
It is stated the behaviour is expected.
Datetime2 with SQL 2014 is of precision 7 with only first 3 digits updated whereas its now 7 with SQL 2022 updated for all 7 digits. And so calculations with the datetime2 fields are turning mess after upgrading.
Now, there are lot of columns in many tables which were defined as datetime2. It would take time to fix correction to all queries, application codes.
Will there be any simple fix to this?
Meera K
(81 rep)
Nov 12, 2024, 05:18 AM
• Last activity: Nov 20, 2024, 12:54 PM
15
votes
2
answers
3268
views
Incorrect comparison of datetime and datetime2
I know that it is not a good practice to have an implicit type convert. But this is really unexpected behavior when a lower value can suddenly become higher. declare @LastSelectedDate DATETIME = '2021-11-09 13:52:29.187' declare @LastSelectedDate_1 DATETIME2(7) = '2021-11-09 13:52:29.1866667' SELECT...
I know that it is not a good practice to have an implicit type convert. But this is really unexpected behavior when a lower value can suddenly become higher.
declare @LastSelectedDate DATETIME = '2021-11-09 13:52:29.187'
declare @LastSelectedDate_1 DATETIME2(7) = '2021-11-09 13:52:29.1866667'
SELECT IIF(@LastSelectedDate_1 > CAST(@LastSelectedDate AS DATETIME2), 1, 0)
SELECT IIF(@LastSelectedDate_1 > @LastSelectedDate, 1, 0)
Is this a bug or I am missing something? I am using sql server 2016.
Artashes Khachatryan
(1533 rep)
Nov 9, 2021, 03:59 PM
• Last activity: Nov 20, 2024, 12:33 PM
4
votes
3
answers
1586
views
Integer number in the 700000s as the days from year 1: how can this be cast in tsql to a date and back if the oldest datetime date is 1753-01-01?
I fell upon an integer format for dates for which I also know the date, but I do not know how to get to that date in TSQL and I also do not know how to get to the integer if I have the date: 700444 -> 1918-10-02 731573 -> 2003-12-24 739479 -> 2025-08-16 Those 6-digit numbers would fit as a counter f...
I fell upon an integer format for dates for which I also know the date, but I do not know how to get to that date in TSQL and I also do not know how to get to the integer if I have the date:
700444 -> 1918-10-02
731573 -> 2003-12-24
739479 -> 2025-08-16
Those 6-digit numbers would fit as a counter for each day from 0001-01-01 onwards, I checked that by getting the number of days for one century from that date that is almost year 2000 and adding that to 1900:
select DATEADD(dd,731573/20,'19000101')
Out:
2000-02-24 00:00:00.000
But I cannot run
select DATEADD(dd,731573/20,'10000101')
, which throws:
> The conversion of a varchar data type to a datetime data type resulted in an out-of-range value.
Microsoft Learn says that TSQL allows dates only from 1753-01-01 onwards, see [datetime (Transact-SQL) Microsoft Learn](https://learn.microsoft.com/en-us/sql/t-sql/data-types/datetime-transact-sql?view=sql-server-ver16#description) , thus:
select DATEADD(dd,731573/20,'17530101')
Out:
1853-02-24 00:00:00.000
I cannot add the 731573 to year 1, though. Then I found [What is the significance of 1/1/1753 in SQL Server?](https://stackoverflow.com/questions/3310569/what-is-the-significance-of-1-1-1753-in-sql-server) :
*--(as said in one of the answers and at [Why should you always write "varchar" with the length in brackets behind it? Often, you get the right outcome without doing so - DBA SE](https://dba.stackexchange.com/q/338742/212659) , take varchar(length)
instead of just varchar
)--*
SELECT CONVERT(VARCHAR, DATEADD(DAY,-731572,CAST('2003-12-24' AS DATETIME2)),100)
SELECT CONVERT(VARCHAR(30), DATEADD(DAY,-731572,CAST('2003-12-24' AS DATETIME2)),100)
Out:
Jan 1 0001 12:00AM
So that this is proven, the number *is* the days from the first day of year 0001. Now I wonder whether I can get there without formatting the datetime column as datetime2. My dates are all just in the 20th and 21st century so that I do not need the datetime2
. I get the data as datetime and try to avoid a type conversion.
How can I cast this integer in the seven-houndred-thousands as the counter of the days from the year 1 on to a date and how can I get from the date back to that integer without converting the date to datetime2?
questionto42
(366 rep)
Apr 17, 2024, 11:00 PM
• Last activity: Oct 9, 2024, 08:27 PM
0
votes
1
answers
453
views
How to summate weekly totals from a postgres date range type per user
I have the following table: ``` Schedules +----+---------+------------+-------+--------------------------+ | id | user_id | project_id | hours | date_range | +----+---------+------------+-------+--------------------------+ | 1 | 1 | 1 | 4 | [2019-05-01, 2019-05-08) | +----+---------+------------+---...
I have the following table:
Schedules
+----+---------+------------+-------+--------------------------+
| id | user_id | project_id | hours | date_range |
+----+---------+------------+-------+--------------------------+
| 1 | 1 | 1 | 4 | [2019-05-01, 2019-05-08) |
+----+---------+------------+-------+--------------------------+
| 2 | 2 | 1 | 8 | [2019-04-20, 2019-05-20) |
+----+---------+------------+-------+--------------------------+
| 3 | 1 | 2 | 2 | [2019-05-05, 2019-05-12) |
+----+---------+------------+-------+--------------------------+
| ...etc | | | |
+--------------+------------+-------+--------------------------+
What I was hoping to accomplish with SQL was to get a summation of total hours per week given a specific date range input, for example date_start: 2019-05-05
, date_end: 2019-05-11
.
For the week of 2019-05-05
to 2019-05-11
, any Schedule
that has a date_range
that has overlapping dates would get summated by project_id and by user_id.
Example: Row 1 from the above table has a date_range
of [2019-05-01, 2019-05-08)
so it would have 2019-05-05
, 2019-05-06
, 2019-05-07
(2019-05-08
omitted because exclusive )
) dates that match the date_start/date_end
criterion.
Since the hours are 4
for that row, the given total would be 4 * n days = 12 hours total
The date_start/date_end
can be arbitrarily large, so say it was over a year range, every date_range
would be added by week over the given range.
Potential expected output (open to suggestions):
| week_start | week_end | project_id | user_id | total_hours |
| 2019-05-05 | 2019-05-11 | 1 | 1 | 12 |
| 2019-05-05 | 2019-05-11 | 1 | 2 | 56 |
| etc ... | | | | |
I'm not quite sure how to structure a query to generate matching date_rage
days and multiply them out per user.
Can someone point me in the right direction?
tr3online
(113 rep)
May 31, 2019, 06:15 AM
• Last activity: Oct 8, 2024, 02:06 PM
Showing page 1 of 20 total questions