Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
0
votes
0
answers
13
views
Database migrations under DynamoDB - Move entries to code, and create a pipeline
We have a DynamoDB based config. We want to transfer all the configs inside the DDB to a code package. We then want to create a pipeline from the said code package to the DDB, which then handles all the changes inside the DDB. I believe this comes under the topic of "Database migrations", and some e...
We have a DynamoDB based config. We want to transfer all the configs inside the DDB to a code package. We then want to create a pipeline from the said code package to the DDB, which then handles all the changes inside the DDB.
I believe this comes under the topic of "Database migrations", and some examples are FlyWay (but that is used for SQL based DBs)
Wanted to check whether there are some solutions already built for this.
Mooncrater
(101 rep)
Sep 5, 2024, 09:38 AM
0
votes
0
answers
212
views
Is it an option to deploy specific schema with objects - procedures, funtions, views to multiples databases?
We have schema `utils` which we would like to deploy with database objects (procedures, functions, views) to multiple databases on different servers via Azure Devops Pipeline. Is there any option to treat this schema with objects as `ddl` file which will be attached to multiple database solutions ?...
We have schema
utils
which we would like to deploy with database objects (procedures, functions, views) to multiple databases on different servers via Azure Devops Pipeline. Is there any option to treat this schema with objects as ddl
file which will be attached to multiple database solutions ? Or we have to add manually all scripts to all database solutions ?
adam.g
(465 rep)
May 13, 2022, 02:08 PM
• Last activity: May 13, 2022, 02:21 PM
1
votes
1
answers
101
views
Data Pipeline load bulk tables in one time
I am using AWS Data Pipeline for copying my RDS MySQL Database to Redshift. I need to create separate pipeline for each table and each pipeline create new EC2 instance. (Process takes time). **Problem** Is there any way to load whole database table into Redshift with single Data Pipeline . so i can...
I am using AWS Data Pipeline for copying my RDS MySQL Database to Redshift. I need to create separate pipeline for each table and each pipeline create new EC2 instance. (Process takes time).
**Problem**
Is there any way to load whole database table into Redshift with single Data Pipeline . so i can schedule it and sync all my data daily with single pipeline.
NOTE: I have almost 250+ tables in DB that needs to be sync with redshift db (Almost daily). Or suggest any other better way for that purpose.
Muhammad Hashir Anwaar
(121 rep)
Nov 21, 2019, 05:21 AM
• Last activity: Jan 6, 2020, 11:25 AM
2
votes
1
answers
235
views
Pipelinedb: How to group stream data by each N minutes in continuous view
How to group data from pipelinedb's `stream` by each N minutes in `continuous view` select? Pipelinedb's stream gets data about the events that comes from a many remote hosts. I need to group this events by type, ip and time intervals in 5 minutes, for example, and count them. So on input I have (ve...
How to group data from pipelinedb's
stream
by each N minutes in continuous view
select?
Pipelinedb's stream gets data about the events that comes from a many remote hosts. I need to group this events by type, ip and time intervals in 5 minutes, for example, and count them.
So on input I have (very roughly):
CREATE STREAM event (ip varchar(15), type varchar(32));
CREATE CONTINUOUS VIEW AS SELECT ??? from event group by ??? etc...;
INSET INTO event VALUES
('111.111.111.111', 'page_open'), -- 22:35
('111.111.111.111', 'page_open'), -- 22:36
('111.111.111.111', 'page_close '), -- 22:37
('111.111.111.111', 'page_close'), -- 22:42
('222.111.111.111', 'page_open '), -- 22:42
('222.111.111.111', 'page_open'), -- 22:43
('222.111.111.111', 'page_close'), -- 22:44
('111.111.111.111', 'page_open') -- 22:44
);
arrival_timestamp | ip | type
------------------------------------------------
22:35 | 111.111.111.111 | page_open -- new interaval, ends in 22:40
22:36 | 111.111.111.111 | page_open
22:37 | 111.111.111.111 | page_close
22:42 | 111.111.111.111 | page_close -- event comes in next interval, ends in 22:45
22:42 | 222.111.111.111 | page_open
22:43 | 222.111.111.111 | page_open
22:44 | 222.111.111.111 | page_close
22:44 | 111.111.111.111 | page_open
And what must be in continuous view select:
time | ip | type | count
---------------------------------------------
22:40 | 111.111.111.111 | page_open | 2
22:40 | 111.111.111.111 | page_close | 1
22:45 | 111.111.111.111 | page_open | 1
22:45 | 111.111.111.111 | page_close | 1
22:45 | 222.111.111.111 | page_open | 2
22:45 | 222.111.111.111 | page_close | 1
p.s.
Sorry for my english
Zaklinatel
(21 rep)
Dec 21, 2016, 03:59 PM
• Last activity: Dec 23, 2016, 12:43 PM
Showing page 1 of 4 total questions