How can I replicate data from a SQL Server VM in Azure to avoid recovery mode for reporting?
0
votes
0
answers
58
views
Currently, I'm using SQL Server on an Azure VM (DB B) to read from Power BI. DB B updates via log shipping from a primary SQL Server (DB A) and it locks me out from reading DB B two times an hour. I've considered caching strategies with Power BI to help, but I'm not certain that will solve the problem in the longterm, and the transition from DirectQueryImport mode can be a pain.
Requirements/Notes for Suggested Solution(s):
- Not an Enterprise user.
- Changing the log shipping method to another method for updates from DB A is not an option.
- Migrating DB B to Azure SQL Database/managed DB and eliminating SQL Server on Azure VM is not an option.
- The log shipping updates happen at the same times each hour.
- Near(ish) real-time replication would be ideal.
- Transactional replication may not be a solution because each table does not have a primary key.
- Minimizing cost would be ideal.
- Reading from DB B should always be available regardless of data consistency.
- Standing up a third DB, DB C, is an option.
- I'm hesitant about enabling CDC on DB B to use as a basis for an ETL solution because of memory on the DB B VM and as not to cause an issue with the log-shipping processes.
- 5 people may send requests to DB B from time-to-time but never all at once.
I'm thinking about just standing up DuckDB on a VM loaded with Linux and write some Python scripts to update data the few times an hour when DB B is updated via log-shipping.
What are some of your recommended solutions?
Asked by IamTrying
(11 rep)
Feb 4, 2025, 03:20 AM
Last activity: Feb 4, 2025, 11:33 AM
Last activity: Feb 4, 2025, 11:33 AM