Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
1
votes
1
answers
1778
views
Mirroring issue with mirror server
I am running two Windows Servers 2012 R2 with SQL Server 2014 Standard Edition. I have strange issue on mirroring. I configured mirroring on for 15 databases, no Witness server. The Principal server show all database is Principal, Synchronized and the Mirror is Mirror, Synchronized/Restoring. When I...
I am running two Windows Servers 2012 R2 with SQL Server 2014 Standard Edition. I have strange issue on mirroring.
I configured mirroring on for 15 databases, no Witness server. The Principal server show all database is Principal, Synchronized and the Mirror is Mirror, Synchronized/Restoring.
When I open Database Mirror Monitor I see Role Principal is in Mirroring State Synchronized but the Mirror's Role is not connected. I have fail over in both directions and the database came up on the mirror server normally. data is replicating for troubleshooting purpose I have done below steps.
1. I have seen in error log in principle nothing is related to mirroring traced I can see most of the errors in principle are Error: 18456, Severity: 14, State: 38.
2. I have seen errors in mirroring server for all the databases are Database mirroring connection error 4 'An error occurred while receiving data: '10054(An existing connection was forcibly closed by the remote host.)'.' for 'TCP://server.company.com:5022'. Error: 1474, Severity: 16, State:
3. I have tried to resolve by stop and start the endpoint its was not resolved.
4. I have tried pause and resume it is also not resolved.
5. I have verified TCP/IP it is enabled on both the servers.
6. I have tried to configure alias it is also not resolve.
7. I have telnet from each other, it seems have no communication issues
8. I have verified connect rights to domain level same service account for both principle and mirror.
9. Earlier it is SQL server 2014 sp2 ,today I have updated with latest service sp3 even though still I can see same in mirror monitor Not connected to mirror.
Please help on this issue, how to resolve this?
Nasar Babu Chowdary
(21 rep)
Dec 28, 2018, 10:14 AM
• Last activity: Aug 2, 2025, 04:06 AM
3
votes
1
answers
425
views
SQL Server 2014 Availability Group failed
I have 3 servers on VMware workstation: 1. Win 2012 R2 as ADDC -Cluster service. 2. Win 2012 R2 as SQL server primary replica . 3. Win 2012 R2 as SQL server secondary replica (auto fail-over). My Availability group fail in this scenario: When I disconnect primary replica from the network the cluster...
I have 3 servers on VMware workstation:
1. Win 2012 R2 as ADDC -Cluster service.
2. Win 2012 R2 as SQL server primary replica .
3. Win 2012 R2 as SQL server secondary replica (auto fail-over).
My Availability group fail in this scenario: When I disconnect primary replica from the network the cluster fail to announce a fail-over to secondary replica when I try to manual fail-over from secondary replica instance I get this error:
however this not the case when I stop primary replica from cluster manager because it will do auto fail-over.
My Availability group fail in this scenario: When I disconnect primary replica from the network the cluster fail to announce a fail-over to secondary replica when I try to manual fail-over from secondary replica instance I get this error:

danarj
(169 rep)
Aug 16, 2015, 06:58 AM
• Last activity: Jul 31, 2025, 12:03 PM
0
votes
2
answers
2053
views
Changing startup accounts on alwayson setup
I have a 2 node Always On Availability Group cluster where we are planning to take backups to a file share which will then be moved to Azure cloud. We have a service account from the Windows team for the backup job, but the startup account of the database engine and the SQL Server Agent are local ac...
I have a 2 node Always On Availability Group cluster where we are planning to take backups to a file share which will then be moved to Azure cloud. We have a service account from the Windows team for the backup job, but the startup account of the database engine and the SQL Server Agent are local accounts. Using proxies did not help me with the situation so I am now thinking of providing the service account credentials in the SQL Server Configuration Manager.
What I want to know is how do I proceed?
This is what I have planned. Please let me know if I have missed something in the flow
1. Change service account on the Node2 (secondary) - both **SQL Server (_InstanceName_)** (database engine) and **SQL Server Agent (_InstanceName_)** (job engine)
2. Pause synchronization
3. Restart the **SQL Server (_InstanceName_)** service and the **SQL Server Agent (_InstanceName_)**
4. Failover the **AOAG** to the secondary, then change the account details of the primary (now the seconday)
5. repeat step 2. and 3.
6. Fail back to the primary again
Please let me know if I am missing any step here
Thanks!
buddinggeek
(623 rep)
Nov 18, 2016, 03:33 PM
• Last activity: Jul 22, 2025, 06:06 AM
4
votes
1
answers
167
views
Migrating a SQL Server 2014 AG to a new 2017 AG
Our current production server runs on SQL Server 2014 AlwaysOn AG. We are planning to upgrade it to SQL Server 2017 AlwaysOn. Due to the configuration issues with the present cluster, we can not use the existing cluster. The production server has 52 databases with a total size of 2 TB. The databases...
Our current production server runs on SQL Server 2014 AlwaysOn AG. We are planning to upgrade it to SQL Server 2017 AlwaysOn.
Due to the configuration issues with the present cluster, we can not use the existing cluster.
The production server has 52 databases with a total size of 2 TB. The databases servers an online system, so minimum downtime is our core requirement.
Our initial plan is to go for a side by side approach.
1. Provision 3 SQL Server 2017
2. Create new cluster
3. Logs from the existing 2014 AG Primary server to all the servers in the new 2017 group.
On the switchover day,
1. Disconnect all applications from the existing server.
2. Do a final logshipping.
3. Apply logs to all servers in the new group WITH NORECOVERY, except WITH RECOVERY on Primary_new
4. Create AG and add all database to AG
5. Change all applications to point to the new cluster.
6. Shutdown old cluster.
Is this a good approach? Are there any other approaches?
If I am missing anything, please guide me.
Thanks.

user1716729
(693 rep)
Dec 27, 2018, 05:48 PM
• Last activity: Jul 18, 2025, 05:03 PM
1
votes
1
answers
154
views
Access previous records from a table variable without using cursor in SQL Server 2014
In a stored procedure, I have to do some calculation based on the previous record and a total. Currently I can achieve this using Cursor or a table variable, but its taking a lot of time as the number of records are huge. How can I achieve this to improve the performance? A part of my stored procedu...
In a stored procedure, I have to do some calculation based on the previous record and a total. Currently I can achieve this using Cursor or a table variable, but its taking a lot of time as the number of records are huge.
How can I achieve this to improve the performance?
A part of my stored procedure:
DECLARE @HoursTemp TABLE
(
ID [INT],
EmployeeName [VARCHAR](100),
WorkDate [DATE],
RegularHours [DECIMAL],
OvertimeHours [DECIMAL],
NightJob [BIT],
JobId [INT]
)
INSERT INTO @HoursTemp
SELECT ID, EmployeeName, WorkDate, RegularHours,OvertimeHours, NightJob, JobId
FROM @PayrollSheet
GROUP BY WorkDate, EmployeeName , JobIdType, NightJob, RegularHours, OvertimeHours, ID, JobId
ORDER BY WorkDate, EmployeeName, JobIdType, NightJob, JobId
SELECT @count = COUNT(ID) FROM @HoursTemp WHILE (@i = 8 AND @SumOverTimeHours = 0)
BEGIN
SET @SumOverTimeHours = @OverTimeHour + (@SumOfHours - 8); -- Sum up the overtime hour.
SET @RegularHours = 8 - (@SumOfHours - @RegularHours); -- Get remainder regular hour.
UPDATE @PayrollSheet SET RegularHours = @RegularHours, OvertimeHours = @SumOverTimeHours Where ID = @PayRollID -- Update the temp table.
END
ELSE
BEGIN
--When sum of regular hours is greater than 8, further regular hours of the day will be considered as over time hours.
IF(@SumOfHours >=8)
UPDATE @PayrollSheet SET RegularHours = 0, OvertimeHours = @RegularHours Where ID = @PayRollID --Update the over time hours with regular hour and set regular hour with 0.
END
END
ELSE
BEGIN
SET @SumOfHours = @RegularHours; -- If Dates does not match then reset the regular hours sum variable to current regular hours so it will be summed with next regular hours when dates match.
SET @SumOverTimeHours = 0; -- Sum up Over time hours present in existing records.
--SET @SumOfNightHours = 0; -- Reset OverTime Sum and Night hours sum variable when dates does not match.
END
-- Get counts of record with same work date,Job ID and Night Job true and match it with total records with same work date.
SELECT @RecordsWithWorkDateAndJobID = COUNT(*) from @PayrollSheet where WorkDate = @WorkDate AND JobID = @PayRollJobID AND NightJob = 1 AND EmployeeName = @EmployeeName
SELECT @RecordsWithOnlyWorkDate = COUNT(*) from @PayrollSheet where WorkDate = @WorkDate AND EmployeeName = @EmployeeName --AND NightJob = 1
--If both count matches then night hours are entered in same job on a day. Else night hours are entered in more than one job. Also if sum of night hours is less than 8 then only update night hours.
IF(NOT(@RecordsWithWorkDateAndJobID > 0 AND @RecordsWithOnlyWorkDate > 0 AND @RecordsWithWorkDateAndJobID = @RecordsWithOnlyWorkDate)) --AND @SumOfNightHours < 8
BEGIN
UPDATE @PayrollSheet SET NightJob = 0 Where ID = @PayRollID -- Set regular hours as 0 and night hours as the regular hour.
--SET @SumOfNightHours = @SumOfNightHours + @RegularHours; -- Maintain sum of night hours, so that if it is greater than 8 then night hours will not be added.
END
SET @PreviousDate = @WorkDate --Store previous work date.
SET @PreviousJobID = @PayRollJobID -- Store previous job id.
SET @PreviousEmployeeName = @EmployeeName
-- Fetch next record from temp table.
SELECT @i = @i + 1
END
Rishi Samadhiya
(21 rep)
Feb 23, 2017, 11:46 AM
• Last activity: Jul 18, 2025, 09:04 AM
5
votes
1
answers
149
views
How to calculate required ressources from a baseline and what should be measured at all?
I have to deal with the following situatiuon: Currently we have - 1 cluster with 5 nodes running 10 instances (SQL Server 2008 Enterprise edition), and - 1 cluster with 2 nodes running 5 instances (SQL Server 2014 Standard edition) in use. All servers are virtual servers running in VMWare. We want t...
I have to deal with the following situatiuon:
Currently we have
- 1 cluster with 5 nodes running 10 instances (SQL Server 2008 Enterprise edition), and
- 1 cluster with 2 nodes running 5 instances (SQL Server 2014 Standard edition)
in use. All servers are virtual servers running in VMWare.
We want to setup a new cluster (SQL Server 2014 Enterprise). In a first step, 1 instance from the 2008 Enterprise cluster and 1 instance from the 2104 Standard cluster are to be migrated.
Therefore, my boss asked 2 questions:
1. How many cores do we need (aim: minimize license costs)?
2. How much RAM do we need?
My answer was: "It depends ..." Now I have to deliver hard facts by monitoring the next few weeks. Great! (beware of irony)
My approach for question number 1:
Using perfmon.exe I plan to monitor
- **Processor\% Processor Time** (_total and single cores),
- **Processor\% User Time** (_total and single cores),
- **Processor\% Interrupt Time** (_total and single cores) - is this really necessary? -, and
- **System\Processor Queue Length**.
The question is, where to get these data from? From the node? From the SQL Server?
In the first case it should be easy: the first instance in question - *vsql2008ent-1\instanceX* for the sake of simplicity - is currently running on a node, let's name it node *sql2008NodeA*. No other instances, nor server should run under normal conditions on this node. So it should not matter where I get the data from, should it? In case of a disaster other instances will be running on this node, too. But we want to have a baseline for normal operation.
The second instance - *vsql2014stan-1\instanceY* - shares it's node - *sql2014NodeA* - with 2 other instances. In this case I can never be sure, how much cores the instance will truely need for smooth operation, right? So I can monitor the instance. But what does the result mean? It shows the CPU ressources actually used by this instance, only. But would more cores have been used if they were available? So what would be the answer to the question mentioned above?
RAM is the other question. Due to several disasters in the past when all instances landed on the same node I have set an upper limit for the maximum server memory for each instance. This limit depends on the available memory of the node (currently 100GB or 120GB respectively). So how to monitor this? If all memory is used up, everything seems clear: insufficient memory. If all goes slow: insufficient memory. But how much memory do I really need?
I try to summarize my questions:
1. Where should I get the measures from (node vs. server)?
2. Do I need to monitor the interrupt time, if I want to know the number of cores required?
3. What should I monitor under the given circumstances to answer the question, how much RAM I need (I know: "The more the better.")?
Thank you very much for your help!
Best regards!
D.C.
(53 rep)
Feb 13, 2017, 11:27 AM
• Last activity: Jul 15, 2025, 11:08 AM
1
votes
2
answers
149
views
Upgrade BI server to 2014 and stay on 2008r2 with the database engine server
We would be happy to upgrade our SQL Server environments to 2014, but we have a tight budget for next year. Our BI server (SQL Server 2008 Standard Edition, that is used only for BI purposes) is located separately from the Production Database server (SQL Server 2008 Enterprise Edition). The BI serve...
We would be happy to upgrade our SQL Server environments to 2014, but we have a tight budget for next year.
Our BI server (SQL Server 2008 Standard Edition, that is used only for BI purposes) is located separately from the Production Database server (SQL Server 2008 Enterprise Edition). The BI server contains our SSIS and SSRS packages and reports.
I was wandering whether it's possible to have a configuration where the BI server will have SQL Server 2014 (Standard of Business Intelligence) but the database server itself will remain on 2008.
Thanks for your answers,
Roni.
Roni Vered
(585 rep)
Dec 10, 2014, 04:18 PM
• Last activity: Jul 15, 2025, 08:06 AM
1
votes
1
answers
51
views
SQL Server 2014 Audit
I'm configuring SQL Server Audit to log all INSERT, UPDATE, DELETE, and SELECT operations. I've tested and recorded everything I need (time, action, server principal, server instance, database name, object name, statement). [![enter image description here][1]][1] The only thing missing and that I co...
I'm configuring SQL Server Audit to log all INSERT, UPDATE, DELETE, and SELECT operations. I've tested and recorded everything I need (time, action, server principal, server instance, database name, object name, statement).
The only thing missing and that I couldn't find is the field that stores the changes before they're modified by UPDATE and DELETE. If someone changes them, I need to know what was stored beforehand.
Could someone help me?
Thank you.

FernandoSatonni
(31 rep)
Jul 11, 2025, 05:19 PM
• Last activity: Jul 13, 2025, 03:10 PM
2
votes
1
answers
182
views
Snapshot Folder Location for Replication with Clustered Instance
I am assigned to create a transactional replication with a SQL Server clustered instance for reporting purposes. I'm planning to use the Distributor and the Subscriber on the same box. Regarding the snapshot folder, is it ok to place it on the same box where the Distributor\Subscriber resides? Will...
I am assigned to create a transactional replication with a SQL Server clustered instance for reporting purposes. I'm planning to use the Distributor and the Subscriber on the same box.
Regarding the snapshot folder, is it ok to place it on the same box where the Distributor\Subscriber resides? Will there by any problems?
SQL_NoExpert
(1117 rep)
Sep 11, 2018, 02:24 PM
• Last activity: Jul 6, 2025, 01:06 AM
0
votes
2
answers
212
views
Three tables with some matching data and some not - want to data from all three
I have three tables Costs (47 rows) Budgets (60 rows) Committed Costs (12 rows) All three have the same 2 fields and than a number JobNumber, CostCode, (either costs, budget, commitedcosts) They all have the same JobNumber and they all CostCode in common. However they also each have unique CostCodes...
I have three tables
Costs (47 rows)
Budgets (60 rows)
Committed Costs (12 rows)
All three have the same 2 fields and than a number
JobNumber, CostCode, (either costs, budget, commitedcosts)
They all have the same JobNumber and they all CostCode in common. However they also each have unique CostCodes. I need to run a query that would get me results from all three tables like this:
JobNumber, CostCode, Costs, Budgets, CommitedCosts
Currently I am using outer joins on the three tables and only getting 47 results since the Costs table is the first table selected from.
Patrick Bergmann
(1 rep)
Feb 22, 2019, 07:59 PM
• Last activity: Jun 27, 2025, 05:03 PM
1
votes
2
answers
197
views
SQL SERVER Detailed Backup Report
I need to create a script that gives the report for the following information: - Backup Job Name - Databases that have been backed up in the last 24 hours - Size of each DB backup - Start time of backup - End time of backup - Duration of backup - Location of the backup files I need to still include...
I need to create a script that gives the report for the following information:
- Backup Job Name
- Databases that have been backed up in the last 24 hours
- Size of each DB backup
- Start time of backup
- End time of backup
- Duration of backup
- Location of the backup files
I need to still include Log backup information, backup size and the location of the backup files stored. Could someone point me in the right direction of how to better my query? Thanks in advance.
What I have tried:
@dbname sysname
SET @dbname = NULL --set this to be whatever dbname you want
SELECT
bup.database_name AS [Database],
bup.server_name AS [Server],
bup.backup_start_date AS [Backup Started],
bup.backup_finish_date AS [Backup Finished]
,CAST((CAST(DATEDIFF(s, bup.backup_start_date, bup.backup_finish_date) AS int))/3600 AS varchar) + ' hours, '
+ CAST((CAST(DATEDIFF(s, bup.backup_start_date, bup.backup_finish_date) AS int))/60 AS varchar)+ ' minutes, '
+ CAST((CAST(DATEDIFF(s, bup.backup_start_date, bup.backup_finish_date) AS int))%60 AS varchar)+ ' seconds'
AS [Total Time]
FROM msdb.dbo.backupset bup
WHERE bup.backup_set_id IN
(SELECT MAX(backup_set_id)
FROM msdb.dbo.backupset
WHERE database_name = ISNULL(@dbname, database_name) --if no dbname, then return all
AND type = 'D' --only interested in the time of last full backup
GROUP BY database_name)
/* COMMENT THE NEXT LINE IF YOU WANT ALL BACKUP HISTORY */
AND bup.database_name IN (SELECT name FROM master.dbo.sysdatabases)
ORDER BY bup.database_name
sqllover2020
(73 rep)
Dec 23, 2020, 09:47 PM
• Last activity: Jun 25, 2025, 07:04 PM
1
votes
1
answers
217
views
SQL Server 2014 - UPSERT XML Multiple rows
We have an input like this 334 17.10 marc58 Marc Juppé Rue Paris FR 333333333 22222 4.50 3664 47 MCPU DDA010 mmx 1111 1 6.2 3665 45 MCPU DFZ42 mmy 1112 2 3.2 and we are currently importing data into Sales Table with a query like this: INSERT INTO Sales (ItemID,Store,Title,SKU,Quantity,Price,Ord...
We have an input like this
334
17.10
marc58
Marc Juppé
Rue
Paris
FR
333333333
22222
4.50
3664
47
MCPU DDA010
mmx
1111
1
6.2
3665
45
MCPU DFZ42
mmy
1112
2
3.2
and we are currently importing data into Sales Table with a query like this:
INSERT INTO Sales (ItemID,Store,Title,SKU,Quantity,Price,OrderID,AmountPaid,UserID,ShippingCosts)
SELECT I.X.value('(Details/ItemID/text())', 'int') as ItemID,
I.X.value('(Details/Store/text())', 'int') as Store,
I.X.value('(Details/Title/text())', 'nvarchar(100)') as Title,
I.X.value('(Details/SKU/text())', 'nvarchar(100)') as SKU,
I.X.value('(Quantity/text())', 'int') as Quantity,
I.X.value('(Price/text())', 'decimal(11,2)') as Quantity,
O.X.value('(OrderID/text())', 'int') as OrderID,
O.X.value('(AmountPaid/text())', 'decimal(11,2)') as AmountPaid,
O.X.value('(UserID/text())', 'nvarchar(100)') as UserID,
O.X.value('(ShippingCosts/text())', 'decimal(11,2)') as ShippingCosts
FROM @XML.nodes('/Orders/Order') as O(X)
CROSS APPLY O.X.nodes('Lines/Line') as I(X)
but now we need to update records that already exists checking Lines.Line.LineID
Since do not want to use MERGE, we thought it was easy to solve adding the usual
IF NOT EXISTS (SELECT 1 FROM Sales where LineID=@LineID)
INSERT INTO Sales () SELECT ....
ELSE
UPDATE Sales SET ItemID=..., Store=..., ... WHERE LineID=@LineID
but the value of
@LineID
in inside the XML and has to be calculated while parsing the XML
And discovered have no idea how to get it
Can suggest how to get the value of LineID
while parsing the XML?
Thanks
Joe
(369 rep)
Oct 19, 2017, 06:53 PM
• Last activity: Jun 24, 2025, 02:04 AM
0
votes
1
answers
196
views
MSX Job deployment: Not creating job on certain servers
[![][1]][1]I am deploying a new index maintenance jobs through MSX to about 100 servers, I am in the process of manually checking that each job has been created and I have already discovered that for some currently unknown reason specific servers are not receiving the new MSX job. Strangely, these s...

Asher
(313 rep)
May 10, 2019, 06:53 AM
• Last activity: Jun 19, 2025, 08:05 AM
0
votes
1
answers
222
views
Figuring out if SQL Server database needs to be defragmented
I am tasked with looking at if a database in SQL Server 2014 needs to be defragmented. I ran DBCC SHOWCONTIG but am having trouble interpreting the results. use databasename; GO DBCC SHOWCONTIG ('Table'); GO The results are: DBCC SHOWCONTIG scanning 'Table' table... Table: 'Table' (802206008); index...
I am tasked with looking at if a database in SQL Server 2014 needs to be defragmented. I ran DBCC SHOWCONTIG but am having trouble interpreting the results.
use databasename;
GO
DBCC SHOWCONTIG ('Table');
GO
The results are:
DBCC SHOWCONTIG scanning 'Table' table...
Table: 'Table' (802206008); index ID: 1, database ID: 20
TABLE level scan performed.
- Pages Scanned................................: 2239
- Extents Scanned..............................: 280
- Extent Switches..............................: 279
- Avg. Pages per Extent........................: 8.0
- Scan Density [Best Count:Actual Count].......: 100.00% [280:280]
- Logical Scan Fragmentation ..................: 0.00%
- Extent Scan Fragmentation ...................: 31.79%
- Avg. Bytes Free per Page.....................: 366.8
- Avg. Page Density (full).....................: 95.47%
DBCC execution completed. If DBCC printed error messages, contact your system administrator.
Questions:
1. What do the percentages above mean? (like the difference between logical and extent scan fragmentation).
2. What percentage do I need to check and what value indicates the disk needs to be defragmented?
cdub
(103 rep)
Jun 26, 2017, 09:08 PM
• Last activity: Jun 11, 2025, 02:04 PM
1
votes
1
answers
2049
views
How to convert a VARCHAR DD/MM/YYYY H:MM:SS AM into datetime and then to Json date?
So I know this question is crazy. I just spent days trying to solve this but i think it's impossible. I have a table with a `CreatedDate` as a `VARCHAR` field. 10/7/2019 7:12:59 AM What I need to to is to convert this date into those dates we see on MongoDB or Json files. It should be like this: 201...
So I know this question is crazy. I just spent days trying to solve this but i think it's impossible.
I have a table with a
CreatedDate
as a VARCHAR
field.
10/7/2019 7:12:59 AM
What I need to to is to convert this date into those dates we see on MongoDB or Json files. It should be like this:
2019-07-04T15:35:41.958UTC-03:00
is there a way to do this? I tried to convert to datetime, I tried substring, left right. I can force the -03:00
at the end.
SELECT CONVERT(VARCHAR(33), getdate(), 126)
With this I can convert a datetime field and thats the problem.
Racer SQL
(7546 rep)
Nov 11, 2019, 02:45 PM
• Last activity: Jun 10, 2025, 05:04 AM
2
votes
1
answers
693
views
Missing Oracle Schema in SSMA Object Explorer
We are currently working on a project to move all of our disparate DBs (MySQL, Oracle, older versions of MS SQL) to SQL Server 2014. So far, pretty much everything has gone smoothly, however, there is one Oracle Database (10g 10.2.0.5), which we're trying to migrate using SSMA that just doesn't show...
We are currently working on a project to move all of our disparate DBs (MySQL, Oracle, older versions of MS SQL) to SQL Server 2014. So far, pretty much everything has gone smoothly, however, there is one Oracle Database (10g 10.2.0.5), which we're trying to migrate using SSMA that just doesn't show up in the SSMA object explorer.
I have tried connecting to this database using the schema owner, the sys AND the system users and it just never shows up. ALL the other schemas on this instance show up and we can migrate them successfully.
Has anyone come across this issue before? Could anyone suggest possible reasons for this? I - and the SQL Server DBA - am stumped. In the interim, we're migrating subsets of data using smacked-together Perl scripts, but this is obviously not ideal.
Francois
(96 rep)
Sep 18, 2015, 11:29 AM
• Last activity: Jun 8, 2025, 01:09 AM
1
votes
1
answers
226
views
How to verify DBCC CHECKDB
I am logging the `DBCC CHECKDB` job output to a table. When I look at the job log using the below command, I want to verify that all is well. use msdb EXEC dbo.sp_help_jobsteplog @job_name = N'DatabaseIntegrityCheck - USER_DATABASES' ; Would you say that we are good if we see "Outcome: Succeeded" in...
I am logging the
DBCC CHECKDB
job output to a table.
When I look at the job log using the below command, I want to verify that all is well.
use msdb
EXEC dbo.sp_help_jobsteplog
@job_name = N'DatabaseIntegrityCheck - USER_DATABASES' ;
Would you say that we are good if we see "Outcome: Succeeded" in the log for the job or is there something else we should also review?
We are using Ola's job output but I am also logging it to a table and am just wondering about the output. If we see "outcome succeeded' in the job, is this enough to know the CHECKDB
found nothing wrong?
cspell
(511 rep)
Aug 31, 2017, 02:19 PM
• Last activity: Jun 7, 2025, 09:04 PM
0
votes
1
answers
495
views
Is it possible for a node to host multiple cluster in relation to Always On AG
 In my example, I have 3 database servers with their own DR servers, own availability group, own clusters. Is it possible to setup another server which can be part of the 3 clusters and be a secondary replica on each of them?

In my example, I have 3 database servers with their own DR servers, own availability group, own clusters. Is it possible to setup another server which can be part of the 3 clusters and be a secondary replica on each of them?
Edgar Allan Bayron
(1350 rep)
Jul 24, 2019, 04:25 AM
• Last activity: Jun 7, 2025, 03:09 PM
0
votes
1
answers
231
views
Running select filename from dbo.sysfiles returns short foldernames with tilde
I am running the following statement in SQL Server 2014(in Windows 2012R2): select filename from dbo.sysfiles where fileid = 1 However it returns: c:\progra~1\micros~1\mssql1~1.sql\mssql\data\master~1.mdf I wish for the full path without the tilde. (e.g `C:\Program Files\Microsoft SQL Server\MSSQL12...
I am running the following statement in SQL Server 2014(in Windows 2012R2):
select filename from dbo.sysfiles where fileid = 1
However it returns:
c:\progra~1\micros~1\mssql1~1.sql\mssql\data\master~1.mdf
I wish for the full path without the tilde.
(e.g
C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\DATA\master.mdf
)
I tried upgrading the SQL Server 2014 Express to 2016 Express but it made no difference.
Any ideas?
Thanks.
Dave
dave
(1 rep)
Feb 8, 2021, 09:45 PM
• Last activity: Jun 6, 2025, 11:05 AM
2
votes
1
answers
1397
views
powershell problem when running on a remote server - access is denied
as a follow up from this question below: [Run a PowerShell script on a different server - from inside a SQL Server job][1] there you can see that for me tu run a powershell script from inside a sql server job, I need to create a proxy. Both the powershell script and the proxy script are on that link...
as a follow up from this question below:
Run a PowerShell script on a different server - from inside a SQL Server job
there you can see that for me tu run a powershell script from inside a sql server job, I need to create a proxy.
Both the powershell script and the proxy script are on that link.
what the powershell script does is to delete old backup files on a remote server.
the question that I still have is:
for the whole thing to work on windows server 2012, I could find not other way than granting Administrator on the remote box to the Proxy account.
as you can see on the picture below.
another alternative was to
create a powershell script that do both, backups and deletes, and create a sql server job with the following code:
Is this a powershell limitation when running on a remote server?
Are there other ways to work around it, without having to grant admin to the proxy account?
PowerShell Basics: Managing Remote Servers & Sessions


Marcello Miorelli
(17274 rep)
Sep 16, 2015, 10:54 AM
• Last activity: Jun 5, 2025, 04:05 PM
Showing page 1 of 20 total questions