Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
5
votes
1
answers
660
views
Reinitialize Table Values in SQL SSDT Unit Testing
I am creating SQL Server Unit Tests. We are testing various stored procedures. In Unit testing principles, it is good practice to setup a small database table, populate values, and tear down (truncate/delete) the database tables, and resetup for each test. This way every unit tests will have a clean...
I am creating SQL Server Unit Tests. We are testing various stored procedures.
In Unit testing principles, it is good practice to setup a small database table, populate values, and tear down (truncate/delete) the database tables, and resetup for each test. This way every unit tests will have a clean environment to validate sprocs which insert, select, update, delete, etc,
Does anyone where or how to reinitialize the tables values in Sql Unit Testing? Resources are pretty new for unit testing in SQL SSDT VS 2017, so I think lot of people are trying to figure out and understand.
Feel free to show or add pictures below.
http://www.sqlservercentral.com/articles/Unit+Testing/155651/
http://www.erikhudzik.com/2017/08/23/writing-sql-server-unit-tests-using-visual-studio-nunit-and-sqltest/
Pictures in Visual Studio SSDT:
Also, trying to review this class in SQLDatabaseSetup.cs:
[TestClass()]
public class SqlDatabaseSetup
{
[AssemblyInitialize()]
public static void InitializeAssembly(TestContext ctx)
{
// Setup the test database based on setting in the
// configuration file
SqlDatabaseTestClass.TestService.DeployDatabaseProject();
SqlDatabaseTestClass.TestService.GenerateData();
}
}
}
using Microsoft.Data.Tools.Schema.Sql.UnitTesting;

user162241
Oct 24, 2018, 05:09 AM
• Last activity: Aug 4, 2025, 09:09 AM
2
votes
1
answers
17509
views
Is there any way to clear all tables using Liquibase?
I'm currently using Liquibase for creating test data in my database before each test run. I came across a problem where some of the test cases need special data, while others need the tables to be empty. So I decided to give Liquibase contexts a try and only insert the data I really need for the cur...
I'm currently using Liquibase for creating test data in my database before each test run.
I came across a problem where some of the test cases need special data, while others need the tables to be empty.
So I decided to give Liquibase contexts a try and only insert the data I really need for the current testcase plus - in an abstract superclass - all the data every test case needs (user accounts and stuff)...
I would like to clear the tables before each run, however I can't seem to find anything for this case except
dropAll()
, which seems to be overkill...
I'm not quite sure if it is even possible - because of foreign keys and other constraints - but I wonder if there is a way to delete the content of ALL the tables without dropping them completely?
gilaras
(123 rep)
Apr 3, 2012, 06:36 AM
• Last activity: Sep 26, 2023, 03:45 PM
0
votes
1
answers
112
views
How to handle an expected exception in base postgres
I want to be able to write a `.sql` script that will carry out an action which fails, and the script should only report failure if the action _doesn't_ fail. For example, given initial table: ``` create table tbl(x integer, y integer); ``` I might update this table in a migration with the following:...
I want to be able to write a
.sql
script that will carry out an action which fails, and the script should only report failure if the action _doesn't_ fail.
For example, given initial table:
create table tbl(x integer, y integer);
I might update this table in a migration with the following:
alter table tbl add constraint unique_tst unique (x, y);
And want to write a test script for that migration, which will be similar to the following:
insert into tbl(x, y) values
(1, 1),
(1, 1)
;
This will fail - which is expected given the constraint - but I'm not sure how to handle that failure in postgres.
Something such as:
if does not fail:
insert into tbl(x, y) values
(1, 1),
(1, 1)
;
return:
failure
But I have no idea if this exists.
Note - I cannot install any extensions for this.
baxx
(326 rep)
Feb 9, 2023, 11:09 AM
• Last activity: Feb 9, 2023, 06:04 PM
5
votes
2
answers
6314
views
DROP DATABASE statement cannot be used inside a user transaction
Not really sure if this question belongs here, but I hope someone could help me out. I've made integration tests going all the way down to the database (using mssql localDB). I want each test to run independently with it's own data - I want to reseed the database with my fake data before each test i...
Not really sure if this question belongs here, but I hope someone could help me out.
I've made integration tests going all the way down to the database (using mssql localDB). I want each test to run independently with it's own data - I want to reseed the database with my fake data before each test is running. I tried to implement it with transactions without success. Here is how I tried to pull it off:
public class TestDbInitializer : DropCreateAlways()
{
public static List Items;
public override Seed(DbContext context)
{
Items = new List();
// Adding items
// ..
Items.ForEach(x => context.Add(x));
context.SaveChanges();
}
}
public class BaseTransactionsTests
{
private TransactionScope _scope
[TestInitialize]
public void Initialize()
{
_scope = new TransactionScope();
}
[TestCleanup]
public void Cleanup()
{
_scope.Dispose();
}
}
[TestClass]
public class IntegrationTests : BaseTransactionsTests
private IDependenciesContainer _container;
public static void AssemblyInit(TestContext context)
{
Database.SetInitializer(new TestDbInitializer());
_container = new DependenciesContainer();
// Registers all my application's dependencies
_container.RegisterAll();
}
[TestInitialize]
public void Initialize()
{
using (var context = new MyContext("TestsDatabase"))
{
context.Initialize(true);
}
}
[TestMethod]
public void TestAddItem()
{
var controller = _container.Resolve();
var result = controller.AddItem(new Item({Name = "Test"}))
var goodResult = result as OkNegotiatedResult();
if (result == null)
Assert.Fail("Bad result")
using (var context = new MyContext("TestsDatabase"))
{
Assert.AreEqual(context.Items.Count, TestDbInitializer.Items.Count + 1)
}
}
I use my dependency injector in my tests, registering all dependencies once (AssemblyInitialize).
I created a DB instance for testings, and a specific DropCreateAlways initializer with a fake data Seed method, which I set as the initializer in the AssemblyInitialize as well.
I want to reseed the database with my fake data before each test run. For that case I implemented the base class which holds a transaction scope.
When I run my tests, the following exception is thrown when Seeding the database in the TestInitialize:
DROP DATABASE statement cannot be used inside a user transaction
How should I deal with it? Moreover, what do you think of my implementation of those integration tests? What could be improved?
S. Peter
(185 rep)
Apr 5, 2016, 05:22 PM
• Last activity: Jan 1, 2023, 09:45 AM
1
votes
0
answers
321
views
Is there a utPLSQL way to mock procedures, functions, or tables?
In tSQLt, you can isolate dependencies for unit testing using procedures like FakeTable to create a dummy version of whatever your test subject depends on, that contains only data you specify. This is just like using Justmock or Moq in .NET unit testing. The only thing remotely close to this that I'...
In tSQLt, you can isolate dependencies for unit testing using procedures like FakeTable to create a dummy version of whatever your test subject depends on, that contains only data you specify. This is just like using Justmock or Moq in .NET unit testing.
The only thing remotely close to this that I've seen for utPLSQL is copying the dependencies to a temp version in the setup procedure, creating the dummy version manually, and then replacing it with the temp version in the teardown, all using EXECUTE IMMEDIATE.
Is there a more convenient way to do this, short of just creating a procedure to do this myself?
JAF
(163 rep)
Dec 14, 2022, 12:05 AM
0
votes
2
answers
429
views
MariaDB 10.6 InnoDB setttings for unit-tests: maximize performance at the expense of persistence
I have a **MariaDB 10.6** the db is used for phpunit tests. The database is dropped and recreated each time I run the unit tests. I would like to maximize the speed of the interactions (read/write) with this db while I totally don't care about persistence. I was thinking of converting all tables to...
I have a **MariaDB 10.6** the db is used for phpunit tests.
The database is dropped and recreated each time I run the unit tests.
I would like to maximize the speed of the interactions (read/write) with this db while I totally don't care about persistence.
I was thinking of converting all tables to
MEMORY
engine, I can't find detailed infos about this engine on [mariadb docs](https://mariadb.com/kb/en/memory-storage-engine/) , but the [mysql docs](https://dev.mysql.com/doc/refman/8.0/en/memory-storage-engine.html) says that TEXT
/BLOB
fields are not supported as well as transactions. So, no way.
I was thinking of converting the tables to TEMPORARY
tables, but this was based on the false belief that they are stored in RAM.
I came across the idea of [running InnoDB
on RAM](https://ultrahigh.org/2013/03/running-mysqlinnodb-in-memory-for-unit-tests/) setting --datadir=/dev/shm/mysql/
, but elsewhere I read that this is not a good idea 'cause MariaDB won't be able to handle a reboot.
So I think that the best thing to do is to tweak InnoDB
settings to maximize performance at the expense of persistence.
I saw [this answer](https://dba.stackexchange.com/questions/167883/optimizing-mysql-5-6-for-unit-tests) but it refers to MySQL 5.6 and a lot of things changed in InnoDB
since that release.
What settings do you recommend?
nulll
(151 rep)
Aug 17, 2022, 07:07 AM
• Last activity: Aug 20, 2022, 05:19 PM
11
votes
2
answers
1195
views
How can I fake inet_client_addr() for unit tests in PostgreSQL?
I have a simple stored procedure whose return value depends on the value of `inet_client_addr()`. How can I override `inet_client_addr()` for the purpose of unit tests when testing my stored procedure? The only solution I've come up with so far is to create a wrapper function around `inet_client_add...
I have a simple stored procedure whose return value depends on the value of
inet_client_addr()
. How can I override inet_client_addr()
for the purpose of unit tests when testing my stored procedure?
The only solution I've come up with so far is to create a wrapper function around inet_client_addr()
:
CREATE FUNCTION my_inet_client_addr() RETURNS INET AS $$
SELECT inet_client_addr();
$$ LANGUAGE sql;
Then use that in my function:
CREATE local_connection() RETURNS BOOLEAN AS $$
SELECT my_inet_client_addr() = '127.0.0.1';
$$ LANGUAGE sql;
Then in my unit test, I can re-define my_inet_client_addr()
:
BEGIN;
SELECT PLAN(2);
REPLACE FUNCTION my_inet_client_addr() RETURNS INET AS $$
SELECT '127.0.0.1'::INET;
$$ LANGUAGE sql;
is(local_connection(),TRUE,'Connection from 127.0.0.1 is local');
REPLACE FUNCTION my_inet_client_addr() RETURNS INET AS $$
SELECT '192.168.1.1'::INET;
$$ LANGUAGE sql;
is(local_connection(),FALSE,'Connection from 192.168.1.1. is not local');
ROLLBACK;
Is there any way to accomplish the same without the wrapper function my_inet_client_addr()
?
Flimzy
(609 rep)
Jul 8, 2014, 08:02 PM
• Last activity: May 30, 2022, 03:34 PM
1
votes
1
answers
923
views
truncate table in mysql 8.0 is slower than older version
I have few hundreds of tests which use mysql. I was using Mysql 5.7 (percona) and I changed it to Mysql 8.0 . My problem is that before every test I truncate ALL my tables. This was fine with 5.7 but mysql 8.0 takes 5 seconds to truncate all 60 tables. This is mysql config which I use for both versi...
I have few hundreds of tests which use mysql. I was using Mysql 5.7 (percona) and I changed it to Mysql 8.0 . My problem is that before every test I truncate ALL my tables. This was fine with 5.7 but mysql 8.0 takes 5 seconds to truncate all 60 tables.
This is mysql config which I use for both version:
[mysqld]
skip-external-locking
key_buffer_size = 32M
max_allowed_packet = 1G
thread_stack = 512K
thread_cache_size = 8
expire_logs_days = 2
max_binlog_size = 100M
max_connections = 1200
tmp_table_size = 512M
max_heap_table_size = 512M
skip-name-resolve
innodb_buffer_pool_size=256M
innodb_log_buffer_size=32M
innodb_log_file_size=64M
innodb_flush_method=O_DIRECT
innodb_thread_concurrency=64
innodb_write_io_threads=32
innodb_read_io_threads=32
innodb_flush_log_at_trx_commit = 2
wait_timeout = 1800
interactive_timeout = 1800
innodb_file_per_table = 1
P.S: This is not just truncate, create table and alter table also take a lot more time compared to previous version.
P.S 2: I tested this both on my system and GitHub's Action. Same result for both.
Hassan Khodadadeh
(11 rep)
May 21, 2021, 02:35 PM
• Last activity: May 22, 2021, 10:17 PM
2
votes
2
answers
1097
views
Are SQL unit tests supposed to be so long?
I am writing stored procedures with some non-trivial business logic. I am trying to unit test them, but the actual tests end up being quite long (shorter ones starting at 40-50 LoCs, using commonly 4 different tables), which doesn't seem very "unit". (Admittedly, I format my code in a way where it t...
I am writing stored procedures with some non-trivial business logic. I am trying to unit test them, but the actual tests end up being quite long (shorter ones starting at 40-50 LoCs, using commonly 4 different tables), which doesn't seem very "unit". (Admittedly, I format my code in a way where it takes a lot of space.)
In context of "normal" programming languages I've heard the advice to refactor the complex procedure into smaller chunks. But I don't want to do that here because:
1. I don't want to pollute "global namespace" by small routines called from one place only.
2. Passing around tables from and to stored procedures is cumbersome.
3. Custom functions can have negative effects on performance.
Am I wrong about this reasoning?
I am new to unit testing, so perhaps I am just writing my tests wrong?
Is SQL longwinded language and thus it's unit tests are longer as well?
***
(I am using SQL Server with tSQLt framework, but I believe the question is system-agnostic.)
IvanLoginov
(103 rep)
Mar 28, 2021, 08:10 AM
• Last activity: Apr 7, 2021, 02:54 PM
6
votes
3
answers
19710
views
EXPORT specific columns to INSERT statements
Is there a way to export specific columns and a fixed number of rows from a table to INSERT statements? I am trying to generate these for unit testing based on a large production table so using something like "Generate Scripts" in SQL Server Management Studio is out of the question. There are a few...
Is there a way to export specific columns and a fixed number of rows from a table to INSERT statements?
I am trying to generate these for unit testing based on a large production table so using something like "Generate Scripts" in SQL Server Management Studio is out of the question.
There are a few tools that might be able to do this, but something so trivial in MySQL should not be that complex in SQL Server (at least I would hope not).
Robin
(333 rep)
Jun 13, 2013, 11:29 AM
• Last activity: Feb 9, 2021, 03:15 PM
0
votes
1
answers
33
views
How to test a function that selects from real tables?
I've got a mega function that everybody is afraid to modify. So each fix adds a dozen of new `IF` lines. To me it feels that this function screams for proper testing. Instead of the real function I'm including a really simplified example, because I believe that the crux of the challenge is that it's...
I've got a mega function that everybody is afraid to modify. So each fix adds a dozen of new
IF
lines. To me it feels that this function screams for proper testing.
Instead of the real function I'm including a really simplified example, because I believe that the crux of the challenge is that it's not only a function of its explicit parameters, but also of the state of some of the actual tables in the database.
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER FUNCTION [statistic].[daysByProject](
@DateFrom DATE,
@DateTo DATE,
)
returns
@ret table(ProjectID INT, IsFine TINYINT, Days smallint)
BEGIN
insert into @ret(ProjectID, IsFine, Days)
select ProjectID, IsFine, Days
from dbo.SomeRealTable
RETURN
END
Is it possible to add tests to such a function? I've even heard of including tests in a transaction, apparently something akin to this
/* alter function */
/**
* Set up test case
* Run
* Raise error if test fails
* Repeat with all test cases
*/
/* Roll back on error */
Could something like that be possible so that (unless explicitly working around this) developers could only modify the function in a way that satisfies the requirements of test cases?
Džuris
(121 rep)
Dec 1, 2020, 07:39 AM
• Last activity: Dec 9, 2020, 08:52 AM
7
votes
1
answers
1968
views
Test Driven Design for postgres procedures
I am looking to introduce a [test driven design](https://www.red-gate.com/simple-talk/sql/sql-training/test-driven-database-development-why-bother/) style to writing my stored procedures for a postgresql database implementation. I have seen that [pgTap](https://github.com/theory/pgtap) is a popular...
I am looking to introduce a [test driven design](https://www.red-gate.com/simple-talk/sql/sql-training/test-driven-database-development-why-bother/) style to writing my stored procedures for a postgresql database implementation.
I have seen that [pgTap](https://github.com/theory/pgtap) is a popular unit testing tool for postgres and correctly allows for the tests to be written in SQL and not built and run externally.
In order to go the full distance I would also like to be able to use test doubles (stub, mock, fake, dummy) and maybe even some IDE support for running and refactoring (something like [tsql](http://tsqlt.org/))
Are there currently tools and resources available specifically for supporting TDD in postgres or do people roll their own by building fake tables and data and wrapping everything in transactions that roll back on completion?
What are currently the best practices for approaching TDD in postgres?
Russell Ormes
(193 rep)
Apr 30, 2018, 05:08 PM
• Last activity: Oct 10, 2020, 02:04 AM
1
votes
0
answers
378
views
Unit Testing ORDS - Oracle Rest Data Services
I've been wanting to develop some unit tests for my [ORDS][1] functions as part of my push for better CI/CD - I am running ORDS functions on an Oracle Autonomous Database 19c, and while I'm developing a testing suite for the functions I've developed, I am probably not going about it the right way. I...
I've been wanting to develop some unit tests for my ORDS functions as part of my push for better CI/CD - I am running ORDS functions on an Oracle Autonomous Database 19c, and while I'm developing a testing suite for the functions I've developed, I am probably not going about it the right way.
I've done a lot of googling on the topic of specifically testing ORDS functions and there doesn't seem to be much out there that I could find, so I was hoping people on here might have some experience with this :)
I have been introduced to a couple of methods on another question (utPLSQL and a method on SQLDeveloper ) but they seem to only test the code - not the restful services, as well as requiring other setup that I have not done - I just have the code in ORDS. I also don't have all my queries in PL/SQL, some of them are in SQL such as my most basic ORDS function:
-- GET REQUEST
-- gets all client info ordered by name
select * from clients order by name asc
Ideally I would just have some code that ran through the ORDS module of my choice and told me if it passed the tests I define.
I'm currently wondering whether to have a script to:
1. Set up a database with new schema etc and populate it with dummy data
1. Run all the tests
1. Drop all the tables to get back to before
Although that sounds like a hassle, it'll also take more and more time to run, seeing as it'll be doing many REST API calls by definition, while unit tests are supposed to be over and done with very quickly.
Things I'm wanting to test are:
1. Authentication is working properly (should return 401 error when there is no auth and when auth is incorrect.
1. Insert functions work as intended
1. Select functions work as intended - this is a key one as they are the most complicated functions we have at the minute
1. Delete functions work as intended
I'm quite new to unit testing with SQL, my previous experience being with JUnit and some Python/C++ unit testing libraries, but it does seem like there's an easier way, whether that be something that others have done in my situation, or whether there are steps I should take to make my own code more friendly to the system that already exists.
I'm also definitely open to the idea of having more than one set of tests, that run in different environments if there is a good case to do so. I'm still getting around CI/CD in general with databases so knowing where to start is a big help.
Ash Oldershaw
(121 rep)
Jul 3, 2020, 09:33 AM
• Last activity: Jul 3, 2020, 03:24 PM
44
votes
5
answers
35349
views
Unit testing of stored procedures
I've been considering this for quite a long time now. The basic question is: how to unit test stored procedures? I see that I can set up unit tests relatively easily for functions in the classic sense (I mean they get zero or more arguments and return a value). But if I consider a real-life example...
I've been considering this for quite a long time now.
The basic question is: how to unit test stored procedures?
I see that I can set up unit tests relatively easily for functions in the classic sense (I mean they get zero or more arguments and return a value). But if I consider a real-life example of a seemingly simple procedure inserting a row somewhere, with a few triggers doing this and that before or after the insert, even defining the boundaries of a 'unit' is quite difficult. Should I test only the
INSERT
itself? That's fairly straightforward, I think–with relatively low value. Should I test the result of the whole chain of events? Apart from the question whether this is a unit test or not, designing a suitable test can be quite a strenuous job with lots of additional question marks arising on the way.
And then comes the problem of constantly changing data. In the case of an UPDATE
affecting more than just a few rows, every potentially affected row must be included somehow in the test cases. Further difficulties with DELETE
s and so on and so on.
So how do you unit test your stored procedures? Is there a treshold in complexity where it gets completely hopeless? What resources are needed for maintenance?
**EDIT** One more small question, based on AlexKuznetsov's answer: Or is there a treshold under which it is completely useless?
András Váczi
(31798 rep)
Jul 18, 2012, 12:08 PM
• Last activity: Feb 13, 2020, 05:18 PM
-1
votes
1
answers
579
views
Automated testing of SQL scripts
I'd like to run some simple tests on SQL scripts in CI to just ensure the scripts are valid SQL (PostgreSQL). While I could run Postgres in Docker or even use an in-memory DB such as Sqlite (it's close enough), that'd require creating the tables etc. first and the people writing these scripts are to...
I'd like to run some simple tests on SQL scripts in CI to just ensure the scripts are valid SQL (PostgreSQL). While I could run Postgres in Docker or even use an in-memory DB such as Sqlite (it's close enough), that'd require creating the tables etc. first and the people writing these scripts are too lazy to put in all that effort.
I've done a lot of Googling and to my dismay, I wasn't able to find a single tool to do only syntax validation and ensure parsability. Does anyone know of a tool/framework, in any language, that can do this?
lfk
(129 rep)
Nov 12, 2019, 07:45 AM
• Last activity: Nov 12, 2019, 10:52 PM
1
votes
1
answers
113
views
Test-Automation: Diff Data in Oracle Database - what are best practices?
We're re-writing a COBOL application in Java. We want to make sure, the applications behave the same, by validating our unit tests produce the same DB changes. How can we easily diff the data in the DB to a given master-DB (e.g. the Database in which the COBOL program has written data). We want to d...
We're re-writing a COBOL application in Java. We want to make sure, the applications behave the same, by validating our unit tests produce the same DB changes.
How can we easily diff the data in the DB to a given master-DB (e.g. the Database in which the COBOL program has written data). We want to do this in an automated way. Bonus: Is it possible to exclude several columns like date columns in a comparison?
user189597
(13 rep)
Sep 10, 2019, 12:10 PM
• Last activity: Sep 10, 2019, 04:41 PM
3
votes
1
answers
2309
views
What does @RC mean in SSDT SQL Server Unit Testing?
In Sql Server Unit Testing, SSDT, what does @RC stand for? Does it mean Row Count, or Row Column? Team wants to understand, we have been using it for long time, good tool which works, just curious. [![enter image description here][1]][1] [![enter image description here][2]][2] [1]: https://i.sstatic...
In Sql Server Unit Testing, SSDT, what does @RC stand for? Does it mean Row Count, or Row Column? Team wants to understand, we have been using it for long time, good tool which works, just curious.


user173948
Mar 5, 2019, 11:00 PM
• Last activity: Mar 9, 2019, 11:08 AM
5
votes
1
answers
1131
views
Optimizing MySQL 5.6 for unit tests
We have a MySQL instance for unit tests. Our unit tests work very heavily on MySQL. Therefore, we put `/var/lib/mysql` dir in `tmpfs` file system because we don't need persistent data in unit tests. Our performance peaked when we did this; however, I would like to consult with you what's the best co...
We have a MySQL instance for unit tests. Our unit tests work very heavily on MySQL. Therefore, we put
/var/lib/mysql
dir in tmpfs
file system because we don't need persistent data in unit tests.
Our performance peaked when we did this; however, I would like to consult with you what's the best configuration for MySQL in this case.
It's very common to put MySQL in tmpfs
(memory) and run non-important stuff like unit tests. The question is: What's best configuration for InnoDB to run in tmpfs
?
These are the configs we have in my.cnf
:
max_connections=350
innodb_fast_shutdown=2
innodb_log_file_size=1048576
innodb_doublewrite=0
innodb_flush_method=O_DIRECT
All our databases are in InnoDB. We don't mind losing some of the data.
P.S. You might ask at this point, why not MEMORY
engine? That's because MEMORY
has table-level locking only, and part of our unit tests are based on this.
EDIT:
We've finished this project, we based our configuration from this blogpost:
http://jotschi.de/2014/02/03/high-performance-mysql-testdatabase/
confiq
(153 rep)
Mar 22, 2017, 02:45 PM
• Last activity: Aug 29, 2018, 10:55 AM
1
votes
3
answers
3454
views
What circumstances could a SQL DateTime column hold an invalid date?
I have been asked to write a series of tests within SQL. Many of them are to test if a certain DateTime columns held in the system is a valid date. Given that the all the fields in question are Datetime column and any changes made to them by end-users are through a GUI front end that already has dat...
I have been asked to write a series of tests within SQL.
Many of them are to test if a certain DateTime columns held in the system is a valid date.
Given that the all the fields in question are Datetime column and any changes made to them by end-users are through a GUI front end that already has date validation tests, not to mention that the SQL connector would through up an error if an invalid was passed through anyway.
My question is that under what circumstances could a DateTime column end up holding an invalid date, i.e 20/40/2018.
We are running SQL Server 2008
Kind regards
Matt
Matt Bartlett
(123 rep)
Aug 10, 2018, 02:33 PM
• Last activity: Aug 21, 2018, 06:07 PM
2
votes
1
answers
2166
views
Write a slow SQLITE query to test timeout
Is it possible to write a 'simple' SQLite query that will take more than 5 seconds to run. I am writing some unit-tests in my application to log slow queries, to that end, I would like a slow query that I know will take more than 5 or 10 seconds to run, (and create a warning to be raised/logged). I...
Is it possible to write a 'simple' SQLite query that will take more than 5 seconds to run.
I am writing some unit-tests in my application to log slow queries, to that end, I would like a slow query that I know will take more than 5 or 10 seconds to run, (and create a warning to be raised/logged).
I was thinking of creating 10 tables with 10 entries in each table and doing a join select on all the table.
But I am not sure if that is a good way of testing a slow select, (rather than a large return set).
Simon Goodman
(145 rep)
Apr 10, 2018, 07:38 PM
• Last activity: Apr 11, 2018, 10:56 AM
Showing page 1 of 20 total questions