Database Administrators
Q&A for database professionals who wish to improve their database skills
Latest Questions
0
votes
1
answers
63
views
Is it better for performance, to create a separate filegroup for tables that contain blobs?
I am currently working on an application that has approximately 500,000 users. Each user has a profile picture averaging around 32KB in size. I am using SqlServer and I want to save these profile pictures in a VARBINARY column in a separate table named `ProfilePictures` (Very much like what is descr...
I am currently working on an application that has approximately 500,000 users. Each user has a profile picture averaging around 32KB in size. I am using SqlServer and I want to save these profile pictures in a VARBINARY column in a separate table named
ProfilePictures
(Very much like what is described here ).
Is it better in terms of performance, to create a separate filegroup for this table? Will this improve performance, and if so, how?
PedroAsking
(3 rep)
Jan 26, 2025, 11:26 AM
• Last activity: Jan 27, 2025, 02:04 PM
0
votes
1
answers
759
views
VARCHAR automatically changes to VARBINARY
I am new to MySQL. When I declare a column as VARCHAR, why does it automatically change to VARBINARY? How should I avoid this? [![`newly created database`][2]][2] Database changed mysql> CREATE TABLE sync( -> vlan_id INT NOT NULL, -> vlan_name VARCHAR(90) NOT NULL, -> vlan_description VARCHAR(100) N...
I am new to MySQL. When I declare a column as VARCHAR, why does it automatically change to VARBINARY? How should I avoid this?
newly created database" class="img-fluid rounded" style="max-width: 100%; height: auto; margin: 10px 0;" loading="lazy">
Database changed
mysql> CREATE TABLE sync(
-> vlan_id INT NOT NULL,
-> vlan_name VARCHAR(90) NOT NULL,
-> vlan_description VARCHAR(100) NOT NULL,
-> PRIMARY KEY(vlan_id)
-> );

Database changed
mysql> SHOW CREATE TABLE sync;
+-------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Table | Create Table |
+-------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| sync | CREATE TABLE sync
(
vlan_id
int(11) NOT NULL,
vlan_name
varbinary(90) NOT NULL,
vlan_description
varbinary(100) NOT NULL,
PRIMARY KEY (vlan_id
)
) ENGINE=InnoDB DEFAULT CHARSET=binary |
+-------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)
mysql>
unknown29
(1 rep)
Aug 6, 2019, 03:38 AM
• Last activity: Nov 11, 2024, 10:03 PM
0
votes
1
answers
74
views
Explanation needed for my result
I am new to string handling in SQL server and was wondering what makes the below code return the result it returns. Could I please have a detailed explanation? **My code:** ``` Select CAST(CAST(CAST('08' as nvarchar) as varbinary) as varchar) as [result] ``` **Output:** >| Result |\ | 0 | What is ha...
I am new to string handling in SQL server and was wondering what makes the below code return the result it returns. Could I please have a detailed explanation?
**My code:**
Select CAST(CAST(CAST('08' as nvarchar) as varbinary) as varchar) as [result]
**Output:**
>| Result |\
| 0 |
What is happening here from '08' being converted to nvarchar to varbinary to varchar??
varun as
(11 rep)
Sep 21, 2024, 11:45 AM
• Last activity: Sep 28, 2024, 04:58 AM
1
votes
1
answers
218
views
Rolling up multiple varbinary(max) rows into a single row
My table looks something like this: |BinFiles| |--------| |0x1101ABB1C...| |0x2202DEE2F...| |0x3303GHH3I...| I tried this to get a single line: ``` SELECT STUFF((SELECT d.BinFiles FROM dbo.demo d WHERE d.ID = dd.ID FOR XML PATH('')), 1, 1, '') BinFiles FROM dbo.demo dd ``` How can I do this? Or is t...
My table looks something like this:
|BinFiles|
|--------|
|0x1101ABB1C...|
|0x2202DEE2F...|
|0x3303GHH3I...|
I tried this to get a single line:
SELECT
STUFF((SELECT d.BinFiles
FROM dbo.demo d
WHERE d.ID = dd.ID
FOR XML PATH('')), 1, 1, '') BinFiles
FROM
dbo.demo dd
How can I do this? Or is this possible for varbinary data (size issues)?
I'm using SQLServer 2019. The varbinary columns are part of a pdf file (1Mb per file). I need to join these columns into a single row for export again as a pdf file.
mbielsa
(66 rep)
Sep 12, 2023, 08:47 AM
• Last activity: Sep 27, 2023, 07:04 AM
5
votes
2
answers
783
views
Varbinary update attempt
Is it possible to adapt this into update query which has to seek a free sequence of 3 bytes or (0x000000) and replace it with a new sequence if the data is split like this every 3 bytes? The issue is that cast/replace wont work with undefined data which is 0x00. The varbinary data is divided every 3...
Is it possible to adapt this into update query which has to seek a free sequence of 3 bytes or (0x000000) and replace it with a new sequence if the data is split like this every 3 bytes? The issue is that cast/replace wont work with undefined data which is 0x00.
The varbinary data is divided every 3bytes -> data(3bytes long)data(3bytes long).
DECLARE @YourTable table
(
Id INT PRIMARY KEY,
Val VARBINARY(50)
)
INSERT @YourTable
VALUES (1, 0x0329000414000B14000C14000D0F00177800224600467800473C00550F00000000000000000000000000),
(2, 0x0329002637000B14000C14000D0F00177800224600467800473C00550F00000000000000000000000000);
SELECT Id, Triplet
FROM @YourTable T
JOIN (VALUES (1),(4),(7),(10),(13),(16),(19),(22),(25),(28),(31),(34),(37),(40),(43),(46),(49)) Nums(Num) ON Num <= DATALENGTH(T.Val)
CROSS APPLY (VALUES (SUBSTRING(T.Val, Num, 3))) V(Triplet)
WHERE Triplet = 0x000000 and DATALENGTH(Triplet) = 3
What I've tried:
UPDATE x
set x.column = CAST(REPLACE(x.column, 0x000000, 0xFFFFFF) AS VARBINARY)
from Table as x
Works only if the column contains no data.
UPDATE x
set x.attr = CAST(REPLACE(0x000000, 0x000000, 0xFFFFFF)
from Table as x
This one semi works by replacing the first 3 bytes of the data, but I'd prefer not to lose data in the process.
The goal is to replace sequence of empty data in varbinary(50) with sequence of my desire. Or select a specific location of the binary for example the last plausible triplet location and import/replace the data there with the desired sequence.
Peter
(79 rep)
Mar 13, 2023, 09:01 PM
• Last activity: Mar 14, 2023, 03:18 PM
2
votes
1
answers
2028
views
Varbinary pattern search
I'm trying to make a query which goes through varbinary data. The issue is that I can't really finish what I'm trying to achieve. What you should know about the column is varbinary(50) and the patterns that occur have no specific order in writing, meaning every prefix could be anywhere as long it ha...
I'm trying to make a query which goes through varbinary data. The issue is that I can't really finish what I'm trying to achieve. What you should know about the column is varbinary(50) and the patterns that occur have no specific order in writing, meaning every prefix could be anywhere as long it has 3 bytes(0x000000) First byte is the prefix second and third are value data that I'm looking to check if its within the range i like. All the data is written like this.
What I've tried:
DECLARE @t TABLE (
val VARBINARY(MAX)
)
INSERT INTO @t SELECT 0x00000100000000000000000000000000000000000000000000000000
INSERT INTO @t SELECT 0x00001000000000000000000000000000000000000000000000000000
INSERT INTO @t SELECT 0x00010000000000000000000000000000000000000000000000000000
INSERT INTO @t SELECT 0x00100000000000000000000000000000000000000000000000000000
INSERT INTO @t SELECT 0x00000f00000000000000000000000000000000000000000000000000
declare @pattern varbinary(max)
declare @pattern2 varbinary(max)
set @pattern = 0x0001
set @pattern2 = @pattern+0xFF
select @pattern,@pattern2
SELECT
*
FROM @t
WHERE val@pattern2
---
This was total bust the patterns were accurate up to 2 symbols if I were to use 4 symbols as pattern it would work only if the pattern is in predefined position. I've tried combination of this and everything below.
WHERE CONVERT(varbinary(2), val) = 0xdata
also this:
select *
from table
where CONVERT(varchar(max),val,2) like '%data%'
Which works great for searching exact patterns, but not for ranges, I need some combination of both.
I need something that would detect this on its own while i just specify a start and end point to look in between like the highest number variation would be '26ffff', but limiting it to something like 'ff00' is acceptable for what I'm looking for.
My best guess is 2 defined numbers, 1 being the allowed max range
and 2nd for a cap, so it doesn't go through every possible outcome.
But I would be happy to whatever works.
The data origin is related to a game server which stores the data like this. There's the predefined prefixes which are the stat type and the rest of the data is the actual numeric value of the stat. The data is represented by 6 characters data intervals. Here is a sample of the data stream. Its always 6-6-6-6-6 as long there's space to record the data on since its capped at 50 characters.
0x0329000414000B14000C14000D0F00177800224600467800473C00550F00000000000000000000000000
The groups are always in 3byte fashion, my idea is to use the first byte to narrow down the search and use then use the second 2 bytes to filter it. I just don't know how to pull it off in an effective way. If the 3byte pattern is violated the data becomes unreadable meaning even if you don't need the extra byte you still have to count it otherwise the data breaks example of a working data.
0x032900'041400'
example of a broken data:
0x0329'041400'
The only issue i could think is when the prefix and part of the value are both true example:
0x262600
Unless the query is specifically ordered to read the data in 3byte sequence meaning it knows that the first byte is always a prefix and the other 2 bytes are value.
Q:Can that be used as an alignment indicator so that the first non-zero byte after at least 3 zero bytes indicates the start of a group?
A:Yes, but that's unlikely I mean it although possible it would be written in order like:
0x260000'270000'
It wouldn't skip forward an entire 3byte group filled with no data. This type of entry would occur if someone were to manually insert it to the db, the server doesn't make records with gaps like those as far I'm aware:
0x260000'000000'270000'
The closest to my desired result as a working query is this, but this is terrible it would work for small ranges, but anything greater would be tedious.
select * from @t
where (CONVERT(varchar(max),val,2) like '%262100%' or
CONVERT(varchar(max),attr,2) like '%262200%' or
etc...)
Goals:
1. Locating the prefix(first binary data pair)
2. Defining a max value after the prefix, everything above that threshold to be listed in the results. Let's say '26' is the prefix, the highest allowed number after is '9600' or '269600'. Basically any data that exceeds this pattern '269600' should be detected example '269700'.
or query result would post this:
> select * from table where CONVERT(varchar(max),attr,2) like
> '%269700%'
---
I'm trying to get a query that can cycle through varbinary data from a table which searches for a specific result within set parameters example:
0x263700, the data is divided in 3-byte segments, 1 byte is the header which will be used for a search pattern, the other 2 bytes are the value I have to check if its within set range for example: everything above '3700'(+ 1 bit difference) binary value until 'FFFF' would be classified as a result or the max plausible result would be '26FFFF'.
This isn't for specific data stream to be cycled one by one.
Peter
(79 rep)
Feb 8, 2023, 03:25 PM
• Last activity: Feb 9, 2023, 10:43 AM
1
votes
1
answers
1146
views
LIKE or match by beginning for binary strings in PostgreSQL
I need to match a binary column with its beginning. For text columns it's easy: `LIKE 'image/%'`. This uses BTREE index if any. Is it possible to do this also using an index for a binary column? BTW, I store UTF-8 encoded strings in Binary, because there is no hard standard that all of them are UTF-...
I need to match a binary column with its beginning.
For text columns it's easy:
LIKE 'image/%'
. This uses BTREE index if any.
Is it possible to do this also using an index for a binary column?
BTW, I store UTF-8 encoded strings in Binary, because there is no hard standard that all of them are UTF-8. So, I store a binary representation (maybe even with \0 (zero) bytes), as I cannot use UTF-8.
porton
(745 rep)
Feb 5, 2022, 02:54 AM
• Last activity: Dec 25, 2022, 03:00 PM
0
votes
1
answers
394
views
Detecting whether a varchar value is already encrypted by EncryptByPassPhrase() in SQL Server
Starting with a `varchar(100)` column named `Password` in a table `Users` with one record in it, after performing an update on the column to encrypt it with `EncryptByPassPhrase()` on the `Password` column the following `varbinary(8000)` string is generated: > 0x0342342565CF0.... ...and for simplici...
Starting with a
varchar(100)
column named Password
in a table Users
with one record in it, after performing an update on the column to encrypt it with EncryptByPassPhrase()
on the Password
column the following varbinary(8000)
string is generated:
> 0x0342342565CF0....
...and for simplicity, the string has been abbreviated with ....
.
To be able to store the encrypted varbinary(8000)
output in the same Password
column, the encryption function is wrapped convert(varchar(8000), EncryptByPassPhrase())
.
Running a select * from Users;
produces this output of its two columns and one record:
> Name | Password
> ---------+----------
> Benjamin |
As you can see, Password
appears blank. However, running Select len(Password);
reveals a value of 52
, so there is definitely a string stored in Password
and it includes non-printable characters.
So, the question is, how can the blank varchar(100)
string be tested for its first two varbinary characters being 0x
?
The idea is to create a wrapper encrypt function that decides if the field is already encrypted, not to encrypt it again if it detects that the string passed to it starts with 0x
?
Guessed
(153 rep)
Feb 2, 2022, 11:16 PM
• Last activity: Feb 3, 2022, 05:25 PM
1
votes
1
answers
815
views
Why can't binary data be inserted/displayed as ones and zeroes?
If I have a column of type `binary` or `varbinary`, I imagine the data as a sequence of bits. For example, it makes sense to me that `01001` (as a base 2 number) could be a valid value in a `binary(5)` column. Is there a reason why inserting and displaying binary data in this fashion is not straight...
If I have a column of type
binary
or varbinary
, I imagine the data as a sequence of bits. For example, it makes sense to me that 01001
(as a base 2 number) could be a valid value in a binary(5)
column.
Is there a reason why inserting and displaying binary data in this fashion is not straightforward?
For example, why does SSMS convert binary data to hexadecimal instead of displaying a sequence of ones and zeroes (which in my opinion is much easier to reason about)?
Jacob Stamm
(121 rep)
Dec 2, 2021, 06:31 PM
• Last activity: Dec 10, 2021, 09:52 AM
1
votes
0
answers
659
views
Speed up reading data from a table
**Problem** We have a table that has just 44 rows, and *one* of the columns is `varbinary(max)`. This column stores XML files converted to `varbinary(max)`. Average length of data in this particular column is over 8000 symbols per row. Average row size is 17 KB (26 KB if you take into consideration...
**Problem**
We have a table that has just 44 rows, and *one* of the columns is
varbinary(max)
.
This column stores XML files converted to varbinary(max)
.
Average length of data in this particular column is over 8000 symbols per row.
Average row size is 17 KB (26 KB if you take into consideration total size of table clustered + nonclustered + unused space).
Values of varbinary
strings are very similar, only small differences between them.
Total table size is small: sp_BlitzIndex
reports 1.1MB; 0.8MB LOB for clustered, 0.1 MB for nonclustered.
Developers claim that reading out whole table from application, takes up to 20 seconds.
They would like to reduce the time it takes to read all rows.
**What I've tried**
For me (in SSMS over VPN) it takes about 3-6 seconds to read 44 rows from this table.
For comparison, other tables of comparable size (~1.1 MB) with thousands of rows take 1-3 seconds.
First thing I recommended was to replace varbinary(max)
with xml
data type.
However developers don't know how long it will take them to achieve this and when this can be done.
Second, I rebuilt the table with (data_compression=page)
, but it did not compress the column (used sp_BlitzIndex
and sp_spaceused
to verify).
**Questions:**
How can we speed up reading data from this table for application and SSMS?
Should it be columnstore indexes, converting column to XML, any other option, or anything that needs to be checked on infrastructure side (and how)?
**Update:**
CREATE TABLE [dbo].[MySlowTable](
[ID] [int] primary key IDENTITY(1,1) NOT NULL,
[CarrierID] [int] NOT NULL,
[MailClass] [int] NULL,
[LabelSize] [varchar](10) NULL,
[LabelType] [varchar](30) NULL,
[Template] [varbinary](max) NOT NULL,
[TemplateName] [varchar](50) NULL)
CREATE NONCLUSTERED INDEX [index_Label] ON [dbo].[MySlowTable] (
[CarrierID],[MailClass],[LabelSize],[LabelType] )
Query that app runs:
SELECT [ID],[CarrierID],[MailClass],[LabelSize],[LabelType],[Template],[TemplateName]
FROM [DB].[dbo].[MySlowTable]
Execution plan is simple clustered index scan with no warnings
In properties I see that only wait is ASYNC_NETWORK_IO
with varying wait time 5000-38000 ms (now it became slower in SSMS over VPN)
For tables of comparable sizes (1.1 MB) with thousands of rows, ASYNC_NETWORK_IO wait is smaller than for table in question
I assume *sometimes* network over VPN can can be slow for my SSMS, but how can it be slow for Azure App querying DB server 1 MB table that is not geographically far, that is good question
Aleksey Vitsko
(6195 rep)
Jun 15, 2021, 03:12 PM
• Last activity: Jun 16, 2021, 10:53 PM
3
votes
3
answers
772
views
Varbinary startswith query using index
I've got this tree-like key (think object OIDs--it's very similar) to store and index in a table, for which queries want to select subtrees thereof. The most straightforward way to do this on a btree is to use the tree path as the single key and descend the key with a starts-with operation to find t...
I've got this tree-like key (think object OIDs--it's very similar) to store and index in a table, for which queries want to select subtrees thereof. The most straightforward way to do this on a btree is to use the tree path as the single key and descend the key with a starts-with operation to find the row blocks.
So the obvious representation is
VARBINARY(n)
*. I cannot think of a way to express starts with in a way that SQL Server can figure out to use the index thereof.
The next-most obvious is to pack the data into some VARCHAR
with a binary collation and use a like
operation; however I've discovered that like @n + '%'
will never use an index, but only like 'literal%
. Adding the WITH INDEX
hint still scans the index in its entirety because SQL server just doesn't understand. I really don't want to think about how to escape arguments in SQL so I can build an EXEC
. That's just madness and a security disaster waiting to happen.
Expressing this in ordinary recursive tables and querying with recursive SQL is out of the question. The recursive query will power-sink the database server.
I actually have the data in a long string of long
variables right now and can select my serialization form. As I said, the natural form is VARBINARY
and it really would look like 0x000000100004A010000000D0000A000
. The most common query is of the form "give me everything starting with 0x000000100004A01
" or in natural model, the first n
values from the string of long
values. I could write them in the form a.b.c.d.e.f...
but the individual numbers are long and come from clicking on things on the GUI.
Roughly speaking I'm looking at a plausible form of
`
CREATE TABLE Record (
RecordId BIGINT NOT NULL IDENTITY(1,1),
RecordedDate DATETIME NOT NULL,
RecordClass CHAR(1) NOT NULL,
UserId INT NOT NULL,
ObjectId VARBINARY(384) NOT NULL,
RecordValue NVARCHAR(100) NULL,
OwnerId BIGINT NULL, -- Joins to another table
SubOwnerId BIGINT NULL, -- and yet another table
PRIMARY KEY (RecordId)
)
CREATE INDEX name ON Record(RecordedDate);
CREATE INDEX name ON Record(OwnerId);
CREATE INDEX name ON Record(SubOwnerId);
CREATE INDEX name ON Record(ObjectId);
`
What is the best way to do a starts-with index?
*I've computed the largest possible n and it's less than 400
Joshua
(426 rep)
Nov 17, 2020, 04:48 PM
• Last activity: Nov 19, 2020, 06:19 AM
1
votes
2
answers
4624
views
How does SSMS display varbinary data?
When you store a .jpg file in a varbinary(max) column and then view the data in SQL Server Management Studio, it appears like "0xFFD8FFE000..." Is this Hex Encoded or something? What is SQL Server doing to generate this string? Can it be converted to a valid JPG file?
When you store a .jpg file in a varbinary(max) column and then view the data in SQL Server Management Studio, it appears like "0xFFD8FFE000..."
Is this Hex Encoded or something? What is SQL Server doing to generate this string? Can it be converted to a valid JPG file?
Mike W
(121 rep)
Nov 17, 2020, 10:57 PM
• Last activity: Nov 18, 2020, 06:19 PM
6
votes
3
answers
7823
views
SQL Server - Varbinary Column - Extremely Slow Statistics Update
I have a table in my database that we use as a filestore, the file itself is stored in a `varbinary` column, which all seemed to work well until recently, we noticed one of our instances of this table had essentially "jammed" on an insert statement. Checking `sys.dm_os_waiting_tasks` showed that the...
I have a table in my database that we use as a filestore, the file itself is stored in a
varbinary
column, which all seemed to work well until recently, we noticed one of our instances of this table had essentially "jammed" on an insert statement.
Checking sys.dm_os_waiting_tasks
showed that the insert statement had triggered a statistics update, and that this statistics update was taking a *very* long time. (17 minutes).
Here's the statement we found running:
SELECT StatMan([SC0], [LC0]) FROM
(SELECT TOP 100 PERCENT CONVERT([varbinary](200),
SUBSTRING ([Data], 1, 100)++substring([Data],
case when LEN([Data])<=200 then 101 else
LEN([Data])-99 end, 100)) AS [SC0],
datalength([Data]) AS [LC0]
FROM [FileSystem].[FileData] WITH
(READUNCOMMITTED) ORDER BY [SC0] ) AS _MS_UPDSTATS_TBL
There are roughly 2000 rows in this table, here's what it looks like:
CREATE TABLE [FileSystem].[FileData]
(
[Id] [uniqueidentifier] NOT NULL CONSTRAINT [DF__FileData__Id__09DE7BCC] DEFAULT (newsequentialid()),
[Data] [varbinary] (max) NULL,
[FileHash] [nvarchar] (4000) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[FileSize] [bigint] NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
ALTER TABLE [FileSystem].[FileData] ADD CONSTRAINT [PK_FileData] PRIMARY KEY CLUSTERED ([Id]) WITH (STATISTICS_NORECOMPUTE=ON) ON [PRIMARY]
GO
We're aware this table is pretty weird, and we are careful to only every address it with a clustered index seek.
For the moment, we've simply disabled automatic statistic generation for this table, but I'm wondering if that's really best practice. Will performance eventually become a problem without up-to-date statistics (bear in mind, this table is only every addressed by it's clustered index)?
Update:
Okay I'm pretty sure we've worked out what was causing the statistic to be generated:
GO CREATE PROCEDURE [FileSystem].[FileData_AppendNewData_Easy]
(@fileDataId uniqueidentifier )
WITH EXECUTE AS CALLER AS
BEGIN
declare @testValue varbinary(max);
set @testValue = 0xabcedf012439;
Update FileSystem.FileData
set Data.Write(@testValue, null, null)
where Id = @fileDataId ;
END
This is a simplified version of the procedure that was causing the issue, it seems Data.Write
causes some kind of implicit predicate on the data column? Well that solves that mystery, although I'm still not sure what impact disabling the statistics for this table might have, can anyone comment on that?
Chris Mawman
(61 rep)
Dec 6, 2012, 02:10 PM
• Last activity: Nov 23, 2019, 02:39 PM
5
votes
1
answers
1716
views
GZip existing varbinary(max) column in Sql Server 2008+
I have an existing legacy table that is ~180GB in size due to a client application storing PDF files as varbinary. I'd like to be able to compress that column for all rows using GZIP to help save space while I create a new solution (I was hoping there was a way to do it in SQL and not have to write...
I have an existing legacy table that is ~180GB in size due to a client application storing PDF files as varbinary. I'd like to be able to compress that column for all rows using GZIP to help save space while I create a new solution (I was hoping there was a way to do it in SQL and not have to write client code for this). I see that the
COMPRESS
method is available for Sql Server 2016 but I need a solution that will work with 2008. Any ideas would be appreciated.
jewnbug97
(197 rep)
Jul 14, 2018, 07:55 PM
• Last activity: Nov 14, 2019, 06:55 PM
0
votes
2
answers
7745
views
BLOB or VARBINARY to store simple Strings that are converted into byte[ ]?
I've read that In MySQL, BLOBs are often used to store image files, file path pointers, video & audio files, and any other big data objects. But can BLOBs be used for simple Strings that are converted into byte[] (like names, address, etc.) or would VARBINARY suffice? Or does it not really impact th...
I've read that In MySQL, BLOBs are often used to store image files, file path pointers, video & audio files, and any other big data objects.
But can BLOBs be used for simple Strings that are converted into byte[] (like names, address, etc.) or would VARBINARY suffice?
Or does it not really impact the database size and query performance?
Jae Bin
(39 rep)
Nov 6, 2018, 07:45 AM
• Last activity: Oct 26, 2019, 03:03 AM
0
votes
1
answers
528
views
How can I decrypt data on a replicated MSSQL database?
I have a database which uses a symmetric key to encrypt a credit card field. There is a trigger on the credit_card field which runs every time it is updated, to encrypt the contents, and save it to another field called credit_card_encrypted and then it wipes the contents of credit_card. I have recen...
I have a database which uses a symmetric key to encrypt a credit card field. There is a trigger on the credit_card field which runs every time it is updated, to encrypt the contents, and save it to another field called credit_card_encrypted and then it wipes the contents of credit_card.
I have recently setup replication on the table, to another instance of SQL Server.
Everything works fine except, I am unable to decrypt the data on the replicated server.
I have imported the certificate and the key and I have created a symmetric key however, whenever I try to decrypt the credit cards, it just returns NULL.
I have tried to manually add a value to the credit_card field on the replication server, and then I manually encrypted it, and I was actually able to successfully decrypt it.
So why can I not decrypt the data that is replicated?
Also, I can compare the varbinary data of the encrypted card and the contents looks exactly the same on the main server & the replication server.
e.g:
0x000CE24D0120A349PPL29D6BFE70C54E0100000064E3380AA4EC04A4B4E959535798696E81502A063617B21CFD75FAFF93866D47603543A5D6EBECDF5F8C0D23D8CCL982
cleverpaul
(201 rep)
Jul 8, 2019, 01:58 AM
• Last activity: Jul 8, 2019, 02:29 PM
4
votes
1
answers
5820
views
Insert String Value To Binary(64) Column
I am attempting to use a stored procedure to insert data into a table. The DDL of my table is like this: CREATE TABLE [dbo].[data1]( [ID] [int] IDENTITY(1,1) NOT NULL, [fname] [varchar](100) NOT NULL, [lname] [varchar](100) NOT NULL, [phone] [varchar](10) NULL, [company] [varchar](100) NOT NULL, [em...
I am attempting to use a stored procedure to insert data into a table. The DDL of my table is like this:
CREATE TABLE [dbo].[data1](
[ID] [int] IDENTITY(1,1) NOT NULL,
[fname] [varchar](100) NOT NULL,
[lname] [varchar](100) NOT NULL,
[phone] [varchar](10) NULL,
[company] [varchar](100) NOT NULL,
[email] [varchar](100) NOT NULL,
[pass] [binary](64) NOT NULL,
[registrationdate] [datetime] NULL,
PRIMARY KEY CLUSTERED
(
[ID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
SET IDENTITY_INSERT [dbo].[data1] ON
GO
INSERT [dbo].[data1] ([ID], [fname], [lname], [phone], [company], [email], [pass], [registrationdate]) VALUES (1, N'Bob', N'Tomato', N'5555555555', N'Test Company', N'test1234@test.com', 0x477265656E0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000, NULL)
GO
SET IDENTITY_INSERT [dbo].[data1] OFF
GO
And this is the stored procedure that I am attempting to execute:
Declare @FName varchar(100) = 'aa'
Declare @Lname varchar(100) = 'bb'
Declare @Phone varchar(100) = 'cc'
Declare @Company varchar(100) = 'dd'
Declare @Email varchar(100) = 'ee'
Declare @Pass binary(64) = 'redsocks'
INSERT INTO dbo.appinfo (fname, lname, phone, company, email, pass)
VALUES (@FName, @Lname, @Phone, @Company, @Email, CONVERT(VARBINARY(64), @Pass, 0))
but I get this error:
> Msg 257, Level 16, State 3, Line 20
Implicit conversion from data type varchar to binary is not allowed. Use the CONVERT function to run this query. Is this statement not converting the value appropriately? CONVERT(VARBINARY(64), @Pass, 0))
Implicit conversion from data type varchar to binary is not allowed. Use the CONVERT function to run this query. Is this statement not converting the value appropriately? CONVERT(VARBINARY(64), @Pass, 0))
MitchMahoney
(43 rep)
Nov 23, 2018, 03:50 AM
• Last activity: Nov 23, 2018, 04:02 AM
5
votes
2
answers
5181
views
Please enlighten me about FILESTREAM and BLOB files
This is a really confusing topic for me. I can understand what a BLOB is, I can easily use it. I can understand what FILESTREAM is. I can easily implement in on a database, I can backup the database and restore it with the folder created to store these files. Things like these, I think, I can unders...
This is a really confusing topic for me.
I can understand what a BLOB is, I can easily use it.
I can understand what FILESTREAM is. I can easily implement in on a database, I can backup the database and restore it with the folder created to store these files. Things like these, I think, I can understand well.
What I don't get is:
1. What's the difference between inserting the
.PDF
(for example) using just a VARBINARY(MAX) column in the database, and storing this .PDF
in a FILESTREAM database. I know with the BLOB varbinary(max) the .PDF
is INSIDE the database. If I physically delete the .PDF
, can I still restore it using the value stored inside the varbinary(MAX) column right?
2. When I insert this .PDF
file in a filestream, I can see it inside the folder (the one created for the filegroup), if I insert 100 files, I can see 100 files inside the filestream folder, but when I run DELETE FROM
and delete 100 rows in the database, I can still see these files in the filestream folder. Why does this happen? I thought that, to maintain consistency, these files would be deleted too.
3. With both cases, after inserting the .PDF
file inside the BLOB VARBINARY(MAX)
or FILESTREAM, do I still need the original file? As it is inside the database, I can easily recover it.
**I'm using .PDF as an example**
I can use and implement both approaches but it still not clear to me what happens with them. For me, it's the same thing, but one is stored outside the database, and the other one inside the database.
Racer SQL
(7546 rep)
Jul 3, 2018, 03:45 PM
• Last activity: Jul 3, 2018, 05:47 PM
1
votes
1
answers
315
views
Exporting Stored PDF into an Email
Just a quick question on whether something is possible to do. Database: MSSQL We have a customer that we generate emails for based on sales (e.g. Sales Person X sold Y amount of stock). Against these orders the main system generates a PDF stored as a VARBINARY field in a table. Is it possible for th...
Just a quick question on whether something is possible to do.
Database: MSSQL
We have a customer that we generate emails for based on sales (e.g. Sales Person X sold Y amount of stock). Against these orders the main system generates a PDF stored as a VARBINARY field in a table.
Is it possible for this VARBINARY to be converted into a physical file (as an attachment or similar) within an email?
I have done similar things with images stored in the database but that is a a lot different than PDF.
This is a Third party application i'm working with so I can't make changes their system.
Thanks for Reading,
Joshua
Joshua Smith
(21 rep)
Mar 8, 2017, 04:52 PM
• Last activity: May 1, 2018, 03:34 PM
0
votes
2
answers
1185
views
Store Mesh in database
I hope this is the correct forum but will ask anyway: I have lots of Mesh data I want to store. Mesh is actually [triangles][1]. I hope it's not too of beginner question but wouldn't Geometry type of SQL Server be the best way to store that? Specifically I'm interested in storing FBX file that are c...
I hope this is the correct forum but will ask anyway:
I have lots of Mesh data I want to store. Mesh is actually triangles . I hope it's not too of beginner question but wouldn't Geometry type of SQL Server be the best way to store that?
Specifically I'm interested in storing FBX file that are created in other programs sometimes in different formats such Autodesk Revit then converted to FBX. My other option is use file system which is risk and not managed.
Or using binary field but storing large binary data in a database rarely ends well I understand.
This question is not about how to actually load the data I know this could be a problem but what SQL datatype should I use.
I was planning to make the FBX files very small so that for example I can have lookup tables of types. Example: a door will always be a single FBX and then the table will have: FBXType FBXData 1 (door) 2 (furniture)
At the moment is just querying so I can build AssetBundles for Unity. Then I was thinking if it is geometry maybe I can use the smart indexes and functions like intersect that is available for geometry type.
The data will be changing that is the reason I want it in a database however it will probably be in deliveries (this one day of the month everything is updated) and probably new rows (will have another column IsDeleted) can even be a new table but not a must. If splitting by type my colleague says only few FBX files will be large for buildings skeleton but even this is not sure because we can split by floor. Most FBX files will be 5MB or less
user114472
(1 rep)
Jan 7, 2017, 02:02 PM
• Last activity: Aug 27, 2017, 10:31 AM
Showing page 1 of 20 total questions