Sample Header Ad - 728x90

Database Administrators

Q&A for database professionals who wish to improve their database skills

Latest Questions

26 votes
1 answers
27622 views
How to preserve the original order of elements in an unnested array?
Given the string: > 'I think that PostgreSQL is nifty' I would like to operate on the individual words found within that string. Essentially, I have a separate from which I can get word details and would like to join an unnested array of that string on this dictionary. So far I have: select word, me...
Given the string: > 'I think that PostgreSQL is nifty' I would like to operate on the individual words found within that string. Essentially, I have a separate from which I can get word details and would like to join an unnested array of that string on this dictionary. So far I have: select word, meaning, partofspeech from unnest(string_to_array('I think that PostgreSQL is nifty',' ')) as word from table t join dictionary d on t.word = d.wordname; This accomplishes the fundamentals of what I was hoping to do, but it does not preserve the original word order. Related question: PostgreSQL unnest() with element number
swasheck (10755 rep)
Oct 19, 2012, 08:00 PM • Last activity: May 15, 2024, 08:26 PM
1 votes
1 answers
582 views
Oracle 12.2 shows high parse per execution ratio for unknown lob$ query
In an AWR report I see the following (most liekly recursive) query which happens very often and is parsed each time: select obj#, intcol#, ts#, file#, block#, property from lob$ where lobj#=:1 The AWR entry for SQL ordered by parse calls: Parse Calls Executions % Total Parses SQL Id SQL Module SQL T...
In an AWR report I see the following (most liekly recursive) query which happens very often and is parsed each time: select obj#, intcol#, ts#, file#, block#, property from lob$ where lobj#=:1 The AWR entry for SQL ordered by parse calls: Parse Calls Executions % Total Parses SQL Id SQL Module SQL Text 411,384 411,414 39.62 6y55dxn24t86q select obj#, intcol#, ts#, fil... Any idea which component is generating that and why it is not a prepared statement? (since our app sets "module" everywhere we know its not directly executed from us). We do use LOBs heavily, so this can be related to reorganisation or maybe data guard?
eckes (1456 rep)
Jul 3, 2019, 01:18 PM • Last activity: Apr 21, 2024, 12:05 AM
0 votes
2 answers
2318 views
Extract/parse info between delimiters
I have a column where there are 10 "fields" separated by the pipe character, looks like this: > texthere | somemoretext | sometinghere | moreinfo | evenmorehere | etc | > etc | I've been trying to extract what's between the pipes and I can do the first 2, but after that, my brain gets stuck in an in...
I have a column where there are 10 "fields" separated by the pipe character, looks like this: > texthere | somemoretext | sometinghere | moreinfo | evenmorehere | etc | > etc | I've been trying to extract what's between the pipes and I can do the first 2, but after that, my brain gets stuck in an inception loop and I can't wrap my head around it. eg. Table name is MyTable, column is MyColumn SELECT TOP 10 MyTable.ItemsID, MyTable.MyColumn (SELECT SUBSTRING (MyTable.MyColumn,1,CHARINDEX('|', MyColumn)-1))as Pos1, (SELECT SUBSTRING(MyTable.MyColumn, CHARINDEX('|', MyColumn)+1,CHARINDEX('|', MyColumn))) as Pos2 FROM MyTable I get what I need for Positions 1 and 2, but then for the rest, I'm not sure how to do it. I'm thinking I need to get the index pos of my 3rd pipe but my brain freezes, I tried variations of: SELECT SUBSTRING(MyTable.MyColumn, CHARINDEX(SUBSTRING(MyTable.MyColumn, CHARINDEX('|', MyColumn)+1,CHARINDEX('|', MyColumn)))) as Pos3 I've just looked into REPLACE and PARSENAME functions. With replace I could replace the pipe characters with dots or periods so that PARSENAME can parse the dot delimited values. However, I discovered that PARSENAME is limited to 4 values. I got nice results using STRING_SPLIT however, this function requires compatibility level 130 and I can't alter the DB. My interactions with it is through APIs.
SELECT value
FROM STRING_SPLIT ((select mytable.mycolumn from dbo.mytable), '|');
bravo4one (101 rep)
Apr 4, 2022, 10:37 PM • Last activity: Apr 18, 2024, 12:04 AM
2 votes
2 answers
304 views
Using TRY_PARSE in SQL Server 2008
I have a script in SQL Server 2012 like this: ``` Declare @ParamSPKDateFrom varchar(50), @SPKDateFrom date, Select @SPKDateFrom = TRY_PARSE(@ParamSPKDateFrom AS date USING 'id-ID') ``` Can I convert this script to **SQL Server 2008**?
I have a script in SQL Server 2012 like this:
Declare @ParamSPKDateFrom varchar(50), @SPKDateFrom date,

Select @SPKDateFrom = TRY_PARSE(@ParamSPKDateFrom AS date USING 'id-ID')
Can I convert this script to **SQL Server 2008**?
jubey simanjuntak (21 rep)
Feb 15, 2023, 12:14 PM • Last activity: Mar 9, 2023, 03:39 PM
1 votes
1 answers
867 views
Filtering out duplicate domains from URL column using Postgres full-text search parsers
I have a PostgreSQL database containing `pages` and `links` downloaded by a web crawler, with the following tables: pages ---------- id: Integer (primary key) url: String (unique) title: String text: String html: String last_visit: DateTime word_pos: TSVECTOR links ---------- id Integer (primary key...
I have a PostgreSQL database containing pages and links downloaded by a web crawler, with the following tables: pages ---------- id: Integer (primary key) url: String (unique) title: String text: String html: String last_visit: DateTime word_pos: TSVECTOR links ---------- id Integer (primary key) source: String target: String link_text: String UNIQUE(source,target) crawls --------- id: Integer (primary key) query: String crawl_results ------------- id: Integer (primary key) score: Integer (constraint 0<=score<=1) crawl_id: Integer (foreign key, crawls.id) page_id: Integer (foreign key, pages.id) The source and target fields in the links table contain URLs. I am running the following query to extract scored links from the top-ranking search results, for pages that haven't been fetched yet: WITH top_results AS (SELECT page_id, score FROM crawl_results WHERE crawl_id=$1 ORDER BY score LIMIT 100) SELECT top_results.score, l.target FROM top_results JOIN pages p ON top_results.page_id=p.id JOIN links l on p.url=l.source WHERE NOT EXISTS (SELECT pp.id FROM pages pp WHERE l.target=pp.url) However, ***I would like to filter these results so that only one row is returned for a given domain (the one with the lowest score)***. So for instance, if I get (0.3, 'http://www.foo.com/bar ') and (0.8, 'http://www.foo.com/zor '), I only want the first because it has same domain foo.com and has the lower score. I was able to find documentation for the builtin full text search parsers, which can parse URLS and extract the hostname. For instance, I can extract the hostname from a URL as follows: SELECT token FROM ts_parse('default', 'http://www.foo.com ') WHERE tokid = 6; token ------------- www.foo.com (1 row) However, I can't figure out how I would integrate this into the above query to filter out duplicate domains from the results. And because this is the docs for "testing and debugging text search", I don't know if this use of ts_parse() is even related to how the URL parser is intended to be used in practice. ***How would I use the host parser in my query above to return one row per domain? Also, how would I appropriately index the links table for host and url lookup?***
J. Taylor (379 rep)
Apr 6, 2019, 08:34 AM • Last activity: Jun 27, 2022, 10:00 AM
0 votes
0 answers
121 views
Saving JSON files in SSMS renders them unparsable?
I have some query that outputs a JSON, i.e., select 1 AS 'test' for json path, without_array_wrapper After clicking on the result, which is {'test':1}, SSMS 18.10 opens a file with the JSON output. If I save this file as test.json, and attempt to load it elsewhere (either in an editor like Atom, in...
I have some query that outputs a JSON, i.e., select 1 AS 'test' for json path, without_array_wrapper After clicking on the result, which is {'test':1}, SSMS 18.10 opens a file with the JSON output. If I save this file as test.json, and attempt to load it elsewhere (either in an editor like Atom, in Python, etc), I get an error. In comparison, if I copy the output and paste it into an empty test.json file, it works just fine. What is going on here?
Rushabh Mehta (251 rep)
Mar 1, 2022, 02:40 PM
1 votes
0 answers
332 views
Scalar function parses and runs in SSMS, incorrect syntax in ScriptDOM
I'm preparing the Database migration from compatibility level 140 (SQL Server 2017) to level 150 (SQL Server 2019). As part of the preparation, [Data Migration Assistant][1] (DMA) has been run and it found a syntax error in one of the scalar functions. CREATE OR ALTER FUNCTION dbo.Trim(@String nvarc...
I'm preparing the Database migration from compatibility level 140 (SQL Server 2017) to level 150 (SQL Server 2019). As part of the preparation, Data Migration Assistant (DMA) has been run and it found a syntax error in one of the scalar functions. CREATE OR ALTER FUNCTION dbo.Trim(@String nvarchar(MAX), @NullIfEmpty tinyint) RETURNS nvarchar(MAX) AS BEGIN RETURN NULLIF(TRIM(CHAR(32) + CHAR(160) + CHAR(10) + CHAR(13) + CHAR(9) FROM @String), IIF(@NullIfEmpty = 1, '', NULL)) END I can parse and run this function in both compatibility levels 140 and 150 without issues. But when I've tried parsing it with a ScriptDOM (tutorial by Mala Mahadevan ) or other tools that I assume are using the ScriptDOM (SQL Prompt formatted, Data migration assistant) I've got the same error. **PowerShell:** 1 parsing error(s): { "Number": 46010, "Offset": 123, "Line": 5, "Column": 10, "Message": "Incorrect syntax near NULLIF." } **Data Migration Assistant:** { "Name": "Dtatest", "Databases": [ { "ServerName": "MyServer", "Name": "DtaTest", "CompatibilityLevel": "CompatLevel140", "SizeMB": 16.0, "Status": "Completed", "ServerVersion": "15.0.2000.5", "AssessmentRecommendations": [ { "CompatibilityLevel": "CompatLevel140", "Category": "Compatibility", "Severity": "Error", "ChangeCategory": "BreakingChange", "RuleId": "Microsoft.Rules.Data.Upgrade.UR00001", "Title": "Syntax issue on the source server", "Impact": "While parsing the schema on the source database, one or more syntax issues were found. Syntax issues on the source database indicate that some objects contain unsupported syntax due to which all assessment rules were not run on the object.", "Recommendation": "Review the list of objects and issues reported, fix the syntax errors, and re-run assessment before migrating this database.", "MoreInfo": "", "ImpactedObjects": [ { "Name": "[dbo].[Trim]", "ObjectType": "Object", "ImpactDetail": "Object [dbo].[Trim] has syntax errors. Incorrect syntax near NULLIF.. Error number 46010. For more details, please see: Line 5, Column 10.", "SuggestedFixes": [] } ] }, { "CompatibilityLevel": "CompatLevel150", "Category": "Compatibility", "Severity": "Error", "ChangeCategory": "BreakingChange", "RuleId": "Microsoft.Rules.Data.Upgrade.UR00001", "Title": "Syntax issue on the source server", "Impact": "While parsing the schema on the source database, one or more syntax issues were found. Syntax issues on the source database indicate that some objects contain unsupported syntax due to which all assessment rules were not run on the object.", "Recommendation": "Review the list of objects and issues reported, fix the syntax errors, and re-run assessment before migrating this database.", "MoreInfo": "", "ImpactedObjects": [ { "Name": "[dbo].[Trim]", "ObjectType": "Object", "ImpactDetail": "Object [dbo].[Trim] has syntax errors. Incorrect syntax near NULLIF.. Error number 46010. For more details, please see: Line 5, Column 10.", "SuggestedFixes": [] } ] } ], "ServerEdition": "Developer Edition (64-bit)" } ], "ServerInstances": [ { "ServerName": "localhost", "Version": "15.0.2000.5", "Status": "Completed", "AssessmentRecommendations": [] } ], "SourcePlatform": "SqlOnPrem", "Status": "Completed", "StartedOn": "2022-02-02T07:34:30.8110417+00:00", "EndedOn": "2022-02-02T07:34:42.9071256+00:00", "EvaluateFeatureRecommendations": false, "EvaluateCompatibilityIssues": true, "EvaluateFeatureParity": false, "TargetPlatform": "SqlServerWindows2019", "DmaVersion": {} } **SQL Prompt:** SQL Prompt formatting error If it makes any difference, the function is marked as *is_inlineable* in the *sys.sql_modules* DMV. Until now I thought SQL Engine uses ScriptDOM as well, but I'm getting different results. Any idea why? **The whole repro script for the DMA:** CREATE DATABASE DtaTest ALTER DATABASE DtaTest SET COMPATIBILITY_LEVEL = 140 GO USE DtaTest GO CREATE OR ALTER FUNCTION dbo.Trim(@String NVARCHAR(MAX), @NullIfEmpty TINYINT) RETURNS NVARCHAR(MAX) AS BEGIN RETURN NULLIF(TRIM(CHAR(32) + CHAR(160) + CHAR(10) + CHAR(13) + CHAR(9) FROM @String), IIF(@NullIfEmpty = 1, '', NULL)) END GO CREATE TABLE dbo.DummyTable ( Id tinyint PRIMARY KEY )
Zikato (5724 rep)
Feb 2, 2022, 07:54 AM • Last activity: Feb 2, 2022, 08:03 AM
1 votes
3 answers
18374 views
Oracle alertlog with WARNING: too many parse errors
The SQLID is not available. Any ideas on how to find the source of the problem? 9913 WARNING: too many parse errors, count=414884 SQL hash=0x896ff002 9914 PARSE ERROR: ospid=24108, error=6550 for statement: 9915 2021-08-24T11:01:17.843825+02:00 9916 begin work 9917 Additional information: hd=0x24c17...
The SQLID is not available. Any ideas on how to find the source of the problem? 9913 WARNING: too many parse errors, count=414884 SQL hash=0x896ff002 9914 PARSE ERROR: ospid=24108, error=6550 for statement: 9915 2021-08-24T11:01:17.843825+02:00 9916 begin work 9917 Additional information: hd=0x24c179460 phd=0x24c17a888 flg=0x28 cisid=222 sid=222 ciuid=222 uid=222 sqlid=6jknqnu4qzw02 9918 ...Current username=SCHEMA_NAME 9919 ...Application: xxxx.exe Action: 9920 2021-08-24T11:03:07.555279+02:00
r0tt (1078 rep)
Sep 1, 2021, 10:00 AM • Last activity: Sep 2, 2021, 11:36 AM
1 votes
0 answers
101 views
Parsing calculated members to StrToMember() function
The 1st 9 calculated members in my script below define start and end date strings (e.g. 'StartDateStr' and 'EndDateStr'), that I then parse into a StrToMember() function within my SELECT command, but it throws the following error: - Query (27, 22) The Dimension '[StartDateStr]' was not found in the...
The 1st 9 calculated members in my script below define start and end date strings (e.g. 'StartDateStr' and 'EndDateStr'), that I then parse into a StrToMember() function within my SELECT command, but it throws the following error: - Query (27, 22) The Dimension '[StartDateStr]' was not found in the cube when the string, [StartDateStr], was parsed. These calculated members resolve as follows: - StartDateStr = [Date].[Day].[Day].&[2020-07-22T00:00:00] - EndDateStr = [Date].[Day].[Day].&[2021-07-20T00:00:00] If I place these strings directly in my SELECT command (which compromises the point of them being dynamic parameters), the query works fine. I've tried parsing the calculated members into my SELECT command without the StrToMember() function also, but it throws the same error. I've also tried renaming my calculated members (e.g. '[Date].[Day].[Day].StartDateStr') and only parsing the variable (member) portion of the dimension, hierarchy, and level reference. Can I not use the StrToMember() function this way, or parse calculated members like this, or does my script have some other syntactic issue I'm missing? I've hit a bit of a dead-end after an extensive Google search, so would greatly appreciate any help. Thanks
WITH
    MEMBER TodayStr AS FORMAT(NOW(),'yyyy-MM-dd')
    MEMBER TodayDate AS CDATE(CSTR(LEFT(TodayStr,4)+'-'+LEFT(RIGHT(TodayStr,5),2)+'-'+RIGHT(TodayStr,2)))
    MEMBER No1 AS WEEKDAY(TodayDate,1) + 3
    MEMBER No2 AS 7
    MEMBER Mod AS No1 - INT(No1 / No2) * No2
    MEMBER EndDate AS IIF(WEEKDAY(TodayDate) = 4,TodayDate - Mod - 8,TodayDate - Mod - 1)
    MEMBER EndDateStr AS '[Date].[Day].[Day].&['+RIGHT(LEFT(EndDate,10),4)+'-'+RIGHT(LEFT(EndDate,5),2)+'-'+LEFT(EndDate,2)+'T00:00:00]'
    MEMBER StartDate AS EndDate - (7 * 52) + 1
    MEMBER StartDateStr AS '[Date].[Day].[Day].&['+RIGHT(LEFT(StartDate,10),4)+'-'+RIGHT(LEFT(StartDate,5),2)+'-'+LEFT(StartDate,2)+'T00:00:00]'
    MEMBER [Measures].[FilteredHrs] AS
        IIF([Measures].[Roster Actual Sum Hours Nett] = 24, [Measures].[Roster Actual Sum Hours Nett], NULL)
SELECT 
    NON EMPTY 
        {[Measures].[FilteredHrs]} ON COLUMNS,
    NON EMPTY 
        {
                [Staff].[StaffNumber].[StaffNumber].ALLMEMBERS*
                [Date].[Roster Week].[Roster Week].ALLMEMBERS*
                [Date].[Day].[Day].ALLMEMBERS*
                [Pay Type].[Pay Type].[Pay Type].ALLMEMBERS
        }
     ON ROWS
FROM 
(
    SELECT
        {STRTOMEMBER(StartDateStr):STRTOMEMBER(EndDateStr)} ON COLUMNS
    FROM [Model]
)
Kopite833 (11 rep)
Jul 27, 2021, 10:47 PM
1 votes
1 answers
637 views
I am counfused about when Oracle database won't do parsing
I am confused about when Oracle database won't do parsing? In the AWR report, there is a metrics called "execute to parse", which means more SQL just execute without parsing when it increases. But as the Oracle document describe: "When an application issues a SQL statement, the application makes a p...
I am confused about when Oracle database won't do parsing? In the AWR report, there is a metrics called "execute to parse", which means more SQL just execute without parsing when it increases. But as the Oracle document describe: "When an application issues a SQL statement, the application makes a parse call to the database to prepare the statement for execution. " It seems that everytime a SQL statement is issued, parsing will be called. So I wandering when Oracle won't do parsing and make the "execute to parse" become a larger number? Or I just misunderstood? What Oracle document said is: SQL Parsing *The first stage of SQL processing is parsing. The parsing stage involves separating the pieces of a SQL statement into a data structure that other routines can process. The database parses a statement when instructed by the application, which means that only the application­, and not the database itself, can reduce the number of parses. When an application issues a SQL statement, the application makes a parse call to the database to prepare the statement for execution.* https://docs.oracle.com/database/121/TGSQL/tgsql_sqlproc.htm#TGSQL178 So if "an application issues a SQL statement, the application makes a parse call", then how applications "can reduce the number of parses"?
Jason Yang (11 rep)
Apr 3, 2021, 12:55 PM • Last activity: Apr 4, 2021, 01:56 PM
0 votes
1 answers
367 views
How can I measure the impact of queries with errors being rejected by SQL Server?
After configuring XE on November 23, 2020 to capture some errors I could count 2,163,665 occurrences of the same error coming from requests of a legacy app (that probably won’t be corrected so soon). The errors can be classified on the 1st and 2nd steps for [Processing a SQL Statement](https://learn...
After configuring XE on November 23, 2020 to capture some errors I could count 2,163,665 occurrences of the same error coming from requests of a legacy app (that probably won’t be corrected so soon). The errors can be classified on the 1st and 2nd steps for [Processing a SQL Statement](https://learn.microsoft.com/en-us/sql/odbc/reference/processing-a-sql-statement?view=sql-server-ver15) : > 1. The DBMS first parses the SQL statement. It breaks the statement up > into individual words, called tokens, makes sure that the statement > has a valid verb and valid clauses, and so on. Syntax errors and > misspellings can be detected in this step. > 2. The DBMS validates the statement. It checks the statement against > the system catalog. Do all the tables named in the statement exist > in the database? Do all of the columns exist and are the column > names unambiguous? Does the user have the required privileges to > execute the statement? Certain semantic errors can be detected in > this step. The doc also says that **Parsing a SQL statement does not require access to the database and can be done very quickly**, but the amount of requests being fired at the server made me think I should verify. I’d like to measure the resource consumption impact caused by those wrong requests being rejected on those steps. Is there a way to do so?
Ronaldo (6017 rep)
Feb 22, 2021, 10:04 AM • Last activity: Feb 22, 2021, 01:29 PM
4 votes
1 answers
11643 views
xml parse in postgresSQL
I have a problem with parse simple xml: 1 IT 1 2 3 4 5 Pražsk&#225; platba kartou This XML file I parsed with use code: DO $$ DECLARE myxml xml; BEGIN myxml := XMLPARSE(DOCUMENT convert_from(pg_read_binary_file('MyData.xml'), 'UTF8')); DROP TABLE IF EXISTS my; CREATE TABLE my AS SELECT (xpath('//ID'...
I have a problem with parse simple xml: 1 IT 1 2 3 4 5 Pražská platba kartou This XML file I parsed with use code: DO $$ DECLARE myxml xml; BEGIN myxml := XMLPARSE(DOCUMENT convert_from(pg_read_binary_file('MyData.xml'), 'UTF8')); DROP TABLE IF EXISTS my; CREATE TABLE my AS SELECT (xpath('//ID', x))::text AS ID, (xpath('data-set/@Name', x))::text AS Name, (xpath('//ID_CUSTOMER', x))::text AS id_customer, (xpath('//Adress', x))::text AS Adress, (xpath('//Desc', x))::text AS tgen FROM unnest(xpath('//data-set', myxml)) x ; END$$; select * from my Unfortunately, this parse gives me only first single row in result. I need create table where are all record in to relevant rows: Rows1 - ID 1, ID_CUSTOMER 1, Adress Pražská, Desc Platba kartou Rows2 - ID 1, ID_CUSTOMER 2, Adress Pražská, Desc Platba kartou Rows3 - ID 1, ID_CUSTOMER 3, Adress Pražský, Desc Platba kartou Rows4 - ........ Rows5 - ........ Thank you for your tips.
michal (103 rep)
Nov 13, 2016, 09:51 AM • Last activity: Jan 19, 2021, 02:04 PM
0 votes
1 answers
64 views
Why are CASE-expressions in the list documenting operator precedence?
In the list here: https://mariadb.com/kb/en/operator-precedence/ case-expressions are put at the same precedence level as `BETWEEN`, between the `NOT` operator and the comparison operators. However, case-expressions always begin with `CASE` and end with `END`, and all subexpressions are also delimit...
In the list here: https://mariadb.com/kb/en/operator-precedence/ case-expressions are put at the same precedence level as BETWEEN, between the NOT operator and the comparison operators. However, case-expressions always begin with CASE and end with END, and all subexpressions are also delimited by the CASE keywords. They're like parenthetical expressions, so I don't understand why case-expressions are on this list. Is there an SQL expression that would be parsed differently if the case-expression precedence was set higher or lower? To give an example, with 2 + 3 * 4, we get different results when parentheses are used in these 2 ways: 2 + (3 * 4) and (2 + 3) * 4. This question is about how it's impossible to do the same with CASE. One can't substitute the use of + for CASE and show 2 different uses of parentheses such that the result differs between them. To compare with other RDBMSes, neither [SQLite](https://sqlite.org/lang_expr.html#operators) nor [PostgreSQL](https://www.postgresql.org/docs/current/sql-syntax-lexical.html#SQL-PRECEDENCE-TABLE) include CASE in their operator precedence lists.
JoL (289 rep)
Jul 25, 2020, 03:43 PM • Last activity: Jul 31, 2020, 03:43 PM
0 votes
2 answers
92 views
Convert Text Column Web Service Data to Columns
I have a database table which stores web transactions for a fundraising donation page. One of the columns in the table stores a bunch of concatenated data in the format 'data1=value1 data2=value2 data3=value3 moredata4=value3..." The pairs represent fields/columns from an external database integrate...
I have a database table which stores web transactions for a fundraising donation page. One of the columns in the table stores a bunch of concatenated data in the format 'data1=value1 data2=value2 data3=value3 moredata4=value3..." The pairs represent fields/columns from an external database integrated thru a rest service. God knows why it's stores in this way, but I need to parse this column into multiple columns. I cannot figure out how to do this in an efficient and clean way. Something to note: not all records/rows in the table have a value for every item in the combined column. So you might see 'data1=value1 data2= data3=data3...' also not all 'firlds/columns' in the concatenated pairs start with 'data', some might say 'tranamt=50 coupon=2bb2' for example. Also to note, the 'space' looking character between pairs is actually chr(10) not a space. I'd really appreciate anyone's help on this. I'm stumped. I've researched this endlessly on stackexchange and Google, and not been able to find anything that helps me with this unique scenario. I've split delimited columns before, but not with a 'field/column' and value pair. I'm also new to oracle. This is in an oracle db. Thanks!
Tyler Hahn (3 rep)
Nov 15, 2019, 07:36 AM • Last activity: Nov 15, 2019, 09:55 AM
1 votes
1 answers
618 views
Parsing URL Links
I have a large data set of over 10k+ rows and I'm trying to parse the url link that people of clicked on here is a table: dbo.email_list UserID Cliked_Linked 101012 https:// amz/profile_center?qp= 8eb6cbf33cfaf2bf0f51 052469 htpps:// lago/center=age_gap=email_address=caipaingn4535=English_USA 046894...
I have a large data set of over 10k+ rows and I'm trying to parse the url link that people of clicked on here is a table: dbo.email_list UserID Cliked_Linked 101012 https:// amz/profile_center?qp= 8eb6cbf33cfaf2bf0f51 052469 htpps:// lago/center=age_gap=email_address=caipaingn4535=English_USA 046894 https://itune/fr/unsub_email&utm=packing_345=campaign_6458_linkname=ghostrider So I tried this code: UPDATE email_list set Clicked_Link= REVERSE(SUBSTRING(REVERSE(Cliked_Link),,CHARINDEX('.', REVERSE(ColumnName)) + 1, 999)) Unfortunately this didn't work. The goal is to have the link split where the '=' sign is and and have anything that is between the equal sign be in its own column This is the result I hope to have UserID COL_1 COL_2 COL_3 COL_4 101012 https:// amz/profile_center?qp 8eb6cbf33cfaf2bf0f51 NaN 052469 htpps:// lago/center email_addres caipaingn4535 English_USA 046894 https://itune/fr/unsub_email&utm packing_345 campaign_6458_linknam ghostrider
learning (49 rep)
Sep 27, 2019, 02:57 PM • Last activity: Sep 27, 2019, 03:29 PM
0 votes
2 answers
1769 views
SQL SERVER: How to get the stored procedure text EXCLUDING comments?
I need to find all the stored procs that use Transactions, as I want to enable transaction abort to those procedures. However, --I didn't do this; it's inherited-- many of the stored procedures contain testing procedures with in comment blocks and most of the tests contain transaction blocks. I am o...
I need to find all the stored procs that use Transactions, as I want to enable transaction abort to those procedures. However, --I didn't do this; it's inherited-- many of the stored procedures contain testing procedures with in comment blocks and most of the tests contain transaction blocks. I am only interested in changing stored procs that actually use transactions. AND I want to be able to monitor when stored procs are updated so that I can make sure that this flag is set. SET XACT_ABORT ON; Addendum: based on comments, here're some examples from my system. /* -- clean up after tests BEGIN TRANSACTION EXEC dbo.AR_Cleanup_MoveEqualAndOppositeSBPLiabilities ROLLBACK TRANSACTION */ /* Use case: 147 BEGIN TRANSACTION .... */
Display name (103 rep)
Jul 29, 2019, 03:42 PM • Last activity: Jul 30, 2019, 01:12 PM
1 votes
1 answers
54 views
pgsql Return Partial Text Field Ending In A Whole Word
I found this PHP code that returns a sub-string ending with a whole word (IE separated by a space): if ( strlen( $body ) 0 ) $body = substr( $body , 0 , $rpos ); return $body; Is it possible to achieve the same result using pgsql's string manipulation functions?
I found this PHP code that returns a sub-string ending with a whole word (IE separated by a space): if ( strlen( $body ) 0 ) $body = substr( $body , 0 , $rpos ); return $body; Is it possible to achieve the same result using pgsql's string manipulation functions?
Ron Piggott (187 rep)
May 21, 2019, 03:18 PM • Last activity: May 21, 2019, 06:16 PM
1 votes
0 answers
1749 views
How do I parse Address into Street Number, Street Name, Type, Unit Number?
How do I parse Address into Street Number, Name, Type, Unit Number? Is there a function which will conduct this? 2280 MEYERS AVE ALDERGROVE AVE & S VINEWOOD ST SIMPSON WAY & N HALE AVE 412 E WASHINGTON AVE #3 AVOCADO AVE & LEMON ST 572 N TULIP ST 1030 HAWAII PL 500 N MIDWAY DR A-E, ADMIN
How do I parse Address into Street Number, Name, Type, Unit Number? Is there a function which will conduct this? 2280 MEYERS AVE ALDERGROVE AVE & S VINEWOOD ST SIMPSON WAY & N HALE AVE 412 E WASHINGTON AVE #3 AVOCADO AVE & LEMON ST 572 N TULIP ST 1030 HAWAII PL 500 N MIDWAY DR A-E, ADMIN
user173948
Mar 18, 2019, 06:31 PM
0 votes
1 answers
344 views
SQL Server 2017 - Extracting source and target column names within a string containing column mappings
I have the following format string stored in text (could be any number of columns): col1_source|col1_target;col2_source|col2_target;col3_source|col3_target;... I'm trying to come up with an elegant way of extracting and isolating all the xxx_source column names and all the xxx_target column names so...
I have the following format string stored in text (could be any number of columns): col1_source|col1_target;col2_source|col2_target;col3_source|col3_target;... I'm trying to come up with an elegant way of extracting and isolating all the xxx_source column names and all the xxx_target column names so I could store them in variables and get the following end result: @Source_Columns = 'col1_source,col2_source,col3_source' @Target_Columns = 'col1_target,col2_target,col3_target' At the end of the day, I'd like to perform SELECTs on my source and target columns to perform data compares. This is what I've achieved so far, but I find it's just too complex for nothing (with a table valued function): CREATE FUNCTION [dbo].[UF_miscParseStringToTable] ( @list nvarchar(MAX) , @sep varchar(8) ) RETURNS @ts table ( [ID] int identity , [value] nvarchar(MAX) ) AS BEGIN -- Parameters check if ((@sep is null) or (datalength(@sep) '%') set @sep = '%' + @sep if (right(@sep, 1) '%') set @sep = @sep + '%' -- First first sep declare @i int set @i = patindex(@sep, @list) -- Acc values while (@i > 0) begin insert into @ts ([value]) values (rtrim(left(@list, @i - 1))) set @list = ltrim(right(RTRIM(@list), len(@list) + 3 - (@i + len(@sep) ))) set @i = patindex(@sep, @list) end set @list = rtrim(@list) -- Insert last value, if any if (@list '') insert into @ts (value) values (@list) return END The function above basically takes my mapping string and converts it to a list of column names in a table (see query logic below): DECLARE @Delim varchar(1) = '|' DECLARE @Mapping varchar(max) = 'col1_source|col1_target;col2_source|col2_target;col3_source|col3_target' DECLARE @String varchar(max) = REPLACE(@Mapping,';', @Delim) SELECT * FROM dbo.UF_miscParseStringToTable(@String, @Delim) The above resulting query yields the following table: ID| value 1 | col1_source 2 | col1_target 3 | col2_source 4 | col2_target 5 | col3_source 6 | col3_target I could perhaps do a join on the column indexes but, I'm finding it difficult to isolate my source and target fields so that I could perform data comparisons between them. In addition, I'd like to avoid performing an extra join to a table if I don't have to. Here are the results desired (to be able to perform the following): SELECT col1_source, col2_source, col3_source FROM mytable; SELECT col1_target, col2_target, col3_target FROM mytable; Any help or ideas would be great! Shawn
shawnyshawny (13 rep)
Feb 21, 2019, 01:40 PM • Last activity: Feb 21, 2019, 04:19 PM
-2 votes
2 answers
959 views
How to convert a multi-row result set into a single-row text separated by commas
I'm using this query select Name, valueType from mytable where ID = 1 and getting this table: [![enter image description here][1]][1] But I need to covert the result set into something like: 'idGasto int, noTicket string, fechaFact string, ..., etc.' [1]: https://i.sstatic.net/uIrup.png Do you have...
I'm using this query select Name, valueType from mytable where ID = 1 and getting this table: enter image description here But I need to covert the result set into something like: 'idGasto int, noTicket string, fechaFact string, ..., etc.' Do you have any suggestion?
E.Rawrdr&#237;guez.Ophanim (311 rep)
Sep 12, 2018, 07:56 PM • Last activity: Sep 13, 2018, 03:43 AM
Showing page 1 of 20 total questions