Sample Header Ad - 728x90

Unix & Linux Stack Exchange

Q&A for users of Linux, FreeBSD and other Unix-like operating systems

Latest Questions

5 votes
1 answers
3191 views
sqlite3 command line - how to set mode and import in one step
I need to be able to do this via command line in one step: lab-1:/etc/scripts# sqlite3 test.db SQLite version 3.8.10.2 2015-05-20 18:17:19 Enter ".help" for usage hints. sqlite> .mode csv ; sqlite> .import /tmp/test.csv users sqlite> select * from users; John,Doe,au,0,"",1,5555,91647fs59222,audio sq...
I need to be able to do this via command line in one step: lab-1:/etc/scripts# sqlite3 test.db SQLite version 3.8.10.2 2015-05-20 18:17:19 Enter ".help" for usage hints. sqlite> .mode csv ; sqlite> .import /tmp/test.csv users sqlite> select * from users; John,Doe,au,0,"",1,5555,91647fs59222,audio sqlite> .quit I've tried the following: lab-1:/etc/scripts# sqlite3 test.db ".mode csv ; .import /tmp/deleteme.csv users" and lab-1:/etc/scripts# sqlite3 test.db ".mode csv .import /tmp/deleteme.csv users" I don't get errors but I also don't end up with any data in the users table. Any tips would be appreciated.
dot (755 rep)
May 23, 2018, 06:46 PM • Last activity: Jun 12, 2025, 08:19 PM
2 votes
3 answers
2013 views
Convert SQLite CSV output to JSON
I want to format SQLite output in JSON format from the command line. Currently, I have CSV output that looks like this: label1,value1 label2,value2 label3,value3 ... Now I'd like to have it formatted like this: {'label1' : 'value1', 'label2': 'value2', ... } Thanks!
I want to format SQLite output in JSON format from the command line. Currently, I have CSV output that looks like this: label1,value1 label2,value2 label3,value3 ... Now I'd like to have it formatted like this: {'label1' : 'value1', 'label2': 'value2', ... } Thanks!
michelemarcon (3593 rep)
Jan 13, 2016, 03:45 PM • Last activity: Jun 12, 2025, 08:10 PM
1 votes
1 answers
57 views
Open reference from Zotero using rofi / dmenu
I have a number of collections in Zotero (a bibliography manager). I would like to be able to open the associated PDFs using `rofi` or `dmenu`, and have a list of entries to choose from of the following form: [COLLECTION][AUTHORS]TITLE.pdf Has anyone created a script for this?
I have a number of collections in Zotero (a bibliography manager). I would like to be able to open the associated PDFs using rofi or dmenu, and have a list of entries to choose from of the following form: [COLLECTION][AUTHORS]TITLE.pdf Has anyone created a script for this?
Rastapopoulos (1869 rep)
Nov 7, 2024, 01:48 PM • Last activity: Nov 8, 2024, 11:59 AM
1 votes
1 answers
100 views
Why doesn't my sqlite3 history work after I run a dotnet core web app?
Over the past year or two I've been using sqlite3 a lot for web development projects. Invariably I would stumble upon a problem where suddenly, the history of commands would no longer work (when pressing up arrow you get command history, just like you do in bash). **FYI** I'm running Ubuntu 22.04.3...
Over the past year or two I've been using sqlite3 a lot for web development projects. Invariably I would stumble upon a problem where suddenly, the history of commands would no longer work (when pressing up arrow you get command history, just like you do in bash). **FYI** I'm running Ubuntu 22.04.3 LTS (6.5.0-21-generic) and .NET Core 8.0.200 **What I've Tried** I couldn't figure this out. * I examined my ~/.sqlite_history file and it looked normal. * checked for $SQLITE_HISTORY env variable but I don't have this and it still works (more below) * Updated to the latest version of sqlite3 (3.45.1 2024-01-30) * I've researched quite a few issues related to this, but still couldn't figure out why suddenly, the history didn't work. **Edit - Tried Console App, Also Causes Problem** 1. Opened a new terminal 2. Opened a sqlite3 db and tested the history command -- it works. 3. Exited sqlite3 back to command line (same terminal) 4. Ran a .NET Core console app which outputs "Hello, World" to the console and exits. (started with dotnet run) 5. Opened same sqlite3 db from step 2. Tried history. History no longer works in this terminal window. **Finally Discovered What Causes the Problem** I finally discovered why sqlite3 history doesn't work. However, I don't understand why this causes an issue. To reproduce the problem I have to: 1. start dotnet core web app Example: ($ dotnet run --project myWebApi) 2. stop the webapi (CTRL-C) 3. start sqlite3 to examine my db -- at this point I have no history (pushing up arrow provides no results) **A Clue To The Issue** If I open a new terminal and run sqlite3 then I do have history again. The dotnet core app is running as me and I don't think there is anything odd happening there, but I'm not sure. Do you know why this is happening? The solution ended up being: $ tput init or at the sqlite command line > .shell tput init
raddevus (123 rep)
Mar 7, 2024, 04:21 PM • Last activity: Oct 16, 2024, 11:52 AM
10 votes
1 answers
3312 views
How can SQLite command ".import" read directly from standard input?
How would you modify function `csv_to_sqlite` so that [`sqlite3`](https://sqlite.org/cli.html) command [`.import`](https://sqlite.org/cli.html#importing_csv_files) reads directly from standard input instead of from a temporary named pipe? ``` #!/bin/bash function csv_to_sqlite() { local database_fil...
How would you modify function csv_to_sqlite so that [sqlite3](https://sqlite.org/cli.html) command [.import](https://sqlite.org/cli.html#importing_csv_files) reads directly from standard input instead of from a temporary named pipe?
#!/bin/bash

function csv_to_sqlite() {
  local database_file_name="$1"
  local table_name="$2"
  local temp_input_fifo=$(mktemp -u)
  mkfifo $temp_input_fifo
  sqlite3 -csv $database_file_name ".import $temp_input_fifo $table_name" &
  cat > $temp_input_fifo
  rm $temp_input_fifo
}

database_file_name=$1
table_name=$2

csv_to_sqlite "$database_file_name" "$table_name"
$ printf "col1,col2,col3\na,1,2.6\nb,2,5.4\n" | ./csv_to_sqlite test.sqlite test
$ sqlite3 -csv -header test.sqlite "SELECT * FROM test"
col1,col2,col3
a,1,2.6
b,2,5.4
Derek Mahar (567 rep)
Mar 25, 2021, 09:46 PM • Last activity: Sep 4, 2024, 09:54 PM
27 votes
1 answers
14584 views
SQLite3 command line: how do you cancel a command?
I made a mistake writing a command at the SQLite command prompt, which I now want to abort, this is how my command line looks sqlite> select * from todos' ...> ' ...> ;^C In this case, likely because I have opened a quote, I can't even hit `ENTER` to run the command. I just get a line continuation,...
I made a mistake writing a command at the SQLite command prompt, which I now want to abort, this is how my command line looks sqlite> select * from todos' ...> ' ...> ;^C In this case, likely because I have opened a quote, I can't even hit ENTER to run the command. I just get a line continuation, that would still be less than ideal because I would be having to run bad code and cause and error, just to regain control of the prompt. How can I cancel the line/command and get returned back to a prompt?
the_velour_fog (12760 rep)
Jun 21, 2016, 08:33 AM • Last activity: Jul 11, 2024, 07:38 AM
1 votes
1 answers
86 views
Add list of urls to the Qutebrowser’s sqlite3 history
### General overview I have a `history.csv CSV file of urls formated like this: ```csv ; ;2024-03-30T12:00:00;2024-03-30T12:00:00 ``` (The _2024-03-30T12:00:00_ is the same) The goal is to get this urls in the history of Qutebrowser who is in sqlite3. ### What I did #### Prerequest First I clos...
### General overview I have a `history.csv CSV file of urls formated like this:
;;2024-03-30T12:00:00;2024-03-30T12:00:00
(The _2024-03-30T12:00:00_ is the same) The goal is to get this urls in the history of Qutebrowser who is in sqlite3. ### What I did #### Prerequest First I closed Qutebrowser. #### Opening the DB I run:
sqlite3 ~/.local/share/qutebrowser/history.sqlite
#### Adding the new urls In the sqlite3 shell I did:
.mode csv
.separator ';'
.import urls.txt History
The things seams good, except just some bad formated lines like:
history.csv:11671: unescaped " character
history.csv:11671: unescaped " character
history.csv:11772: expected 4 columns but found 6 - extras ignored
#### Verificaton I verified the changes took place with:
SELECT * FROM History;
And yes, I saw the urls. #### Closing Finally I closed the console with:
.quit
### The problem When I went back to Qutebrowser and try to open some urls to see if it works, so it doesn’t work. The urls seems not imported. ### The question What to do to import urls into Qutebrowser’s history? By the way I described or any other way. ## Update The links I added appear when I activate :history command inside Qutebrowser. But did not appear in history completion. So it’s not a question of database.
fauve (1529 rep)
Mar 31, 2024, 12:59 PM • Last activity: Mar 31, 2024, 11:06 PM
0 votes
2 answers
55 views
places.sqlite buggy? seems to behave abnormally. How to get $LAST_VISITED_URL?
I posted [here][1]. Here is my `last-url.sh` ```bash #!/bin/bash rm -rf /tmp/db.sqlite cp ~/.mozilla/firefox/*.default-esr/places.sqlite /tmp/db.sqlite ; LAST_URL=`sqlite3 /tmp/db.sqlite "select url from moz_places order by last_visit_date desc limit 1;"`; echo $LAST_URL ``` My script keeps giving m...
I posted here . Here is my last-url.sh
#!/bin/bash
    
    rm  -rf /tmp/db.sqlite 
    cp ~/.mozilla/firefox/*.default-esr/places.sqlite /tmp/db.sqlite ;
    LAST_URL=sqlite3 /tmp/db.sqlite "select url from moz_places order by last_visit_date desc limit 1;";

    echo $LAST_URL
My script keeps giving me an example of a stackoverflow.com URL instead of the youtube.com one, even if I reload the page. Where is the issue hiding? I have only one xxxxxx.default-esr profile. Screenshot: firefox history sqlite database Testing Examples: the "google test" is a brand new visited URL, and the others were refreshed pages Now I ask myself how Firefox creates the history tab (it reflects the real history). Is it using which file or which SQL request on places? SQLite?
r01_mage (1 rep)
Mar 6, 2024, 10:01 PM • Last activity: Mar 9, 2024, 01:42 AM
0 votes
1 answers
333 views
SQLite3 randomly changes file permissions and owner
I use sqlite3 for a PHP app. The functions I use are from the Sqlite3 package built in to PHP. I do not normal CRUD actions, including restarting the DB for testing. This includes dropping tables and reinserting dummy data. The DB file has file permissions of `sudo chmod a+rw` and `sudo chown $USER:...
I use sqlite3 for a PHP app. The functions I use are from the Sqlite3 package built in to PHP. I do not normal CRUD actions, including restarting the DB for testing. This includes dropping tables and reinserting dummy data. The DB file has file permissions of sudo chmod a+rw and sudo chown $USER:www-data. This all works fine. But at least once per day, I will get a no write error. And when I look at the DB, the permissions and ownership has changed. The user and group changes to $USER:$USER and the file permissions change to -rw-r--r-- What could be causing this behavior? OS: Debian 12
Vinn (236 rep)
Jan 20, 2024, 03:05 PM • Last activity: Jan 20, 2024, 11:27 PM
0 votes
1 answers
124 views
Pipe sql query to find command
OS: RockyLinux 8.5 This command works wonderfully: sqlite3 files.db "select file from A;" | rsync -R -av --files-from=/dev/stdin /SOURCE /DESTINATION/Out result without pipe: sqlite3 files.db "select file from A;" file1.txt file2.txt file3.txt I'm trying to use the same method but pipe to a find com...
OS: RockyLinux 8.5 This command works wonderfully: sqlite3 files.db "select file from A;" | rsync -R -av --files-from=/dev/stdin /SOURCE /DESTINATION/Out result without pipe: sqlite3 files.db "select file from A;" file1.txt file2.txt file3.txt I'm trying to use the same method but pipe to a find command. sqlite3 files.db "select file from A;" | find -type f -name /dev/sdin It does not return any values.
Gary Schermer (1 rep)
Nov 15, 2023, 03:01 AM • Last activity: Nov 16, 2023, 12:01 AM
2 votes
2 answers
2091 views
Parsing SQLite output
Using -line Option I get this output from SQLite: row1_column1 = valueA row1_column2 = valueB row2_column1 = valueC row2_column2 = valueD So one line for each column value and result rows are separated by a blank line. I need to get this output in a array or list containing valueA,valueB valueC,valu...
Using -line Option I get this output from SQLite: row1_column1 = valueA row1_column2 = valueB row2_column1 = valueC row2_column2 = valueD So one line for each column value and result rows are separated by a blank line. I need to get this output in a array or list containing valueA,valueB valueC,valueD Additionally non-numeric (!) values shall be enclosed by ' and '. Any simple way to do this? Thanks for any hint!
me.at.coding (3169 rep)
Sep 9, 2012, 03:30 PM • Last activity: Nov 11, 2023, 08:37 PM
4 votes
3 answers
329 views
check patterns that don't exist in sqlite
I explained a similar situation with plain text files on https://unix.stackexchange.com/questions/29624/grep-huge-number-of-patterns-from-huge-file/30144. Many people there said I should, so now I'm migrating my data to a sqlite database: I have a file from which I extract about 10,000 patterns. The...
I explained a similar situation with plain text files on https://unix.stackexchange.com/questions/29624/grep-huge-number-of-patterns-from-huge-file/30144 . Many people there said I should, so now I'm migrating my data to a sqlite database: I have a file from which I extract about 10,000 patterns. Then I check if the database doesn't contain such patterns. If it doesn't, I need to save them externally in file for further processing: for id in $(grep ^[0-9] keys); do if [[ -z $(sqlite3 db.sqlite "select id from main where id = $id") ]]; then echo $id >>file fi done Since I'm new to SQL, I couldn't find a simple way to do this. Also, this loop is useless as it is 20 times slower than what I achieved with awk on the mentioned URL. Since the database is huge, keeps growing, and I run this loop very frequently, is it possible to make this faster?
admirabilis (4792 rep)
Jan 31, 2012, 12:54 AM • Last activity: Nov 11, 2023, 01:26 PM
3 votes
4 answers
2240 views
Generating unique IDs for json content indexing
I am looking for effective and simple ID generation for the following content using bash script: {"name": "John", "surname": "Gates", "country": "Germany", "age": "20", "height": "180"} {"name": "John1", "surname": "Gates", "country": "Germany", "age": "20", "height": "180"} {"name": "John2", "surna...
I am looking for effective and simple ID generation for the following content using bash script: {"name": "John", "surname": "Gates", "country": "Germany", "age": "20", "height": "180"} {"name": "John1", "surname": "Gates", "country": "Germany", "age": "20", "height": "180"} {"name": "John2", "surname": "Gates", "country": "Germany", "age": "20", "height": "180"} {"name": "John3", "surname": "Gates", "country": "Germany", "age": "20", "height": "180"} {"id": "XXX", "name": "John", "surname": "Gates", "country": "Germany", "age": "20", "height": "180"} {"id": "XXX", "name": "John1", "surname": "Gates", "country": "Germany", "age": "20", "height": "180"} {"id": "XXX", "name": "John2", "surname": "Gates", "country": "Germany", "age": "20", "height": "180"} {"id": "XXX", "name": "John3", "surname": "Gates", "country": "Germany", "age": "20", "height": "180"} I will have approximately 5,000,000 of similar records and I want to generate repeatable, predictable ID. As I will be constrained by time to process the following file, I need to do it under 20 minutes window to sql lite database on a Linux machine. MD5, SHA1 are too expensive to be used, unless I can do something like GNU Parallel on 16 threads on AMD Ryzen 1900X CPU that will manage to do it under a few minutes? I have tried with MD5, accomplished 28,000 IDs calculated with 1 min 45 seconds. With SHA1 it took me 2min 3 seconds. I was thinking about creating ID very simple: JohnGatesGermany20180 John1GatesGermany20180 John2GatesGermany20180 John3GatesGermany20180 What could you recommend where the following requirements have to be met: - bash - Linux - 5,000,000 records to process - under 20 minutes - id has to be the same for the same json lines Performed tests: #!/usr/local/bin/bash while IFS= read -r line do uuid=$(uuidgen -s --namespace @dns --name "www.example.com" ) done > test3.txt done < testfile1.txt $time bash script.sh real 12m49.396s user 12m23.219s sys 4m1.417s
Anna (53 rep)
Aug 2, 2018, 08:43 PM • Last activity: Nov 10, 2023, 04:42 PM
0 votes
0 answers
153 views
how to install missing python modules offline without managers
I have a device running an embedded OS (Custom Linux buildroot). It is offline and has **no** package manager (`apt`, `yum`, et al.) and Python has several missing modules, one of which is key for our application to run. Python also does not have `pip` either (which makes a kind of sense, it being a...
I have a device running an embedded OS (Custom Linux buildroot). It is offline and has **no** package manager (apt, yum, et al.) and Python has several missing modules, one of which is key for our application to run. Python also does not have pip either (which makes a kind of sense, it being an offline system). **I cannot** rebuild the OS since it is currently running and mission-critical systems have to stay up. All I need to do is add the Python sqlite3 module to this thing and *poof*, application is in business. On my development box, I have tried doing a
-shell
pip freeze > reqs.txt
to see what modules I am using, but none of the Python default modules are listed, most notably sqlite3. Can I just copy the module directly from my Python modules folder to the device's Python modules folder and be done, or is this going to make a mess? Forgive the seemingly noob question, but I've never played with this kind of offline scenario when Python isn't fully realized on the machine., and given that I can't have the app throwing errors on the running box, I'm leery of just tinkering.
WhiteRau (255 rep)
Sep 20, 2023, 08:01 PM • Last activity: Sep 28, 2023, 03:34 PM
0 votes
1 answers
62 views
Sqlite table creation
Is it possible to create tables based on a columns data? I currently have a table named `Exchange` which contains numerous columns ``` ID:NAME:PRICE 1:Stick:12 2:Stone:20 3:Water:1 4:Water:1 ``` But I want to create numerous tables based on the `Name` column and select the rows that contain that nam...
Is it possible to create tables based on a columns data? I currently have a table named Exchange which contains numerous columns
ID:NAME:PRICE
1:Stick:12
2:Stone:20
3:Water:1
4:Water:1
But I want to create numerous tables based on the Name column and select the rows that contain that name and move them to their respective table. So in the above example, the result would be 3 new tables Stick, Stone & Water The table Water would be
1:Water:1
2:Water:1
sql (1 rep)
Feb 25, 2023, 02:10 PM • Last activity: Feb 25, 2023, 03:28 PM
74 votes
6 answers
72052 views
Is there any (good) SQLite GUI for Linux?
I'm looking for a SQLite graphical administration utility for Linux but I can't seem to find any (I found an extension for Firefox I'm not a user of that browser). Is there any that you know of?
I'm looking for a SQLite graphical administration utility for Linux but I can't seem to find any (I found an extension for Firefox I'm not a user of that browser). Is there any that you know of?
Jimmy (743 rep)
May 14, 2012, 10:51 AM • Last activity: Feb 15, 2023, 09:27 AM
0 votes
1 answers
9751 views
Migrate nextcloud sqlite database to mysql inside docker
migrate sqlite to mariadb inside a docker image? I installed the docker-compose example from the nextcloud readme (the [base version with apache][1]) but that one still uses sqlite, although there is created a db docker image as well, but it is not used. How do I migrate the sqlite to the created db...
migrate sqlite to mariadb inside a docker image? I installed the docker-compose example from the nextcloud readme (the base version with apache ) but that one still uses sqlite, although there is created a db docker image as well, but it is not used. How do I migrate the sqlite to the created db-docker image? I tried: docker-compose exec --user www-data app php occ db:convert-type --all-apps mysql nextcloud 127.0.0.1 nextcloud which asks for the database password then, but the password I initially set in docker-compose.yml doesn't work
rubo77 (30435 rep)
Aug 13, 2018, 08:37 AM • Last activity: Dec 14, 2022, 09:02 PM
0 votes
1 answers
2848 views
Bash script for sqlite commands
I wish to run a bash script that asks for a variable to be then used in a sqlite query. I have no real experience in scripting, anyway I've tried something like the following but it doesn't work. Doesn't even give an error, just it doesn't show anything. ``` #!/bin/bash echo name read name sqlite3 /...
I wish to run a bash script that asks for a variable to be then used in a sqlite query. I have no real experience in scripting, anyway I've tried something like the following but it doesn't work. Doesn't even give an error, just it doesn't show anything.
#!/bin/bash
echo name	
read name	
sqlite3 /arch.db << 'EOF'
.headers on
select type, number, address from documents where name = '$name';
EOF
I'd appreciate any help.
M.I. (1 rep)
May 8, 2021, 04:03 PM • Last activity: Aug 16, 2022, 07:06 PM
1 votes
1 answers
4121 views
What exactly is being cached when opening/querying a SQLite database
I was asked to improve existing code to query SQLite databases. The original code made a lot of separate calls to the database and filtered the results in Python. Instead, I opted to re-write the database creation and put the filtering logic in the SQL query. After running benchmarks on a databases...
I was asked to improve existing code to query SQLite databases. The original code made a lot of separate calls to the database and filtered the results in Python. Instead, I opted to re-write the database creation and put the filtering logic in the SQL query. After running benchmarks on a databases of different sizes. While comparing with the original implementation I found that the average query time for n=3 of a query was a lot faster in the new implementation (3s vs. 46 **minutes**). I suspected that this was a caching issue, but I wasn't sure of its origin. Between every query I closed the database connection and deleted any lingering Python variables and ran gc but the out-of-this-world persisted. Then I found that it was likely the system that was caching something. Indeed, when I clear the system's cache after every iteration with echo 3 > /proc/sys/vm/drop_caches the performance is much more in line with what I expected (2-5x speed increase compared to 80.000x speed increase). The almost philosophical issue that I have now is what I should report as an improvement: the cached performance (as-is) or the non-cached performance (explicitly deleting cache before queries). (I'll likely report both but I am still curious about what is being cached.) I think it comes down to the question what is actually being cached. In other words: does the caching represent a real-world scenario or doesn't it at all. I would think that if the database or its indices are cached, then the fast default performance is a good representation of the real world as it would be applicable to new, unseen queries. However, if specific queries are cached instead, then the cached performance does not reflect on unseen queries. Note: this might be an unimportant detail but I have found that the impact of this caching is especially noticeable when using fts5 virtual tables! Tl;dr: when the system is caching queries to SQLite, what exactly is it caching, and does that positively impact new, unseen queries? If it matters: Ubuntu 20.04 with sqlite3.
Bram Vanroy (183 rep)
Aug 7, 2022, 12:02 PM • Last activity: Aug 7, 2022, 12:30 PM
0 votes
1 answers
1367 views
/var keeps filling due to yum cache
the /var/cache directory keeps filling frequently due to yum metadata in centos 7.5 server. most of the space utilized by the below files. i have set keep cache to 0 in /etc/yum.conf as well. but still not resolved. someone please throw somelight on this issue to rectify the same. ``` [4.0K] centos7...
the /var/cache directory keeps filling frequently due to yum metadata in centos 7.5 server. most of the space utilized by the below files. i have set keep cache to 0 in /etc/yum.conf as well. but still not resolved. someone please throw somelight on this issue to rectify the same.
[4.0K]  centos7-x86_64-updates
[   0]  cachecookie
[ 52M]  filelists.xml.gz
[4.0K]  gen
[535M]  filelists.xml
[252M]  filelists.xml.sqlite
**[2.7G]  other.xml
[2.8G]  other.xml.sqlite**
[201M]  primary.xml
[255M]  primary.xml.sqlite
[716M]  other.xml.gz
subash ct (1 rep)
May 5, 2022, 06:33 AM • Last activity: May 5, 2022, 07:24 AM
Showing page 1 of 20 total questions