Sample Header Ad - 728x90

Unix & Linux Stack Exchange

Q&A for users of Linux, FreeBSD and other Unix-like operating systems

Latest Questions

2 votes
1 answers
5012 views
What does "is set-group-ID on execution - ignored" mean, and why am I unable to uncompress a file because of it?
I'm trying to uncompress a file using ```sh gunzip GCF_000746645.1_ASM74664v1_genomic.fna.gz ``` ... but I get the following error: ```none gzip: GCF_000746645.1_ASM74664v1_genomic.fna.gz is set-group-ID on execution - ignored ``` I've learned that "set-group-ID on execution" refers to something rel...
I'm trying to uncompress a file using
gunzip GCF_000746645.1_ASM74664v1_genomic.fna.gz
... but I get the following error:
gzip: GCF_000746645.1_ASM74664v1_genomic.fna.gz is set-group-ID on execution - ignored
I've learned that "set-group-ID on execution" refers to something related to permissions on the server, but I'm not entirely sure how so and what I should do.
user452473 (21 rep)
Jan 23, 2021, 12:14 AM • Last activity: May 29, 2025, 08:07 AM
1 votes
1 answers
5811 views
zcat / gzip error while piping out
I am using zcat to output the contents of a rather large .gz compressed text file. I am piping the output to grep and searching for a specific string. Below is the command I am running. dylan@xaelah:/media/dylan/ExtHD$ zcat wpaPasswords2.gz | grep baconisdelicious The command runs for a while and th...
I am using zcat to output the contents of a rather large .gz compressed text file. I am piping the output to grep and searching for a specific string. Below is the command I am running. dylan@xaelah:/media/dylan/ExtHD$ zcat wpaPasswords2.gz | grep baconisdelicious The command runs for a while and then exits with the following error. gzip: wpaPasswords2.gz: unexpected end of file I have also tried piping the output to the tail command, which with default options it completes with output, but not with the last 10 lines of the compressed files, it outputs 10 lines but not the last 10 lines. If I pipe the output to 'tail -n 1' such as: dylan@xaelah:/media/dylan/ExtHD$ zcat wpaPasswords2.gz | tail -n 1 I get the same error message of gzip: wpaPasswords2.gz: unexpected end of file While the command runs I watch the output of htop and the bottle neck is I/O, the computer is not running out of ram so I am unclear of how exactly to determine what the cause of this issue is. I have checked the logs and have not found anything of value. The reason I am trying to do all of this is because I am going to combine several very large .gz files while sorting them and removing duplicates however I want to make sure my process will work before running my scripts over gigs and gigs of data. Any advice would be appreciated, thanks!
Dylan (1048 rep)
Mar 10, 2015, 05:17 AM • Last activity: May 8, 2025, 09:01 PM
4 votes
1 answers
2189 views
Best compression for operating system image
I have an Operating system image of size 2.5G. I have a device with a limited size. Thus I was looking for the best possible solution for providing the compression. Below are the commands and results of their compression: 1.tar with gzip: tar c Os.img | gzip --best > Os.tar.gz This command returned...
I have an Operating system image of size 2.5G. I have a device with a limited size. Thus I was looking for the best possible solution for providing the compression. Below are the commands and results of their compression: 1.tar with gzip: tar c Os.img | gzip --best > Os.tar.gz This command returned an image of 1.3G. 2.Xz only: xz -z -v -k Os.img This command returned an image of 1021M. 3.Xz with -9: xz -z -v -9 -k Os.img This command returned an image of 950M. 4.tar with Xz and -9: tar cv Os.img | xz -9 -k > Os.tar.xz This command returned an image of 950M. 5.tar with Xz -9 and -e: xz -z -v -9 -k -e Os.img This command returned an image of 949M. 6.lrzip: lrzip -z -v Os.img This command returned an image of 729M. Is there any other possible best solution or command line tool ( preferable ) for the compression?
Sharvin26 (307 rep)
Mar 12, 2019, 02:44 PM • Last activity: Apr 26, 2025, 09:47 AM
240 votes
9 answers
497759 views
tar: Removing leading `/' from member names
root@server # tar fcz bkup.tar.gz /home/foo/ tar: Removing leading `/' from member names How can I solve this problem and keep the `/` on file names ?
root@server # tar fcz bkup.tar.gz /home/foo/ tar: Removing leading `/' from member names How can I solve this problem and keep the / on file names ?
superuser (2501 rep)
Dec 23, 2012, 12:47 PM • Last activity: Apr 1, 2025, 07:53 PM
1 votes
1 answers
102 views
Parallel processing of single huge .bz2 or .gz file
I would like to use GNU Parallel to process a huge .gz or .bz2 file. I know I can do: bzcat huge.bz2 | parallel --pipe ... But it would be nice if there was a way similar to `--pipe-part` that can read multiple parts of the file in parallel. One option is to decompress the file: bzcat huge.bz2 > hug...
I would like to use GNU Parallel to process a huge .gz or .bz2 file. I know I can do: bzcat huge.bz2 | parallel --pipe ... But it would be nice if there was a way similar to --pipe-part that can read multiple parts of the file in parallel. One option is to decompress the file: bzcat huge.bz2 > huge parallel --pipe-part -a huge ... but huge.bz2 is huge, and I would much prefer decompressing it multiple times than storing it uncompressed.
Ole Tange (37348 rep)
Mar 28, 2025, 11:58 AM • Last activity: Mar 29, 2025, 10:33 AM
2 votes
2 answers
75 views
Recovery of a compressed image not possible due to lack of Space - general understanding of compression methods
I used `gzip` to compress an image which is quite huge still. dd if=/dev/sda2 bs=1M | gzip -c -9 > sda2.dd.img.gz then I changed the partitioning of the Drive because I wanted to install Linux. And when trying to decompress and write it to the former (smaller) partition gunzip sda2.dd.img.gz >/dev/s...
I used gzip to compress an image which is quite huge still. dd if=/dev/sda2 bs=1M | gzip -c -9 > sda2.dd.img.gz then I changed the partitioning of the Drive because I wanted to install Linux. And when trying to decompress and write it to the former (smaller) partition gunzip sda2.dd.img.gz >/dev/sda2 (using > instead of dd, as described here, its content is first written to another file (instead of directly to the partition as I expected). The Problem: the Partition where the zipped file is on is to. small. I could use a external media but that will take more time. But would prefer to write the decompressed data to the partition directly. So I decided to get more info about zipping itself first. but it left me even more confused (see my first response here) Can anyone give any hint, please ? thanks in advance,
Nisang Marc Pfannkuchen (21 rep)
Feb 24, 2025, 06:26 PM • Last activity: Feb 26, 2025, 08:43 PM
0 votes
0 answers
58 views
tar extracted file with bad output
I tarred one file with: ```bash tar cf My-tarball.tar path/to/file.txt ``` Then compressed it: ```bash gzip My-tarball.tar ``` But when i decompress it and extract it ```bash gunzip My-tarball.tar.gz tar -xf My-tarball.tar ``` the file is in a bad format, on Vim it shows a lot of: ```txt ^@^@^@^@^@^...
I tarred one file with:
tar cf My-tarball.tar path/to/file.txt
Then compressed it:
gzip My-tarball.tar
But when i decompress it and extract it
gunzip My-tarball.tar.gz
tar -xf My-tarball.tar
the file is in a bad format, on Vim it shows a lot of:
^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^
and so on... I'm really worried if a I lost my file T-T, i have no backup, stupid from my part What am i missing? What can i do? **P.S.:** all of the process was run on a NTFS disk mounted with lowntfs-3g if it helps Thanks in advance
BenjamimCS (11 rep)
Feb 17, 2025, 10:24 PM • Last activity: Feb 18, 2025, 01:57 AM
20 votes
8 answers
139725 views
tar: Unexpected EOF in archive
I was attempting to untar a `.tar.gz` file, but came across this error: gzip: stdin: unexpected end of file tar: Unexpected EOF in archive tar: Unexpected EOF in archive tar: Error is not recoverable: exiting now The `tar.gz` file includes a `.tar` file, which when untarred results in: tar: Unexpect...
I was attempting to untar a .tar.gz file, but came across this error: gzip: stdin: unexpected end of file tar: Unexpected EOF in archive tar: Unexpected EOF in archive tar: Error is not recoverable: exiting now The tar.gz file includes a .tar file, which when untarred results in: tar: Unexpected EOF in archive tar: Unexpected EOF in archive tar: Error is not recoverable: exiting now I tried both –ignore-zeros and –ignore-failed-read, although they both didn't work. Is there any way I could extract this file even if it is corrupted? The file type in question: .tar.gz: Gzip Compressed Data, from a UNIX system.
iLinux85 (391 rep)
Nov 5, 2012, 11:54 AM • Last activity: Feb 10, 2025, 08:32 PM
5 votes
3 answers
6813 views
How to check what compression type an RPM package uses?
I recently realized that the RPM packages shifted from gzip-compression to xz-compression a few years ago. I need to check is the compression type of an RPM package that I have. I also need to check what compression type is considered by my system when it is trying to unpack an RPM file.
I recently realized that the RPM packages shifted from gzip-compression to xz-compression a few years ago. I need to check is the compression type of an RPM package that I have. I also need to check what compression type is considered by my system when it is trying to unpack an RPM file.
Alchemist (591 rep)
Aug 31, 2016, 07:07 PM • Last activity: Jan 31, 2025, 05:54 PM
247 votes
4 answers
99373 views
Why are tar archive formats switching to xz compression to replace bzip2 and what about gzip?
More and more [`tar`][1] archives use the [`xz`][2] format based on LZMA2 for compression instead of the traditional [`bzip2(bz2)`][3] compression. In fact *kernel.org* made a late "*Good-bye bzip2*" [announcement, 27th Dec. 2013][4], indicating kernel sources would from this point on be released in...
More and more tar archives use the xz format based on LZMA2 for compression instead of the traditional bzip2(bz2) compression. In fact *kernel.org* made a late "*Good-bye bzip2*" announcement, 27th Dec. 2013 , indicating kernel sources would from this point on be released in both tar.gz and tar.xz format - and on the main page of the website what's directly offered is in tar.xz. Are there any specific reasons explaining why this is happening and what is the relevance of gzip in this context?
user44370
Jan 6, 2014, 06:39 PM • Last activity: Jan 21, 2025, 02:31 PM
1 votes
1 answers
66 views
Recover mis-gzipped folder / directory & files
I needed to compress/archive a folder, so I ran the following command: ``` gzip -v --rsyncable --fast -r myFolder/ -c > myFolderArchive.gz ``` ...foolishly thinking this was going to do just what I thought it would: an archive of `myFolderArchive` and its files recursively. It even had a nice output...
I needed to compress/archive a folder, so I ran the following command:
gzip -v --rsyncable --fast -r myFolder/ -c > myFolderArchive.gz
...foolishly thinking this was going to do just what I thought it would: an archive of myFolderArchive and its files recursively. It even had a nice output:
./myFolder/file1 ... 80%
./myFolder/file2 ... 20%
...
Opening the archive later however, I only see a single file in it. A quick search led me to understand my mistake: GZip (or I guess, myself) has taken every file, compressed it, and concatenated them one by one into a single file, essentially. Losing all file/directory structure. In the meantime, I've rm -r'd the original folder. All I have now is myFolderArchive.gz. Would anyone see a way to take that archive, and potentially reconstruct the original set of files, from the myFolderArchive.gz file's content, now that it's all mixed into a single GZipped file? I do still have access to the original disk (for a limited time) and could potentially attempt to recover at least the original directory structure (filesystem is ext4). Technically, the content/data itself is in myFolderArchive.gz, it would "just" need to be sliced right...
Undo (113 rep)
Dec 22, 2024, 04:32 AM • Last activity: Dec 23, 2024, 05:59 AM
1 votes
0 answers
31 views
Extract and merge multipe tar gz
I have multiple tar.gz archives, the size of each archive is approximately 40 GB: v1.0-trainval01_blobs.tgz v1.0-trainval01_blobs.tgz ... v1.0-trainval10_blobs.tgz I can unpack each archive and get the following directory structure: v1.0-trainval01_blobs samples sweeps v1.0-trainval02_blobs samples...
I have multiple tar.gz archives, the size of each archive is approximately 40 GB: v1.0-trainval01_blobs.tgz v1.0-trainval01_blobs.tgz ... v1.0-trainval10_blobs.tgz I can unpack each archive and get the following directory structure: v1.0-trainval01_blobs samples sweeps v1.0-trainval02_blobs samples sweeps ... v1.0-trainval10_blobs samples sweeps This unpacking will consume many hours, even days. But this is not enough! Next I have to merge samples and sweeps folders: v1.0-trainval_all_blobs samples sweeps And this is time-consuming operation again... Do I can unpack contents of all tar.gz in v1.0-trainval_all_blobs via the single command?
Ars ML (11 rep)
Nov 27, 2024, 08:46 AM
2 votes
3 answers
331 views
Find positions of keyword in GZ'ipped file
I'm trying to identify parts of a MySQL DB that take most space in GZipped dumps that I need to frequently save and restore so that I can archive some tables partially. I have a suspicion that different tables are compressed better or worse depending on their content so their uncompressed size is no...
I'm trying to identify parts of a MySQL DB that take most space in GZipped dumps that I need to frequently save and restore so that I can archive some tables partially. I have a suspicion that different tables are compressed better or worse depending on their content so their uncompressed size is not very relevant here. Is there any way I can map CREATE TABLE or INSERT statements in the DB dump to the file positions in the .sql.gz file? Something like gunzip where {0} is the current position of the db.sql.gz file that gunzip is reading from
Anton Duzenko (149 rep)
Oct 9, 2024, 01:05 PM • Last activity: Oct 11, 2024, 03:24 PM
17 votes
8 answers
29596 views
How to gzip 100 GB files faster with high compression?
We have 100+ GB files on a Linux machine, and while trying to perform gzip using below command, gzip is taking minimum 1-2 hours to complete: gzip file.txt Is there a way we can make gzip to run fast with the same level of compression happening when we use gzip? *** CPU: Intel(R) Core(TM) i3-2350M C...
We have 100+ GB files on a Linux machine, and while trying to perform gzip using below command, gzip is taking minimum 1-2 hours to complete: gzip file.txt Is there a way we can make gzip to run fast with the same level of compression happening when we use gzip? *** CPU: Intel(R) Core(TM) i3-2350M CPU @2.30 GHz
Ravi (409 rep)
Dec 10, 2020, 06:25 PM • Last activity: Oct 10, 2024, 08:41 PM
0 votes
1 answers
65 views
Is it possible to compress a tar ball with gzip/bzip2/xz after tar ball file has been created?
If we create a tar ball file by giving the following command tar -cvf Docs.tar $HOME/Documents/* then post creation of the tar ball is it possible to use gzip or bzip2 or xz or some other compression utility to compress the tar file? I know that we can give the option `--bzip2` or `--xz` or `--gzip`...
If we create a tar ball file by giving the following command tar -cvf Docs.tar $HOME/Documents/* then post creation of the tar ball is it possible to use gzip or bzip2 or xz or some other compression utility to compress the tar file? I know that we can give the option --bzip2 or --xz or --gzip while creating the tar along with -cvf option but what if that is not done. And after the tar is created then the compression is sought to be applied. Is it possible? If yes then how?
KDM (116 rep)
Oct 2, 2024, 01:50 PM • Last activity: Oct 2, 2024, 02:15 PM
0 votes
1 answers
119 views
"Permission denied" error when using sudo for writing image to /dev/sdb on Debian
I am confused as to why I am getting the following error: $ sudo zcat firmware.A20-OLinuXino-Lime2.img.gz partition.img.gz > /dev/sdb bash: /dev/sdb: Permission denied Shouldn't `sudo` be able to run any command? Why can't it run this one? I am attempting to install Debian on my A20-OLinuXino-Lime2....
I am confused as to why I am getting the following error: $ sudo zcat firmware.A20-OLinuXino-Lime2.img.gz partition.img.gz > /dev/sdb bash: /dev/sdb: Permission denied Shouldn't sudo be able to run any command? Why can't it run this one? I am attempting to install Debian on my A20-OLinuXino-Lime2. Thank you so much for any support you can provide.
SpreadingKindness (23 rep)
Sep 6, 2024, 01:15 AM • Last activity: Sep 6, 2024, 05:55 AM
23 votes
3 answers
40172 views
Use gzip to compress the files in a directory except for already existing .gz files
I have a directory of logs that I would like to set up a job to compress using [gzip][1]. The issue is I don't want to recompress the logs I've already compressed. I tried using `ls | grep -v gz | gzip`, but that doesn't seem to work. Is there a way to do this? Basically I want to gzip every file in...
I have a directory of logs that I would like to set up a job to compress using gzip . The issue is I don't want to recompress the logs I've already compressed. I tried using ls | grep -v gz | gzip, but that doesn't seem to work. Is there a way to do this? Basically I want to gzip every file in the directory that does not end in .gz.
jabbajac (335 rep)
Oct 10, 2014, 03:35 PM • Last activity: Aug 28, 2024, 07:47 PM
6 votes
6 answers
4301 views
Estimate compressibility of file
Is there a quick and dirty way of estimating `gzip`-compressibility of a file without having to fully compress it with `gzip`? I could, in `bash`, do bc <<<"scale=2;$(gzip -c file | wc -c)/$(wc -c <file)" This gives me the compression factor without having to write the `gz` file to disk; this way I...
Is there a quick and dirty way of estimating gzip-compressibility of a file without having to fully compress it with gzip? I could, in bash, do bc <<<"scale=2;$(gzip -c file | wc -c)/$(wc -c gz file to disk; this way I can avoid replacing a file on disk with its gz version if the resultant disk space savings do not justify the trouble. But with this approach the file is indeed fully put through gzip; it's just that the output is piped to wc rather than written to disk. Is there a way to get a rough compressibility estimate for a file without having gzip work on all its contents?
iruvar (17005 rep)
Sep 16, 2014, 04:48 PM • Last activity: Aug 22, 2024, 12:45 AM
0 votes
0 answers
140 views
vmlinuz to vmlinux ERROR
``` $ file vmlinuz vmlinuz: Linux kernel x86 boot executable bzImage, version 4.14.244 (root@d0ea4514eda5) #1 SMP Thu Aug 31 01:23:02 PDT 2023, RO-rootFS, swap_dev 0x3, Normal VGA ``` I try to use `extract_vmlinux` and `vmlinux-to-elf` to extract vmlinux from vmlinuz, but report the following errors...
$ file vmlinuz
vmlinuz: Linux kernel x86 boot executable bzImage, version 4.14.244 (root@d0ea4514eda5) #1 SMP Thu Aug 31 01:23:02 PDT 2023, RO-rootFS, swap_dev 0x3, Normal VGA
I try to use extract_vmlinux and vmlinux-to-elf to extract vmlinux from vmlinuz, but report the following errors respectively:
$ vmlinux-to-elf vmlinuz vmlinux
Traceback (most recent call last):
  File "/usr/local/bin/vmlinux-to-elf", line 63, in 
    ElfSymbolizer(
  File "/usr/local/lib/python3.8/dist-packages/vmlinux_to_elf/elf_symbolizer.py", line 44, in __init__
    kallsyms_finder = KallsymsFinder(file_contents, bit_size)
  File "/usr/local/lib/python3.8/dist-packages/vmlinux_to_elf/kallsyms_finder.py", line 177, in __init__
    self.find_linux_kernel_version()
  File "/usr/local/lib/python3.8/dist-packages/vmlinux_to_elf/kallsyms_finder.py", line 225, in find_linux_kernel_version
    raise ValueError('No version string found in this kernel')
ValueError: No version string found in this kernel
$ ./extract_vmlinux vmlinuz > vmlinux
extract_vmlinux: Cannot find vmlinux.
Then I tried manual extraction:
$ od -A d -t x1 vmlinuz | grep 'fd 37 7a 58 5a 00'
3254032 fd 37 7a 58 5a 00 44 65 73 74 69 6e 61 74 69 6f

$ dd if=vmlinuz of=vmlinuz_unxz bs=1 skip=3254032
116928+0 records in
116928+0 records out
116928 bytes (117 kB, 114 KiB) copied, 1.10249 s, 106 kB/s

$ xz -d vmlinuz_unxz
xz: vmlinuz_unxz: Compressed data is corrupt
What went wrong? Any suggestions to extract vmlinux? Thank you!
pipik (1 rep)
Aug 16, 2024, 02:14 AM • Last activity: Aug 16, 2024, 02:31 AM
1 votes
0 answers
172 views
Recoverably recompress gzip files into zstd, preserving original checksums?
I need to archive a lot of gzip-compressed data. The problem is that compared to zstd, gzip is wasteful, both in terms of ratio and CPU time required to decompress the data. Because of that, I want to recompress the data into zstd. Unfortunately, I need to be able to reconstruct the SHA/MD5 checksum...
I need to archive a lot of gzip-compressed data. The problem is that compared to zstd, gzip is wasteful, both in terms of ratio and CPU time required to decompress the data. Because of that, I want to recompress the data into zstd. Unfortunately, I need to be able to reconstruct the SHA/MD5 checksums of the original compressed gzip files, in order to prove its origin. Is it possible? If the gzip algorithm was deterministic, that would be trivial, but I don't have access to information about which version of gzip was used, what was the compression level etc.
d33tah (1381 rep)
Aug 12, 2024, 02:35 PM
Showing page 1 of 20 total questions