Sample Header Ad - 728x90

Unix & Linux Stack Exchange

Q&A for users of Linux, FreeBSD and other Unix-like operating systems

Latest Questions

2 votes
0 answers
72 views
Want access to root directory from remote computer via Dolphin or Gnome Files app
I am setting up a new computer in my home. Both my old computer and my new one run various flavors of Ubuntu Linux, the new one Kubuntu 24.04.1, the old one Ubuntu 24.04.1. There are certain files on the old system that I want to copy to the new one. I have the sshd_config files set to permit access...
I am setting up a new computer in my home. Both my old computer and my new one run various flavors of Ubuntu Linux, the new one Kubuntu 24.04.1, the old one Ubuntu 24.04.1. There are certain files on the old system that I want to copy to the new one. I have the sshd_config files set to permit access by my username with ChrootDirectory / I find the following situation occurring: running command-line sftp, I can connect to the other computer and successfully do cd /. Also, running FileZilla I can get to the root dir of the remote. However with Dolphin, though I can make an sftp://myname@remote connection, after entering the password, I can't go above /home/myname. Similarly, as an experiment I can make the reverse connection from the old computer to the new one. I also, though, cannot get above my home directory. How can I make this work, and in particular, if FileZilla and command-line sftp manage to do so, why can't Dolphin or Gnome Files do it?
Steve Cohen (519 rep)
Jan 27, 2025, 11:02 PM
2 votes
1 answers
2529 views
Locate and manage files on extern hard drive through FileZilla over SSH
I have installed Debian Server on one of my servers and plugged in a extern hard drive to it through USB (sdb1, 2, and 5 on the image below) where I have all my files on right now. Before I used this WD Red hard drive (that are in a cabinet right now) in a NAS from Synology and I have not formatted...
I have installed Debian Server on one of my servers and plugged in a extern hard drive to it through USB (sdb1, 2, and 5 on the image below) where I have all my files on right now. Before I used this WD Red hard drive (that are in a cabinet right now) in a NAS from Synology and I have not formatted it and I don't want that either. Now I want to manage these files through FileZilla over SSH. I can't boot into Synology's operating system on my server because of unknown reasons. Therefore I installed Debian Server. I know that I can install Xpenology on a USB drive and boot it from there but I don't know how so Debian is the first choice at this moment. FreeNAS is no go since it requires 8 GB in RAM. My server only have 1 GB or 1,5 GB in RAM. How can I access my files through FileZilla over SSH? Where can I find the location of the files on the extern hard drive and so on? Is mount the correct command for this? How can I use it? enter image description here
Airikr (121 rep)
Feb 7, 2016, 09:55 PM • Last activity: Jan 7, 2025, 03:07 PM
0 votes
1 answers
128 views
Inadvertently deleted a directory using FileZilla. How can I get it back?
I was using FileZilla to try to transfer some files. I made one mouse-click in the wrong place and wound up deleting a directory with over 14,000 files that I very much did not want deleted. I'm sure FileZilla is doing something like rm -rf and not moving the directory to "trash". I have a backup us...
I was using FileZilla to try to transfer some files. I made one mouse-click in the wrong place and wound up deleting a directory with over 14,000 files that I very much did not want deleted. I'm sure FileZilla is doing something like rm -rf and not moving the directory to "trash". I have a backup using IDrive, but their restore function is also failing. I have a call into them, so there is hope in that direction, but is there anything I can do to restore it locally? **UPDATE (1 day later):** Never mind. The files weren't deleted. The mouse click in FileZilla moved the directory under the one above it in the list. I thought I had checked this with the find command but I must have done it wrong. At least we've re-established the fact that there's nothing we can do if we really did delete the folder. And also that there's no point in using FileZilla anymore unless you're managing lots of servers. Just the plain Ubuntu Files app can handle my needs though it's a bit hidden. Someone suggested I use it but it's not necessary.
Steve Cohen (519 rep)
Jan 4, 2025, 03:27 AM • Last activity: Jan 6, 2025, 12:41 AM
1 votes
0 answers
65 views
Show directory owner id instead of name via FTP (Filezilla)
I am setting up the directory permission of a fresh new linux Debian 12 server. I have added my user(uid=1000) to the www-data group. My commands are as below - ``` sudo chown -R 1000:www-data public_html/ sudo chmod -R g+s public_html/ ``` But when I go to filezilla I see `user www-data` and not `1...
I am setting up the directory permission of a fresh new linux Debian 12 server. I have added my user(uid=1000) to the www-data group. My commands are as below -
sudo chown -R 1000:www-data public_html/
sudo chmod -R g+s public_html/
But when I go to filezilla I see user www-data and not 1000 www-data Is there anyway I can show the ID and not name. Thanks in advance.
Prithviraj Mitra (111 rep)
Jun 15, 2024, 09:08 AM • Last activity: Jun 15, 2024, 09:58 PM
0 votes
3 answers
742 views
Is there any way to upload large amounts of files to Google Drive with a decent speed besides Filezilla Pro and RClone?
I signed for Google Drive's Premium 2TB plan for my backups and needed a way to send more than 1 million+ of files there. Web interface is very problematic for this purpose as if the transfer has any errors, identifying what's online and what's not would be very difficult. So I started to look for s...
I signed for Google Drive's Premium 2TB plan for my backups and needed a way to send more than 1 million+ of files there. Web interface is very problematic for this purpose as if the transfer has any errors, identifying what's online and what's not would be very difficult. So I started to look for some safe way to send files there. First I found google-drive-ftp-adapter, but as I was trying to install it I got an error that I couldn't solve: *"This app is blocked. This app tried to access sensitive info in your Google Account. To keep your account safe Google blocked this access."* The project has no activity so I guessed that this was some move by Google that wasn't addressed by the maintainers. Then I tried to add google drive to Gnome's online accounts. It did mount the drive, but when I tried to upload the files the speed was ridiculous: around 15 KB/s which would take me like 10000 hours to upload the files (around 543GB) and I got some errors even in the first files. After that I bought Filezilla Pro which connects to Google Drive. So I bought it and it is worked fine. The speed is around 15 MB/s which is not ideal, it will take like 10 hours to upload but that's much better that 10000 and it's doable. Also filezilla has the "Failed transfers" tab that allows me to see the transfers that had an error and I just redo them and that's it. I wanted a GUI tool, but also gave a chance to rclone that proved to work really well and I could make it reach 32 MB/s which downed the time to 5 hours. The drawback is that if any tranfer has an error I have to cherry pick them in a log file to retransfer them. This is more incovenient than Filezilla. But none of these solutions felt like my endgame. Each one has it's drawbacks. I really would prefer a FOSS solution preferably with a GUI that simply let me choose the dirs I wanted to copy, redo the errors easily and it's fast enough. GUI tools are preffered but I would consider command line tools if it's really efficient, let me choose what I want and has decent logging while copying preferably visually. Does anyone has any other solution besides Filezilla Pro and RClone ? How would you send 1 million+ of files from several different folders to Google Drive ? **Edit** I really had underestimated the speed. It's been copying files for 17 hours and there's only 368GB copied. With RClone. 🙄 **Edit2** Underestimate ? that's a very far understatement. The upload finally concluded. And guess what: 3d 1h 13m 42.3s to upload 463.169 GiB*. Now... ok I know the small file overhead and stuff. But this overhead level is not acceptable by any standard. I really have to find some way to speed things to an acceptable level. Searching for a solution I saw that TeraBox has a feature called "cloud decompression". Will give it a try. *I know the value differs from what I stated above. But that's because I had already uploaded a part of the files in a previous run. **Edit3** TeraBox do have cloud decompression. They do offer 1TB free, their paid plan (2TB) is as cheap as it gets (U$3.49/month) and does have the ability to decompress online. Unfortunately decompression of files larger than 12 Gib isn't supported and to divide 530 Gibs in packages of 12 Gibs I have to cherry pick files in the sub-dirs what is a lot of work. Impractical. So this solution is only applicable if either we want to do that 3 day upload again or maintain subdirs as packages. No FTP client, paid or free, supports TeraBox as Filezilla supports Google Drive. But they do have desktop apps for all major OS (Win, Mac Linux) and a mobile app for Android. to quote U2: "And I still haven't found, what I'm looking for" :D
Nelson Teixeira (470 rep)
May 9, 2024, 07:16 AM • Last activity: May 15, 2024, 02:22 AM
2 votes
4 answers
13863 views
Why can't I access to my ftp server with my local user account?
First of all I'm using `Centos 6` server, putty and wordpress.org I followed the instructions from [this link][1] to set up `vsftpd` and ftp in `centos` server. For the `vsftpd.conf` file, these were the changes I made: anonymous_enable=NO local_enable=YES chroot_local_user=YES All of them are uncom...
First of all I'm using Centos 6 server, putty and wordpress.org I followed the instructions from this link to set up vsftpd and ftp in centos server. For the vsftpd.conf file, these were the changes I made: anonymous_enable=NO local_enable=YES chroot_local_user=YES All of them are uncommented. I then restarted the vsftpd service. For iptables I enabled input and output for port 21. After entering the user account name and password for ftp://domain.com, it seems like the server is not recognizing my username and password. They are the same credentials I have been using to log in to CentOS server. Then I found something on Google about getsebool. It mentioned that ftp_home_directory is turned off and I needed to turn it on with setsebool -P OK, now I am able to connect using ftp in putty, but not in the web browser or filezilla.
Voltic (83 rep)
Mar 26, 2014, 08:44 PM • Last activity: Sep 14, 2023, 02:04 PM
1 votes
0 answers
5321 views
How to get more details in log for lftp?
I'm fighting with logging on to an FTP server with lftp, and it simply hangs as shown below, despite having tried various things. The text below is all I get from lftp $ lftp lftp :~> debug -o log.txt -c -t 9 lftp :~> set ftp:ssl-force true lftp :~> set ssl:verify-certificate no lftp :~> set ftp:use...
I'm fighting with logging on to an FTP server with lftp, and it simply hangs as shown below, despite having tried various things. The text below is all I get from lftp $ lftp lftp :~> debug -o log.txt -c -t 9 lftp :~> set ftp:ssl-force true lftp :~> set ssl:verify-certificate no lftp :~> set ftp:use-feat false lftp :~> connect ftp.dataforsyningen.dk -p 990 lftp ftp.dataforsyningen.dk:~> login MadsSkjern Password: lftp MadsSkjern@ftp.dataforsyningen.dk:~> ls `ls' at 0 [TLS negotiation...] After ten minutes, its still hanging there without having timed out. So I abort with ctrl+c. I tried enabling logging with the highest level, 9 (source ). But the text below is all I get (the log includes me aborting with ctrl+c). $ cat log.txt 2022-11-28 18:03:10 ftp.dataforsyningen.dk ---- Resolving host address... 2022-11-28 18:03:10 ftp.dataforsyningen.dk ---- IPv6 is not supported or configured 2022-11-28 18:03:10 ftp.dataforsyningen.dk ---- 1 address found: 188.64.158.165 2022-11-28 18:03:16 ftp.dataforsyningen.dk ---- Connecting to ftp.dataforsyningen.dk (188.64.158.165) port 990 2022-11-28 18:04:16 ---- Closing control socket What is going on? Did they really write a piece of software that did not log any other details than these? Or what do I need to do, to make it print more details? For comparison, the log from Filezilla, for a similar connect attempt is 157 lines, with the highest log level. **Version and environent details** LFTP | Version 4.9.2 Libraries used: GnuTLS 3.7.3, idn2 2.3.2, Readline 8.1, zlib 1.2.11 Ubuntu 22.04.3 LTS
Mads Skjern (1005 rep)
Nov 28, 2022, 05:29 PM • Last activity: Sep 1, 2023, 06:31 PM
2 votes
2 answers
17239 views
How to fix FTP permission issue?
Have set up an FTP server and user. but it seems I'm unable to upload or edit any file. Even though the user has filled 777 permission. I can't even upload files to the user's root folder. Server OS: Ubuntu Client OS windows FTP server/ Client: fileZila Log: ``` Status: Connection established, waiti...
Have set up an FTP server and user. but it seems I'm unable to upload or edit any file. Even though the user has filled 777 permission. I can't even upload files to the user's root folder. Server OS: Ubuntu Client OS windows FTP server/ Client: fileZila Log:
Status:	Connection established, waiting for welcome message...
Status:	Insecure server, it does not support FTP over TLS.
Status:	Server does not support non-ASCII characters.
Status:	Logged in
Status:	Starting download of /var/www/html/wp/staged/wp-content/themes/Newspaper/Newspaper/woocommerce/single-product.php
Status:	File transfer successful, transferred 1,193 bytes in 1 second
Status:	Starting download of /var/www/html/wp/staged/wp-content/themes/Newspaper/Newspaper/woocommerce/single-product.php
Status:	File transfer successful, transferred 1,193 bytes in 1 second
Status:	Starting upload of C:\Users\User\AppData\Local\Temp\fz3temp-2\single-product.php
Command:	PASV
Response:	227 Entering Passive Mode (165,227,173,119,117,244).
Command:	STOR single-product.php
Response:	550 Permission denied.
Error:	Critical file transfer error
File permission: File permission Groups and users: Groups and users vsftpd.conf:
# Standalone mode
listen=YES
max_clients=200
max_per_ip=4
# Access rights
anonymous_enable=YES
local_enable=NO
write_enable=YES
anon_upload_enable=YES
anon_mkdir_write_enable=NO
anon_other_write_enable=NO
# Security
anon_world_readable_only=NO
connect_from_port_20=YES
hide_ids=YES
pasv_min_port=50000
pasv_max_port=60000
# Features
xferlog_enable=YES
ls_recurse_enable=NO
ascii_download_enable=NO
async_abor_enable=YES
# Performance
one_process_model=YES
idle_session_timeout=120
data_connection_timeout=300
accept_timeout=60
connect_timeout=60
anon_max_rate=50000
anon_mkdir_write_enable=NO
anon_other_write_enable=NO

#Userlist

userlist_deny=NO
userlist_enable=YES
userlist_file=/etc/vsftpd.allowed_users
vsftpd.allowed_users:
ftpuser
Any idea what's going on here? Update: I have changed the permission of the folder that holds the file to 777, and it still doesn't work. parent folder permission log:
Status:	Logged in
Status:	Starting download of /var/www/html/wp/staged/wp-content/themes/Newspaper/Newspaper/woocommerce/single-product.php
Status:	File transfer successful, transferred 1,193 bytes in 1 second
Status:	Starting upload of C:\Users\User\AppData\Local\Temp\fz3temp-2\single-product.php
Command:	PASV
Response:	227 Entering Passive Mode (*xxxxxxxxxxxx*).
Command:	STOR single-product.php
Response:	550 Permission denied.
Error:	Critical file transfer error
Status:	Retrieving directory listing of "/var/www/html/wp/staged/wp-content/themes/Newspaper/Newspaper"...
Status:	Directory listing of "/var/www/html/wp/staged/wp-content/themes/Newspaper/Newspaper" successful
Status:	Disconnected from server
Status:	Connection closed by server
yoni349 (131 rep)
Apr 3, 2020, 02:42 PM • Last activity: Mar 17, 2023, 06:54 PM
4 votes
1 answers
8375 views
Is there any FileZilla Server alternative (GUI based) in Linux?
I'm wondering is there any alternative to FileZilla server in Linux? * I'm looking for a server with GUI manager which offers nearly zero configuration so I can run and manage FileZilla server within 5 min while PureFTPd needs at least 2 hrs and managing users and groups really painful, especially f...
I'm wondering is there any alternative to FileZilla server in Linux? * I'm looking for a server with GUI manager which offers nearly zero configuration so I can run and manage FileZilla server within 5 min while PureFTPd needs at least 2 hrs and managing users and groups really painful, especially from a flat file. * I tried to use Pure-Admin (GTK manager for PureFTPd) and I found it really stupid by comparison to FileZilla GUI manager
mbnoimi (181 rep)
May 9, 2013, 01:52 AM • Last activity: Mar 7, 2023, 07:44 PM
2 votes
0 answers
1996 views
When I run FileZilla in Ubuntu terminal on Windows 10 I get this Error: Unable to initialize GTK+, is DISPLAY set properly?
I am trying to upload some files from my desktop to my DigitalOcean droplet account, but in order to transfer the files I needed to install FileZilla to make the file transfer. After I installed the app on my Ubuntu 20.04 terminal in Windows 10 and run the command `filezilla` to launch the app it th...
I am trying to upload some files from my desktop to my DigitalOcean droplet account, but in order to transfer the files I needed to install FileZilla to make the file transfer. After I installed the app on my Ubuntu 20.04 terminal in Windows 10 and run the command filezilla to launch the app it throws this error
Unable to init server: Could not connect: Connection refused ,14:44:20: 
Error: Unable to initialize GTK+, is DISPLAY set properly?
I know similar questions have been asked here, but I have tried all those solutions which didn't help in my case.
Godda (121 rep)
Apr 12, 2022, 03:17 PM • Last activity: Jan 31, 2023, 10:57 AM
2 votes
1 answers
2042 views
Unable to transfer files (put) using ftp or filezilla from windows to ubuntu VM
I've setup a ubuntu 64 bit VM (16.04) using Oracle VirtualBox. Using ftp, I am able to connect to the VM. I am also able to list (ls) the contents of the folder. However I am unable to put files to the VM. C:\>ftp x.x.x.x Connected to x.x.x.x. 220 (vsFTPd 3.0.3) 200 Always in UTF8 mode. User (x.x.x....
I've setup a ubuntu 64 bit VM (16.04) using Oracle VirtualBox. Using ftp, I am able to connect to the VM. I am also able to list (ls) the contents of the folder. However I am unable to put files to the VM. C:\>ftp x.x.x.x Connected to x.x.x.x. 220 (vsFTPd 3.0.3) 200 Always in UTF8 mode. User (x.x.x.x:(none)): user 331 Please specify the password. Password: 230 Login successful. ftp> quote pasv 227 Entering Passive Mode (...) ftp> put trnsfr.txt 200 PORT command successful. Consider using PASV. 550 Permission denied. On ubuntu, I set permissions on folder to rwx on owner,group,all. I then modified /etc/vsftpd.conf as follows: pasv_enable=YES pasv_min_port=30000 pasv_max_port=30100 port_enable=yes pasv_address=x.x.x.x allowed data connections and restarted vsftpd daemon: iptables -I INPUT -p tcp --destination-address 30000:30100 -j ACCEPT /etc/init.d/vsftpd restart but still permission error exists. I then tried using filezilla: The connections is successful but again files can not be transferred: with dft settings in vsftpd: Response: 550 Permission denied. Error: Critical file transfer error with the settings in vsftpd.conf as above: the following is displayed in filezilla: Error: The data connection could not be established: WSAEADDRNOTAVAIL - Cannot assign requested address I believe the problem is to do with the data connection - but am not sure what/how to resolve?
Helen Reeves (121 rep)
Jul 10, 2017, 09:30 AM • Last activity: Dec 14, 2022, 12:00 PM
0 votes
1 answers
905 views
Convert RSA pair to pem filezilla compatible key on linux
I have a pair of keys generated using: `ssh-keygen -t rsa -b 4096 -f ~/.ssh/keys/my_key -C "blah@gmail.com"`. This yielded 2 files `my_key` and `my_key.pub`. Now I need to convert that pair to a `.pem` key that is *filezilla* compatible (to connect over sftp). I already tried something like `ssh-key...
I have a pair of keys generated using: ssh-keygen -t rsa -b 4096 -f ~/.ssh/keys/my_key -C "blah@gmail.com". This yielded 2 files my_key and my_key.pub. Now I need to convert that pair to a .pem key that is *filezilla* compatible (to connect over sftp). I already tried something like ssh-keygen -f my_key -m 'PEM' -e > my_key.pem but *filezilla* kept complaining It doesn't contain a private key. I am running *Ubuntu 22.04 x64*. Please advise.
Enissay (103 rep)
Jul 12, 2022, 08:50 PM • Last activity: Jul 15, 2022, 07:12 AM
2 votes
1 answers
303 views
Linux server: How does memory get allocated to parallel processes?
I'm a complete newbie, so please excuse my ignorance and/or potentially wrong terminology. I'm using an Ubuntu server through ssh for brain image processing (one command with several programs, execution takes ~4-5 hours per brain), which I run through the Terminal. Since the server has limited stora...
I'm a complete newbie, so please excuse my ignorance and/or potentially wrong terminology. I'm using an Ubuntu server through ssh for brain image processing (one command with several programs, execution takes ~4-5 hours per brain), which I run through the Terminal. Since the server has limited storage (~200GB) and the brain data are big (2-3 GB input, 500 MB output), I'm constantly downloading processed data and uploading new to-be-processed ones, using FileZilla. The brain image processing is quite RAM-intensive and has failed several times due to memory issues, so I'm now doing these two procedures (procedure 1=brain image processing vs. procedure 2=uploading/downloading) separately and manually -- i.e., when I'm doing one, I won't do the other at the same time. But I was wondering if there's a more efficient way of doing this, while still ensuring that the brain image processing doesn't fail. In a nutshell, I would like procedure 1 to take up as much RAM as it needs, with the "rest" being allocated to procedure 2. I'm currently assigning procedure 1 all 8 cores, but it only uses all 8 only so often (because of how the program is written). Is there a way to achieve this, ideally one that still allows me to use FileZilla (because it's so fast and simple, though I'm not opposed to uploading/downloading through the Terminal)? For example, might it be the case that whichever process I start first takes "precedence" and just takes whatever memory it needs at a given point in time and any other processes just take what's left? Or how does RAM get allocated between concurrently running processes (especially if started from different software, if that matters)? I hope all of this made sense. Thanks in advance!
Jana (29 rep)
Feb 6, 2022, 11:59 AM • Last activity: Feb 13, 2022, 08:23 PM
1 votes
1 answers
2028 views
FileZilla works but sftp doesn't, how so?
My webhost allows me to create FTP accounts, and I successfully used them with FileZilla to store backup files from my computer on those remote FTP accounts. Now I am trying to upgrade from the FileZilla way of doing things to the more easily automatizable line-command way. So, I do something like t...
My webhost allows me to create FTP accounts, and I successfully used them with FileZilla to store backup files from my computer on those remote FTP accounts. Now I am trying to upgrade from the FileZilla way of doing things to the more easily automatizable line-command way. So, I do something like this in my terminal : sftp -P 21 myftpusername@myurl.net (I know the port is 21 because my webhost told me so). But to my surprise, the terminal gets stuck with this command and outputs nothing (as of now it has been stuck for 10 minutes, so I'm not holding my breath for it to eventually work out). Usually, I experience the opposite : the pure terminal way of doing things is more efficient than the nice-UI way, because the nice-UI is nothing but a wrapper around the command-line. Any help appreciated.
Ewan Delanoy (255 rep)
Sep 14, 2016, 12:52 PM • Last activity: Aug 16, 2021, 01:03 AM
0 votes
1 answers
137 views
file send to 000webhost by filezilla
Status: Connection established, waiting for welcome message... Status: Initializing TLS... Status: Verifying certificate... Status: TLS connection established. Status: Server does not support non-ASCII characters. Status: Logged in Status: Retrieving directory listing... Command: PWD Response: 257 "...
Status: Connection established, waiting for welcome message... Status: Initializing TLS... Status: Verifying certificate... Status: TLS connection established. Status: Server does not support non-ASCII characters. Status: Logged in Status: Retrieving directory listing... Command: PWD Response: 257 "/" is your current location Command: TYPE I Response: 200 TYPE is now 8-bit binary Command: PASV Response: 227 Entering Passive Mode (145,14,144,27,202,206). Command: MLSD Response: 150 Connecting to port 31039 Error: Connection timed out after 20 seconds of inactivity Error: Failed to retrieve directory listing I was able to login to my web server(000webhost.com) that's what I got first. But, when everything was loading then, connection timed out. My internet is working well. But, Why I am having the issue? I searched on Internet for a while. There was lot more questions like this but, none of them was helpful. Even, I watched some tutorial in YT also. enter image description here
Game Stakes (23 rep)
Apr 2, 2021, 11:21 AM • Last activity: Apr 2, 2021, 03:24 PM
-1 votes
1 answers
4662 views
Temporary failurew resolving 'http.kali.org'
E: Failed to fetch http://http.kali.org/kali/pool/main/g/gcc-10/gcc-10-base324-1_amd64.deb E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing? I am trying to install FileZilla on Kali and I continue to get this error. Any suggestions?
E: Failed to fetch http://http.kali.org/kali/pool/main/g/gcc-10/gcc-10-base324-1_amd64.deb E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing? I am trying to install FileZilla on Kali and I continue to get this error. Any suggestions?
cope (3 rep)
Apr 7, 2020, 04:24 PM • Last activity: Apr 7, 2020, 06:17 PM
-2 votes
1 answers
16134 views
How to install FileZilla3 on kali linux?
I have an FTP server on my Windows machine, and I've been trying to install `filezilla` in order to connect to the FTP server with it. I went to the `filezilla` official website and downloaded the software for Linux 64bit. The file which I downloaded has a `.tar.bz2` format so I extracted it using t...
I have an FTP server on my Windows machine, and I've been trying to install filezilla in order to connect to the FTP server with it. I went to the filezilla official website and downloaded the software for Linux 64bit. The file which I downloaded has a .tar.bz2 format so I extracted it using the tar -xvf command as follows tar -xvf Downloads/FileZilla3.tar.bz2 and now I have a directory labeled FileZilla3 which contains three other directories that match the system files in the root directory. This is an image of the downloaded file This is an image of the .tar.bz2 archive I downloaded file and the other folder is the uncompressed archived folder, FileZilla 3. Inside the FileZilla3 directory there are three more directories: image of the three further directories. How I do I install this software from here?
mmdz (5 rep)
Mar 27, 2020, 07:34 PM • Last activity: Mar 29, 2020, 02:42 PM
0 votes
1 answers
78 views
Apache and mysqlamin showing blank
Hi I am running a website for over 2 years and suddenly it showed up out of memory issue from last week. So I tried to allocate more memory but now apache and mysqladmin pages are blank and I get this error. Apache(error.log) ``` [Tue Jan 28 13:33:24.324500 2020] [ssl:warn] [pid 6908:tid 652] AH0190...
Hi I am running a website for over 2 years and suddenly it showed up out of memory issue from last week. So I tried to allocate more memory but now apache and mysqladmin pages are blank and I get this error. Apache(error.log)
[Tue Jan 28 13:33:24.324500 2020] [ssl:warn] [pid 6908:tid 652] AH01909: www.example.com:443:0 server certificate does NOT include an ID which matches the server name
[Tue Jan 28 13:33:24.355750 2020] [core:warn] [pid 6908:tid 652] AH00098: pid file C:/xampp/apache/logs/httpd.pid overwritten -- Unclean shutdown of previous Apache run?
[Tue Jan 28 13:33:24.449498 2020] [ssl:warn] [pid 6908:tid 652] AH01909: www.example.com:443:0 server certificate does NOT include an ID which matches the server name
[Tue Jan 28 13:33:24.652622 2020] [mpm_winnt:notice] [pid 6908:tid 652] AH00455: Apache/2.4.33 (Win32) OpenSSL/1.0.2o PHP/7.0.30 configured -- resuming normal operations
[Tue Jan 28 13:33:24.652622 2020] [mpm_winnt:notice] [pid 6908:tid 652] AH00456: Apache Lounge VC14 Server built: Mar 29 2018 11:38:15
[Tue Jan 28 13:33:24.652622 2020] [core:notice] [pid 6908:tid 652] AH00094: Command line: 'c:\\xampp\\apache\\bin\\httpd.exe -d C:/xampp/apache'
[Tue Jan 28 13:33:24.668248 2020] [mpm_winnt:notice] [pid 6908:tid 652] AH00418: Parent: Created child process 1572
[Tue Jan 28 13:33:25.371369 2020] [ssl:warn] [pid 1572:tid 620] AH01909: www.example.com:443:0 server certificate does NOT include an ID which matches the server name
[Tue Jan 28 13:33:25.480743 2020] [ssl:warn] [pid 1572:tid 620] AH01909: www.example.com:443:0 server certificate does NOT include an ID which matches the server name
[Tue Jan 28 13:33:25.511993 2020] [mpm_winnt:notice] [pid 1572:tid 620] AH00354: Child: Starting 150 worker threads.
[Tue Jan 28 13:33:26.168240 2020] [:error] [pid 1572:tid 2004] [client 185.153.196.13:57688] PHP Fatal error:  Allowed memory size of 2097152 bytes exhausted (tried to allocate 32768 bytes) in C:\\xampp\\htdocs\\wp-includes\\formatting.php on line 2818
[Tue Jan 28 13:33:26.168240 2020] [:error] [pid 1572:tid 2004] [client 185.153.196.13:57688] PHP Fatal error:  Allowed memory size of 2097152 bytes exhausted (tried to allocate 32768 bytes) in C:\\xampp\\htdocs\\wp-includes\\version.php on line 1
[Tue Jan 28 13:34:31.138496 2020] [:error] [pid 1572:tid 2004] [client 115.64.88.169:58561] PHP Fatal error:  Allowed memory size of 2097152 bytes exhausted (tried to allocate 4096 bytes) in C:\\xampp\\htdocs\\wp-includes\\formatting.php on line 2841, referer: http://www.surgicalguide.com.au/wp-admin/users.php 
[Tue Jan 28 13:34:31.262618 2020] [:error] [pid 1572:tid 2004] [client 115.64.88.169:58561] PHP Fatal error:  Allowed memory size of 2097152 bytes exhausted (tried to allocate 4096 bytes) in C:\\xampp\\htdocs\\wp-includes\\pomo\\mo.php on line 323, referer: http://www.surgicalguide.com.au/wp-admin/users.php 
[Tue Jan 28 13:34:45.007539 2020] [:error] [pid 1572:tid 1992] [client 115.64.88.169:49827] PHP Fatal error:  Allowed memory size of 2097152 bytes exhausted (tried to allocate 4096 bytes) in C:\\xampp\\htdocs\\wp-includes\\formatting.php on line 3214
[Tue Jan 28 13:34:45.020551 2020] [:error] [pid 1572:tid 1992] [client 115.64.88.169:49827] PHP Fatal error:  Allowed memory size of 2097152 bytes exhausted (tried to allocate 4096 bytes) in C:\\xampp\\htdocs\\wp-includes\\pomo\\mo.php on line 315
myeql_error.log
2020-01-28 13:33:22 1854 InnoDB: Warning: Using innodb_additional_mem_pool_size is DEPRECATED. This option may be removed in future releases, together with the option innodb_use_sys_malloc and with the InnoDB's internal memory allocator.
2020-01-28 13:33:22 6228 [Note] InnoDB: innodb_empty_free_list_algorithm has been changed to legacy because of small buffer pool size. In order to use backoff, increase buffer pool at least up to 20MB.

2020-01-28 13:33:23 6228 [Note] InnoDB: Using mutexes to ref count buffer pool pages
2020-01-28 13:33:23 6228 [Note] InnoDB: The InnoDB memory heap is disabled
2020-01-28 13:33:23 6228 [Note] InnoDB: Mutexes and rw_locks use Windows interlocked functions
2020-01-28 13:33:23 6228 [Note] InnoDB: _mm_lfence() and _mm_sfence() are used for memory barrier
2020-01-28 13:33:23 6228 [Note] InnoDB: Compressed tables use zlib 1.2.3
2020-01-28 13:33:23 6228 [Note] InnoDB: Using generic crc32 instructions
2020-01-28 13:33:23 6228 [Note] InnoDB: Initializing buffer pool, size = 16.0M
2020-01-28 13:33:23 6228 [Note] InnoDB: Completed initialization of buffer pool
2020-01-28 13:33:23 6228 [Note] InnoDB: Highest supported file format is Barracuda.
2020-01-28 13:33:23 6228 [Note] InnoDB: The log sequence number 12213792 in ibdata file do not match the log sequence number 100572373 in the ib_logfiles!
2020-01-28 13:33:23 6228 [Note] InnoDB: Restoring possible half-written data pages from the doublewrite buffer...
2020-01-28 13:33:23 6228 [Note] InnoDB: 128 rollback segment(s) are active.
2020-01-28 13:33:23 6228 [Note] InnoDB: Waiting for purge to start
2020-01-28 13:33:23 6228 [Note] InnoDB:  Percona XtraDB (http://www.percona.com)  5.6.38-83.0 started; log sequence number 100572373
2020-01-28 13:33:23 10980 [Note] InnoDB: Dumping buffer pool(s) not yet started
2020-01-28 13:33:23 6228 [Note] Plugin 'FEEDBACK' is disabled.
2020-01-28 13:33:23 6228 [Note] Server socket created on IP: '::'.
2020-01-28 13:33:24 6228 [Note] c:\xampp\mysql\bin\mysqld.exe: ready for connections.
Version: '10.1.32-MariaDB'  socket: ''  port: 3306  mariadb.org binary distribution
Please help!
user392523 (1 rep)
Jan 28, 2020, 02:45 AM • Last activity: Jan 28, 2020, 03:24 AM
1 votes
0 answers
167 views
Cannot transfer data to Linux VM through FileZilla
I want to transfer from my local computer to my Linux (CentOs) VM through FileZilla. The file should be in this path in my Linux VM: `/etc/pki/ca-trust/source/anchors. But when I try to upload the file into this path, then this failure occurs Error: /etc/pki/ca-trust/source/anchors/test.docx: open f...
I want to transfer from my local computer to my Linux (CentOs) VM through FileZilla. The file should be in this path in my Linux VM: `/etc/pki/ca-trust/source/anchors. But when I try to upload the file into this path, then this failure occurs Error: /etc/pki/ca-trust/source/anchors/test.docx: open for write: permission denied Error: File transfer failed Can someone tell me where is my mistake? Please bear with me, because I am new to Linux.
zenubeh (19 rep)
Nov 11, 2019, 07:52 AM
1 votes
0 answers
120 views
I can not use ssh to connect to the debian sshd
I have a debian server, in my mac I use filezilla to `ssh` connect to it, but get the below error: Error: Disconnected: No supported authentication methods available (server sent: publickey) Error: Can not connect to server the below is the debian sshd's config: Host * PasswordAuthentication yes Sen...
I have a debian server, in my mac I use filezilla to ssh connect to it, but get the below error: Error: Disconnected: No supported authentication methods available (server sent: publickey) Error: Can not connect to server the below is the debian sshd's config: Host * PasswordAuthentication yes SendEnv LANG LC_* HashKnownHosts yes GSSAPIAuthentication yes PermitRootLogin yes Seems can not use password to connect, why. --- **EDIT-01** In my Mac to connect I get permission error:
$ ssh root@192.168.64.2
root@192.168.64.2: Permission denied (publickey).
244boy (685 rep)
Oct 16, 2019, 06:00 PM • Last activity: Oct 17, 2019, 06:41 AM
Showing page 1 of 20 total questions