ftp wget large amount of files

General support questions
Post Reply
Morris
Posts: 1
Joined: 2015/08/20 13:10:37

ftp wget large amount of files

Post by Morris » 2015/08/20 13:16:11

Hey guys just wondered if you can help me.

Iv recently started a vps and have been in the process of transfering stuff over to it but i run a game server that using a file saving system and have a folder with over 300k file accounts in it and instead of it taking hours for me to download them via ftp i thought i could do it via my putty control screen and use a command such as "wget -r ftp user:pass@ip/dir/to/source" etc which does work with some of my folders and files but not with this folder.
On my vps it just creates a folder with .listing in it and no account files. was wondering if anyone had a idea why and a correct command to get all 300k files from my old host to my new vps.

Thanks in advance

gerald_clark
Posts: 10642
Joined: 2005/08/05 15:19:54
Location: Northern Illinois, USA

Re: ftp wget large amount of files

Post by gerald_clark » 2015/08/20 14:21:50

Use scp or rsync.

User avatar
TrevorH
Site Admin
Posts: 33215
Joined: 2009/09/24 10:40:56
Location: Brighton, UK

Re: ftp wget large amount of files

Post by TrevorH » 2015/08/20 15:49:57

Or tar it up and transfer the tar file then untar it.
The future appears to be RHEL or Debian. I think I'm going Debian.
Info for USB installs on http://wiki.centos.org/HowTos/InstallFromUSBkey
CentOS 5 and 6 are deadest, do not use them.
Use the FAQ Luke

Post Reply