How to copy dirs from my host to my pc
I have a better idea… go to : www.web2ftp.com and take it in a zip with that site
Alright, if you have a command line, send the command: ls ../
There should be a file named www, htdocs, or http. Now, type ls ../directorynamehere
If the output is the same as juswt a plain old ls command, then you have the folder. On to the next step:
tar -cvf mycopy.tar ../directorynamehere
Now, go to your web browser and replace the filename of your script with mycopy.tar. This will dl the archive. Now, use a decompression utility to decompress the file.
example: http://www.example.com/code.php?cmd=ls ../
output: etc htdocs.tar mail public_ftp public_html tmp www
http://www.example.com/code.php?cmd=ls ../www
output: myporn index.html code.php links.html
http://www.example.com/code.php?cmd=ls
output: myporn index.html code.php links.html
http://www.example.com/code.php?cmd=tar -cvf mycopy.tar ../www
output:../htdocs/myporn/goatse.jpg ../htdocs/myporn/tubgirl.jpg ../htdocs/index.html ../htdocs/code.php ../htcods/links.html
http://www.example.com/mycopy.tar output [download prompt comes up, save as]
http://www.example.com/code.php?cmd=rm mycopy.tar
all done!
Before you go and download that giant tar archive, I'd recommend compressing it with gzip or bzip2. tar is an archive utility, not a compression utility (note, however, that if you use -z, it will do the compression, but that it just calls gzip. This is not supported by every version of tar). For the highest compression, use bzip2 -9 [file]. This will take a little while, if the tar archive is big. If the server does not have bzip2 avalible, use gzip -9 [file]. Again, be prepared to wait a bit oif the tar archive is big.