These two commands have been replaced by secure versions:
Formerly: ftp of public data:
Now more common: ftp of private data:
ftp is now more often used in read-write mode for remote private data (needs password).
e.g. Your web hosting company uses a UNIX server. You periodically upload your web site (edited in Windows) onto it with a private ftp.
There are many graphical drag-and-drop ftp clients, and even programs that make the site into a full Windows drive.
You may or may not be able to remotely ssh or sftp to:
student.computing.dcu.ie = 18.104.22.168 (on a CA subnet)
Running UNIX GUI applications:
I use the following two to run a Windows GUI with a UNIX command-line underneath. My files on the UNIX server appear as just another read-write Windows drive. I can use Windows apps to edit them. And I have a UNIX command-line always open on which I can run scripts to process them:
Say you are working on an offline copy of a website of 10,000 files. You make changes to 137 files among the 10,000. You want to upload the 137 new files, but not the entire 10,000. Drag and drop the 137 to their correct destinations could be quite tedious.
For repetitive tasks, drag-and-drop is not a better interface than being able to write automated scripts (this will be a theme of this course). You can write ftp scripts ("macros") and call them from Shell scripts:
cd public_html lcd /users/local/humphrys/local/history-copy mkdir ./Flanagan/NMI.bird put ./Flanagan/NMI.bird/IMAG1024.jpg ./Flanagan/NMI.bird/IMAG1024.jpg put ./Flanagan/NMI.bird/IMAG1025.jpg ./Flanagan/NMI.bird/IMAG1025.jpg put ./Flanagan/the.bird.html ./Flanagan/the.bird.html put blog.html blog.html
Changes directory on remote site.
Changes directory on local site.
Makes remote directories.
Puts various files into various destinations.
The above ftp script is automatically built by a program.
lynx -reload -source URL
cat DATA | lynx -reload -source -post_data URL
wget -q -O - URL
UserAgent="Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)" wget -q -O - -U "$UserAgent" URL
wget --spider --force-html -i file.html
Idea: Your files are "on the network" somewhere. You can access them and make changes to them from anywhere. All copies stay in synch.
This is what you actually have within DCU (can move from terminal to terminal, accessing files at central server). The idea is that you would have this at home (and when travelling etc.) as well.
Simplest solution - 1 copy of files
More complex solution - 2 copies of files - Work on machine which has synchronised mirror of server files - Have to keep copies in synch