Dr. Mark Humphrys

School of Computing. Dublin City University.

Home      Blog      Teaching      Research      Contact

Search:

CA249      CA318      CA651

w2mind.computing.dcu.ie      w2mind.org


Remote and Network Computing



telnet and ftp (and their successors)

These two commands (and their secure successors) have been for decades the two fundamental commands of the Internet / remote computing.

  1. telnet (host) - Login to remote host
  2. ftp (host) - Transfer files to/from remote host
With telnet you get a command-line, with ftp you get a read-write file system.

Origin:

  1. telnet, 1971 (and here).
  2. ftp, 1971.

These two commands have been replaced by secure versions:

  1. telnet -> ssh
  2. ftp -> sftp / scp (both use ssh) or ftps (uses ssl)

Formerly: ftp of public data:
Read-only ftp (File Transfer Protocol) was what people used to publish files and archives online before the Web (http - Hypertext Transfer Protocol).
This is rarely used any more. Though most browsers should be able to read public files through it.

Now more common: ftp of private data:
ftp is now more often used in read-write mode for remote private data (needs password).
e.g. Your web hosting company uses a UNIX server. You periodically upload your web site (edited in Windows) onto it with a private ftp.
There are many graphical drag-and-drop ftp clients, and even programs that make the site into a full Windows drive.



DCU remote access for students

You may or may not be able to remotely ssh or sftp to:

student.computing.dcu.ie = 136.206.11.245 (CA, on servers subnet)


How to login to Linux at DCU




remote email

use POP3 or IMAP protocol to talk to server:
  mailhost.computing.dcu.ie
  mail.dcu.ie


Accessing UNIX remotely and from Windows

Running UNIX GUI applications:


I use the following two to run a Windows GUI with a UNIX command-line underneath. My files on the UNIX server appear as just another read-write Windows drive. I can use Windows apps to edit them. And I have a UNIX command-line always open on which I can run scripts to process them:



FTP scripting

FTP scripting is a good/essential tool for website maintenance.

Say you are working on an offline copy of a website of 10,000 files. You make changes to 137 files among the 10,000. You want to upload the 137 new files, but not the entire 10,000. Drag and drop the 137 to their correct destinations could be quite tedious.

For repetitive tasks, drag-and-drop is not a better interface than being able to write automated scripts (this will be a theme of this course). You can write ftp scripts ("macros") and call them from Shell scripts:



HTTP scripting

You can also do HTTP GET or POST scripting from the command-line.
Some tools that do this:


  1. lynx
    • does HTTP GET:
        lynx -reload -source URL
      
    • does HTTP POST:
        cat DATA | lynx -reload -source -post_data URL
      

  2. wget
    • gnu.org
    • manual
    • does HTTP GET:
        wget -q -O - URL
      
    • Sites that block scripts:
      If a site won't let a script see its content, you can set User agent to pretend to be a browser:
        UserAgent="Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)"
        wget -q -O - 	-U "$UserAgent"	URL
      
    • To do the "Link checker" Java practical in shell script, something like:
        wget --spider --force-html -i file.html
      

  3. curl
    • dumps to command line by default
    • -s for silent mode
    • does HTTP GET
    • does HTTP POST



Working remotely

Idea: Your files are "on the network" somewhere. You can access them and make changes to them from anywhere. All copies stay in synch.

This is what you actually have within DCU (can move from terminal to terminal, accessing files at central server). The idea is that you would have this at home (and when travelling etc.) as well.


Simplest solution - 1 copy of files
  1. Read files from server, and copy changed files back to server as you go along. Can do this with ftp now, but really need broadband to work with remote files. Need high-speed broadband to work with large remote files.

More complex solution - 2 copies of files - Work on machine which has synchronised mirror of server files - Have to keep copies in synch

  1. Synchronise over the network.
    Read files from server at start of session, copy changed files back at end of session.
    e.g. Say have broadband modem, always-on:
    1. When leave office, set synchronise program running with home. By time get home, files are synchronised. Work on them locally.
    2. When going back to office, start synch program again. By time get in, files are synchronised again.

    Or:

  2. Physically bring laptop (or flashdrive / external hard disk) to/from work to synchronise.





Feeds      HumphrysFamilyTree.com

Bookmark and Share           On Internet since 1987.