This web page will attempt to cover these issues and present up to date information regarding the current status of security issues as they relate to WWW. This page include pointers to sources of additional information, mailing lists, tools, etc.
This web page is evolving over the course of the COMP290 seminar on WWW/Mosaic. Because of this process of evolution portions of this page may be in a state of flux and under construction.
For additional sources of general information which I have just recently found (4-16-95) and need to organize:
(Table of Contents)
2: Existing Authentication Access Mechanisms
Authentication access mechanisms are the basic methods used by CERN HTTP
and NCSA HTTP to provide control over document access.
2.1: Mosiac/NCSA Authentication
The current methods in NCSA Mosaic for X, version 2.0, and NCSA HTTP for restricting
access to documents and providing authentication are based on several criteria:
Passwords have no correspondence to individual's UNIX passwords on specific systems. These passwords are uuencoded as they are passed across the network. This is not a secure method of end to end transmission.
There are two levels at which authentication can work: per-directory and per-server. An excerpt from the Mosaic User Authentication Tutorial: "Per-directory authentication means that users with write access to part of the filesystem that is being served can control access to their files as they wish. They need not have root access on the system or write access to the server's primary config files."
Per-directory access control is maintained by using an file named ".htaccess" that resides in each directory. The server reads this file when attempting to access a document in that directory or in any subdirectories under that directory. The access control specification given on a per-directory basis may be overridden by the global access configuration file, see below.
Per-server access control is determined by a global access configuration file. This access configuration file is specified under the AccessConfig variable in the configuration files. For examples of directives for use when writing an global access control file see, Resource Configuration File.
In addition to just restricting access, server features may also be disabled in certain specified directories. For example user's home directories may not be safe to run all the server features in. This may be equivalent to disable server side includes. A server side include is when httpd allows users to create documents on the fly by parsing files for information such as current date, last modification date, size, etc. This parsing is performed as the server's user, root, not the actual user's.
See the Mosiac User Authentication Tutorial for additional information on providing access to WWW documents using NCSA HTTP. This tutorial walks the user through the access configuration set up and provides lots of useful examples.
For additional information see:
For more information (although Mosaic's Documentation is much better) see:
NCSA has proposed a PGP/PEM Encryption Scheme for use with NCSA HTTP. The NCSA
proposal uses encryption/decryption routines which are external to both the Mosaic
browser and the HTTP server. The browser and server will call these routines to
encrypt and decrypt their communications and thus provide
user authentication and a secure transport method. These programs have "hooks" for
use with PGP "pretty good privacy" or RIPEM. Both methods use RSA encryption. Note
that the U.S. government has strict rules on the exportation of these technologies.s
Persons who wish to use Mosaic and HTTP with PEM or PGP encryption will need to
communicate beforehand and find a tamper-proof way to exchange their public keys.
See NCSA httpd/Mosaic: Using
PGP/PEM Authorization" for installation details for browser and server.
3.1.2: Secure NCSA HTTPD
These are pointers to on-line documentation for Secure NCSA HTTPD, which was developed jointly by Enterprise Integration Technologies, RSA Data Security, Inc., and NCSA. Secure NCSA HTTPD uses S-HTTP, see Section 3.6, to allow secure commercial transactions to take place through the Web.
Detailed information may be found at:
There is also a proposal for
RIPEM-based HTTP Authorization scheme. This has already been implemented by
3.3: The IETF HTTP Security Working Group
The HTTP Security
Working Group is a proposed group unratified by the IETF. Because they are a
proposed group, their charter, goals, and associated web pages are currently
under construction. Their basic goal, however, is to develop requirements
and specifications for the provision of security services to HTTP. This
working group should be watched for future developments in secure HTTP.
Shen is a
security scheme proposed by CERN.
Shen provides for three separate security related mechanisms.
RSA's public key cryptography.
Currently this protocol is implemented on the Netscape browser and the Netsite Commerce server. This protocol would allow a user to transfer credit card and other personal information from any Netscape browser to the Netsite Server.
Bank of America is using Netscape to provide real-time online credit card authorizations. First Data Card Services Electronic Funds Services (EFS), the world's largest credit card payment processor, is also providing the same services using Netscape. MCI is also providing a similar services: a secure online shopping mall.
For additional information see:
EIT also has presentations, on the web, which are in MS Powerpoint for Windows 3.0. They claim they should be portable to Macintosh Powerpoint.
Additional information can be obtained by mailing firstname.lastname@example.org or examining the following links:
David M. Kristol, of AT&T Bell Laboratories has proposed an extension to HTTP
which will provide security through the use of "wrappers." Wrapping is just
what it sounds like: the body of the WWW communication is wrapped by adding
headers and footers and is often encoded. Enough information is present in
the headers for the recipient to decode the message. For more information, see
A proposed Extension Mechanism for HTTP.
SimpleMD5 is another specification of a secure authentication scheme built on top of HTTP/1.0. This scheme does not provide a secure transfer mechanism or encryption of body content. It is designed primarily to facilitate a secure access authentication mechanism.
This is based on a "challenge-response" mechanism using a "nonce" value. This value is something that is used for one transaction which both the server and client must be in agreement upon. The checksum of the password along with the nonce value is broadcast across the network, thus the password is never sent in the clear.
This model has been proposed by SpyGlass, Inc. SpyGlass has also proposed an Enhanced Mosaic Security Framework. Enhanced Mosaic is designed to work with S-HHTP. For additional information see:
The most important goal of the designers is to implement security without using patented or exported restricted software. This is a CERN supported proposal although its not exactly clear from the Web page who's idea it is.
(Table of Contents)
4: Secure Browsers
4.1: Secure NCSA Mosaic
Secure NCSA Mosaic was developed by EIT in cooperation with RSA and the NCSA. Additional sources of information include:
(Table of Contents)
5: Firewalls and WWW Proxies
WWW proxy servers run on firewall machines to provide access to WWW to clients that reside inside the firewall. Security firewalls are designed to restrict direct interaction between hosts behind the firewall and all other outside hosts on the Internet, see for more information on firewalls.
The CERN HTTP server may be configured to run as a proxy server. An additional benefit of proxy servers, is that they may be set up to provide caching which will significantly cut down network traffic costs and providing faster response times.
If you are unable to run a CERN HTTP proxy server, NCSA HTTP may be modified to run behind firewalls using the SOCKS package. Normally NCSA HTTP requires a direct Internet connection to run correctly. SOCKS is a package designed to run Internet clients from behind firewalls without interfering with the firewall's security requirements. The SOCKS package includes a modified version of Mosaic for X 2.0. This version is not supported by NCSA.
Proxy's can be set up to connect to other proxies. An inner proxy server may connect to the outside world through an outer proxy server. This may be a security hole. For additional information look at Authenication Between Clients and Proxies.
Proxy Related Papers from the WWW '94 Conference Proceedings:
Sources of additional information: (should be a mosaic file kicking around somewhere)
Robots, webcrawlers, and spiders are programs that automatically traverse the web collecting index and cataloging information. These programs can be quite useful, however poorly written "depth first" searching robots have the potential to bring servers to a stand still by recursively downloading information from CGI scripts which have an infinite number of links. Robots can also overload servers by producing "rapid-fire" requests for information. The suggested document retrieval access rate for a robot is to retrieve 1 document per minute or longer. For additional information, see Guidelines for Robot Writers or Ethical Web Agents.
It is often useful to protect certain areas of the server from access by robots or to prevent them from accessing the server completely. The method used to protect certain areas from servers is to provide a single file, available via HTTP on the local URL "./robots.txt." This file, maintained by the system administrator, specifies the access policy for the server. Policies can be based on a per robot (or user-agent) basis or may use wildcards to indicate exclusion of all robots. An example "./robots.txt" (note: our installation does not actually use one of these files) file follows:
# robots.txt for http://www.cs.unc.com/ #Allows arachnophilia to look everywhere it wants User-agent: arachnophilia Disallow: #Disallows any other user-agent from accessing the /tmp directory tree User-agent: * Disallow: /tmp/
This approach can be easy implemented on existing servers and the access policy can be determined through a single document retrieval. Robots can follow the protocol of accessing the "./robots.txt" file to determine whether they should continue to access the server. Additional sources of information:
The W3Launch tool is designed to provide user support for collections of free-ware teaching software, which are accessible using ftp. WWW will provide an interactive learning environment in which teachers and students can access the software, run it using WWW, and read additional support material. W3Launch, according to the author, allows for the combination of " hypermedia support material with software menuing and launching." Need to figure out what the hell the guy is talking about.
Please e-mail me if you have any problems with this email@example.com