Network Working Group
Request for Comments: 369
NIC: 6801
J. Pickens
UCSB COMPUTER SYSTEMS LABORATORY
25 July 1972

EVALUATION OF ARPANET SERVICES

January through March, 1972

ABSTRACT

RFC #302, Exercising the ARPANET, described a group organized at UCSB to investigate the network resources. The stated goals were to develop problem solving capability and, in the process, produce helpful criticism for the nodes investigated. This report summarizes the group's experiences and finding and suggests network refinements to improve user satisfaction.

The group's encounter with ARPANET included many unexpected problems and difficulties. Most worthy of mention are software heterogeneity and inadequate documentation.

From this first hand experience the group has formulated criteria for ease in use of network resources. The report presents these criteria as well as suggestions for improved documentation, better utilization of current resources, and a plea for regular usage of inter-personal communications facilities. Individual sites have been graded on reliability, response, and friendliness. Comments regarding specific sites have been included to help in adapting to the needs of uninitiated users.

Despite problems encountered in the initial nine week exposure, enough was learned of ARPANET resources to enable the group to write useful software. Programs to effect automatic login, file transfer, and interprocess communication have been written and put to use.

TABLE OF CONTENTS

      BACKGROUND
         Approach.......................................  2
         Goals..........................................  2
      THE SURVEY
         Extent and Duration............................  3
         Statistical Results............................  3
      CRITIQUE OF ARPANET SERVICES
         A Site Measurement Parameter, "Friendliness"...  4
         Software Critique..............................  5
         Community Spirit...............................  5
         Economics......................................  6
      SUGGESTIONS FOR IMPROVEMENT
         Software.......................................  6
         Community Spirit...............................  7
      CONCLUSION........................................  8
      APPENDIX A
         Sample of Survey Questionnaire................   9
      APPENDIX B
         Grades and Comments for Specifics Sites.......  10

BACKGROUND

Approach

The test group was organized from a group of Electrical Engineering graduate students in Computer Science. Within the group was represented a substantial degree of experience with high level languages and time sharing systems (such as the Dartmouth BASIC and UCSB mathematical graphics systems). However, no one had experience in exercising ARPANET, and few knew what resources the ARPANET represented. After two weeks of presentation from Jim White and Roland Bryan, the group was turned loose for open experimentation.

Enthusiasm was high as each group managed to locate and decode the login procedures for various nodes and began to learn how to use the available resources. In fact, half of the weekly seminar time was devoted to sharing learned experiences and procedures. Interest, however, lagged some as the quarter progressed due to poor network site reliability, few active nodes, and hard to locate documentation (only five out of fourteen students remained active after the first quarter).

Goals

The primary goal of the group was to learn how to use and to evaluate network resources. It was decided to be fair but direct in evaluating each site, including UCSB. Since the level of networking experience was initially low, the evaluation criteria was dictated mostly by gut feelings.

At the conclusion of the first quarter's effort, a questionnaire was given to the students (a sample of which is included in Appendix A).

The group response is summarized for overall performance below. Data for individual sites is presented in Appendix B. Some of the questions asked were the following:

Estimate percentage of time spent in various trouble states Estimate the mean time to failure
Describe personal experience with the network Suggest improvements
Grade the investigated nodes on the factors of reliability, response, and friendliness

THE SURVEY

Extent and Duration

During the period in which the major effort was expended (January- March, 1972) relatively few nodes were active. Experimentation, therefore, concentrated most heavily on UCSB, BBN-TENEX, MIT-MULTICS, and SRI-ARC. Minor investigation was performed of HARV-10, UCLA-NMC, and UCLA-CCN. The remaining sites were either inactive or inaccessible for lack of documentation.

Activity included the following:

      Game playing (e.g., chess, life, and doctor at BBN-TENEX)
      Text and file manipulation (e.g., COL, NLS, TECO)
      Inter-personal communication (LINK and SNDMSG)
      On line compilation (e.g., TENEX FORTRAN, MULTICS PL/1).

Statistical Results

Figure 1 below summarizes the overall response to the questionnaire given to the group after nine weeks experience with the ARPANET. Individual exposure varied from ten to sixty hours, and twelve students responded. Each survey item is presented as a group average (sum/12) and is supplemented with a low and a high value to show the range of response. The questions were slightly ambiguous in that they failed to distinguish between node inactivity and local NCP inactivity. Also, some figures may reflect individual students' inadequacy in understanding local and foreign procedures. Nevertheless, the data is interesting as a look into uninitiated user experience.

Figure 1

   Survey Item                                Average    Low    High
   
   % of time unable to log in any site         12,4%     2%     25%
   % of time unable to log into desired site   35.7      20     75
   % of time foreign site suddenly crashes     13        5      50
   % of time local site suddenly crashes       12.5      5      25
   % of time trouble free operation            35        0      80
   Approximate mean-time-between-failure       1h       5 min   2 hrs
   
   TOTAL TIME INVESTED                         32.3hrs  10 hrs  60 hrs
   
   First to be noted is that considering the entire ARPANET complex, no
   one approximated the mean-time-between-failure at more than two
   hours!  Secondly, the average time for "trouble free" operation was
   35%, a figure untenable for regular user usage.  In all fairness,
   however, some sites were much more "trouble free" than others, and
   individuals tend to define the term by the level of their own
   competence and experience, thus explaining the high of 80% and the
   low of 0%.

CRITIQUE OF ARPANET SERVICES

A Site Measurement Parameter, Friendliness

Much discussed by the group was the concept of "friendliness", especially as it applies to on-line systems. The following definition of friendliness is offered, based on direct network experience.

Friendliness is:

Concise, complete, and available documentation.
Easy system usage (e.g., minimum numbers of keys for login system and job status readily available).
Easy to reach help both on-line people and on-line files. No messages overkill (as sometimes unexpectedly occurs during login).
Reasonable reliability and response time
Concise, but informative error diagnostics

The reader can probably think of more criteria, but these were the outstanding points of friendliness generated specifically by the group's experience.

Software Critique

1) Initial experimentation concentrated on login procedures, canned scenarios (e.g., Abhay K. Bhushan's ARPANET scenario, RFC #254), game playing, and inter-personal communication. As the effort continued, attempts were made to solve problems at various nodes. One student, for example, programmed a Newton-Raphson root finder in PL/1 at MIT- MULTICS a blackbody problem in FORTRAN at BBN-TENEX and MIT-MULTICS, and in PL/1 at MIT-MULTICS; and a Discrete Fourier Transform in BASIC at BBN-TENEX. It is the group's conclusion that small problems can be written in a half hour, entered and edited in fifteen minutes and debugged in another fifteen minutes. For small problems the current ARPANET software resources are quite adequate.

2) By far the most annoying difficulty was obtaining adequate documentation. The resource notebook was found to be interesting but of limited utility.

3) Information about each node's NCP, which was requested in February, 1972, is still unavailable.

4) Significant variations in procedures were found in executing similar tasks on different nodes. Consider, for example, the wide variety of text editors with unique file naming, editing, and manipulation commands (TENEX, TECO, COL, NLS...). Consider, too, the wide variety of compilation, load and execute procedures (RJE for UCSB edit, save, compile, save, load, execute for TENEX systems). Even more disparate are the "executive level" commands with all their varieties (TENEX's "Control-C", UCLA-NMC's "X", UCSB's "RESET" ... all of which return to the "top-lvel"). Software heterogeneity is a stumbling block to the user.

5) Residents of large nodes are hard pressed to find problems which should be solved outside of the local environment. With UCSB's mathematical graphics on-line system and direct access to batch, the group experienced apprehensive twinges spending hours on the network solving problems which could be solved in minutes locally.

Community Spirit

1) Individuals sometimes got the impression (erroneously it is hoped) that some researchers in the ARPA community had little desire to consult and/or help. On the other hand, others bent over backwards in giving assistance. The group had hoped for a more consistent response.

2) There was difficulty in locating the source of responsibility for resource development. It seemed to the seminar group that the complete distribution of responsibility negated incentive to locate, document, and create useful network resources.

Economics

Network economics at levels above as well as the communications level, are a big user problem, e.g., if distributed computing is allowed, then distributed billing is a necessity. It is frustrating to watch accounts randomly die at different nodes and have to spend weeks in monetary renovation. This problem was experienced with a site which (a) randomly changed passwords and then (b) eliminated its free account. Also there is a problem with double connect charges, e.g., $4.00 per hour at UCSB to sign on to BBN-TENEX at $8.00 per hour, which totals to $12.00 per hour!

SUGGESTIONS FOR IMPROVEMENT

In spite of the many difficulties and frustrations, the class was impressed with the potential of ARPANET and produced several suggestions for improvement.

Software

1) Working groups should be organized to define problems which require the use of a significant set of the network resources.

2) The ARPANET represents a great resource already, even with TELNET as the only operational protocol. More effort should be put in utilization of what currently exists. Two illustrative examples follow:

a) By combining the resources represented by UCSB's OLS and UCSB's

TELNET, user programs were created to sign on automatically to the various sites. Thus a network user need know only the sign-on procedure for UCSB; all settings of local/remote echo, character/line at a time, upper/lower case, etc. are taken care of automatically by the pre-written user programs.

b) Combining the resources of TELNET PROTOCOL, PL/1 subroutine

calls to the UCSB NCP, and 360 O/S multi-programming, a group of students created a batch-fed command language in PL/1 to communicate via telnet with foreign sites. This program has been used successfully to investigate file transfer (NIC files are regularly copied on 8-1/2 x 11" white printer paper, and cards will soon be transferred to I4-TENEX), interprocess communication (a program was started at BBN-TENEX to be used as a subroutine locally; plans exist to initiate and monitor a chess game between BBN-TENEX and SU-AI), and data transfer (pre-formatted files of data have been transferred from UCLA- NMC to UCSB; UCLA-NMC will soon make available survey and measurement data ala TELNET PROTOCOL and through direct ICP!). Moe details of this program will be available in a future report.

3) Documentation: A self-sufficient mini-user-manual (MINIMAN) should exist for each site and also for each function network wide, such as the FORTRAN compilers. The MINIMAN would be similar in some respects to the resource notebook, but would be more oriented to helping the user run. A site dependent MINIMAN would contain the following:

Sign on procedure
Simple file manipulation and editing commands
Compilation and execution instructions
TELNET access
Brief (!) summary of programs and subroutines
Direction on how to get help.

Overall documentation of hardware, software and human resources should be more complete. A documentation questionnaire should perhaps be circulated to authors of network programs, including the authors of Network Control Programs. Merging information from the questionnaire with the Resource Notebook would facilitate the construction of a resource-location cross referenced index. Such an index, perhaps on-line, would aid the network user in locating both software and hardware. Whatever the final scheme, more planning is required to improve the user versus documentation battle. The recent effort in this direction by Marshall D. Abrams entitled "Serving Remote Users on the ARPANET" (NIC 10606 RFC #364) is well timed and should be thoroughly considered.

4) Finally, high level subroutine calls to each NCP, such as those offered by UCSB, should be universally available.

Community Spirit

1) Networks have great though unexploited potential for inter- personal communication. The communication resources (NIC's JOURNAL, NLS TENEX's SENDMSG, LINK; UCLA-NMC'S S_.MSG:C to name a few) are used today only by the proficient few, but should be utilized regularly by all. Two symptoms of the current state of network communications from the group's point of view are that most procedural information was shared verbally in class and that many problems in locating documentation were solved by a last resort to that old standby, the telephone. Improved communications will stimulate cooperation on joint projects.

2) Names and interests of programmers/researchers willing to cooperate on joint projects and corresponding "blue sky" lists of software projects should be maintained.

3) A network NEWS and NOTES should be published to inform and advise network participants of new resources and procedural modifications. Care must be taken, however, to keep this document concise (i.e., avoid "message over-kill"). Perhaps a one page flier published weekly would meet this need.

4) A network consulting center should be created, perhaps at the existing NIC, which would specialize in non-partisan matching of network users to network resources.

5) A strong potential of the network is in Computer Science education. Being exposed to many varieties of computer systems helps the student/user avoid the narrowness of experience and opinion which sometimes exists in centers of learning and computing. In this respect the TIP user is probably the most benefited as, for little investment in local resources, many styles of systems are at his "finger-tips". Yet even for service nodes, the network represents an inexpensive extension to local educational resources. Current efforts to tap the educational value of ARPANET should be encouraged and extended.

CONCLUSION

Existing site surveys measure and evaluate the performance of IMP hardware, host hardware, and host NCP programs, but little has been done to evaluate software performance. The UCSB EE 210 graduate students attempted a primitive first pass evaluation of network resources in the period between January and March 1972. Out of this effort have come definitions and criteria which would be useful to other individuals or agencies in developing evaluation schemes on the USER protocol level. To this end, it is hoped that this report is useful.

APPENDIX A - Sample Student Questionnaire

ARPANET

   Grade Given:  A=Excellent                 Evaluation by:
                 F=Bad
   
   -------------------------------------------------------------------
   SITE | RELIABILITY| RESPONSE | FRIENDLINESS | # HOURS  | COMMENTS |
        |            |          |              |   USED   |          |
   -----|------------|----------|--------------|----------|----------|
        |            |          |              |          |          |
        |            |          |              |          |          |

ARPANET Evaluation

-- Indicate % of your sessions which were in the following categories:

              %               State
         +--------+-------------------------------------------+
         |        |  Unable to Log in to any site.            |
         |--------|-------------------------------------------|
         |        |  Unable to Log in to Desired site.        |
         |--------|-------------------------------------------|
         |        |  Foreign site suddenly crashes.           |
         |--------|-------------------------------------------|
         |        |  Local site crashes.                      |
         |--------|-------------------------------------------|
         |        |  Trouble free operation.                  |
         |--------|-------------------------------------------|
         |        |  Other                                    |
         +--------+-------------------------------------------+

-- Considering the performance of the local host, communication

network, and remote hosts, estimate the mean time to failure of ARPANET:

         Mean-Time-Between-Failure=___________

-- What was your total time invested in the ARPANET this quarter?

         Total Time Invested=___________
   
   -- Describe your overall experience with the ARPANET (e.g., rise and
      fall of personal interest factors involved, etc.).

-- What suggestions for changes or improvements or new capabilities

do you have to make to ARPANET hosts?

(Use back side or other paper for these questions if necessary)

APPENDIX B - Specific Sites, Grades and Comments

The following grades and comments are based on the two to four most representative questionnaire responses for each site. Reliability, Response, and Friendliness are averaged grades and reflect subjective criticism. Total Invested time is the sum total of the experimentation times reported by individual respondents. It is hoped that future evaluations might be more specific and complete than the current efforts, yet the value of these initial efforts should not be underestimated.

Grades:

         A=Excellent
   
         F=Bad
                                                    Total Time
   Site        Reliability  Response  Friendliness   Invested
   --------------------------------------------------------------
   BBN-TENEX       A            A         A             71 hours
   UCSB            B            B+        B-            36
   SRI-ARC         B            B         A             75
   HARV-10         C            A-        B             14
   UCLA-NMC        C-           C         D             14
   MIT-MULTICS     C-           D         C+            82
   --------------------------------------------------------------

Group Comments

      Site:  BBN-TENEX
         Very popular site
         Doctor, life and chess are stimulating and easy to use games
         Operators are very helpful
         Account problems kept site from being useful
         BASIC is well-written and easy to use
         FORTRAN is difficult to use because of the many steps to
         create-compile-execute.
      
      Site:  UCSB
         There are many problems with old key boards
         TELNET diagnostics are poor
         Online help files are sorely lacking
         Graphics are necessary for full utility
         Operator would not reload NCP when down
         List of TELNET site names are not current or complete
      
      Site:  SRI-ARC
         Good documentation exists on NLS specifics, but general
           overview is lacking
         Inter-console link is convenient and often used.
         NLS-JOURNAL is useful but requires significant training
         Online perusal is difficult at terminals with small display faces.
      
      Site:  HARV-10
         Operator is readily available
         FORTRAN is straight forward
         Easy to use editor
         Couldn't get operator to put BASIC on.
      
      Site:  UCLA-NMC
         Self-explanatory ABACUS program is not self-explanatory
         System often disappears
         Hard to get past LOG ON* without TIMEOUT GOODBYE
         Message system is well organized.
      
      Site:  UCLA-CCN
         Always up, but nothing can be done (HELP is not supported)
         When RJS is executed, there is no response until correct signon
         procedure is entered (spurious death indication).
      
      Site:  MIT-MULTICS
         Response is very slow
         Automatic logout of autonomous user is excruciatingly painful
         Text editor is very easy and helpful
         PL/1 and FORTRAN are easy to use.
     
        [This RFC was put into machine readable form for entry]
     [into the online RFC archives by Hélène Morin, Viagénie 12/99]