This is a memoir of my experience with FP6000#4 in Regina at Saskatchewan Power Corporation. It covers a period from spring 1963 to late 1965.
Roger D. Moore Feb 2018
Don Peters (who had a Stanford MBA in EDP) accepted a position with Saskatchewan Power Corp as head of Computer Services department. In addition to his experience selling RCA computers, he was familiar with Stanford University's operation of a BALGOL-based computer service.
These experiences led to several strong requests to FP:
1] Algol compiler
2] AVR for magnetic tape not ML&W scheme
3] provision for compile and execute of Algol programs.
4] CPU timer to allow billing computer users
5] ?? magnetic drum
1] Don managed to persuade Don Ritchie (head of FP computer division) that I was capable of writing an Algol 60 compiler. At this time I was part of the Stanford University SUBALGOL project which was essentially completed. I was hired on a 18 month contract as half of the SPC post-sales support team. (EDP expert Lionel Albert was the other half of the team). The intent was that the Algol compiler would be used for engineering activities at SPC.
2] The magnetic tape unit allocation proposed in Marcotty, Longstaff and Williams was:
When referring to peripheral units in a program, the programmer numbers his units of a particular type starting from zero. Suppose, for example, that the program named "BILL" uses four magnetic tape units. The programmer will number these units 0, 1, 2 and 3. If BILL is to be run on an FP6000 system with six tape units which are numbered 9 to 14 and if, at the time when BILL is input, units 9 and 12 of the system are already being used by current programs in the system, then EXECUTIVE will allot units 10, 11, 13 and 14 to BILL. EXECUTIVE will type out a message to the operator giving the correspondence between BILL'S unit numbers and the system unit numbers. Every time BILL refers to his magnetic tape unit 0, system unit 10 will be used. Because of this arrangement for the numbering of peripheral units, all peripheral transfers are handled by EXECUTIVE. This method also shields the programmer from a lot of the work in organizing peripheral transfers. If BILL is run on another occasion, the allocation of actual units to the units 0 to 3 in BILL may be quite different, but, since the change from the program's unit numbers to the system unit number is performed automatically by EXECUTIVE, no alterations have to be made to BILL.
This scheme involves what Don Peters thought to be an excessive amount of operator fiddling. It was really only feasible given a tape unit in which the address used by the computer could be easily changed by a an address dial (IBM 727/729) or address plug (IBM 2314). The Burroughs unit lacked this feature.
The scheme which was eventually adapted for use on FP6000s with tape was what IBM referred to as Automatic Volume Recognition. Any magnetic tape mounted on an unallocated tape unit was required to have a label as the initial record. Mounting a completely blank tape could cause a runaway read operation which the operator could abort by taking the unit offline.
My vague recollection is that EXECUTIVE only read the tape label if there was an outstanding tape open instruction. In practice there were two common tape open I/O operations: 64 open file and check label 384 find scratch tape and label it 128 was a rarely used variant of 64 which insisted on absence of a write ring.
mode 64 looked for three words (12 characters) of programmer identification and one word reel number (binary) mode 384 wrote the four words used in modes 128&64 plus a file serial number, retention cycle and date. Original intent was that that mode 384 would find a tape here sum of retention cycle and date written was less than current date. This may have changed to "retention cycle = zero".
I don't know exactly what Don Peters had in mind for compile and go operations. The EXECUTIVE architects were oriented towards the commands described in ML&W. Deletion of extraneous characters from commands (e.g. "LOAD #BILL ON X." became "LOAD #BILL X.") meant that almost all commands could fit in 18 characters. It was agreed that cards which began with "uparrow STAR" were non-existent at SPC. The resulting scheme was such that a card with the trigger characters in columns one and two was a control card. Readying an unallocated card reader caused EXEC to read the first card and check for a control card. A control card could contain up to four commands.
If a plausible command was found, it was parsed and executed. "LOAD #JACK." was expanded to "LOAD #JACK <command source>." where source was the unit number of the device from which the command was read.
Successful completion of the EXEC activity initiated by the command led to an attempt to find a successor on the same card or the following card. Completion of some commands such GIVE or SUSP[end] was immediate; others such as LOAD, GO etc were only complete when the program reached a situation where an INSTRUCT #JACK message would be printed (see manual section 6.2, page 2). Before printing the INSTRUCT message, EXEC attempted to find another command. Successor command could be found either twenty card columns beyond the first command or on the following card. A freshly loaded program needed a GO command to begin execution. For Algol, Autocoder, and Fortran the entry point determined the output device: 23 paper tape; 24 card punch; 25 magnetic tape; 26 drum
For compile and go; the source program was followed LOAD and GO commands. For compilation to drum I believe LOAD #JACK 18 (18 was external drum address) was used. The LOAD command looked for a file named JACK and tried to load it.
I am a little fuzzy on details of LOAD but I may have some points right. LOAD #JACK. command from card reader assumed next card was start of object program (request slip) LOAD #JACK 18. tried to load from a drum file named JACK. LOAD #JACK (tape unit number) search the tape for start of JACK object program. This was used for a system tape containing several programs. I cannot remember syntax which caused EXECUTIVE to search unallocated tape units to find an output tape from Algol/Fortran with a semi-compiled JACK.
The computer operators used command cards for some purposes which were not anticipated. A common operating requirement was converting a labelled tape into a scratch tape. Assuming the tape reel was mounted on unit 22, the sequence: GIVE #LABL 22. GO #LABL 20. would rewrite the tape label with a retention period of zero while preserving the volume serial number. LABL would release the tape drive when it was done. LABL also had an entry point for processing a completely blank tape. A console ALTER command was required to supply the volume serial number.
Certain other utility programs could be loaded and perhaps executed with command cards.
Don Peters was interested in how much CPU time was used by a program so that some department could be billed. FP6000#4 included a CPU timer which incremented while a program was running. A timing pulse which occurred during a hesitation or when the machine was in Executive was ignored. This enhancement of ignoring hesitations was imitated by IBM when the 370 CPU TIMER was introduced. One problem with this kind of billing scheme is that it requires keeping records of completed programs. I don't recall this being done in my time in Regina.
Base language was from the 1960 report rather than the 1962 revised report. My hope was for relatively good performance using a simple single pass compiler. As the Fortran compiler was also single pass, the GPL loader program already had some facilities needed by a single pass compiler.
1] Some interworking with the Fortran project was desirable. The calling sequence for Fortran/Algol compilers allowed a main program written in one language to call a subroutine written in the other. This calling sequence of capable of supporting Jensen's device in an Algol only environment.
2] Jeannetta Peters who was a consultant to SPC who insisted that Fortran-like READ and PRINT statements had to be provided. They are characterized by phrases to gather data for output and distribute data for input. (Fortran manual Input/Output chapter refers to these phrases as "list".) The format argument was a <character expression> (defined below). If a program contained a PRINT/ READ statement. the Algol compiler had to ensure that the program request slip requested a line printer/ card reader. Fortran I/O package did the heavy lifting for PRINT/READ.
3] I had just finished working on SUBALGOL where some external subroutine extensions to BALGOL were used to process characters. I decided to add a character type to FP6000 Algol. The rules were influenced by the hardware limitations of the FP6000 and a desire to keep performance relatively high.
4] Some of the more awkward features of Algol 60 were supported. Calculated call by name which is sometimes called Jensen's Device was supported. (Part of the compiler test suite was a call of Merner/Knuth's GPS to compute then Nth prime.) Another supported awkward feature was support for GO TO undefined switch designator (4.3.5).
New declaration:
<character segment> ::= <character identifier> [ <upper bound>] | <character identifier> , <character segment>
<character list> ::= <character segment> | <character list> , <character segment>
<character declaration>::= CHARACTER <character list>
Upper bound of character array was restricted to 127. This allowed use of the 064 looping instruction to increment a character address and decrement/test a 7 bit count.
Character expression
<character expression> ::= <character identifier> | <character identifier> [<subscript expression> , <result length> ] | <character constant> The zero-origin subscript was starting position; both arguments were integer expressions. Thus FP6000 ALGOL: dest := source[index, length] is equivalent to C string.h: memcpy(dest, source+index, length)
The other uses of character expression were as procedure arguments and in comparisons.
Most of the compiler was written while I was resident in Regina. SPC supplied the key punching. During a short visit to Palo Alto I had a chance to debug the Floyd production tables used to translates a token stream to postfix notation. A Burroughs B5500 ALGOL program was used to develop these tables. This work was completed on 22 Nov 1963.
FP leased most of the former head office building of Victory Aircraft in Malton. This building is also known as the "A.V. Roe Main Administration Building". It was torn down sometime after Boeing Canada closed the plant in 2005. I arrived in Malton, Ontario in June 1964 with lots of cards. Initially the only machine for software testing was FP6000#6. It had a card reader intended for SPC (probably the 800 card / minute model) and paper tape in and out plus a line printer. Thus object programs were all on paper tape.
#6 was used for some hardware development. For a while it had a special peripheral connected which was used to exercise a device invented by Gord Lang. It was called a "modem" and I had no idea what it did.
Later #4 and #5 became available for some software testing. Both machines were also used for debugging tape and drum control hardware. The compiler was finished in November 1964 about a month before my contract expired. At this time meetings to plan the incorporation of I. P. Sharp Associates Limited were in progress. I relocated from Malton to Regina in December and was there when the machine was being installed. In December my status changed from FP employee in post-sales support to IPSA employee providing consulting services to SPC.
Some performance problems were encountered on the EDP side. The sort program was a read-backwards polyphase sort program which used from three to five tape units (usually four). The initial pass of a polyphase sort creates ordered runs of records on the work tapes. Subsequent passes merge these runs until a single run remains. In theory it is desirable to give the initial pass lots of storage so that the length of the initial runs increase. This in turn reduces the number of passes required. The implementation used by the original programmers moved whole records around more than was required.
In forming these runs the records buffered in storage are in two classes: Those with a key which allows them to be included in the current run; those with a key which prevents inclusion in the current run. My solution was to maintain two vectors of pointers: one vector for current run; another for excess. A newly read input record was usually placed incorporated into one of the two groups (a special case was immediate copy to the work tape). a pointer to the new record was inserted into one of the two vectors. Its place was determined by a binary search. This process was still of N-squared duration but the base time was the time to move a single word with the block move instruction (two core cycles=5µs). This provided satisfactory performance.
The machine was installed in December. Business programmers had written PGEN programs in summer at Malton. With realistic data volumes, performance of PGEN object programs became an issue. PGEN numbers were stored in decimal which made calculations rather slow. There may have been other problems.
I proposed using Algol for EDP. Algol compiler already had a limited capability for handling strings. (See character expression above). Compiler was extended to provide double precision integers.
Subroutine library for EDP was added: String functions: conversion between string and integer both ways
My recollection is that conversion routines were fairly primitive. Decimal field would have been a fixed length ( character expression ). Leading blanks were probably tolerated. I cannot remember exactly how negative numbers were represented (leading minus or eleven over-punch on final digit). Perhaps absolute value was applied before conversion. In practice an Algol integer often represented a number in dollars and cents. I cannot remember if the software acknowledged this (leaving a blank in the decimal point position would have solved most of the problem).
Punched card oriented routines to add/remove zone punches above digit punches.
1] Apply a zone punch to a single digit. POP
2] Retrieve the zone punch from a single column destructively. PIP
3] Remove all zone punches from a character argument.
I/O functions
FP6000 tape convention established by Lionel Albert was fixed length records which were usually catenated into a block of several records.
Read/Write blocked records
Read/Write unblocked tape
Read an eighty column card
Tapes were assumed to be labelled. Exceptions to this were handled with the GIVE #JACK 2x. command.
To quote from the FP6000 programming manual: In some cases the programmer must allow for a 9-word area consists of one mode word, one reply word and 7 words associated with the label. The nine word area takes the form:
Word 1 Mode
2 Reply information
3-5 Programmers identification
6 Reel number
7 File serial number
8 Retention cycle
9 Date written
For the two open input modes, words 3-6 were input fields. Executive searched for a tape with a label which matched these four words. Words 7-9 were read from the label.
For open labelled output tape, Executive searched for a tape with expired retention (and a write ring). Words 3-9 were written to tape as part of the new label.
Tape marks were handled rather differently than in the IBM world. A major difference from 360 practice was the support for intermediate tape marks in the middle of a file. This involved use of what ICL called "qualifier blocks". The three situations where a tape mark was used in the FP6000 were
end of file: TM EOF qualifier
end of reel: TM EOR qualifier
intermediate tape mark: TM ITM qualifier (included three user words) TM {trailing TM was for read backwards reasons}
Ferranti-Packard used a five word qualifier (ICL expanded this to twenty). The initial word was some standard value (I think it was word of all ones which in hindsight suggests would be a poor choice). Second word indicated type: EOF/EOR/ITM. My recollection is that EOF/EOR made no use of the last three words. For ITM the contents of the last three words were user defined.
My recollection is that only one application at Regina used ITM. Remember that an EDP program was rather cramped in the number of tape drives used (maximum of four). The tape to line printer utility expected ITMs, EOR and EOF qualifiers. When the program encountered an ITM and the three user words (twelve characters) were printed on the TTY33 console (which suspended the program). The computer operator interpreted this message as a request to mount a specific form on the printer and resume execution of the utility via a GO command.
Routines used by Algol programmer
Initialize buffers
Open tape unit
Read/Write
Write EOF/EOR/ITM
Read qualifier block
BEGIN
CHARACTER LINE[121];
reserve line space LINE[0,1] is carriage control, LINE[1,120] is line image
INTEGER ARRAY PRTBUF[-8 : 5*31];
reserve a buffer of 5*31 words
where 5 is blocking factor; 31 is record size (4 char/word)
first nine words were for housekeeping etc (see below)
INTEGER ARRAY LABELWORK[1:9];
work area for open
MOVEWORDS( LABELWORK[3], 'MY PRINT TAPE',3); LABELWORK[6] ← 0;
tape label is irrelevant but needed. Systems analyst might specify value.
OPENOUTPUT(unitnumber, LABELWORK);
unitnumber is integer in 0-7 range which pairs PRTBUF with opened tape unit
OPENWRITEBLOCKED ( PRTBUF, PRTBUF, 31, unitnumber );
name of buffer is repeated to indicate single buffering.
double buffering required declaration of two buffer arrays.
31 is logical record length in words
unitnumber is tape unit (chosen by programmer) range is 0-7.
LINE ← '0HELLO WORLD VIA EDP EXTERNAL PROCEDURES' ;
leading zero is carriage control (single space after print),
unequal length move was blank padded
WRITEBLOCKED (PRTBUF, LINE) ;
output one record.
CLOSE (PRTBUF) ;
force out partial block, write end-of-file stuff , release tape unit
The tape routines required the Algol programmer to declare one or two buffer arrays in an outer block. Two buffers were required for the double-buffered versions of a read/write routine. Tape operations were usually overlapped with computation. My memory is dim on how many callable routines existed. Read and write buffers differed only in the -4 (mode) word; unblocked is special case where blocking factor equals one. I am also vague about how unit numbers were assigned but I guessed at one possibility in HELLO WORLD example.
The nine words which preceded the actual buffer locations were used for various purposes
-4 thru -1 held the executive control area
-4 mode ;
-3 reply from Exec;
-2 block size (normally 5*31 for our example);
-1 pointer to buf[1] needed by Executive
0 held drum address for unblocked/unbuffered read/write.
Other word required by the read/write routines were
-8 pointer to other buffer (or self)
-7 record length
-6 position within current block for next record
-5 unit number (I cannot remember how these were recorded).
I have a guessed a little about contents of words -7 -6-5. Algol programmer could ignore everything except the rarely used drum address word. The length of the buffer could be calculated from the Fortran compatible array header. http://www.chilton-computing.org.uk/acl/literature/manuals/p001.htm#c22p4 There were other subroutines
The Algol programmer had some responsibilities regarding unusual conditions: After a call to the external read/write procedure checking for a non-zero reply word was required. There are some potential problems with requiring the Algol programmer to look at buffer[-3]. I believe this word could be set by an interrupt when tape operation finished. My recollection is that synchronization was achieved by delivering an integer result from the read/write procedures.
When tape-mark result is returned, there are no more buffered records awaiting delivery to the ALGOL program. The ALGOL program had to read the five word qualifier block to ascertain meaning. For end of file, setting some internal flag to indicate EOF was encountered was sufficient. When the program completed, the tape unit would be closed.
For end of reel, some work was required. Some sequence such as:
CLOSE(INBUFFER);
LABELWORK[6] := LABELWORK[6] + 1;
OPENINPUT(unitnumber, LABELWORK);
repeat the failed read op (blindly assuming result is zero)
For write operations, end of tape was processed in a similar fashion. ALGOL program had to write a tape mark and a five word EOR qualifier block.
ALGOL program could write an ITM sequence (see remarks on tape to print utility). I cannot remember any application which expected to read them but it would have been feasible.
I did introduce a checkpoint facility for use in certain long duration programs with multiple reel master files. CA51 which was the main customer billing program was the initial user. I don't know if other programs eventually used it. A checkpoint was written at the beginning of a master file reel other than the first. It was preceded by some kind of ITM sequence (I cannot remember details). The checkpoint records on tape were followed by a similar ITM sequence to facilitate skipping over a checkpoint which was not needed. I cannot remember details of checkpoint writing and use. (It may have been used the 154 and 155 Executive instructions).
In hindsight I made a grave error when I wrote the code to skip past a checkpoint. I modified the external procedure to read blocked tape so that it ignored an over-length block. This allowed me to write a simple Algol loop to ignore the checkpoint blocks. My error was that the procedure had previously been quite intolerant of an over-length block and aborted the program with some kind of console error message. Some later programmer had a misunderstanding on blocking factor and thus managed to discard some fairly important historical data.) Data was reconstructed by a fair bit of rerunning programs.). MORAL beware of changing the specifications of a subroutine which is used by other people. It would have rather easy for me to create a short Autocoder procedure to skip forward to the next tape mark.
As SPC switched their customer billing from IBM 604/1401, some intermittent problems with the magnetic tape system became obtrusive. The usual problem was that a tape was unreadable or unwritable and so a program had to be cancelled. These problems were addressed in three different ways:
1] Optional checkpoint facility for some long duration programs
2] Executive changes
3] Identification and correction of a minor design error in the tape control unit.
I introduced a checkpoint facility for use in certain long duration programs with multiple reel master files. CA51 which was the main customer billing program was the initial user. I don't know if other programs eventually used it. A checkpoint was written at the beginning of a master file reel other than the first. It was preceded by some kind of ITM sequence (I cannot remember details). The checkpoint records on tape were followed by a similar ITM sequence to facilitate skipping over a checkpoint which was not needed. I cannot remember details of checkpoint writing and use. (It probably used the 154 and 155 Executive instructions).
In hindsight I made a grave error when I wrote the code to skip past a checkpoint. I modified the external procedure to read blocked tape so that it ignored an over-length block. This allowed me to write a simple Algol loop to ignore the checkpoint blocks. My error was that the procedure had previously been quite intolerant of an over-length block and aborted the program with some kind of console error message. Some later programmer had a misunderstanding on blocking factor and thus managed to discard some fairly important historical data. (Data was reconstructed by a fair bit of rerunning programs.). MORAL beware of changing the specifications of a subroutine which is used by other people. It would have rather easy for me to create a short Autocoder procedure to skip forward to the next tape mark.
As SPC had some tape problems, the ability to restart CA51 (the principal customer billing program) from a checkpoint saved quite a bit of rerun time and was probably worthwhile. In 1974 when I was consulting at National Revenue Taxation (the income tax group), I mentioned this to some programmers. They remarked that their master file was ninety reels of tape and they expected to encounter at least one unreadable reel per master file update. As the number of records written to a single reel was artificially constrained (rather than write to the EOT mark), it was possible for NRT to reconstruct a single reel of the master file.
The magnitude of the problem was not well understood. Some program runs were aborted when Executive's attempts at error recovery failed. The frequency of minor errors was not known. As a first step, Executive was modified to print a three character message when recovering from a tape error. These messages were sometimes so frequent as to interfere with input operations on the TTY33 console. A new command to suppress/enable these commands was introduced.
The preceding minor change required knowledge of how to modify Execute and exactly how it did certain things. Executive was programmed in a rather primitive language called ASSEMBLER. It allow addresses to be referenced by a symbolic name (codeword). With the help of my apprentice Bob? Jackson, in one hour I wrote an Algol/Autocoder program which read the source tape for Executive. If a line contained a codeword, an output record was written with the codeword in a standard position and the line image to the right. These output records were then ordered with the standard SORT utility and then printed. We then had a cross-reference for Executive which not only supplied the reference but also indicated how the symbol was used. Favourable experiences with this primitive cross-reference influenced the design of the IPSCOBOL. The compiler generated cross-reference had a (Fetch, Store, Unknown) code for every reference to a symbol.
Now that we understood where the Executive magnetic tape routines were located, it was fairly easy to modify them. The next modification was to add a diagnostic facility to the 175 (magnetic tape I/O) instruction. The user supplied a special register word to be sent to the peripheral controller; executive read the special register when the operation completed. The only censorship of the operation was datum/limit processing on the address count pair from the 175 instruction. This allowed an Algol program to experiment with different error recovery procedures. I cannot remember exactly what change was made for read error recovery. I know that some other systems move the tape a few records backwards and then retry the read. (This assumes that error was caused by crud on the tape which will be removed or relocated by the back and forth motion.) For write errors, experimentation was helpful. For the raw tape used at SPC a defective stretch of tape detected during a write operation was several inches long. Increasing the length of tape which was erased before retrying the write gave a noticeable improvement in reliability.
The subtle design error in the MTU concerned the detection of tape marks. My recollection is that the MTU sometimes hallucinated a tape mark when reading forward. A pair of JK flip-flops which were supposed to perform a shift malfunctioned. The design error was that the second JK FF has triggered a pulse which was slightly delayed. The JK inputs to the second JK became the new value of the first FF rather the previous value as in a proper shift register. This speeding up was attributed to aging of capacitors in the second FF but there was never any attempt to swap cards around to verify this. The solution was to slightly delay the trigger pulse to the first FF so that both JK FFs switched at the same time.
The main criticism of storage allocation in the FP6000 is that all of the unused storage was at high addresses above the last program loaded. When a program other than the most recently loaded completed, other programs had to be moved. In theory this move should have taken about a tenth of a second or less as move rate was 2E5 words/second. In practice it was several seconds as the scheme which Executive used to ensure cessation of I/O was a little clumsy. As the condition was rather rare at SPC, this flaw was never corrected in my time. The reason is that the program being deleted was usually the most recently loaded. With a different pattern of computer usage (e.g. George 3 environment) this might have been a problem.
Typical mix of programs at SPC in the daytime included a lot of short duration compile and go for either program testing or engineering applications. There might be a tape only EDP program running as well. Some short duration EDP programs which generated error lists requiring immediate clerical attention were also run in daytime. Evening runs were four-tape master file update or sort concurrent with a utility (tape to print or card to tape).
The tape AVR system worked fairly well. It was supported by a tape library in which it was impossible to check out or own a blank tape. A special project could only own a tape which contained data. Having fungible scratch tapes was an important part of AVR. AVR on input generally worked well. One exception was a monthly master file update where the systems analyst had given the input and output master files the same name. This kept the computer operators hopping on reel changes.
When I left SPC production programs were loaded from cards. A program could take from 200 or a thousand cards. Loading at 800 cpm was fairly quick but the 200 cpm reader sometimes had to be used. Miklos Barabas thought this was a waste of computer time and asked my successor as system programmer to fix this. The solution was a utility program which read an object program and wrote it to a named drum file.
One matter of concern to Miklos was toleration of certain equipment outages. Regina was three days away from Boston for air freight purpose where the Analex printer was made. The only other magnetic tape EDP computer in town was an IBM 1410 operated by the Government of Saskatchewan. They agreed that this problem and so both installations wrote some software in anticipation of a printer failure. SPC could read a tape from the 140 and print it. 1410 shop could print an FP6000 generated tape. Although these programs were tested, they were not needed in my time.
Bernard Cherny was the chief technician. He managed to find substitute transistors to repair circuit cards as the original transistors were no longer made. He also found a master machinist who worked for a truck garage. When the ball bearings for the Burroughs/Machines De La Rue Bull 300cpm card punch failed, SPC was somewhat inconvenienced. The machinist made bushings to replace the failed bearings. Aside from the need to pause every 15 minutes to lubricate them, the bushings worked well. When the oil pump on the Analex printer failed, Berny discovered that a Cummins diesel used the same pump, he bought one for a third the cost of the Analex replacement.
In Spring 1965, the cabinet minister for SPC was supposed to drop by the SPC computer centre to see the FP6000 through the glass viewing wall. This was a time when the computer was normally unused. Don Peters requested a dummy program to make the FP6000 look busy. I wrote a short program which wrote nonsense blocks for tape until end-of-tape was encountered.. It then executed end of reel procedure and started another tape. As tapes were open with retention cycle of zero, this could go on forever with one or two tape drives. A similar program was written to read the tapes. This kept two of the tape drives spinning. I tried to guild the lily by cloning the two programs to spin four tape drives. This didn't work as the lowest priority program stalled.
I inquired about this and learned that Executive tape channels were scheduled in a more primitive fashion than the CPU. If an attempt to start a tape operation failed due to an "all channels busy" signal, the program was suspended but remembered. When a tape channel became available, the remembered program was executed regardless of priority. Only one program could be remembered. Therefore my test bed with four programs and two channels stalled. As this scenario was quite rare at SPC, Executive was never fixed.
A PDP8 was installed in Gas Dispatch. It had some simple electronics to accept data from Bristol pulse duration modulation telemetry system. Although the scheme was fairly error resistant, PDM seemed an extremely inefficient use of bandwidth. It was quite simple so demodulation and demultiplexing were easily performed by software. The PDP8 was connected by a four story long twisted pair to a new peripheral controller on the FP6000. Data could be sent back and forth via a half-duplex protocol. I helped modify the PDP8 software to improve reliability. Someone else had already provided some support software on the FP6000 end. One use at the FP6000 was a small permanently resident program which recorded some historical temperature data on the drum.