Tuesday, September 8, 2020

The Beast of Business (1968) and The International Society for the Abolition of Data Processing Machines

Harvey Matusow was an odd character. In the 1960s, after his confusing stint as a communist-turned-FBI informer-turned-whistleblower, he found himself involved in the underground scenes in New York and, later, in London. It was in London where he founded the International Society for the Abolition of Data Processing Machines.


In support of this organization, Matusow published The Beast of Business in 1968. The book begins by explaining that "abolition" doesn't exactly mean abolition:
The purpose of the International Society for the Abolition of Data Processing Machines is not to do away with positive uses of the machine, or of the computer in society. After all, the computer does have a good, healthy and constructive function in mathematics and the other sciences, and in solving certain problems facing, for instance, scientists, architects, and engineers. 
But when the uses of the computer involve business or government, and the individual is tyrannized, then we make our stand. ... The ISFADPM is not opposed to [computer] use in the laboratory, assembling mathematical formulae, working for the physicist, the chemist, the engineer. But when the government starts to use the computer for ends (which are planned) such as filing banks of information, complete dossiers on every individual in a country, you end up with the kind of Big Brother State predicted by George Orwell in 1984.
Matusow describes a bright line between using computers in hard sciences versus soft sciences. He was not necessarily alone here -- privacy concerns over computers in the social sciences had been raised years earlier. Then the tone quickly changes, with a manifesto warning against the loss of personal freedoms via obedience to technology. (This manifesto is vaguely reminiscent of another more famous one.) 
Manifesto
Of the International Society for the Abolition of Data Processing Machines

You are a member of society. That society is made up of free citizens and free people. These free people through free elections have determined that their free institutions will continue to exist. They have declared that, if necessary, they would defend their right to live with their freedom, in their free way, and continue to enjoy the benefits and privileges which are guaranteed to citizens in free countries.

It is upon you and your friends, faced with an over-technological society, that the future of your country rests. Your country and your society have been placed in jeopardy: their defence is possible only if you face up to the challenge presented to you by the abuse of the computer.

In the transition from non-computerised life to life in a Big-Brother, computerised society you may at first feel confused. It is the purpose of this book to help you over these rough spots as rapidly as possible and to lay the foundations for your successful career as a Computer Fighter.

Making a good Computer Fighter is no different from making a good citizen. The rules are the same - Know Your Own Job, Know your enemy and be ready to step into the job of the man ahead of you.
Most of the book is presented as a series of anecdotes, none of them sourced, many of which sound like urban legends. An example: "A major computer installation in Denmark ran into trouble in the summer of 1968. It seemed to go haywire and was giving out all sorts of mis-information. Upon investigation, specialists found the computer's ailment was SUNSTROKE." (I was unable to find any further information about this incident.) 

One of the final sections of the book is a "guerrilla warfare manual for striking back at the computer", which includes the following helpful tip: "Women going into a room with a bank of computers are advised to wear a lot of the cheapest perfume they can find - filling the room with an odour which can affect many allergic computers."

Given Matusow's general eccentricity, I'm guessing he wrote most of this with a straight face. It's a weird and interesting time capsule of the early confusion and fear caused by computers. 

Tuesday, March 10, 2020

Boris Beizer, "Software Testing Techniques", 1983

Boris Beizer was an influential figure in the field of software testing. His book Software Testing Techniques, across three editions, is often-cited. When the first edition was published in 1983, security was still focused more on operating systems and database integrity than on applications, so it's maybe ahead of its time that the topic appears in the index at all.


On this page Beizer recommends modeling "High-level control functions within an operating system. Transitions between user states, supervisor's states, and so on. Security handling of records, permission for read/write/modify privileges, priority interrupt and transitions between interrupt states and levels, recovery issues and the safety state of records and/or processes with respect to recording recovery data." So he's still talking about the operating system.

But that's not the only place he talks security. On page 17 in the chapter "The Taxonomy of Bugs", he notes that "Gratuitious enhancements [in functionality] can, if they increase the system's complexity, accumulate into a fertile compost heap that breeds future bugs, and they can burrow holes that can be converted into system security breaches."

Chapter 7, Data Validation and Syntax Testing, is devoted to the hazards of failure to sanitize inputs. He opens with:
I think one of the worst cop-outs ever invented by the computer industry is "garbage-in equals garbage-out". We know when to use that one! When a program of ours screws up in a nasty way. People are inconvenienced by the host of subsidiary problems that result. A big investigation is launched and it's discovered that an operator made a mistake, an improper tape was mounted, or the source data was inconsistent, or something like that. That's the time to put on the Guru's mantle, shake your head from side to side, disclaim all guilt, and mutter, "What do you expect? Garbage-in equals garbage-out." 
Do we have the right to say that to the families of passengers on an airliner that crashes? Will you offer that explanation for the failure of the intensive care unit's monitor system? How about a nuclear reactor meltdown, a supertanker run aground, or a war? GIGO is no explanation for anything except our failure to install good data-validation checks, or worse, our failure to test the system's tolerance for bad data. The point is that garbage shouldn't get in at all -- not in the first place or in the last place. Every system must contend with a bewildering array of internal and external garbage, and if you don't think the world is hostile, how do you plan to cope with alpha particles?
Of course input validation is fundamental today. But Beizer is not finished predicting the future:
There are a few malicious users in every population -- infuriating people, professional Blue Meanies who delight in doing strange things to the systems they use. Years ago they'd pound the sides of vending machines for free sodas. Their sons and daughters invented the blue-box used to get free long-distance and international telephone calls. Now they're tired of probing the nuances of their home video games and they're out to attack computers. They're out to get you. Some of them are even programmers. The are persistent and systematic. A few hours of attack by one of them is worse than years of ordinary use and bugs found by chance. And there are so many of them; so many of them and so few of us. 
Then there is crime. It's estimated that computer criminals (using mostly hokey inputs) are raking in hundreds of millions of dollars annually. Some criminals could be doing it from a telephone booth in Arkansas with an acoustic coupled programmable calculator. Every piece of bad data unknowingly accepted by a system, every crash-causing input sequence, is a chink in the system's armor that knowledgeable criminals can use to penetrate, corrupt and eventually suborn the system to their own purposes. And don't think the system's too complicated for them. They have your listings, and your documentation, and the data dictionary, and whatever else they need. There aren't many of them, but they are smart, highly motivated, and possibly organized.
Beizer was an interesting and challenging individual by many accounts. His writing style makes the material more entertaining than the title suggests.

Wednesday, January 29, 2020

Paul Armer, "Privacy Aspects of the Cashless and Checkless Society", April 1968


Paul Armer of RAND testified before the Senate Subcommittee on Administrative Practice and Procedure, on the topic of privacy in financial systems, in February 1968. He notes that while the controversial Federal Data Center had become politically nonviable by this time, the automation of the financial industry was proceeding rapidly, bringing its own set of privacy issues. He identifies four main factors.
Since I am not concerned here with the implications for the financial world but rather for privacy, let me focus on four attributes of the various possible future systems-- attributes important to privacy. The first is the percentage of financial transactions actually recorded; the second, the amount of detail about a transaction that is recorded and subsequently computerized; the third, the degree to which this information is centralized; the fourth, how rapidly the information is transmitted to the central computer.
Why are these particular parameters important to the privacy issue? The percentage of transactions recorded is obviously important because it determines how complete the picture is. The same is true for the amount of detail recorded about a transaction. The amount of detail actually computerized is important because if we visualize future computer systems with remote terminals connected to the computer, only the information actually in the central computer will be available electronically at the remote terminals. Information written on a piece of paper such as a credit card sales slip or check and not in the computer could only be obtained by someone eyeballing the piece of paper.
Lastly, the rapidity of transmission is important chiefly when we think of such a system being used for personal surveillance. The value of information about an individual's whereabouts declines rapidly with time.
The extreme case, in which all transactions go through the system and all the details are recorded (who, what, where, when, and how) and then sent immediately to a single center, obviously represents the greatest threat to privacy. Such a system would know where we are and what financial activities we are involved in everytime we so much as buy a candy bar or pass through a toll station. Now it is unlikely that we will get to this extreme situation in the near future, if ever. But how fast are we moving towards it, even if we may never reach that limit?
The prediction that electronic transactions could leak physical location data seems right on.

Still present is that contemporary concern with data centralization as a threat to privacy:
[C]entralization of data is usually a concomitant of computer use. The payoff to successful snooping is much greater when all the facts are stored in one place. Though most of the data to complete a dossier on every citizen already exists in the hands of the government today, it is normally so dispersed that the cost of collecting it and assembling it would be very high.
Today we have the opposite problem: we have no idea where all our data is.

Thursday, January 16, 2020

Stockton Gaines and Norman Shapiro, "Some Security Principles and Their Application To Computer Security" (1978)

Presented at Foundations of Secure Computation, an October 1977 conference at the Georgia Institute of Technology. Gaines and Shapiro begin by adapting basic security principles -- barriers, detection, concealment -- to computers. They also consider deterrence in light of an intruder's motives, with an eye toward making the intrusion too difficult to bother.
It is important to consider security from the point of view of the potential violator. He may seek to obtain information of value to him or to modify information that somebody else will use because there is some expected value to him as a consequence of the modification. He may be dissuaded from doing so because he estimates that the costs are unacceptable. The first cost is the direct cost in time, effort, and money of carrying out his plans. Both strong protection mechanisms and concealment mechanisms, such as cryptography, may impose unacceptable costs in one or more of these measures. In addition, detection and apprehension may have costs associated with them that are uncertain to the violator but whose deterrence value may be substantial. The violator may be deterred by the social stigma associated with the detection or by the penalties which may follow as a consequence of detection.
Next they consider contemporary computer security, comprising authentication, authorization ("access control mechanisms"), and the operating system, as well as the system hardware. The idea of holes in an application's security has not yet caught on.
To understand the state of computer security today and how it might be enhanced, we first analyze computer systems from a system point of view. A person attempting to use a computer system either by submitting a job or accessing the computer through a terminal must identify himself to the computer and then be authenticated. ... Other aspects of a computer system which are relevant to security are the hardware itself and the operating and management procedures for the computer.
... When the question of the security of information stored in a computer system was first raised, over a decade ago, it was immediately discovered that from a security point of view operating systems were full of flaws (and many of them still are today). In some systems, these flaws were so serious that it was possible for a user to gain control of the operating system, that is, to have code prepared by the user executed as if it were the supervisor code. Furthermore, flaws in the operating system, once discovered, turned out to be easy to exploit. ...
The initial reaction to the discovery of the weakness of computer system security was to try to correct the flaws. This meant rewriting the access control code so that it would work correctly and trying to rework those parts of systems for which flaws were due to bad design. Such efforts did not work out well; systems so enhanced were shown to have many flaws remaining. Because these flaws were easy to exploit, covering up only a few of them did not appear to be very advantageous. As a result of these failures, recent research has been directed to finding new operating system designs which take security into account in a fundamental way during the design process. ...
They note the relative absence of detection mechanisms for security, and close with a recommendation for what would today be recognized as logging:
There are currently few examples of the use of detection in computers. One that is frequently used in systems providing remote access via terminals is to report to the user at each log-in the time of his previous log-in. Thus providing him the opportunity to notice if this report differs from what he remembers. This technique may cause the detection of unauthorized use of the account. There are some weaknesses. For instance, users who repeatedly see the log-in message reporting time of last use soon fail to read this information. ... 
We have already remarked that the notion of detection has received very little attention. As an example of the kinds of techniques that might be used, we will consider one idea -- that a record be maintained of all accesses to a file owned by an individual that are made by others, or of only those accesses that are made by others when he is not logged into the system, and that this information be reported to him. Ultimately, the user may provide a list of those he expects to access his files and the report may consist of information concerning all accesses by those not on that list. If such information is stored in a way that a violator cannot get at it, then the information can be relied upon a great deal of the time. One way of recording information so that it cannot be destroyed is to write it on a tape that has no backspace provisions. ...
Interesting and worth a read. The PDF of the entire conference proceedings is linked above. I also found a hardback copy for a few bucks on Amazon.

Tuesday, January 7, 2020

The Computer and Invasion of Privacy (1966): Part II

July 28, 1966: Burton E. Squires comments on "artificial intelligence". (Obsolete spellings of "programer" and "programing" are in the original source.)
One often hears the remark that computers can do only what they are told to do. While this may be essentially true, it is practically false. Such a statement completely ignores the speed and complexity of problems that can be handled by a modern digital computer. It is, in a sense, like saying that an automobile won't take you anywhere you can't walk. Now, an automobile is about 15 times faster than walking. A modern computer is about a million times faster than paper and pencil. 
However, a computer is a machine, not an animate being. As an automobile performs no better than the skill of its designers and driver, a computer performs no better than the skill of its engineers and programers. Some of these programers are extremely skillful and sophisticated. They can write programs which give the computer a kind of "artificial intelligence." In such programs the computer is allowed to operate in a simulated random manner, to evaluate the effects of these random operations, and to modify its own operating program. As a result, the computer can literally write its own programs for the direct solution of a difficult problem. By such means it is sometimes said that computers "learn." After a short time even the programer has little knowledge of what the machine is actually doing, and he may be unable to predict the future behavior of the machine. The machine is able to learn because it was programed to do so according to a specific learning theory. In this way the intellect of the programer is still operating. 
Computers now under construction will be able to process pictures as readily as present computers process linguistic information. A whole new era of information handling is upon us. It is quite reasonable to speculate that within the next 10 years computer terminals will be as commonplace as color television sets are today. There can be little doubt that the establishment of a Federal data center could bring greater economy and efficiency to Government operations. It could do much more. It could make available to the executive branch immediate and up-to-date information summaries on all aspects of our national, business, and personal lives. Whether this can be done without violating the rights of individuals seems difficult at best and unlikely at least.

Saturday, January 4, 2020

The Computer and Invasion of Privacy (1966)

In 1965, a National (or sometimes, Federal) Data Center was proposed, which would permit centralized management of statistical data collected by various U.S. federal agencies. The goal was wider availability and better correlation of statistical population data for social science researchers. The evolution and ultimate failure of the proposal is documented by Rebecca S. Kraus.

The House Committee on Government Operations heard testimony on July 26 - 28, 1966 about the the proposed Center and its potential privacy risks. Some remarks from Paul Baran are in line with modern security practice.

If the computer industry is to avoid external regulation, then it behooves everyone who is involved with time-shared systems handling potentially sensitive information to start working, or a[t] least thinking, about the problem of privacy. The computer industry should take the initiative and the responsibility of building in the needed safeguards itself before "Big Brother" is forced to do it himself...

To be more specific, what safeguards do I envision? Of course, we do not know all the answers yet. But, clearly, there are steps that we should be considering, including:
Provision for minimal cryptographic-type protection to all communications lines that carry potentially embarrassing data -- not super-duper unbreakable cryptography, just some minimal, reversible, logical operations upon the data stream to make the eavesdropper's job so difficult that it isn't worth his time...
Never store file data in the complete "clear." Perform some simple -- but key controllable -- operation on the data so that a simple access to storage will not dump stored data out into the clear. 
Make random external audits of file operating programs a standard practice to insure that no programmer has intentionally or inadvertently slipped in a "secret door" to permit a remote point access [to] information to which he is not entitled by sending in a "password."
When the day comes when individual file systems are interconnected on a widespread basis, let us have studied the problem sufficiently so that we can create sensible, precise ground rules on cross-system interrogation access.
Provide mechanisms to detect abnormal informational requests. That is, if a particular file is receiving an excessive number of inquiries or there is an unusual number of cross-file inquiries coming from one source, flag the request to a human operator.
Build in provisions to verify and record the source of requests for information interrogations.
Audit information requests and inform ·authorities of suspected misuse of the system.
This is followed by some discussion on aggregated and anonymized ("statistical") information vs. personally-identifiable information, and how personal information might be protected by legally-mandated system design.
Mr. HORTON. [I]s it possible technically to design a system so that only statistical information could be utilized or be furnished and thus protect the-so-called individual information?  
Mr. BARAN. If you say I know all the questions I want to ask in the future, perhaps. But if you don't, that means you have to keep the information in raw form. This is the most efficient way of keeping it. 
Mr. HORTON. I am assuming you are an expert in this field (the field of computers and what they can do. I am asking you from a technical standpoint whether or not it is possible -- in other words, could we pass a law that would require the construction of a computer that would only produce statistical information that would be foolproof insofar as individual information was concerned?
Mr. BARAN. "Foolproof" is a rough word. I think we could build safeguards to make it difficult. How effective they are, I think, requires a level of detail that we have not examined yet. 
Mr. HORTON. The point I am trying to make is that I think any law Congress would enact to safeguard the right of individuals in this area would depend to a large measure upon the state of the art. 
Mr. BARAN. That is right. 
Mr. HORTON. With regard to the technical aspects, I do not think we have sufficient information to protect the private individual in the computerized systems. 
Mr. BARAN. That is right. The technical art is changing very rapidly in computers. The speed of the computer is going up tremendously. The cost is coming down. The size of the memories is expanding very rapidly. As we look to the future we could probably see increases of size of computers -- perhaps on the order of 10,000 times as powerful as today's computers.


Friday, January 3, 2020

Harold D. Laswell, "The Social Consequences of Automation", May 1958

Modern technology has developed a repertory of devices capable of penetrating barriers of privacy. Some of these relate to behavior such as the recording microphone or concealed photography; others refer to the inner life (narcosis-inducing devices, lie detection polygraph machines, etc.). ... In the past, the efficiency of police networks suffered from shortage of personnel. The installation of automatically monitored surveillance instruments makes it possible to penetrate the remaining barriers to privacy, and to redouble the pressures toward cautious conformity, not only to lawful prescriptions but to the informal prescriptions laid down by "Mrs. Grundy." The potentiality exists of monitoring not only prisons, schools, offices, plants, barracks, training, and recreational fields, but of surveying traffic flows, etc.

Limits to this process may be wanted in order to maintain areas of individual privacy and freedom. If so, it will be necessary to stop drifting and take the positive step of drawing up, adopting, and administering codes of freedom.


Source


The Beast of Business (1968) and The International Society for the Abolition of Data Processing Machines

Harvey Matusow was an odd character . In the 1960s, after his confusing stint as a communist-turned-FBI informer-turned-whistleblower, he fo...