Wednesday, January 29, 2020

Paul Armer, "Privacy Aspects of the Cashless and Checkless Society", April 1968


Paul Armer of RAND testified before the Senate Subcommittee on Administrative Practice and Procedure, on the topic of privacy in financial systems, in February 1968. He notes that while the controversial Federal Data Center had become politically nonviable by this time, the automation of the financial industry was proceeding rapidly, bringing its own set of privacy issues. He identifies four main factors.
Since I am not concerned here with the implications for the financial world but rather for privacy, let me focus on four attributes of the various possible future systems-- attributes important to privacy. The first is the percentage of financial transactions actually recorded; the second, the amount of detail about a transaction that is recorded and subsequently computerized; the third, the degree to which this information is centralized; the fourth, how rapidly the information is transmitted to the central computer.
Why are these particular parameters important to the privacy issue? The percentage of transactions recorded is obviously important because it determines how complete the picture is. The same is true for the amount of detail recorded about a transaction. The amount of detail actually computerized is important because if we visualize future computer systems with remote terminals connected to the computer, only the information actually in the central computer will be available electronically at the remote terminals. Information written on a piece of paper such as a credit card sales slip or check and not in the computer could only be obtained by someone eyeballing the piece of paper.
Lastly, the rapidity of transmission is important chiefly when we think of such a system being used for personal surveillance. The value of information about an individual's whereabouts declines rapidly with time.
The extreme case, in which all transactions go through the system and all the details are recorded (who, what, where, when, and how) and then sent immediately to a single center, obviously represents the greatest threat to privacy. Such a system would know where we are and what financial activities we are involved in everytime we so much as buy a candy bar or pass through a toll station. Now it is unlikely that we will get to this extreme situation in the near future, if ever. But how fast are we moving towards it, even if we may never reach that limit?
The prediction that electronic transactions could leak physical location data seems right on.

Still present is that contemporary concern with data centralization as a threat to privacy:
[C]entralization of data is usually a concomitant of computer use. The payoff to successful snooping is much greater when all the facts are stored in one place. Though most of the data to complete a dossier on every citizen already exists in the hands of the government today, it is normally so dispersed that the cost of collecting it and assembling it would be very high.
Today we have the opposite problem: we have no idea where all our data is.

Thursday, January 16, 2020

Stockton Gaines and Norman Shapiro, "Some Security Principles and Their Application To Computer Security" (1978)

Presented at Foundations of Secure Computation, an October 1977 conference at the Georgia Institute of Technology. Gaines and Shapiro begin by adapting basic security principles -- barriers, detection, concealment -- to computers. They also consider deterrence in light of an intruder's motives, with an eye toward making the intrusion too difficult to bother.
It is important to consider security from the point of view of the potential violator. He may seek to obtain information of value to him or to modify information that somebody else will use because there is some expected value to him as a consequence of the modification. He may be dissuaded from doing so because he estimates that the costs are unacceptable. The first cost is the direct cost in time, effort, and money of carrying out his plans. Both strong protection mechanisms and concealment mechanisms, such as cryptography, may impose unacceptable costs in one or more of these measures. In addition, detection and apprehension may have costs associated with them that are uncertain to the violator but whose deterrence value may be substantial. The violator may be deterred by the social stigma associated with the detection or by the penalties which may follow as a consequence of detection.
Next they consider contemporary computer security, comprising authentication, authorization ("access control mechanisms"), and the operating system, as well as the system hardware. The idea of holes in an application's security has not yet caught on.
To understand the state of computer security today and how it might be enhanced, we first analyze computer systems from a system point of view. A person attempting to use a computer system either by submitting a job or accessing the computer through a terminal must identify himself to the computer and then be authenticated. ... Other aspects of a computer system which are relevant to security are the hardware itself and the operating and management procedures for the computer.
... When the question of the security of information stored in a computer system was first raised, over a decade ago, it was immediately discovered that from a security point of view operating systems were full of flaws (and many of them still are today). In some systems, these flaws were so serious that it was possible for a user to gain control of the operating system, that is, to have code prepared by the user executed as if it were the supervisor code. Furthermore, flaws in the operating system, once discovered, turned out to be easy to exploit. ...
The initial reaction to the discovery of the weakness of computer system security was to try to correct the flaws. This meant rewriting the access control code so that it would work correctly and trying to rework those parts of systems for which flaws were due to bad design. Such efforts did not work out well; systems so enhanced were shown to have many flaws remaining. Because these flaws were easy to exploit, covering up only a few of them did not appear to be very advantageous. As a result of these failures, recent research has been directed to finding new operating system designs which take security into account in a fundamental way during the design process. ...
They note the relative absence of detection mechanisms for security, and close with a recommendation for what would today be recognized as logging:
There are currently few examples of the use of detection in computers. One that is frequently used in systems providing remote access via terminals is to report to the user at each log-in the time of his previous log-in. Thus providing him the opportunity to notice if this report differs from what he remembers. This technique may cause the detection of unauthorized use of the account. There are some weaknesses. For instance, users who repeatedly see the log-in message reporting time of last use soon fail to read this information. ... 
We have already remarked that the notion of detection has received very little attention. As an example of the kinds of techniques that might be used, we will consider one idea -- that a record be maintained of all accesses to a file owned by an individual that are made by others, or of only those accesses that are made by others when he is not logged into the system, and that this information be reported to him. Ultimately, the user may provide a list of those he expects to access his files and the report may consist of information concerning all accesses by those not on that list. If such information is stored in a way that a violator cannot get at it, then the information can be relied upon a great deal of the time. One way of recording information so that it cannot be destroyed is to write it on a tape that has no backspace provisions. ...
Interesting and worth a read. The PDF of the entire conference proceedings is linked above. I also found a hardback copy for a few bucks on Amazon.

Tuesday, January 7, 2020

The Computer and Invasion of Privacy (1966): Part II

July 28, 1966: Burton E. Squires comments on "artificial intelligence". (Obsolete spellings of "programer" and "programing" are in the original source.)
One often hears the remark that computers can do only what they are told to do. While this may be essentially true, it is practically false. Such a statement completely ignores the speed and complexity of problems that can be handled by a modern digital computer. It is, in a sense, like saying that an automobile won't take you anywhere you can't walk. Now, an automobile is about 15 times faster than walking. A modern computer is about a million times faster than paper and pencil. 
However, a computer is a machine, not an animate being. As an automobile performs no better than the skill of its designers and driver, a computer performs no better than the skill of its engineers and programers. Some of these programers are extremely skillful and sophisticated. They can write programs which give the computer a kind of "artificial intelligence." In such programs the computer is allowed to operate in a simulated random manner, to evaluate the effects of these random operations, and to modify its own operating program. As a result, the computer can literally write its own programs for the direct solution of a difficult problem. By such means it is sometimes said that computers "learn." After a short time even the programer has little knowledge of what the machine is actually doing, and he may be unable to predict the future behavior of the machine. The machine is able to learn because it was programed to do so according to a specific learning theory. In this way the intellect of the programer is still operating. 
Computers now under construction will be able to process pictures as readily as present computers process linguistic information. A whole new era of information handling is upon us. It is quite reasonable to speculate that within the next 10 years computer terminals will be as commonplace as color television sets are today. There can be little doubt that the establishment of a Federal data center could bring greater economy and efficiency to Government operations. It could do much more. It could make available to the executive branch immediate and up-to-date information summaries on all aspects of our national, business, and personal lives. Whether this can be done without violating the rights of individuals seems difficult at best and unlikely at least.

Saturday, January 4, 2020

The Computer and Invasion of Privacy (1966)

In 1965, a National (or sometimes, Federal) Data Center was proposed, which would permit centralized management of statistical data collected by various U.S. federal agencies. The goal was wider availability and better correlation of statistical population data for social science researchers. The evolution and ultimate failure of the proposal is documented by Rebecca S. Kraus.

The House Committee on Government Operations heard testimony on July 26 - 28, 1966 about the the proposed Center and its potential privacy risks. Some remarks from Paul Baran are in line with modern security practice.

If the computer industry is to avoid external regulation, then it behooves everyone who is involved with time-shared systems handling potentially sensitive information to start working, or a[t] least thinking, about the problem of privacy. The computer industry should take the initiative and the responsibility of building in the needed safeguards itself before "Big Brother" is forced to do it himself...

To be more specific, what safeguards do I envision? Of course, we do not know all the answers yet. But, clearly, there are steps that we should be considering, including:
Provision for minimal cryptographic-type protection to all communications lines that carry potentially embarrassing data -- not super-duper unbreakable cryptography, just some minimal, reversible, logical operations upon the data stream to make the eavesdropper's job so difficult that it isn't worth his time...
Never store file data in the complete "clear." Perform some simple -- but key controllable -- operation on the data so that a simple access to storage will not dump stored data out into the clear. 
Make random external audits of file operating programs a standard practice to insure that no programmer has intentionally or inadvertently slipped in a "secret door" to permit a remote point access [to] information to which he is not entitled by sending in a "password."
When the day comes when individual file systems are interconnected on a widespread basis, let us have studied the problem sufficiently so that we can create sensible, precise ground rules on cross-system interrogation access.
Provide mechanisms to detect abnormal informational requests. That is, if a particular file is receiving an excessive number of inquiries or there is an unusual number of cross-file inquiries coming from one source, flag the request to a human operator.
Build in provisions to verify and record the source of requests for information interrogations.
Audit information requests and inform ·authorities of suspected misuse of the system.
This is followed by some discussion on aggregated and anonymized ("statistical") information vs. personally-identifiable information, and how personal information might be protected by legally-mandated system design.
Mr. HORTON. [I]s it possible technically to design a system so that only statistical information could be utilized or be furnished and thus protect the-so-called individual information?  
Mr. BARAN. If you say I know all the questions I want to ask in the future, perhaps. But if you don't, that means you have to keep the information in raw form. This is the most efficient way of keeping it. 
Mr. HORTON. I am assuming you are an expert in this field (the field of computers and what they can do. I am asking you from a technical standpoint whether or not it is possible -- in other words, could we pass a law that would require the construction of a computer that would only produce statistical information that would be foolproof insofar as individual information was concerned?
Mr. BARAN. "Foolproof" is a rough word. I think we could build safeguards to make it difficult. How effective they are, I think, requires a level of detail that we have not examined yet. 
Mr. HORTON. The point I am trying to make is that I think any law Congress would enact to safeguard the right of individuals in this area would depend to a large measure upon the state of the art. 
Mr. BARAN. That is right. 
Mr. HORTON. With regard to the technical aspects, I do not think we have sufficient information to protect the private individual in the computerized systems. 
Mr. BARAN. That is right. The technical art is changing very rapidly in computers. The speed of the computer is going up tremendously. The cost is coming down. The size of the memories is expanding very rapidly. As we look to the future we could probably see increases of size of computers -- perhaps on the order of 10,000 times as powerful as today's computers.


Friday, January 3, 2020

Harold D. Laswell, "The Social Consequences of Automation", May 1958

Modern technology has developed a repertory of devices capable of penetrating barriers of privacy. Some of these relate to behavior such as the recording microphone or concealed photography; others refer to the inner life (narcosis-inducing devices, lie detection polygraph machines, etc.). ... In the past, the efficiency of police networks suffered from shortage of personnel. The installation of automatically monitored surveillance instruments makes it possible to penetrate the remaining barriers to privacy, and to redouble the pressures toward cautious conformity, not only to lawful prescriptions but to the informal prescriptions laid down by "Mrs. Grundy." The potentiality exists of monitoring not only prisons, schools, offices, plants, barracks, training, and recreational fields, but of surveying traffic flows, etc.

Limits to this process may be wanted in order to maintain areas of individual privacy and freedom. If so, it will be necessary to stop drifting and take the positive step of drawing up, adopting, and administering codes of freedom.


Source


The Beast of Business (1968) and The International Society for the Abolition of Data Processing Machines

Harvey Matusow was an odd character . In the 1960s, after his confusing stint as a communist-turned-FBI informer-turned-whistleblower, he fo...