Donate for the Cryptome archive of files from June 1996 to the present

16 September 2012. Related NY Times article:

http://www.nytimes.com/2012/09/16/technology/in-microsofts-new-browser-the-privacy-light-is-already-on.html

Developers at Apache, a popular Web server, have also objected, saying that Microsoft’s default setting may not convey a user’s specific intent. They have introduced an update that may cause some sites to ignore what they view as a “presumptive” do-not-track flag.

Industry representatives say privacy advocates have skewed the conversation from the outset, by using Big Brother-y terms like “do not track,” when they view the choice for consumers as between seeing relevant ads or generic ads. They add that the process shouldn’t scare consumers. To create interest-based ads, they say, ad networks and analytics companies assign people anonymous code numbers and simply record things like the sites they visit and the search terms they enter.

“Somebody knowing anonymously that a number that they can follow on the Web likes swimming, to me that is not a privacy breach,” says Chris Mejia, director of the ad technology group at the Interactive Advertising Bureau, an industry association.

But Jonathan Mayer, a graduate student in computer science at Stanford who has studied online tracking, says there is cause for deeper concern. For him, the issue is not behavior-based ads, but the data-mining necessary to produce them.

For instance, Mr. Mayer says, consumers may not be aware that when they visit a site, dozens of entities, like analytics companies and data aggregators, may be operating on that page, collecting online information about them, and amassing those details for advertising purposes. Moreover, he says, those entities could potentially have access to screen names or e-mail addresses that might be used to re-identify people.

14 September 2012

Machinic Bypasses of Personal Anonymity

Related:

Tubes: A Journey to the Center of the Internet, by Andrew Blum, 2012
Secrets of Computer Espionage: Tactics and Countermeasures, Joel McNamara, 2003
The Design Philosophy of the DARPA Internet Protocols, David D. Clark, 1988
Transport Protocol Specification, ISO, 1984
A Protocol for Packet Network Intercommunication, Vincent Cerf and Robert Kahn, 1974

Previous:

http://cryptome.org/2012/08/quixotic-anon-priv-sec.htm
http://cryptome.org/2012/06/comms-folly.htm
http://cryptome.org/cryptome-anon.htm


Cryptome comments on Hacking the Future: Privacy, Identity and Anonymity on the Web, by Cole Stryker

The distinction between personal anonymity, consumer anonymity and machine anonymity is worth pondering. Data profilers will quickly offer personal anonymity in exchange for access to data to profile an anonymous consumer. Who a person is is less important than buyer behavior, the name on a bank account is not as important as payments made by the account. It is well established that Internet aggregators of data seek the patterns, pathways and networks of individuals, not so much their individual identity. An algorithmic, machinic ID is assigned to the data for purposes of processing and using the data.

Behind the promises of personal anonymity is the reality of machinic identity which underlies all devices that transmit and receive data. Much of machinic processing take place without humans and thus has no need for personal identity -- personal anonymity is automatically provided by exclusion from the processing.

Every digital device can be traced and profiled by its activities. The hardware which is used for digital communications, for example, sends signals among devices which are uniquely identified necessarily by type, location and capabilities. A personal device performs its service without need for information about the person using it -- personal identity is useless for machinic performance. Mouse/input device to computer to LAN to cable to ISP to manifold ISPs, nodes and networks -- to and from -- occurs in the background without personal ID, similar to automated controls of structures and infrastructures. This signal is trivially anonymous unless converted to personal nonymity for non-machinic purposes.

Machines and physical networks leak signal, very faintly or grossly. Some of the signal is only for machinic performance, some is anonymous content for directing the signal, some is nonymous content for the sending and receiving parties. Only the latter raises the issue of personal identity, and it is this part which allows protection of identity. The machinic parts and their distinctive leakiness may be used to profile a pattern of behavior by a user whose personal identity is otherwise concealed. Tracing an input-device user to a particular location and to a receiver of that input, as well as profiling the exchanges, can be done machinicly without need to breach personal anonymity or comsec at either end.

For this reason it is fair to describe the Internet as well as other communications systems as concealed spying systems -- concealed from those who (have been induced to) believe user location, identity and content protection are primary. Analogic machinic performance requirements necessarily bypass digital security. And weaknesses in analogic systems cannot be fully corrected by digital means, which, at best, can only camouflage and divert.

As with communications security, the most effective attacks on a secure system are never revealed, instead camouflage, diversions and ruses are promulgated to conceal the attacks. Controversy about password protection, encryption, privacy policy, anonymity and governmental, commercial and institutional spying on Internet users may well be orchestrated diversions from how machinic tracking bypasses consumer and citizen efforts of protection.

Examples of diversion are anonymizing services, encryption and privacy policies which provide some protection on the digital surface but none on the analogic subsurface.