Standard user model

From Wikipedia, the free encyclopedia
Jump to: navigation, search

A standard user model is a standard data model of a service end user. In theory, it permits a wide range of adaptive infrastructure, especially software, to adapt to a single human user's characteristics, e.g. preferred language, quality of eyesight including large print or color blindness adaptation, sound volume.

Privacy violation[edit]

In practice, however, it is often merely an excuse to collect a great deal of personal data and violate privacy. Few applications make full use of data gathered in standard user models, and some suggest it is always economically infeasible to do so. Thus, information gathered regarding a disability may be used to deny someone compensation for an injury caused by their failure to see or do something, as opposed to being used to make that failure less likely by adapting the infrastructure or software to their use. The fields of adaptive software and personal differences analysis, themselves part of the field of systems design engineering, concern themselves with the degree to which infrastructure can be flexible.


In 2003 the most common examples of standard user models are driver's licenses, passports, Medicalert data, insurance applications, credit applications and credit history. With the exception of medical applications, these are typically used to offer or deny or price service, and not to improve the quality of aid offered.

An example of a more adaptation-oriented standard user model is the Windows Live ID which is used to adapt the MSN Messenger and Microsoft-allied web sites to user desires. However Microsoft has been criticized for security and privacy management problems in the past, and its model is unlikely to be adapted for direct use by governments or other mandatory purposes.

Identity cards[edit]

National identity cards are in use in many nations but these typically do not contain more identifying information than a passport. That is changing due to the War on Terrorism and increasing use of biometrics, which expands the standard user model for identity cards beyond border crossings and airport security and secure building requirements, and may soon include profiling of individuals' political views or their religion encoded in the vast storage capacity of such new cards.

This gives rise to concerns about privacy and the potential for a carceral state to emerge wherever such cards are required to let citizens access infrastructure that previously they used anonymously. In theory, monitoring of every page read on the World Wide Web, every note written by email, every line of instant messaging, and even every telephone conversation and telephone number or street address could be recorded as part of a standard user model.

Identity theft[edit]

This might be useful to the user, but it also might be devastating if it ended up in the hands of someone who used it for identity theft and in particular for frameups. There is no reliable way to prevent all such abuses in any scheme, particularly if common criminal identifiers such as fingerprint or DNA are included in the standard model - a single security failure could lead to an entire population's biometric data falling into the hands of individuals or agencies willing to use it to frame anyone whose political opinions or friends they did not like. Simply by sequencing more DNA or printing fingerprints of life size on a page, and spreading signs of the individual whose identity was stolen at a crime scene, literally anyone could be framed up for any crime at all.

As with digital photography and digital audio techniques, the signs of the forgery would become impossible to tell from the signs of the actual individual framed, and fingerprint and DNA evidence would become unreliable in principle, just as photos and audiotapes and even videotapes (given sufficient motivation to create a forgery) are now.

The future[edit]

Standard user models for hospital and military use appear very likely to expand, however, and may come to be used more widely in the future, regardless of the privacy risks of such centralized databases.

See also[edit]