Jump to content

Trusted Computing: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Line 188: Line 188:
*[http://www.fsf.org Free Software Foundation, Richard Matthew Stallman and the GNU Project]
*[http://www.fsf.org Free Software Foundation, Richard Matthew Stallman and the GNU Project]
*[http://www.defectivebydesign.org/ Defective By Design]
*[http://www.defectivebydesign.org/ Defective By Design]
*[http://www.boksar.info/index.php/2006/04/02/trusted-computing-cuando-la-confianza-supera-los-limites/ Trusted computing... ¡cuando la confianza supera los límites!]


=== proponents of trusted computing ===
=== proponents of trusted computing ===

Revision as of 17:07, 8 February 2007

Trusted Computing Group, the creators of trusted computing

Trusted Computing (commonly abbreviated TC) is a technology developed and promoted by the Trusted Computing Group (TCG). The term is taken from the field of trusted systems and has a specialized meaning. "Trusted computing" means that the computer will consistently behave in specific ways, and those behaviors will be enforced by hardware and software. Note that "trusted" does not imply any specific behavior - security and consistency are the most important attributes of trust. In this technical sense, "trusted" does not necessarily have the same definition as "trustworthy", because "trustworthiness" generally implies more than just security and consistency.

Trusted Computing is controversial. Advocates of the technology (like the International Data Corporation [3], the Enterprise Strategy Group [4] and Endpoint Technologies Associates [5]) claim that it will make computers safer, less prone to viruses and malware, and thus more reliable from an end-user perspective. Further, they state that Trusted Computing will allow computers and servers to offer improved computer security over that which is currently available. Opponents (like the Electronic Frontier Foundation and the Free Software Foundation) believe that trust in the underlying companies is not deserved and that the technology puts too much power and control into the hands of those who design systems and software. They also believe that it potentially forces consumers to lose anonymity in their online interactions, as well as mandating technologies that many have no pressing need for. Finally, TC is seen as a possible enabler for future versions of document and copy protection - which are of value to corporate and other users in many markets and which to critics, raises concerns about undue censorship.

Some prominent security experts[1][2] have spoken out against Trusted Computing as they believe it will provide computer manufacturers and software authors with increased control to impose restrictions on what users are able to do with their computers. There are concerns that TC would have (or may even covertly be intended to have) a large anti-competitive effect on the free software markets, private software development, and the IT market in general. Some, such as Richard Stallman, have suggested the backronym treacherous computing for these reasons[6]. Regardless of the debate and the form of the final products, major influences in computing, such as chip manufacturers Intel and AMD, computer manufacturers such as Dell and Apple, and systems software developers such as Microsoft, plan to include TC into coming generations of products.[3][4][5] The U.S. Army requires that every new small PC it purchases must come with a Trusted Platform Module (TPM) [6][7]. According to the International Data Corporation, by 2010 essentially all portable PCs and the vast majority of desktops will include a TPM chip. [8]

The nature of trust

Security experts define a trusted system to be one which is required to be trusted for the security of a larger system to hold. For example, the United States Department of Defense's definition of a trusted system is one which could break security policy if it misbehaved; i.e., "a system that you have chosen to trust, possibly out of necessity." Cryptographer Bruce Schneier observes "A 'trusted' computer does not mean a computer that is trustworthy." Using these definitions a hard drive controller must be trusted by its users, that it genuinely saves to the drive in every case, the data it is intended to be saving, and a secure website must be trusted that it is secure because a user cannot verify this for themselves. Trust in security parlance is always a kind of compromise or weakness—sometimes inevitable, but never desirable as such. As another analogy, your best friend cannot share your medical records, since he or she does not have them. On the other hand, your doctor can, and does (legal issues with doing so aside). It is possible that you trust your doctor and think he or she is a fine person; it's also possible that there is only one doctor in your town, so you are forced to trust him or her.

The main controversy around trusted computing is around this meaning of trust. The Trusted Computing group describes "Technical Trust" as "an entity can be trusted if it always behaves in the expected manner for the intended purpose". Critics characterize a trusted system as a system you are forced to trust rather than one which is particularly trustworthy.

There is also concern amongst critics that it will not always be possible to examine the hardware components on which Trusted Computing relies, the Trusted Platform Module, which is the ultimate hardware system where the core 'root' of trust in the platform has to lie. If not implemented correctly, it presents a security risk to overall platform integrity and protected data. The specifications, as published by the Trusted Computing Group, are open and are available for anyone to review. However, the final implementations by commercial vendors will not necessarily be subjected to the same review process.

A final concern is that the world of cryptography can often move quickly, and that hardware implementations of algorithms might create an inadvertent obsolescence.

While proponents claim that trusted computing increases security, critics counter that not only will security not be helped, but trusted computing will facilitate mandatory digital rights management (DRM), harm privacy, and impose other restrictions on users. Trusting networked computers to controlling authorities rather than to individuals may create digital imprimaturs. Contrast trusted computing with secure computing in which anonymity, not disclosure, is the main concern. Advocates of secure computing argue that the additional security can be achieved without relinquishing control of computers from users to superusers.

Proponents of trusted computing argue that privacy complaints have been addressed in the existing specifications - possibly as a result of criticism of early versions of the specifications. The Trusted Platform Module can be disabled; however services that require remote attestation won't work without an enabled TPM .

Key Concepts

Trusted computing encompasses five key technology concepts, of which all are required for a fully trusted system.

  1. Endorsement Key
  2. Secure Input and Output
  3. Memory curtaining / Protected execution
  4. Sealed storage
  5. Remote attestation

Endorsement Key

"The endorsement key is a 2,048-bit RSA public and private key pair, which is created randomly on the chip at manufacture time and cannot be changed. The private key never leaves the chip, while the public key is used for attestation and for encryption of sensitive data sent to the chip, as occurs during the TPM_TakeOwnership command."

(David Safford, [7]) This key is used to allow the executions of secure transactions: every TPM is required to sign a random number, using a particular protocol created by the trusted computing group (the Direct anonymous attestation protocol) in order to ensure its compliance of the TCG standard and to prove its identity; this makes it impossible for a software TPM emulator to start a secure transaction with a 'trusted' entity. The TPM is designed to avoid the extraction of this key by hardware analysis.

Secure I/O

Secure input and output (I/O) refers to a protected path between the computer user and the software with which they believe they are interacting. On current computer systems there are many ways for malicious software to intercept data as it travels between a user and a software process - for example keyboard loggers and screen-scrapers. Secure I/O reflects a hardware and software protected and verified channel, using checksums to verify that the software used to do the I/O has not been tampered with. Malicious software injecting itself in this path could be identified.

Memory curtaining

Memory curtaining extends the current memory protection techniques to provide full isolation of sensitive areas of memory — for example locations containing cryptographic keys. Even the operating system doesn't have full access to curtained memory, so the information would be secure from an intruder who took control of the OS.

Sealed storage

Sealed storage protects private information by allowing it to be encrypted using a key derived from the software and hardware being used. This means the data can be read only by the same combination of software and hardware. For example, users who keep a song on their computer that hasn't been licensed to be listened won't be able to play it. Currently, a user can locate the song, listen to it, and send it to someone else, play it in the software of their choice, or for backup (in some cases, using circumvention software to decrypt it, such as hymn). Alternately the user may use software to modify the operating system's DRM routines to have it leak the song data once, say, a temporary license was acquired. Using sealed storage, the song is securely encrypted so that only the unmodified and untampered music player on his or her computer can play it.

Remote attestation

Remote attestation allows changes to the user's computer to be detected by authorized parties. That way, software companies can avoid users tampering with their software to circumvent technological protection measures. It works by having the hardware generate a certificate stating what software is currently running. The computer can then present this certificate to a remote party to show that its software hasn't been tampered with.

Remote attestation is usually combined with public-key encryption so that the information sent can only be read by the programs that presented and requested the attestation, and not by an eavesdropper, such as the computer owner.

To take the song example again, the user's music player software could send the song to other machines, but only if they could attest that they were running a secure copy of the music player software. Combined with the other technologies, this provides a more secured path for the music: secure I/O prevents the user from recording it as it is heard on the speakers, memory curtaining prevents it from being dumped to regular disk files as it is being worked on, sealed storage curtails unauthorized access to it when saved to the hard drive, and remote attestation protects it from unauthorized software even when it is used on other computers.

Applications for Trusted Computing

Protecting hard-drive data after theft

The Enterprise and Ultimate editions of Windows Vista make use of a Trusted Platform Module to facilitate BitLocker Drive Encryption.[9] The Trusted Platform Module is used to securely bootstrap and access decryption keys for volume level hard drive encryption. This is done via the Trusted Platform Module's Platform Configuration Registers. As the computer starts up a series of validations occur on the BIOS, the master boot record, the boot sector and so on until the decryption keys can be retrieved from the Trusted Platform Module and used to decrypt the hard drive as needed. This use of the TPM mitigates some attacks on accessing the data on a stolen or lost laptop; such as just plugging in the harddrive in a different system, booting to a different operating system or attempting to modify the boot code.

The Enforcer is a Linux Security Module designed to improve integrity of a computer running Linux by ensuring no tampering of the file system. It can interact with 'trusted' hardware to provide higher levels of assurance for software and sensitive data. The Enforcer can also work with the TPM to store the secret to an encrypted loopback file system, and unmount this file system when a tampered file is detected; the secret will not be accessible to mount the loopback file system until the machine has been rebooted with untampered files. This allows sensitive data to be protected from an attacker.

Possible applications for Trusted Computing

Digital rights management

Trusted Computing would allow companies to create an almost unbreakable DRM system. An example is downloading a music file. Remote attestation could be used so that the music file would refuse to play except on a specific music player that enforces the record company's rules. Sealed storage would prevent the user from opening the file with another player or another computer. The music would be played in curtained memory, which would prevent the user from making an unrestricted copy of the file while it's playing, and secure I/O would prevent capturing what is being sent to the sound system.

Tackling cheating in on-line games

Trusted computing could be used to combat cheating in multiplayer on-line games. Some players modify their game copy in order to gain unfair advantages in the game; remote attestation, secure I/O and memory curtaining could be used to verify that all players connected to a server were running an unmodified copy of the software.

Protection from identity theft

Trusted Computing could be used to prevent identity theft. Take for example, online banking. Remote attestation could be used when the user is connecting to the bank's server and would only serve the page if the server could produce the correct certificates. Then the user can sending his encrypted account number and PIN, with some assurance that the information is private to him and the bank.

Protection from viruses and spyware

Digital signature of software will allow users to identify applications modified by third parties that could add spyware to the software. For example, a website offers a modified version of a popular instant messenger that contains spyware as a drive-by download. The operating system could notice the lack of a valid signature for these versions and inform the user that the program has been modified. Trusted computing could also stop attacks by viruses. However, Microsoft has denied that this functionality will be present in its NGSCB architecture. Trusted computing could also be used by antivirus vendors to write antivirus software that couldn't be corrupted by virus attacks.

Protection of biometric authentication data

biometrics ATM in South Korea

Biometric devices used for authentication could use trusted computing technologies (memory curtaining, secure I/O)to assure the user that no spyware installed on his/her PC is able to steal sensitive biometric data. The theft of this data could be extremely harmful to the user because while a user can change a password if he or she knows that the password is no longer secure, a user cannot change the data generated by a biometric device[8].

Verification of remote computation for grid computing

Trusted computing could be used to guarantee participants in a grid computing system are returning the results of the computations they claim to be instead of forging them. This would allow large scale simulations to be run (say a climate simulation) without expensive redundant computations to guarantee malicious hosts aren't undermining the results to achieve the conclusion they want[10]

Disputes and criticism of trusted computing

Opponents of trusted computing point out that the security features that protect computers from viruses and attackers also restrict the actions of their owners. They argue that this makes new anti-competitive techniques possible, which may hurt the people who buy trusted computers.

The Cambridge cryptographer Ross Anderson has great concerns that "TC can support remote censorship [...] In general, digital objects created using TC systems remain under the control of their creators, rather than under the control of the person who owns the machine on which they happen to be stored (as at present) [...] So someone who writes a paper that a court decides is defamatory can be compelled to censor it — and the software company that wrote the word processor could be ordered to do the deletion if she refuses. Given such possibilities, we can expect TC to be used to suppress everything from pornography to writings that criticise political leaders." He goes on to state that:

"[...] software suppliers can make it much harder for you to switch to their competitors' products. At a simple level, Word could encrypt all your documents using keys that only Microsoft products have access to; this would mean that you could only read them using Microsoft products, not with any competing word processor."
"The [...] most important benefit for Microsoft is that TC will dramatically increase the costs of switching away from Microsoft products (such as Office) to rival products (such as OpenOffice). For example, a law firm that wants to change from Office to OpenOffice right now merely has to install the software, train the staff and convert their existing files. In five years' time, once they have received TC-protected documents from perhaps a thousand different clients, they would have to get permission (in the form of signed digital certificates) from each of these clients in order to migrate their files to a new platform. The law firm won't in practice want to do this, so they will be much more tightly locked in, which will enable Microsoft to hike its prices."

Anderson summarizes the case by saying "The fundamental issue is that whoever controls the TC infrastructure will acquire a huge amount of power. Having this single point of control is like making everyone use the same bank, or the same accountant, or the same lawyer. There are many ways in which this power could be abused."

The Free Software Foundation

One of the biggest criticisms of the Trusted Computing Group comes from the Free Software Foundation, the foundation responsible for developing the GNU operating system (used widely in its GNU/Linux variant, often referred to simply as Linux) and several other technology related issues.The Free Software Foundation refers to Trusted Computing as Treacherous Computing frequently in their articles. Their website provides an example animated short that explains negative aspects of Trusted Computing.[9].

Irrespective of an individual's stance on trusted computing, it is foreseeable that Trusted Computing will become a major battle for computer freedoms and liberties within the next 10 to 15 years.

Users can't change software

In the diary example, sealed storage protects the diary from malicious programs like viruses, but it doesn't distinguish between those and useful programs, like ones that might be used to convert the diary to a new format, or provide new methods for searching within the diary. A user who wanted to switch to a competing diary program might find that it would be impossible for that new program to read the old diary, as the information would be "locked in" to the old program. It could also make it impossible for the user to read or modify his or her diary except as specifically permitted by the diary software. If he or she were using diary software with no edit or delete option then it could be impossible to change or delete previous entries.

Remote attestation could cause other problems. Currently web sites can be visited using a number of web browsers, though certain websites may be formatted (intentionally or not) such that some browsers cannot decipher their code. Some browsers have found a way to get around that problem by emulating other browsers. For example, when Microsoft's MSN website briefly refused to serve pages to non-Microsoft browsers, users could access those sites by instructing their browsers to emulate a Microsoft browser. Remote attestation could make this kind of emulation irrelevant, as sites like MSN could demand a certificate stating the user was actually running an Internet Explorer browser.

Users don't control information they receive

One of the early motivations behind trusted computing was a desire by media and software corporations for stricter Digital Rights Management (DRM): technology to prevent users from freely sharing and using potentially copyrighted or private files without explicit permission. Microsoft has announced a DRM technology, PVP-OPM, that it says will make use of hardware encryption.

Trusted computing can be used for DRM. An example could be downloading a music file from a band: the band's record company could come up with rules for how the band's music can be used. For example, they might want the user to play the file only three times a day without paying additional money. Also, they could use remote attestation to only send their music to a music player that enforces their rules: sealed storage would prevent the user from opening the file with another player that did not enforce the restrictions. Memory curtaining would prevent the user from making an unrestricted copy of the file while it's playing, and secure output would prevent capturing what is sent to the sound system.

Once digital recordings are converted to analog signals, the (possibly degraded) signals could be recorded by conventional means, such as by connecting an audio recorder to the card instead of speakers, or by recording the speaker sounds with a microphone. Even trusted computing cannot defeat the analog hole.

Without remote attestation, this problem would not exist. The user could simply download the song with a player that did not enforce the DRM restrictions, or one that lets him convert the song to a normal "unrestricted" format such as MP3 or vorbis.

Censorship

The use of 'trusted' applications whose source is not public (and so the user does not know what the application's functionalities are) makes new forms of censorship possible. Besides, these applications could be designed to alter or change its functionalities after a certain date or to 'upgrade' themselves without being authorized to do so by the user. For example, a newspaper could require its readers to download a 'trusted' application in order to access its articles. This program could force the user to read the 'last version' of an article, effectively enabling the author of the article to deny access to older versions of the article. In this way, the newspaper editor could "rewrite history" by changing or deleting articles. Even if an old version of the article is present on the user's hard disk, the program could refuse to read it. Censorship on the web could also be implemented using 'trusted computing'; a 'trusted' browser could 'upgrade' itself and deny the user access to certain web sites that are present in a blacklist written by the browser's author.

Users don't control their data

One commonly stated criticism of Trusted Computing, is that sealed storage could prevent them from moving sealed files to the new computer. This limitation might exist either through poor software design or deliberate limitations placed by content creators. The migration section of the TPM specification requires that it be impossible to move certain kinds of files except to a computer with the identical make and model of security chip. If an old model of chip is no longer produced it becomes impossible to move the data to a new machine at all; the data is forced to die along with the old computer.

Moreover, critics are concerned that TPM is technically capable of forcing spyware onto users, with e.g. music files only enabled on machines that attest to informing an artist or record company every time the song is played.

Loss of Internet anonymity

Because a TC-equipped computer is able to uniquely attest to its own identity, it will be possible for vendors and others who possess the ability to use the attestation feature to zero-in on the identity of the user of TC-enabled software with a high degree of certainty.

Such a capability is contingent on the reasonable chance that the user at some time provides user-identifying information, whether voluntarily or indirectly. One common way that information can be obtained and linked is when a user registers a computer just after purchase. Another common way is when a user provides identifying information to the website of an affiliate of the vendor.

While proponents of TC point out that online purchases and credit transactions could potentially be more secure as a result of the remote attestation capability, this may cause the computer user to lose expectations of anonymity when using the Internet.

Critics point out that this could have a chilling effect on political free speech, the ability of journalists to use anonymous sources, whistleblowing, political blogging and other areas where the public needs protection from retaliation through anonymity.

In response to privacy concerns, researchers developed direct anonymous attestation which allows a client to perform attestation while limiting the amount of identifying information that is provided to the verifier.

The question of practicality

It has also been compellingly argued that many of the assumptions which underly TC are impractical "in the real world" .

Any hardware component, including the TC hardware itself, has the potential to fail, or be upgraded and replaced. A user might rightfully conclude that the mere possibility of being irrevocably cut-off from access to his or her own information, or to years' worth of expensive work-products, with no opportunity for recovery of that information, is unacceptable. Legal restrictions on the use and dissemination of information, or mandating its reliable storage for a period of time that may extend to many years in the future, may also, it has been argued, preclude the practical application of TC technology in many of the ways now contemplated. The concept of basing ownership or usage restrictions upon the verifiable identity "of a particular piece of computing hardware" may be perceived by the consumer as inadequately answering the question, "what do I do when it breaks?"

Technical issues

Trusted Computing requests that all software and hardware vendors will follow the technical specifications released by the Trusted Computing Group in order to allow interoperabilty between different trusted software stacks. However, even now there are interoperability problems between the TrouSerS trusted software stack (released as opensource software by IBM) and HP's stack (as explained in the TrouSerS FAQ). Another problem is the fact that the technical specifications are still changing, so it isn't clear which is the 'standard' implementation of the trusted stack.

Suggestion for Owner Override

Some opponents of TC think that most of these problems come up because trusted computing "protects" programs against everything, even the owner and whatever software he might be running. They've proposed allowing the owner of the computer to override these protections. This is called owner override.

Activating owner override would allow the computer to use the secure I/O path to make sure the owner is physically present, to then bypass restrictions. Such an override would allow remote attestation to a user's specification, e.g., to create certificates that say Internet Explorer is running, even if a different browser is used. Instead of preventing software change, remote attestation would indicate when the software has been changed without owner permission.

Trusted Computing Group members have refused to implement owner override [10]. Proponents of trusted computing believe that Owner override, defeats the trust in other computers since remote attestation can be forged by the owner. Owner override offers the security and enforcement benefits to a machine owner, but does not allow him to trust other computers, because their owners could waive rules or restrictions on their own computers. Under this scenario, once data is sent to someone else's computer, whether it be a diary, a DRM music file, or a joint project, that other person controls what security, if any, their computer will enforce on their copy of those data. This has the potential to undermine the applications of trusted computing to enforce Digital Rights Management, control cheating in online games and attest to remote computations for grid computing.

According to the Electronic Frontier Foundation, one of the fundamental premises behind trusted computing is that the owner and his software cannot be trusted.[11] It is assumed that the user will — through negligence or willful intent — take actions that may result in compromising his own system. For example, an IT administrator could not ensure that notebook computers are running a specified operating system. An alternative approach would be to require a key provided with the computer in order to engage the override. This would require that the owner authorize the override rather than merely someone who has physical access to the computer and may satisfy the IT administrator, but it would also allow users to connect insecure programs to the network. It would also reduce the usefulness of Trusted Computing in Digital Rights Management or other applications that require Remote Attestation against the will of the computer owner.

Hardware and Software support for TPMs

  • Some Apple computers with Intel processors include a TPM module.
  • Since 2004, most major manufacturers have shipped systems (usually laptops) that have included Trusted Platform Modules, with associated BIOS support.[12] In accordance with the TCG specifications, the user must enable the Trusted Platform Module before it can be used.
  • The Linux kernel has included trusted computing support since version 2.6.13, and there are several projects to implement trusted computing for Linux. In January 2005, members of Gentoo Linux's "crypto herd" announced their intention of providing support for TC - in particular support for the Trusted Platform Module.[13] There is also a TCG-compliant software stack for Linux named TrouSerS, released under an open source license.
  • Some limited form of trusted computing can be implemented on current versions Microsoft Windows with third party software.
  • The Intel Classmate PC (a competitor to the One Laptop Per Child) includes a Trusted Platform Module[14]

See also

References

  1. ^ ZDNet
  2. ^ Schneier
  3. ^ "TPMs [Trusted Platform Modules] from various semiconductor vendors are included on enterprise desktop and notebook systems from Dell and other vendors" [1]
  4. ^ "Among other things, Apple uses the hardware component of Trusted Computing, known as the Trusted Platform Module (TPM), to verify that the company's PowerPC-to-Intel interpreter only works on authentic Apple hardware." [http://www.securityfocus.com/brief/270 Apple makes trusted computing cool (Security Focus)]
  5. ^ "Windows Vista provides a set of services for applications that use TPM technologies." [2]
  6. ^ U.S. Army requires trusted computing
  7. ^ strategic goal n. 3 , "deliver a joint netcentric information that enables warfighter decision superiority" , October 2006
  8. ^ Microsoft's leaner approach to Vista security
  9. ^ "AES-CBC + Elephant di®user A Disk Encryption Algorithm for Windows Vista" (PDF). Microsoft TechNet. {{cite web}}: line feed character in |title= at position 27 (help)
  10. ^ Innovations for Grid Security From Trusted Computing.
  11. ^ Schoen, Seth (2003). "Trusted Computing:Examples of Abuse of Remote Attestation:Part 4.Computer Owner as Adversary?" (PDF). Trusted Computing: Promise and Risk. Retrieved 2006-03-13.
  12. ^ Tony McFadden (March 26 2006). "TPM Matrix". Retrieved 2006-05-05. {{cite web}}: Check date values in: |year= (help)
  13. ^ "Trusted Gentoo". Gentoo Weekly Newsletter. January 31 2005. Retrieved 2006-05-05. {{cite web}}: Check date values in: |year= (help)
  14. ^ Intel (December 6 2006). "Product Brief: Classmate PC" (PDF). Retrieved 2007-01-13. {{cite web}}: Check date values in: |year= (help)

External links

official sites

  • Trusted Computing Group (TCG) — Trusted computing standards body, previously known as the TCPA.
  • Trusted Mobile Platform - a set of specifications that define security features for mobile devices, jointly developed by IBM, Intel, and NTT DoCoMo.
  • TCG products page :information on TCG Member's TCG-related products and services

software that uses trusted computing

opponents of trusted computing

proponents of trusted computing

Other

Template:Link FA