Jump to content

Object (computer science)

From Wikipedia, the free encyclopedia
(Redirected from Object (computing))

In software development, an object is an entity that has state, behavior, and identity.[1]: 78  An object can model some part of reality or can be an invention of the design process whose collaborations with other such objects serve as the mechanisms that provide some higher-level behavior. Put another way, an object represents an individual, identifiable item, unit, or entity, either real or abstract, with a well-defined role in the problem domain.[1]: 76 

A programming language can be classified based on its support for objects. A language that provides an encapsulation construct for state, behavior, and identity is classified as object-based. If the language also provides polymorphism and inheritance it is classified as object-oriented. A language that supports creating an object from a class is classified as class-based. A language that supports object creation via a template object is classified as prototype-based.

The concept of object is used in many different software contexts, including:

See also

[edit]
  • Actor model – Model of concurrent computation
  • Business object – Entity within a multi-tiered software application
  • Object lifetime – Time period between the creation and destruction of an object-oriented programming instance
  • Object copying – Techniques for copying an object in object-oriented programming
  • Semantic Web – Extension of the Web to facilitate data exchange

References

[edit]
  1. ^ a b c Grady Booch; Robert Maksimchuk; Michael Engle; Bobbi Young; Jim Conallen; Kelli Houston (April 30, 2007). Object-Oriented Analysis and Design with Applications (3 ed.). Addison-Wesley Professional. ISBN 020189551X.
  2. ^ Oppel, Andy (2005). SQL Demystified. McGraw Hill. p. 7. ISBN 0-07-226224-9.
[edit]