Jump to content

Essential systems analysis

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Monkbot (talk | contribs) at 01:31, 12 December 2020 (Task 18 (cosmetic): eval 3 templates: del empty params (3×);). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Essential Systems Analysis was a new methodology published in 1984 by Stephen M. McMenamin and John F. Palmer for performing Structured Systems Analysis based on the concept of Event Partitioning.[1]

The Essence of a system is "its required behavior independent of the technology used to implement the system".[2] It is a model of what the system must do saying ideally nothing about how it will do it.[2]

The methodology[1] proposed that finding the true requirements for an information system entails the development of an Essential Model for the system, based on the concepts of a perfect internal technology, composed of:

  • a perfect memory, that is infinitely fast and big, and
  • a perfect processor, that is infinitely potent and fast.

It was later adapted by Edward Yourdon to develop Modern Structured Analysis.[3]

The main result was a new and more systematic way to develop the Data Flow Diagrams, which are the most characteristic tool of Structured Analysis.

References

  1. ^ a b McMenamin, Stephen M.; Palmer, John F. (1984). Essential systems analysis. Yourdon Press. ISBN 978-0-917072-30-7.
  2. ^ a b Yourdon, Edward (2006). Just enough structured analysis. Ed Yourdon.
  3. ^ Yourdon, Edward. (1989). Modern structured analysis. Englewood Cliffs, N.J.: Yourdon Press. ISBN 0-13-598624-9. OCLC 17877629.