Talk:Subsumption architecture

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

I've just deleted this[edit]

This AI model may have been invented by Alexandre Parodi, and was first used in a real robot as part of the FMC/CEL (now United Defense) AVTB/AGVT program in 1984[citation needed].
  • J. J. Nitao and A. M. Parodi (1986) "A Real-Time Reflexive Pilot for an Autonomous Land Vehicle", IEEE Control Systems, vol. 6, no. 1, February, pp.14-23.

Here is the actual quote about Nitao and Parodi's architecture from that paper. Anyone who reads any paper actually on Subsumption Architecture will see how far off this is immediately.

Currently the FMC architecture constis of a Planner, Observer, Mapmaker and Pilot, and the vehicle control subsystems. Included in the Observer is the sensor subsystem... The system is now mainly hierarchical, but will become more hetrarchical as each expert shares information from its area of resposibility with the other subsystems. (p. 14)

This sounds like a cross between GOFAI and blackboard architectures. Arguably, SA is like the latter. But all the main points of the SA are missed except for those shared with any modular system. Also, the publications came out the same year so presumably the systems were both in development at the same time. There is no evidence this work was known in 1984.--Jaibe 22:47, 30 October 2006 (UTC)[reply]

disambiguating modules[edit]

Someone deleted the modules link & called it disambiguating. I looked at the ambiguous modules page, and at least half of the links there are relevant to the use of modularity in AI, so I think it makes sense to link to the whole page. Unless someone wants to start a special page on modules in AI, but I don't know what that would have beyond modularity in programming, maths and mind.--Jaibe 22:26, 29 November 2006 (UTC)[reply]

Feedback is given mainly through the environment.[edit]

what does this mean: "Feedback is given mainly through the environment."? Few internal sensors? —Preceding unsigned comment added by 91.89.243.196 (talk) 18:13, 5 November 2007 (UTC)[reply]

It means that basically the robot has no knowledge before exploring, it has no built-in map, for example, of the area, it thus needs to explore, use the environment to know what is going on around it. It has a set of rules but not facts.Ολίβια (talk) 14:30, 11 October 2008 (UTC)[reply]

this is upside-down[edit]

when i read this i have to constantly swap "lower" with "higher" in my head. the condition that get checked first is on top. Thus, the "more abstract", longer-term goal orientated production rules are on the bottom. if a then b else if c then d else... etc. it's upside-down. Kevin Baastalk 14:12, 29 December 2009 (UTC)[reply]

India Education Program course assignment[edit]

This article was the subject of an educational assignment at College Of Engineering Pune supported by Wikipedia Ambassadors through the India Education Program during the 2011 Q3 term. Further details are available on the course page.

The above message was substituted from {{IEP assignment}} by PrimeBOT (talk) on 20:04, 1 February 2023 (UTC)[reply]