|This article is of interest to the following WikiProjects:|
- 1 redirects from Regression Tree
- 2 automatic creation???
- 3 Mention where to acquire the software used to generate the images in this wikipedia article
- 4 Disadvantages to decision trees
- 5 RE: Nothing said about decision trees ... Suggested Fix
- 6 Note
- 7 Nothing said about decision trees as a decision aid
- 8 Machine Learning
- 9 Addition
- 10 Example Confusing
- 11 A probability tree in maths is not a decision tree, it needs its own article
- 12 Decision trees in data mining and business intelligence
- 13 Related techniques?
- 14 Online examples
- 15 Decision tree software
- 16 Article contains lots of material not relating to decision trees
redirects from Regression Tree
Yet does not really discuss regression trees (which are an analytical technique not a procedural technique) at all. If I wanted to find out what a regression tree or a classification tree was I would not find this article particularly helpful. — Preceding unsigned comment added by 18.104.22.168 (talk) 00:21, 21 December 2011 (UTC)
all the following part had no connection at all with the article: totally out of context
what has this got to do with the subject?
if this is a "n-derivate" of the subject, is it??? it should have its own encylcopedic entry displaying the full "path" development / explanation.
introduction, meaning context etc if it can't be explained , it can't be accepted ??:
"Creation of decision nodes
Three popular rules are applied in the automatic creation of classification trees. The Gini rule splits off a single group of as large a size as possible, whereas the entropy and twoing rules find multiple groups comprising as close to half the samples as possible. Both algorithms proceed recursively down the tree until stopping criteria are met.
The Gini rule is typically used by programs that build ('induce') decision trees using the CART algorithm. Entropy (or information gain) is used by programs that are based on the C4.5 algorithm. A brief comparison of these two criterion can be seen under Decision tree formulae.
More information on automatically building ('inducing') decision trees can be found under Decision tree learning."
- You might be right, but you cannot simply erase the contents if they are valid information. See WP:PRESERVE. --Antonielly (talk) 15:39, 17 November 2008 (UTC)
- I believe that this article has a problem since the word decision tree has different meanings in operations research and in machine learning, where we also call it "classification tree". Unfortunatelly "classification tree" (the term which, I think, is not used in operations research) redirects to here. I think that "classification tree" should redirect to Decision tree learning, while this page has to either start with a paragraph describing the ambiguity or at least the ambiguity of the term should be made clear(er) in the second paragraph. Most references at the end of the article actually refer to classification trees and are completely unrelated to other information on the page.Janez Demsar (talk) 22:38, 21 July 2009 (UTC)
Mention where to acquire the software used to generate the images in this wikipedia article
- Seems like the images were made with Insight Tree from http://www.visionarytools.com/ Buddelkiste (talk) 11:07, 22 March 2012 (UTC)
Disadvantages to decision trees
This article talks about the advantages to using decision trees but shouldn't it also include disadvantages too? User:noneforall October 14, 2007
RE: Nothing said about decision trees ... Suggested Fix
I think the note below is right on -- I'm amazed this entry hasn't been fixed.
My thought on a fix is that most of the decision tree entry needs to be moved to a more specific category ---- maybe decision tree learning. The data mining form of decision tree learning could be linked from a corrected decision tree page, within a parenthetical note about the confusion over terminology. Influence diagrams and decision analysis need to be referenced. Etc. I agree...there's do definition of what the diagrams mean--their spacing, colors, numbers, etc. I'm going to add a template and flag this article. Maybe it will get fixed then. RCanine 14:21, 6 April 2007 (UTC)
Someone should probably point out the Z criterion (sqrt(positive weight * negative weight)), which is used by AdaBoost (Schapire and Singer). Earlier, it was analyzed by Kearns + Mansour (IIRC) in the case where example weights are uniform, and they cited Quinlan as first proposing it.
"...is a white box model" - Ahahahaha! The hilarity of the mental processes which lead anyone to think up the concept of a "white box" has brightened my day.
Nothing said about decision trees as a decision aid
I'm shocked that there is no mention of decision trees as a decision aid - where the expected values of various choices are calculated. This is what I understand as a Decision Tree - the stuff about their use in data mining is only of secondary importance to my mind.
For example a factory manager has to decide to invest in product A or product B (she cannot do both due to budget constrants). Product A is estimated to require two million pounds (or dollars if you like) of R&D investment, but only has a 50% chance of the research being successful and a product being obtained. It will then have a 30% chance of making a $5M profit, a 40% chance of making a $10M profit, and a 30% chance of not selling at all and making a loss of £1M for the manuafacturing costs. Product B on the other hand will cost $3M in R&D but has an 80% chance of making a $4M profit and a 20% chance of a $2M loss. If the company has a policy of maximising expected values, which should she go for?
This is just an example off the top of my head, but a more domestic example is of someone deciding to rent or buy their own house, along with a capital gain or loss depending on where house prices go and what the cost of renovation (or "fixing up" I think in AmEng) will be.
The most important part of the article has been left out!
I'd also like to add that the highly mathematical formal description of decision trees is not going to be understood by most readers. Articles like this need to start with a very simple example that everyone can understand. --22.214.171.124 15:08, 6 August 2006 (UTC)
Decision trees are also important in machine learning, not just management science. It would be good to see this distinction elaborated on in the article. There also needs to be more elaboration (or links to other articles) on constructing decision trees - mentioning ID3 and C4.5 is a start. Also, what about the example provided? How is the threshold value of 70 chosen for humidity? This seems wrong.
I tried to add a decision tree software to the list as it was in keeping with other links, why would informavores not qualify for entry on this page? —The preceding unsigned comment was added by Louharris (talk • contribs) 09:29, 3 April 2007 (UTC).
I've done a few decision trees and I think the example given is confusing, especially without clarification on which colors mean what. I think perhaps a simpler example would be nice to start in order to illustrate the principle. fsiler 19:36, 30 July 2007 (UTC)
A probability tree in maths is not a decision tree, it needs its own article
Probability tree redirects here but needs its own entry as in maths it's something different: a diagram illustrating possible outcomes from a series of events. It isn't a decision tree as you can't decide the steps, they occur as the result of chance, eg coin toss. The secondary school maths curriculum of numerous English-speaking countries such as New Zealand specifies this usage and students will come to Wikipedia looking for information on them. Strayan (talk) 06:30, 30 June 2008 (UTC)
Decision trees in data mining and business intelligence
A section should be added on software decision trees like the one available in SAS Enterprise Miner. A good resource is Barry de Ville's Decision Trees for Business Intelligence and Data Mining: Using SAS Enterprise Miner. SAS Institute, Inc. Cary, NC. 2006. La9rsemar (talk) 18:30, 2 September 2009 (UTC)
- Please read my comment above. I guess you are confusing decision trees from operation research with decision trees from data mining, which are described on page Decision tree learning.Janez Demsar (talk) 20:17, 4 September 2009 (UTC)
Surely the online examples given aren't really the same as what is described in the article. Decision trees are used to help on a decision not help with navigation. —Preceding unsigned comment added by 126.96.36.199 (talk) 13:31, 30 January 2010 (UTC) The text starts with 10 000 000 when the tree shows 33 333 and explains about the bold line going from nodes 1, 3, 5 and so on when nowhere is those kind of numbers. The text should be removed and a new explanation should be done. —Preceding unsigned comment added by 188.8.131.52 (talk) 13:34, 21 April 2011 (UTC)
Decision tree software
- It could be Expression Tree, available here http://www.public.asu.edu/~kirkwood/DAStuff/programs/et.htm 184.108.40.206 (talk) —Preceding undated comment added 09:38, 25 April 2013 (UTC)
Article contains lots of material not relating to decision trees
Unless anyone has good reasons for objecting, I intend to delete the non-decision tree material, and other material which appears to be personal research as far as I am aware. I think the following should be deleted: flow diagram (appears to be personal research), influence diagrams, utility preferences (text does not connect it with decision trees), references to AI and genetic algorithms (which are not referenced in the text as far as I can see), minor cleaning up. 220.127.116.11 (talk) 22:00, 20 April 2013 (UTC)