Jump to content

Talk:Database normalization: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
Line 38: Line 38:


this article has the ugliest tables i've seen <span style="font-size: smaller;" class="autosigned">—Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/67.187.187.128|67.187.187.128]] ([[User talk:67.187.187.128|talk]]) 00:42, 27 November 2009 (UTC)</span><!-- Template:UnsignedIP --> <!--Autosigned by SineBot-->
this article has the ugliest tables i've seen <span style="font-size: smaller;" class="autosigned">—Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/67.187.187.128|67.187.187.128]] ([[User talk:67.187.187.128|talk]]) 00:42, 27 November 2009 (UTC)</span><!-- Template:UnsignedIP --> <!--Autosigned by SineBot-->

:You noticed that too? [[Special:Contributions/71.231.132.75|71.231.132.75]] ([[User talk:71.231.132.75|talk]]) 14:40, 24 December 2009 (UTC)

Revision as of 14:40, 24 December 2009

WikiProject iconComputing Start‑class
WikiProject iconThis article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.
???This article has not yet received a rating on the project's importance scale.
WikiProject iconDatabases Start‑class (inactive)
WikiProject iconThis article is within the scope of WikiProject Databases, a project which is currently considered to be inactive.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.

Archives: 1 2 3

Karnaugh map

How would I know if I had reached the most normalised/optimised stage? Is there any tool like Karnaugh map that lists all permutations/combinations?Anwar (talk) 13:01, 22 May 2008 (UTC)[reply]

I don't know, but for small schemas (say, 20 relations or less) this is easy to see. A related issue is that there may be implicit dependencies that have not been marked explicitly in the database schema - determining those is not so easy, see e.g. [1] Rp (talk) 08:49, 7 July 2009 (UTC)[reply]

Trade-off

The article does not explain the trade-offs suffered with normalisation. For instance, a highly normalised database needs more tables each stripped to the bare minimum. So serving a singular query would require pulling data from several tables. This costs time and money. In a way, the business process is not optimised (though the database is!).Anwar (talk) 13:18, 22 May 2008 (UTC)[reply]

Do you have an article or a computing paper where this is explained, so we can add it better to the article? --Enric Naval (talk) 11:43, 24 May 2008 (UTC)[reply]
A normalization, and thus pulling data from several tables doesn't cost significantly more time. In fact, a database that is not normalized will take much more CPU time and Disk I/O to process, requiring more resources as query load increases. Then your business will either need to purchase more hardware, or re-architect it's database and code base, and trust me, that costs alot more. Common sense always dictates that you "Do it right the first time" 71.231.132.75 (talk) 14:38, 24 December 2009 (UTC)[reply]

Denormalization

The statement, "It has never been proven that this denormalization itself provides any increase in performance, or if the concurrent removal of data constraints is what increases the performance." needs more backup on that argument or needs to be revised or removed. Volomike (talk) 12:53, 17 January 2009 (UTC)[reply]

It's wrong. Denormalization doesn't remove data constraints. It just means that instead of maintaining two tables, you maintain their join. This can be faster if you often need that join. Rp (talk) 21:13, 1 July 2009 (UTC)[reply]
PS the article should bother to explain this. Rp (talk) 21:19, 1 July 2009 (UTC)[reply]
Agreed. 99.60.1.164 (talk) 17:13, 22 August 2009 (UTC)[reply]

Request for normalization

Please help with http://strategy.wikimedia.org/wiki/Proposal:Assessment_content and http://strategy.wikimedia.org/wiki/Proposal_talk:Assessment_content#Normalization_of_assessment_items_.28questions.29_in_database 99.60.1.164 (talk) 01:37, 23 August 2009 (UTC)[reply]

Please note that [2] contains a list per quesion with each element containing two timestamps, so this is definetly a 6NF-level problem. An easier, and perhaps more important sub-requirement is the review system in [3], in particular, this example outlines a sub-schema related to
a selection of text or a url (to a permanent aritcle version or diff, etc.) could be an item for which multiple, randomly-selected reviewers chosen for their stated familiarity with a topic area would be selected. Those reviewers could be shown that text or url (perhaps as part of a list of such items) in a securely authenticated and captcha-ed channel. They would be asked to vote on the accuracy of the item, and have the opportunity to fully explain their votes in comments. If a statistically significant number of votes are in agreement, then the item could be approved as to veracity or rejected. When the votes are not in agreement, then additional voter(s) would perform a tie-breaking function. Each voter's track record in terms of agreement with other voters could be recorded secretly and used to (1) weight their vote to nullify defective voters, and/or (2) used to select whether the voter is allowed to become a tie-breaker.
The fields required to support that need to be added to [4]. 99.35.130.5 (talk) 18:00, 10 September 2009 (UTC)[reply]

ugly tables

this article has the ugliest tables i've seen —Preceding unsigned comment added by 67.187.187.128 (talk) 00:42, 27 November 2009 (UTC)[reply]

You noticed that too? 71.231.132.75 (talk) 14:40, 24 December 2009 (UTC)[reply]