Good science (IMHO) should include at least the following:
1. Ask questions (impertinent if necessary)
2. Think critically (esp re assumptions)
3. Tell the truth (all that is relevant even if somewhat unpalatable)
Who does science? Well everybody does..
So why is science today falling into disrepute? <don't know - don't care> teashirts etc..
1. Scientific reports can be bought
2. Scientific reports can be ordered (political ajendas)
3. Scientists can 'weld' everything to a pet theory (world view)
4. Scientists tend to have big egos which contribute to a reluctance to admit error
5. Scientific institutions (even the Nobel Prize Committee) like any other organisation such as companies and government parties, are by definition embody 'self preservation' as such they have an 'image' to protect.. and do so vigorously even at the expense of hiding uncomfortable truth.
All of the above of course turns up here! What I mean is that even wide publication, peer review and mainstream scientific community acceptance does not guarantee good science is always at the top of the agenda..
Vh mby (talk) 00:54, 9 November 2010 (UTC)
So what is meant by 'truth'?.. simply "stuff we know well enough to use with confidence", it is in the end what constitutes our knowlewdge base. Does that mean its 'absolutely true' or is there any room for 'uncertainty'.. for us mere mortals no and there is always some uncertainty. The concept of 'absolute' truth while not being very usefull is indicated by the nature of truth itself ie it is exclusive of all other proposals.. we cannot have that level of certainty because we cannot ever say we know everything. However knowledge or our truth does exist because it is how we got from the Wright flyer to the Saturn V or from the Sopwith Camel to Concorde.. Engineering (Applied Science)is based on knowledge which is considered to be true, it is also an essential component of law, ethics, business, community life, family life, information systems, communications etc.. So how may we separate or distinguish this 'truth' from theory, conjecture or speculation?
Let me informally define a FACT as a VERIFIED OBSERVATION.
We now classify observations as either verified or unverified. Unverified contrary observations may be held in tension without destroying our idea (theory) however it only takes one verified contrary observation to FALSIFY our theory. To establish truth we obviously need more verified observations (facts) than unverified observations. However even the process of verification can be somewhat 'subjective'.
So now we must include and deal with the problem of UNCERTAINTY. It relates to both the observed facts and the truth we want to establish.. only then can we say we have a working definition of truth. When a number of individuals with relevant training and/or experience agree on the facts surrounding an issue and collectively act on or use that knowledge and this is repeated over a period of time by different individuals we usually conclude we have the truth. However in keeping with the well established UNCERTAINTY PRINCIPLE a small uncertainty always remains. I suggest we overcome this very small uncertainty with.. FAITH.! It is simply the best expression we have for what we all do when we need to TAKE ACTION based on our perception of what is TRUE. The safety of bungee jumping or aerobatic flight is soundly based on accepted truth about materials, processes training etc.. but let me suggest.. they both require the exercise of a considerable degree of faith. So our 'formula' for truth becomes:
TRUTH = FACTS + FAITH (FAITH > 0 and FACTS >> 0)
Without facts your truth is just belief, without faith and your truth becomes absolute, and impossible. Between these two extremes is a sliding scale (% facts) vs (% faith) but both are always present.
Truth as defined thus has some interesting characteristics:
1. Since everyone does 'science' anyone may discover 'truth' and so we may conclude that no one actually owns or has exclusive rights to truth.
2. When we have found a truth in these terms we may conclude: all contrary proposals must be considered false. Until at least one verifiable contrary fact is observed. This is the concept of falsification which is necessary for good science.
3. There is no valid alternative to the search for truth and process of verification or falsification.
4. The search for and process of establishing truth is independent of the subject matter and therefore applies to any area of knowledge.. The only requirement is that verifiable observations (facts) may be established. Even the distinction natural or supernatural has no bearing on this process.
Vh mby Gyroman (talk) 02:32, 5 November 2016 (UTC)
Ever since the discovery of the structure of DNA we have known that life may be characterised as 'information rich'. However like the term complexity, 'information' of the type that is found in DNA is not very well defined. Which may also have something to do with the fact that 'life' itself is not very well defined either. It is a crucial definition for any 'information rich' type of complexity, which by the way also includes man made objects. The main focus of the literature on information is about the quality of the signal and not its meaning, (see Shannon). Physicists and cosmologists refer to information in the universe, meaning the event history of every particle in it. This however is not 'signal' or 'semantic' information, of the type found in DNA and human design. This distinction is the basis of the SETI program. It postulates that only inteligence can be the source of semantic information. Intelligence is associated with life and we know that intelligent life like ourselves is distinguished by the objects we make and the language we use in communication. In fact the very tools required to make intergalactic transmissions must be specified by semantic information so we may conclude that there is no difference between semantic information in an intelligently designed object and that found in DNA which is also a specification for a living creature.
This implies a single definition for semantic information should suffice for all occurrences of it. Firstly lets describe semantic information correctly..
All Semantic Information is a NON Repeating, NON Random, ORDERED Set
Like trying to define life, semantic information has some universal, characteristics which are helpful in closing in on a definition.
1. It is expressed in symbolic codes which come from a finite alphabet
2. The codes are grouped according to a grammatical language
3. There is a method of writing
4. There is a method of reading
5. It has a meaning and a purpose
Since methods for writing and reading imply physical structures with a purpose they must also be specified by semantic information.! Which means the proper definition of 'Semantic Information' implies it is recursive.. It cannot exist without prior semantic information? Presenting a whole bag full of problems for both the naturalist and the Atheist.
Following prior work on the subject we may understand the following:
Semantic Information is a symbolic representation of a material reality.
Ref: Dr Werner Gitt. "In the Begining was Information"
This is at the very centre of the biggest problem in science today and will no doubt continue to be so well into the future. It is the reason for the foregoing and pertinent to what follows. You only have to look at all the articles and the talk pages on the subjects Entropy, Introduction to Entropy, Entropy (disambiguation).. to appreciate there IS a problem. It is not because we do not know the answer either. However to answer why it is the case you must first answer the question, Do you really want to know the truth? The search for truth is basic to all science but it can never be assumed to actually apply. You and you alone must decide what is true.
The proper definition of entropy is a well known and well documented truth which I will try to show here. But one that a rather large and influential group of individuals do not really want you to fully understand. So from the Cambridge Encyclopedia of Technology.. "Entropy: A measure of disorder" but according to Emeritus Professor of Chemistry Frank Lambert "Entropy: Is not disorder" and there you have the problem.
You may see lots of attempts to show 'order' to be a subjective/analogous purely human judgment which is not rigorous or measurable so not scientifically valid. Well let me assure you the writers of the Cambridge Encyclopedia were well aware of that requirement. An ordered state is properly and most simply just an improbable state..
We have to start with Equilibrium.. the concept where molecular behavior becomes so evened out that nothing much happens.. Its quite simple, every molecule in any system (defined by a boundary) has both a momentum and a position. You must understand that just as it is possible (but very unlikely) that all the energy will be concentrated in one atom while all the others are stationary, so it is just as unlikely that all the molecules will be packed into say one tenth of the volume leaving a vacuum in the rest. Both these unlikely states viewed as an instantaneous snapshot are called microstates. They are unlikely or improbable (even though all possible microstates of the system are equally likely) because each microstate belongs to a group of similar arrangements having that same distribution of momentum & position which we call a macrostate. The number of microstates in different macrostates differs and is what creates a probability distribution of macrostates. The more likely simply has a greater number of microstates and the least likely has the smallest number of microstates. This all means that of all the possible microstates a system may take it will by simple probability, tend to be found in a macrostate that has the highest number of microstates.
The number of microstates in a given macrostate is obtained by identifying each atom then counting the number of possible (logical) arrangements by exchanging every atom/particle with every other atom/particle of the same momentum/position. In the case of a single atom having all the energy in a system of N atoms (N-1 atoms are stationary) the number of microstates in that macrostate is simply N because there are N atoms which could be the one with all the energy. Whereas if the energy of a system is equally divided among all N atoms the number of microstates is N! Clearly much more likely. The log of this very large number is the basis of the absolute entropy of that particular state. The fact that the momentum(speed) and position of each particle will be a statistically distributed over the whole system in accordance with the Normal Distribution is a necessary requirement. The actual momentum and position of any single molecule is independent of which molecule it is but the count of microstates must cover all the logical possibilities (called the logical phase space).
There is a terrible confusion between entropy and information which is most aptly resolved in an article from the Encylopedia Britannica 1965 "Heat" as follows
"The similarity of the theory of heat to the theory of information also is striking in many other ways. The second law of thermodynamics states that entropy always increases in any spontaneous change; in the limit, entropy remains constant if the change takes place reversibly; and it never decreases spontaneously. Similarly, information always decreases as the result of being communicated; in the limit it remains constant as the communication becomes perfect; ie when no randomness such as electrical noise is introduced in the act of communication; but information never increases as the result of communication. Thus entropy and information are strictly isomorphic quantities, though differing in sign, the first increasing and the second decreasing when randomness occurs." Randomness obviously includes random mutation of information encoded in DNA during reproduction. The statement is thus most pertinent to the theory of evolution as reproduction is just one form of communication.
Why does the confusion persists? Naturalists have an agenda based on an unshakable belief that mass & energy are all that exist; which requires that Mass + Energy = Information; thus allowing a theory that life could make itself! That nothing more than the basic laws of the universe acting over a very long time are all that was required to do it. Its equivalent to the old 19th century belief that information is free. The simple truth about entropy reveals the untenable nature of that postulate which was effectively falsified by Leo Szilard in 1929!
Gyroman (talk) 00:10, 9 January 2017 (UTC)
This is the 'bellwether' indicator of the nature of the problem with entropy. Its a lost word because entropy is a lost word. Its an essential definition for a true comparison of states of matter consistent with the entropy of those states. With a definition of complexity derived from absolute entropy it is independent of what the matter comprises (atoms/molecules/objects or assemblies). The process of scientific discovery (seeking the truth ie knowledge) about a subject often requires the classification and arranging of things or concepts according to logical patterns or rules. When a basic rule is obtained from the data it is possible to predict characteristics of missing elements or project to unknown data which guides future searches. Examples abound in physics, chemistry, cosmology, biology etc. When we say something is more complex than something else we are really saying it is more rare or difficult to achieve. As such a general definition of complexity is pivotal to the investigation of any postulate involving a natural origin or rise in complexity (ie evolution). The following are of particular relevance:
1. The formation of stars and galaxies from cosmic gas (H & He) and dust
2. The abiogenesis of life from non living matter.
3. The continued evolution of living species by natural selection.
All these postulate a reduction in entropy and corresponding increase in complexity and so depend upon an unambiguous definition of this term. In particular whether or not something is living or not should not make any difference. It is because the entropy of a system is the quantitative measure of the time decay of order in a system it must relate to the reduction in complexity. While one effect is the dissipation of energy and evening out of properties like temperature and pressure it's statistical basis from Ludwig Boltzmann, reveals absolute entropy is based on the probability a particular arrangement may exist.
Improbability is the inverse of probability, meaning the total number of possible arrangements a system can take. The significance of the Boltzmann equation for entropy as the natural logarithm of the number of microstates (number of possible arrangements) it automatically includes both logical improbability as well as physical improbability. Meaning not just the number of physical arrangements a system can take but if you identify each particle all the possible permutations within each physical arrangement. Broadly, physical arrangements are called macrostates while logical arrangements are called microstates.
To illustrate imagine tossing ten child's building bricks into a 10 x 10 grid tray. Assume they are constrained to align with the grid by funnel. The probability they will end up in a neat row across the centre is small so lets talk about its inverse or improbability, this is simply the total number of possible arrangements of which this is just one. Ignoring order it is given by the familiar nPr or number of permutations of 10 in 100 = 6.28e19. This represents the physical improbability of this state. It gives the average occurrence of this arrangement (exactly) if you throw the bricks an infinite number of times. Now lets number the bricks and examine the improbability of them falling in readable (upright) order from 0 - 9. This last state is a logical improbability which = 6.28e19 x 10! (ordering) x 10^4 (4 orientations) x 10^6 (6 faces) = 2.27e36, (for just 10 bricks), note the logical improbability multiplies with the physical improbability to produce the total improbability. Now replace our bricks with atoms and our tower with molecular arrangements and it is clear the Boltzmann equation for the entropy of a system at a statistical level provides a direct measure of 'complexity' applicable to any physical arrangement of matter and we may conclude that:
FAITH & RELIGION
Faith simply means what we exhihbit whenever we act without complete knowledge. Since no one can claim complete knowledge about anything we all must express this kind of faith in all we do. So even Richard Dawkins may be said to have faith in these terms! This 'rational' faith is therefore based on the concept of truth as expressed above. Rational faith is simply the acceptance of the minimum uncertainty which it is either not possible, uneconomic or unnessessary to try and eliminate. The constraint of rational faith being that it is employed whenever we have decided to act, ie. when necessary, implies that it is esential for our livelyhood or life concerns (even if it is only to relax in a deckchair and trust it will not collapse).
Religion on the other hand is largely cultural, based on beliefs (without necessarily concerning oneself about the truth or otherwise of those beliefs) and traditions. It does play its part in the fabric of most societies and is used by many individuals to fill a need for lets say telling their story about their experience of the deep mysteries of life, etc. No need to go on here. The main point of course is that religion as such is not science (as defined above) and religious followers whatever their persuasion, have no mandate or right or even need to teach their beliefs as 'truth'. People of religious persuasion do have a right to hold their beliefs, however we all have an obligation to speak the truth, particularly to those who are dependant on us or on our position (be it parent, teacher or scientist). This is a moral imperative or ethical issue completly independant of our religious beliefs.
In reality we all need truth (above definition - knowledge), we all excercise faith (rational) but religion is or (should be) optional.
Vh mby (talk) 01:09, 9 November 2010 (UTC)