Jump to content

User:Vh mby

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Vh mby (talk | contribs) at 13:06, 1 June 2018 (→‎FAITH & RELIGION: typo). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

SCIENCE

Good science (IMHO) should include at least the following:
1. Ask questions (impertinent if necessary)
2. Think critically (esp re assumptions)
3. Tell the truth (all that is relevant even if somewhat unpalatable)

Who does science? Well everybody does..

So why is science today falling into disrepute? <don't know - don't care> tea shirts etc..

1. Scientific reports can be bought
2. Scientific reports can be ordered (political agendas)
3. Scientists can 'weld' everything to a pet theory (world view)
4. Scientists tend to have big egos which contribute to a reluctance to admit error
5. Scientific institutions (even the Nobel Prize Committee) like any other organisation such as companies and government parties, are by definition embody 'self preservation' as such they have an 'image' to protect.. and do so vigorously even at the expense of hiding uncomfortable truth.

All of the above of course turns up here! What I mean is that even wide publication, peer review and mainstream scientific community acceptance does not guarantee good science is always at the top of the agenda..
Vh mby (talk) 00:54, 9 November 2010 (UTC)

TRUTH

So what is meant by 'truth'?.. simply "stuff we know well enough to use with confidence", it is in the end what constitutes our knowledge base. Does that mean its 'absolutely true' or is there any room for 'uncertainty'.. for us mere mortals no and there is always some uncertainty. The concept of 'absolute' truth while not being very useful is indicated by the nature of truth itself ie it is exclusive of all other proposals.. we cannot have that level of certainty because we cannot ever say we know everything. However knowledge or our truth does exist because it is how we got from the Wright flyer to the Saturn V or from the Sopwith Camel to Concorde.. Engineering (Applied Science)is based on knowledge which is considered to be true, it is also an essential component of law, ethics, business, community life, family life, information systems, communications etc.. So how may we separate or distinguish this 'truth' from theory, conjecture or speculation?

Let me informally define a FACT as a VERIFIED OBSERVATION.

We now classify observations as either verified or unverified. Unverified contrary observations may be held in tension without destroying our idea (theory) however it only takes one verified contrary observation to FALSIFY our theory. To establish truth we obviously need more verified observations (facts) than unverified observations. However even the process of verification can be somewhat 'subjective'.

So now we must include and deal with the problem of UNCERTAINTY. It relates to both the observed facts and the truth we want to establish.. only then can we say we have a working definition of truth. When a number of individuals with relevant training and/or experience agree on the facts surrounding an issue and collectively act on or use that knowledge and this is repeated over a period of time by different individuals we usually conclude we have the truth. However in keeping with the well established UNCERTAINTY PRINCIPLE a small uncertainty always remains. I suggest we overcome this very small uncertainty with.. FAITH.! It is simply the best expression we have for what we all do when we need to TAKE ACTION based on our perception of what is TRUE. The safety of bungee jumping or aerobatic flight is soundly based on accepted truth about materials, processes training etc.. but let me suggest.. they both require the exercise of a considerable degree of faith. So our 'formula' for truth becomes:

         TRUTH = FACTS + FAITH   (FAITH > 0 and FACTS >> 0)

Without facts your truth is just belief, without faith and your truth becomes absolute, and impossible. Between these two extremes is a sliding scale (% facts) vs (% faith) but both are always present.

Truth as defined thus has some interesting characteristics:
1. Since everyone does 'science' anyone may discover 'truth' and so we may conclude that no one actually owns or has exclusive rights to truth.
2. When we have found a truth in these terms we may conclude: all contrary proposals must be considered false. Until at least one verifiable contrary fact is observed. This is the concept of falsification which is necessary for good science.
3. There is no valid alternative to the search for truth and process of verification or falsification.
4. The search for and process of establishing truth is independent of the subject matter and therefore applies to any area of knowledge.. The only requirement is that verifiable observations (facts) may be established. Even the distinction natural or supernatural has no bearing on this process.

Vh mby Gyroman (talk) 02:32, 5 November 2016 (UTC)

INFORMATION

Ever since the discovery of the structure of DNA we have known that life may be characterized as 'information rich'. However like the term complexity, 'information' of the type that is found in DNA is not very well defined. Which may also have something to do with the fact that 'life' itself is not very well defined either. It is a crucial definition for any 'information rich' type of complexity, which by the way also includes man made objects. The main focus of the literature on information is about the quality of the signal and not its meaning, (see Shannon). Physicists and cosmologists refer to information in the universe, meaning the event history of every particle in it. This however is not 'signal' or 'semantic' information, of the type found in DNA and human design. This distinction is the basis of the SETI program. It postulates that only intelligence can be the source of semantic information. Intelligence is associated with life and we know that intelligent life like ourselves is distinguished by the objects we make and the language we use in communication. In fact the very tools required to make intergalactic transmissions must be specified by semantic information so we may conclude that there is no difference between semantic information in an intelligently designed object and that found in DNA which is also a specification for a living creature.

This implies a single definition for semantic information should suffice for all occurrences of it. Firstly lets describe semantic information correctly..

All Semantic Information is a NON Repeating, NON Random, ORDERED Set

Like trying to define life, semantic information has some universal, characteristics which are helpful in closing in on a definition.
1. It is expressed in symbolic codes which come from a finite alphabet
2. The codes are grouped according to a grammatical language
3. There is a method of writing
4. There is a method of reading
5. It has a meaning and a purpose

Since methods for writing and reading imply physical structures with a purpose they must also be specified by semantic information.! Which means the proper definition of 'Semantic Information' implies it is recursive.. It cannot exist without prior semantic information? Presenting a whole bag full of problems for both the naturalist and the Atheist.

Following prior work on the subject we may understand the following:
Semantic Information is a symbolic representation of a material reality.
Ref: Dr Werner Gitt. "In the Beginning was Information"

Vh mby (talk) 02:44, 16 April 2009 (UTC)

COMPLEXITY

Although this is a very commonly used word it is amazingly poorly defined and an indicator of the nature of a much bigger problem with another word 'entropy'. Scientifically speaking it is an essential word for a true comparison of states of matter consistent with the 'entropy' of those states. A definition of complexity derived from the absolute entropy of a state of matter is universal, essentially giving the required independence of all the specific details of the system (atoms/molecules/objects, assembly method or purpose etc). The process of scientific discovery (seeking the truth ie knowledge) about a subject often requires the classification and arranging of things or concepts according to logical patterns or rules. When a basic rule is obtained from the data it is possible to predict characteristics of missing elements or project to unknown data which guides future searches. Examples abound in physics, chemistry, cosmology, biology etc. When we say something is more complex than something else we are really saying it is more rare or difficult to achieve. As such a general definition of complexity is pivotal to the investigation of any postulate involving a natural origin or rise in complexity (ie evolution). The following are of particular relevance:


1. The formation of stars and galaxies from cosmic gas (H & He) and dust
2. The abio-genesis of life from non living matter.
3. The continued evolution of living species by natural selection.


All these postulate a reduction in entropy and corresponding increase in complexity and so depend upon an unambiguous definition of this term. In particular whether or not something is living or not should not make any difference. It is because the entropy of a system is the quantitative measure of the time decay of order in a system it must relate to the reduction in complexity. While one effect is the dissipation of energy and evening out of properties like temperature and pressure it's statistical basis from Ludwig Boltzmann, reveals absolute entropy is based on the probability a particular arrangement may exist.

Improbability is the inverse of probability, meaning the total number of possible arrangements a system can take. The significance of the Boltzmann equation for entropy as the natural logarithm of the number of microstates (number of possible arrangements) it automatically includes both logical improbability as well as physical improbability. Meaning not just the number of physical arrangements a system can take but if you identify each particle all the possible permutations within each physical arrangement. Broadly, physical arrangements are called macrostates while logical arrangements are called microstates.


To illustrate imagine tossing ten child's building bricks into a 10 x 10 grid tray. Assume they are constrained to align with the grid by funnel. The probability they will end up in a neat row across the centre is small so lets talk about its inverse or improbability, this is simply the total number of possible arrangements of which this is just one. Ignoring order it is given by the familiar nPr or number of permutations of 10 in 100 = 6.28e19. This represents the physical improbability of this state. It gives the average occurrence of this arrangement (exactly) if you throw the bricks an infinite number of times. Now lets number the bricks and examine the improbability of them falling in readable (upright) order from 0 - 9. This last state is a logical improbability which = 6.28e19 x 10! (ordering) x 10^4 (4 orientations) x 10^6 (6 faces) = 2.27e36, (for just 10 bricks), note the logical improbability multiplies with the physical improbability to produce the total improbability. Now replace our bricks with atoms and our tower with molecular arrangements and it is clear the Boltzmann equation for the entropy of a system at a statistical level provides a direct measure of 'complexity' applicable to any physical arrangement of matter and we may conclude that:

COMPLEXITY = IMPROBABILITY

Vh mby (talk) 00:53, 9 November 2010 (UTC) Gyroman (talk) 12:06, 17 September 2016 (UTC), Gyroman (talk) 10:30, 12 February 2018 (UTC)

ENTROPY

This is at the very centre of the biggest problem in science today and will no doubt continue to be so well into the future. It is the reason for the foregoing and pertinent to what follows. You only have to look at all the articles and the talk pages on the subjects Entropy, Introduction to Entropy, Entropy (disambiguation).. to appreciate there IS a problem. It is not because we do not know the answer either. However to answer why it is the case you must first answer the question, Do you really want to know the truth? The search for truth is basic to all science but it can never be assumed to actually apply. You and you alone must decide what is true.

The proper definition of entropy is a well known and well documented truth which I will try to show here. But one that a rather large and influential group of individuals do not really want you to fully understand. So from the Cambridge Encyclopedia of Technology.. "Entropy: A measure of disorder" but according to Emeritus Professor of Chemistry Frank Lambert "Entropy: Is not disorder" and there you have the problem.

You may see lots of attempts to show 'order' to be a subjective/analogous purely human judgment which is not rigorous or measurable so not scientifically valid. Well let me assure you the writers of the Cambridge Encyclopedia were well aware of that requirement. An ordered state is properly and most simply just an improbable state..

We have to start with Equilibrium.. the concept where molecular behavior becomes so evened out that nothing much happens.. Its quite simple, every molecule in any system (defined by a boundary) has both a momentum and a position. You must understand that just as it is possible (but very unlikely) that all the energy will be concentrated in one atom while all the others are stationary, so it is just as unlikely that all the molecules will be packed into say one tenth of the volume leaving a vacuum in the rest. Both these unlikely states viewed as an instantaneous snapshot are called microstates. They are unlikely or improbable (even though all possible microstates of the system are equally likely) because each microstate belongs to a group of similar arrangements having that same distribution of momentum & position which we call a macrostate. The number of microstates in different macrostates differs and is what creates a probability distribution of macrostates. The more likely simply has a greater number of microstates and the least likely has the smallest number of microstates. This all means that of all the possible microstates a system may take it will by simple probability, tend to be found in a macrostate that has the highest number of microstates.

The number of microstates in a given macrostate is obtained by identifying each atom then counting the number of possible (logical) arrangements by exchanging every atom/particle with every other atom/particle of the same momentum/position. In the case of a single atom having all the energy in a system of N atoms (N-1 atoms are stationary) the number of microstates in that macrostate is simply N because there are N atoms which could be the one with all the energy. Whereas if the energy of a system is equally divided among all N atoms the number of microstates is N! Clearly much more likely. The log of this very large number is the basis of the absolute entropy of that particular state. The fact that the momentum(speed) and position of each particle will be a statistically distributed over the whole system in accordance with the Normal Distribution is a necessary requirement. The actual momentum and position of any single molecule is independent of which molecule it is but the count of microstates must cover all the logical possibilities (called the logical phase space).

There is a terrible confusion between entropy and information which is most aptly resolved in an article from the Encyclopedia Britannica 1965 "Heat" as follows

"The similarity of the theory of heat to the theory of information also is striking in many other ways. The second law of thermodynamics states that entropy always increases in any spontaneous change; in the limit, entropy remains constant if the change takes place reversibly; and it never decreases spontaneously. Similarly, information always decreases as the result of being communicated; in the limit it remains constant as the communication becomes perfect; ie when no randomness such as electrical noise is introduced in the act of communication; but information never increases as the result of communication. Thus entropy and information are strictly isomorphic quantities, though differing in sign, the first increasing and the second decreasing when randomness occurs." Randomness obviously includes random mutation of information encoded in DNA during reproduction. The statement is thus most pertinent to the theory of evolution as reproduction is just one form of communication.

Why does the confusion persists? Naturalists have an agenda based on an unshakable belief that mass & energy are all that exist; which requires that Mass + Energy = Information; thus allowing a theory that life could make itself! That nothing more than the basic laws of the universe acting over a very long time are all that was required to do it. Its equivalent to the old 19th century belief that information is free. The simple truth about entropy reveals the untenable nature of that postulate which was effectively falsified by Leo Szilard in 1929!
Gyroman (talk) 00:10, 9 January 2017 (UTC)

FAITH & RELIGION

Faith simply means what we exhibit whenever we act with incomplete knowledge. Since no one can claim complete knowledge about anything we all must express this kind of faith in all we do. So even Richard Dawkins may be said to have faith in these terms! This 'rational' faith is therefore based on the concept of truth as expressed above. Rational faith is simply the acceptance of the minimum uncertainty which it is either not possible, uneconomic or unnecessary to try and eliminate. The constraint of rational faith being that it is employed whenever we have decided to act, ie. when necessary, implies that it is essential for our livelihood or life concerns (even if it is only to relax in a deckchair and trust it will not collapse).
Religion on the other hand is largely cultural, based on beliefs (without necessarily concerning oneself about the truth or otherwise of those beliefs) and traditions. It does play its part in the fabric of most societies and is used by many individuals to fill a need for lets say telling their story about their experience of the deep mysteries of life, etc. No need to go on here. The main point of course is that religion as such is not science (as defined above) and religious followers whatever their persuasion, have no mandate or right or even need to teach their beliefs as 'truth'. People of religious persuasion do have a right to hold their beliefs, however we all have an obligation to speak the truth, particularly to those who are dependent on us or on our position (be it parent, teacher or scientist). This is a moral imperative or ethical issue completely independent of our religious beliefs.

In reality we all need truth (above definition - verified knowledge), we all exercise faith (rational) but religion is or (should be) optional.
Gyroman (talk) 13:06, 1 June 2018 (UTC) Vh mby (talk) 01:09, 9 November 2010 (UTC) Gyroman (talk) 10:13, 12 February 2018 (UTC)