Talk:BASIC/Archive 2

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1 Archive 2

Structured/unstructured examples

It's just occurred to me that, although their dialects differ, the structured/unstructured examples are almost identical in terms of structure. In fact both are well structured. The only difference is that one implements the structures using IF/THEN and GOTO whereas the other implements the same structures using DO/LOOP.

We really need the unstructured example to be less structured to make the point. -- Derek Ross 16:37, 18 Nov 2003 (UTC)

Or do we, actually? Biased as I am (I did the recent modifications to the code example), as I see it, the example illustrates 1) the better structuring support you get from structured BASIC, and, 2) equally important, that a programmer who cares for easily identifiable structure in her/his code will often be able to impose this whether the language of choice/force supports it. --Wernher 03:37, 20 Nov 2003 (UTC)
That said, it could be fun to add an example stub that goes to the extreme in what you said, i.e. a quite incomprehensible (but correct) BASIC stub :-) --W.

What is the difference between "structured" and "unstructured"? I was kind of offended, having grown up using Applesoft, that it was called "unstructured". Imho the example is really bad too. REPEAT$, STRING$, UCASE$, are functions specific to the language, and do not demonstrate that "non-structuredness". The only real difference in the program is the DO:LOOP structure, which is accomplished just as easily with the GOTO statements. It would help if someone defined what exactly is meant by "structured" (Applesoft has GOSUB, ON X GOTO, and DEF FN for functions). BASIC interpreters without line numbers just interpret them internally. Is that really enough to call the language "structured" vs "unstructured"? line numbers?

(Additionally, it is my opinion that data declaration is actually more of a step backward in terms of BASIC as a language.)--Ben 08:27, 18 Jun 2004 (UTC)

Firstly, you are right about the examples: both of them are structured (in the sense that Dijkstra defined) despite the fact that the first one uses GOTO statements and line numbers. We really need a better example of unstructuredness.

Secondly, the simplest definition of structuredness that I have seen is this. When a programmer rewrites the program as a flowchart, a structured program produces a flowchart with no crossing lines; an unstructured program doesn't. No crossing lines is good because it implies that the original program flow is easier to follow and thus the original program is easier to debug or to change at a later date. BASIC doesn't force programmers to program in this way any more than C does but it is certainly possible (and recommended) that they do so.

Structured programming also implies local variables. Unfortunately, early versions of BASIC (including Applesoft) did not support this concept and thus all variables were global. This has little or no effect on small programs but it does cause problems for larger programs as the number of variables that the programmer has to remember becomes bigger. It eventually becomes a near certainty that the programmer will create a new variable with the same name as an old one, leading to problems, if not to disaster. This not nearly so likely to happen with local variables which are cleared whenever their particular procedure has finished and so can have the same names as local variables in other procedures.

Lack of data declaration leads to a similar problem in that, as a program becomes larger, it becomes more and more likely that it will contain a mis-spelled variable name. This does not cause a problem when data declaration is compulsory, since BASIC complains that the programmer is using a new variable which has not been declared. However it does cause a problem when data declaration is optional since BASIC creates a new variable where one was not intended, leading to minor problems if the programmer is lucky and major problems if not. That is why data declaration is definitely a step forward. A nice thing about some dialects of BASIC (including Visual BASIC) is that you can choose whether you want compulsory declaration or not via the OPTION EXPLICIT command. Naturally it should always be turned on when writing large programs. -- Derek Ross | Talk 14:50, 2004 Jun 18 (UTC)

The way you are talking about it, to me it sounds like "structured programming" and "unstructured programming" are based on a 'style' of programming rather than the language? It doesn't seem like you can say Applesoft is unstructured, but you can say that the style often used is unstructured. Earlier BASIC languages (like Applesoft) often tempted the programmer to use an unstructured style, but there was nothing preventing them from using a structured style. Your example, while you make a good point, seems to be more concerned with the ease of use rather than the workings of the language. The only thing I can think of that would give the programmer a hard time when the program is running would be that having less control over the space that variables use could prove difficult if the program stored other data in the same area of memory. Declaring the variables ahead of time could help the programmer use the memory the way they want to. Other than that, those just seem to be gripes. BASIC isn't object-oriented so local variables and global variables don't matter to the finished product (right?), and neither does data declaration except possibly in the example I gave. If there is a BASIC language without functions, or gosubs, then I would call that 'unstructured'. Applesoft, I still think, is a structured language (now that I know more about it) it's just that it would be more difficult to program in a structured way, but it could be done, and the resulting compiled program and flowchart would be identical. --Ben 05:44, 30 Aug 2004 (UTC)

You're quite right, Ben. It is a style of programming rather than anything to do with the language. We shouldn't say that a particular language is structured or unstructured -- only that people use it in an unstructured way. What people normally mean when they talk about a "structured" language is one that makes it easier to program in a structured style, or perhaps -- like Pascal -- makes it more difficult to program in an unstructured style. But as you say in your final comment, the results will depend more on the style of programming than the programming language. -- Derek Ross | Talk 06:33, 2004 Oct 4 (UTC)
Is object-oriented programming something you can only do in an object-oriented language? Is C++ really object oriented, or is smalltalk the only object-oriented language?
You can do structured programming in a non-structured language. Some languages were specifically designed to support structured programming. Structured programming as an idea, was the idea that a program should be made out of (standardised) structures. A structured programming language is a language which has built-in structures for structured programming (REPEAT-UNTIL), and support for user-defined structures (SUB, FUNCTION). Some BASICS had neither built-in structures nor support for user-defined structures, and, in a pattern we've seen repeated many times, drew the criticism of the structured-programming bigots.
Since this is an article about a language, rather than an article about a style, I think that it is appropriate to have an example where the style is identical, and only the language elements differ.150.101.166.15 (talk) 08:12, 25 February 2008 (UTC)
It is, of course, possible to do structured programming in a non-structured language. More surprisingly, it is also possible to program in an object-oriented style using a non-OO language. Bertrand Meyer's tome, Object-Oriented Software Construction, second edition contains advice on how to do so for various non-OO languages. Although BASIC is not one of the languages he discusses there, it is relatively easy to adapt the advice given for FORTRAN IV to a modern structured BASIC. The results are reasonable too.
I can see how you could do it for a 1960s BASIC too, but it would be like teaching a dog to walk on its hind legs: entertaining but of little practical value. There's just too much work involved in emulating the structuring, the local variables and the OO methods. -- Derek Ross | Talk 16:14, 25 February 2008 (UTC)

Article credits to Nupedia/Fedorow

This was snipped from the main page:

Based on an article originally written for Nupedia by Peter Fedorow <fedorowpATyahoo.com>.

chocolateboy 09:33, 2 Jun 2004 (UTC)

This is not a vanity credit. It is actually a legal requirement. This article is based on a Nupedia article written by Peter Fedorow and we are only allowed to use it if Peter is credited in the article, since, unlike you, me and the rest of the Wikipedia editor/authors, he doesn't show up in the Page History as an author. I will replace it in the article. -- Derek Ross | Talk 16:54, 2 Jun 2004 (UTC)

I would favor a policy here that forbids Wikipedia content that requires such a notice. How much Nupedia content remains in this article, anyway? - Bevo 18:11, 2 Jun 2004 (UTC)

More than you might expect. The overall structure of the article is unchanged and most of the work done on the article has been in the form of additions rather than modifications. Paragraphs from the Nupedia article still exist practically unchanged. Even where changes have been made they may well have consisted of moving things around rather than completely rewriting them. -- Derek Ross | Talk

Apparently this article needs to be added to the list kept at http://en.wikipedia.org/wiki/Wikipedia:Nupedia_and_Wikipedia . In reviewing other Nupedia sourced articles, I don't see the exact pattern of attribution that is used here (especially the email address inclusion). I also see that the links to the original Nupedia articles are now deadends. Maybe we just need some uniformity of attribution style. - Bevo 19:48, 2 Jun 2004 (UTC)

POV statement

"such elitism is a recurring theme in the computer industry." I think that this statement is unessesarily POV. This is not a statement of truth, it's a statment indicating that someone (some author of this article I assume) feels that elitism is a recurring theme in the computer industry. It's an undisputed fact that goto statements can cause unnessessary confusion, the paper by Dijkstra simply pointed out why goto statements should be avoided in favor of higher level syntax. You probably won't find anyone these days that still disagrees with Dijkstra. This whole sentence needs to be scraped in my opinion.

Almost immediately after its release, computer professionals started deriding BASIC as too slow and too simple;² such elitism is a recurring theme in the computer industry.

Should be rephrased:

After it's release, several highly-respected computer professionals, notably Edsger W. Dijkstra, expressed thier opinions that the use of goto statements, which existed in BASIC, promoted poor programming practices. Some also derided BASIC as too slow and too simple.²

Let me know what you all think. I think this rewording helps make the sentence much les POV. If no one objects, I will add it to the article shortly. マイケル 20:18, Aug 20, 2004 (UTC)

Your first replacement statement is true of all 1960s programming languages, so if it's worth putting in the BASIC article, it's worth putting in the others too. As for your second statement, Dijkstra derided just about every language that he knew about for one reason or another. So perhaps we should add that into every other language that he mentioned in his SIGPLAN article too. If you don't agree, it would be nice to hear why you think that BASIC should be an exception. -- Derek Ross | Talk 15:43, 2004 Aug 23 (UTC)
That's not the point. The point is there was a POV statement. I rephrased it so that it was not a POV. If you think it could use some adjusting, fine, but I think it is worth mentioning what critics had to say about BASIC. It seems to me that the statement was included in the article in the first place because someone had confused two separate papers written by Dijsktra. One which outlines many problems he saw with languages in general, and then a more in depth paper which only dealt with the use of goto statements in general. Now, I personally think that it would be better to outline more specific criticisms of BASIC. Dijsktra's statements as currently sourced do not consist of the bulk of serious criticisms imho. I don't think the second paper he wrote was meant to be taken as seriously as some people take it. The first, involving goto statements, clearly was meant to be taken seriously though. Now, I'm sure he wasn't the only one criticizing BASIC, so including only reference to his criticism of BASIC, and then claiming criticism of BASIC is elitism which runs rampant is completely and utterly POV. It's irrelevant weather elitism did or did not exist in the computer industry, elitism exists everywhere, in any industry. Also, to point out a case of elitism as criticism and then dismiss all criticism as elitism, is a straw man argument. In any case, I was more concerned with removing the POV statement, which I did. If you can find a better way to phrase the criticism section, you are more than welcome to have at it. マイケル 18:57, Aug 23, 2004 (UTC)
Fair enough (I actually agreed with your point and that was why I didn't bring it up -- I just wanted to change the way that you were tackling it) but the POV could have been most easily fixed by removing the clause about elitism. It wasn't really necessary to repeat vaguely negative information (which appears elsewhere in the article in more detailed form in any case). All that does is change the POV and we actually want to remove it. -- Derek Ross | Talk 00:12, 2004 Aug 24 (UTC)
One more thing. I think that Dijsktra was criticizing the teaching of BASIC as a beginners language, because it got them into bad programming habits. So, that should probably be mentioned to keep it in context. マイケル 18:59, Aug 23, 2004 (UTC)
You should note that BASIC was not particularly intended as a beginners language. In their book "Back to BASIC", the designers make it clear that it was intended as a language for use by all the students at Dartmouth, including those non-technical humanities-type students who nevertheless wished to do simple programming without having to learn more about computers than absolutely necessary. -- Derek Ross | Talk 00:12, 2004 Aug 24 (UTC)
Also, the reason it is worth including Dijsktra criticism of goto statements is the following:
Dijsktra believed that the use of goto statements encouraged poor programming practices. He later stated his belief that people who learn programming via BASIC pick up such poor programming practices, that they can never be good programmers. It is my impression that goto statements were the root of his second criticism. I could be wrong of course, but this seems logical. I think my new rephrasing of this section is less POV than the previous version, and provides better context for these reasons. I agree it could probably use a little more work, but it is certainly an improvement, since it is not longer POV. マイケル 19:12, Aug 23, 2004 (UTC)
I agree that the reason that Dijkstra criticised BASIC was because it led its users "into bad programming habits". However having read a fair bit of the man's writing, I would hazard a guess that although he disliked them, it wasn't so much the BASIC GOTOs that he was thinking of (since they were a feature of nearly every programming language at the time); it was more the BASIC online IDE (extremely unusual at the time) that was the problem, since like most modern IDEs, it encourages thoughtless coding. Dijkstra was very much in favour of mathematical design of the algorithm followed by coding of the program. He disliked the mindless "code, test, code, test, code, test, etc." style of programming which IDEs encourage and this in my opinion is the reason why he said the BASIC programmers had been "mentally mutilated". To some extent I would agree with him. However I think that the mutilation is just as likely if you start programming via an IDE in FORTRAN, ALGOL or C since it is the IDE which encourages the really bad programming habits, not so much the language constructs. I doubt that Dijkstra liked that aspect of the modern programming paradigm at all. -- Derek Ross | Talk 00:12, 2004 Aug 24 (UTC)

backronym, not acronym, right?

From the Jargon File:

Note: the name is commonly parsed as Beginner's All-purpose Symbolic Instruction Code, but this is a backronym. BASIC was originally named Basic, simply because it was a simple and basic programming language. Because most programming language names were in fact acronyms, BASIC was often capitalized just out of habit or to be silly. No acronym for BASIC originally existed or was intended (as one can verify by reading texts through the early 1970s). Later, around the mid-1970s, people began to make up backronyms for BASIC because they weren't sure. Beginner's All-purpose Symbolic Instruction Code is the one that caught on.

func(talk) 12:26, 3 Oct 2004 (UTC)

Whoops, never mind. The most recent version of the Jargon file has apparently changed its mind:

BASIC stands for “Beginner's All-purpose Symbolic Instruction Code”. Earlier versions of this entry claiming this was a later backronym were incorrect. [1]

It's a bit annoying that they don't cite any sources... (but then again, neither does Wikipedia most of the time).

func(talk) 04:15, 4 Oct 2004 (UTC)

<sigh>, Forget the Jargon file -- Use The Source, Luke. In this case the appropriate source is the original Dartmouth BASIC manual written in October 1964 which states on page 2 that BASIC stands for Beginners All-purpose Symbolic Instruction Code. -- Derek Ross | Talk 06:03, 2004 Oct 4 (UTC)

The weird thing about this is that during the last 4 or 5 years or so, I have heard at least 6 unrelated people in completely different contexts refer to BASIC as a backronym or state that it isn't a proper acronym. This is an odd thing for there to be a sort-of urban legend about. func(talk) 13:44, 4 Oct 2004 (UTC)

Well, there are three types of acronymic item: those where the name was created from a phrase which already existed, like NATO; those where the word was created at the same time as its acronymic phrase, like BASIC; and those where the phrase was created for an existing word like ACME which is supposedly an acronym for American Company Making Everything but which is actually a Greek word meaning highest point. The first two types are commonly thought of as true acronyms, natural or contrived, whereas the last type is a backronym since the true origin of its word is unrelated to its acronymic phrase. I can see that people might argue that the last two types were similar though, so perhaps that explains why some people don't think that BASIC is a proper acronym. -- Derek Ross | Talk 14:51, 2004 Oct 4 (UTC)

After performing some google-hit tests, I think this might be notable enough an error to comment on in the article. What do you think about adding something like this somewhere:
Several versions of the popular Jargon File once claimed that BASIC is a backronym created in the 1970s, (recent versions have corrected this). Evidence from the original Dartmouth BASIC manual (1964) show this to not be true, but numerous online dictionaries and reference works on the Internet have now proliferated the earlier Jargon File's error.
func(talk) 16:18, 4 Oct 2004 (UTC)
Sounds good to me. - [[User:Bevo|Bevo]] 17:27, 4 Oct 2004 (UTC)
Yes, that seems reasonable. A small quibble -- please change "show this to not be true" to "show this to be untrue". It sounds better. -- Derek Ross | Talk 17:52, 2004 Oct 4 (UTC)

Preformatted text's column widths

To rationalize my recent edit where I reinserted one horizontal spc char between the columns of the preformatted (ASCII, nonproportional font) 'Relational operators' section: I just did it from the -eh- conviction - that a definition table should have noticably more space between its columns than between intracolumn 'definee-definiter' pairs. In this case, four spaces vs two is the minimum, I think. --Wernher 17:55, 18 Dec 2004 (UTC)

Upper index bounds

DIM myIntArray (100) AS INTEGER
Depending on the dialect of BASIC and use of the Option Base statement, values can range from myIntArray(0) to myIntArr(100), from myIntArr(1) to myIntArr(100) or from myIntArray(LowInteger) to myIntArray(HighInteger).

I'm no BASIC expert, but wouldn't this declare an array indexed between 0 (or 1) and 99? 66.92.237.111 19:47, 20 Feb 2005 (UTC)

I can assure you that it wouldn't. -- Derek Ross | Talk 05:05, Feb 21, 2005 (UTC)

My 2 cents on a couple of points

First off, I saw a early 1980's documentary a few years back which contained an interview with K&K. One of them (I can't remember which) stated that a large number of students were having difficulty learning "progamming concepts" when FORTRAN was their first computer language. So they (K&K) originally created BASIC as a student's learning language.

They go into more detail about this in their book Back to BASIC. Yes, one of the aims was to use it as an easy-to-learn first language for beginner Comp. Sci. students but it was also intended as an easy-to-use language for art and humanities students who would never be expected to learn anything else. Remember that at that time FORTRAN could not handle text (unless encoded as numbers) which made it extremely difficult to use for artistic purposes. -- Derek Ross | Talk 01:26, May 16, 2005 (UTC)

p.s. Prior to BASIC, almost every computer science made sure the student knew some COBOL and FORTRAN so learning BASIC first was originally seen as a good thing. There are people in the field who believe that once your mind has been infected with BASIC, you're forever screwed as a progammer. (I don't believe this by the way)

Glad to hear it. The big reason why BASIC was supposed to screw you up was because of the IDE which supposedly led to thoughtless "code first, design later" programs. Nowadays all languages have IDEs which allow you to "code first, design later" so I guess that we are all screwed!

BASIC is still evolving and one of the biggest changes happened when "Visual BASIC 6" morphed into "Visual BASIC .Net". Prior to ".NET" it was possible to DIM an array from any number to any higher number. When Microsoft developed "Visual Studio.NET" they wanted to be able to easily link object files produced by one language with object files produced by any other language. Since vanilla C and C++ require all arrays to be dimmed from subscript zero, then BASIC would be forced to do the same. (Other operating systems don't pass a pointer to an array but rather a pointer to an array descriptor; this can be handy if you want to do run-time bounds checking. I don't know if MS considered this but they probably realized there is more lines of C/C++ code in production and decided it was easier to force the BASIC developers to change rather than the C/C++ developers but this is just conjecture on my part. On the flip side, lots of code in .NET is only compiled to MSIL (Microsoft Intermediate Language) rather than x86 binary and maybe their decision is related to this)

--Neilrieck 22:40, 2005 May 15 (UTC)

I know what you mean. This C-like array thing really puzzles me. But it's got nothing to do with object code compatibility. It's easy to write a compiler so that it translates an array definition like DIM A[5 TO 7, -15 TO 8] into DIM A[0 TO 2, 0 TO 23] and automatically adds or subtracts a constant offset to any references that appear. In fact even with the C compiler or the VB.NET one, it has to translate DIM A[0 TO 2, 0 TO 23] into A[0 TO 71] and fix up all the references so that they point to the right element. So there is no technical reason why they couldn't have carried on with arbitrary array bases in VB.NET. A properly written compiler would still produce object code which was totally compatible with any of the other .NET languages whether the object language is x86 or MSIL. -- Derek Ross | Talk 01:26, May 16, 2005 (UTC)

Paul Vick, a Technical Lead for Visual Basic.Net, gives a somewhat disapointing explanation in his Panopticon Central blog. According to him, although the .Net CLR (Common Language Runtime) had two types of arrays to suit both the VB and the C camps (one that allowed arbitrary lower bounds, one optimized for zero bounds), the VB.Net team chose to use the zero based array model for reasons of interoperability... -- Branco Medeiros

That's rather a sad explanation. Even now it would be possible to write a separate preprocessor which went through a VB.NET project remapping all arrays with arbitrary lower bounds to zero lower bound equivalents. Why wasn't that done as an integral part of the IDE ? Even the VB6 -> VB.NET convertor utility doesn't do it and the excuse given on that blog won't wash for it since it's a source to source convertor. -- Derek Ross | Talk 2 July 2005 18:49 (UTC)

Delphi is a Pascal dialect and lets you not only set the bounds on the index of an array but to use a newly defined ordinal type to do so. The latest Delphi can compile to .NET, so what problems would be caused by allowing VB.NET to do the same? Kjwhitefoot 21:06:11, 2005-09-06 (UTC)

Note on move

I undid Eyu100 (talk · contribs)'s move of this article to BASIC (programming language) because: (a) this is a fairly major article and so such a move should be discussed beforehand, and (b) because Eyu100 did not bother to fix the many links pointing to the old title (i.e., almost all of them), nor any of the broken redirects. -- Hadal 04:40, 17 Jun 2005 (UTC)

Agreed -- I was just about to do the same. Note also that the title as it stands is in conformance with Wikipedia established practice -- see Ada programming language, C programming language, Lisp programming language, Python programming language. --FOo 04:42, 17 Jun 2005 (UTC)

Missing dialect

Just noticed there's a missing dialect reference, though there is a wikipedia node already, BBC_BASIC.

It's included in List of BASIC dialects and List of BASIC dialects by platform. This article isn't trying to include every dialect as that would be a duplication of those lists.-gadfium 19:38, 26 August 2005 (UTC)

Backronym

The statement read "it was a backronym", but was later reversed. This is:

A. Completely irrelivant to the subject at hand ("what does BASIC stand for").

B. It say it is, then says it isn't. Need we be subjected to this sentence that debates with itself ?

This entire sentence belongs on the talk page. It is irrelivant to the subject at hand, and CERTAINLY does not belong in the INTRODUCTION. Think of it from the point of view of someone who knows nothing about Basic. How does this side argument reflect anything useful about the language ?

Respectfully, Ross, you guys have discussed this issue to death. I think it is more appropriate to discuss why this needs to be IN, that why it needs to be OUT.

Also, I deleted the comment about "not being based on Ogden's Basic book" because I am the one who said that. Now I'm taking it out. Basic is not a soft drink, either, but I don't see how that denial belongs at the top of the page, either. --Samiam95124 21:15, 20 October 2005 (UTC)

Essential Reading

Someone else here mentioned the book "Back to Basic" by John G. Kemeny and Thomas E Kurtz, the originators of basic. This book goes a LONG WAY towards resolving many of the myths perpetuated about Basic (it was always interpreted, etc.). It comes from the horse's mouth, so to speak, and it gives some surprising insights to the Basic language. For example, did you know that Qbasic and True Basic are fairly complete implementations of ANSI full Basic? Unfortunately, John Kemeny passed away in 1992.

This book is highly recommended reading for people here adding to the history section of this page.

--Samiam95124 21:48, 9 November 2005 (UTC)

The correct title is Back To BASIC: The History, Corruption, and Future of the Language. If you're talking about Dartmouth BASIC and its direct descendants, BASIC should always be capitalized, as it is an acronym, and it distinguishes the language from BASIC-like languages such as Microsoft's Visual Basic (a trade name). I agree, Kemeny and Kurtz's book should be required reading for anyone interested in BASIC programming and all who deride the language because their only exposure to it was a crippled, interpreted microcomputer version. Quicksilver 19:22, 11 November 2005 (UTC)