Jump to content

Talk:Metacompiler/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1Archive 2

Edit to CWIC Example

I edited the CWIC example to show the tree crawling of the generator pattern. I changed pattern to unparse_rule which was used in describing TREEMETA pattern matching. I think it is more descriptive. If you follow the generator logic you see it loads a register when it matches an atomic element by default if the preceding transforms are complete and match all other possibilities. If not loadreg would return failure and the generator would then fail. The previous example failed in traversing the tree. And failed in showing the tree crawling ability of the unparse/pattern rule

--Steamerandy (talk) 23:34, 15 September 2014 (UTC)

What real world compilers are we talking about?

QUESTION? what real compilers are we talking about. When I hear meta compiler I think of META I, META II, TREEMETA, (META 5)?, and CWIC. There are other implementations. In the following I am referring to the above compilers. Could we maybe get a list?

If that is the case we are on the same tracke here. If not .. Here's what I think anyway.

Meta Compilers are Parsing expression grammars

The meta compilers I listed above are all using a PEG[1] parsing grammars.

These are programming languages not compiler generators. They do not take a language description and a machine description and produce a compiler. Once you have a working compiler it may be possible to just change the machine description to generate code for a different computer.

A Meta compiler, - compiling it's self is insignificant

Someone seams to think that because these languages compiled them selves makes them special. Well I do not agree. That is simply a side effect of what they are. Not really a defining attribute. They really do not compile them selves. There is far more code in the run-time library. The meta compiler CWIC is object oriented. The part of CWIC that generates code is actually a full implementation of LISP 2 which Erwin Book was a developer of at SDC. CWIC is an object oriented language. A variable held any type of data. Objects carried their type with them. So a variable could contain an integer at some point or list another time. Run-time library requirements included, dynamic memory manager, symbol table manager. Functions supporting parsing are various string compares. Backtracking is a tricky peace of code. There are various flavors of string compares functions that compare a string to the input steam. Some advance the stream, and some do not. Some negate the compare condition on return. Object are dynamic and like C++ dynamic objects have to be initialized. Symbols, integers, strings, nodes, floating point numbers, decimal fixed point numbers. lists and trees. Symbols have to be entered into the symbol table. And removed when back tracking and objects created must be undone. Conversions, string to floating, integer and decimal object arithmetic. An extensive set of file handlers. Not to mention that backtracking involved the file library as well. More conversion routines for printing (list file output. All these run-time support routines have to be written. There is far far less lines of source code in the hand coded meta language then in the library functions. And almost nil time spent debugging the hand coded rules. Run time debugging 100s of hours. A round 1800 hours spent on the SLIC run time. CWIC had MOL/360 that was intended for writing the run-time libraries. META II ran on an interrupter so was fairly simple. But META II was not a viable compiler writing tool. It is vary similar to yacc in code generating functionality.

META I the first metacompiler was hand compiled and used to compile META II. Would META I not be a metacompiler?

I can, and have, hand compiled a meta language into code. We are talking 2 digit man hours, <, 20 hand compiling metalanguage source code to assembly and debuging. I have spent many man weeks implementing and debuging the support library.

Boot strapping actually is a learning process. If you can not hand code the meta compiler. How are you going to generate code to implement it. You have to figure out what to generate. That is a lot easier if you already have working code. You are getting the support library working as you boot strap the language. And it only need be done in steps. Hand compiling is the easy part. Writing and debugging the run-time is the hard part.

Response to "A Meta compiler - compiling itself is insignificant"

You make no sense and I think you are talking gibberish. And you are wrong. The defining feature of a metacompiler is that it's written in its own metadefinitional form and compiles/translates itself to a working/executable version of itself. That's what makes it a *meta*compiler. You continually keep misdefining the prefix "meta-" ... it can also mean "above" or "beyond", as in metaphysics or metadata. It represents one abstraction level above the topic it's about. There is physics, which addresses our physical reality, and then there's metaphysics, which addresses whatever is beyond (or above) our ordinary reality. Metadata is data which describes other data, providing information about the data which is not contained in the data itself and thus can be considered above it in an abstract hierarchical sense. And a metacompiler can be used to define compilers, and compiler compilers, and thus is an abstract definition of the parsing and translation process which is above the usual compilers and compiler generators oriented to a specific task. Metacompiler is to compiler as metaphysics is to physics. Metacompiler is to compiler as metadata is to data. Some compiler-compilers are metacompilers. Not all compiler-compilers are metacompilers. And metacompilers can be very useful for other tasks besides compiling, such as source-to-source transformation. As such, they are a way to translate problems stated in domain-specific languages into working program code. Yes, there are other ways, such as the PEG approach you keep touting, the parsing expression grammar approach, but from my experience that's a pretty lame approach, at least the ways I've seen it used. A simple parser with stack, as generated by a run-of-the-mill metacompiler, is far superior from what I've seen. PEG's seem to be a quick-and-dirty kluge. Invented only recently, 2004. My 2 cents.
It's obvious that you are ignoring Wiki article: Metadata defined here as data about data. A metalanguage is a language used to make statements about statements in another language. The key word here is about. Sorry if you are somewhere off in the beyond? There is no references given in the article to substantiate that a metacompiler must translate it's self. In all the statements I have made about what a metacompiler is I have linked to supporting wiki articles or given valid references. "Metacompiler is to compiler as metadata is to data" Nonsense!! MetaLanguage is to language as metadata is to data. Is not Metacompiler a metalanguage compiler?
"a metacompiler can be used to define compilers, and compiler compilers, and thus is an abstract definition of the parsing and translation process which is above the usual compilers and compiler generators". Now this is exactly what I think should be the defining attribute. A metacompiler takes as input a definition of the parsing and translation process and produces a compiler. Or better: A metacompiler takes as input a metalanguage, formal specification of a formal language, of the parsing and translation process and produces a compiler for that language.
We are mostly in agreement except on a metacompiler compiling it's self being the defining factor. It detracts from what they are. Being able to compile them selves is a natural outcome of what they are. They take a formal description of a language and produce a compiler for that language. compiler compiler originally meant a compiler that produced a runnable compiler. Not some part of a compiler. But yacc changed that.
I suspect we may be talking about two different things. You seem to be hung up on the fact that Schorre named his metacompiler "Meta", and thus you keep going on about "Meta -space- compilers"... a metacompiler is a concept straight out of compiler history... and by the way, surprise! not all metacompilers are made the way Schorre made his. Forth has a long and honorable history of booting itself onto different platforms using a metacompiler written in Forth. I also worked with someone who made a clever metacompiler based on punched card input images -- it was for a Chemical Engineering simulator that had started out using punched cards for input. He called his punched card thingy a "metacompiler", which it was, and the Forth people call their implementation a "metacompiler", which it is. Get off this hang up over what Schorre called his metacompiler. -- Damon Simms (talk) 14:41, 7 October 2014 (UTC)
I am not hung up on Schorre's metacompilers. It's just that all references here seam to connect to Schorre's work. The reason I asked for a list is that if that is the case they use a parsing expression grammar. The meta compilers of Schorre were described as bnf like. BNF is a generative grammar. Schorre's meta compilers are not really bnf like except in the most generally sense they are rule based languages. Schorre's meta languages are analytic grammars precisely defining the order of tests in recognizing language constructs. In Schorre's meta languages: X = A | B | C; is interpreted to mean look for A, if A fails look for B, if B fails look for C if C fails X fails. If A is successful no further testing tried And X is successful. It is not a generative grammar!!! That is the description of a parsing expression grammar as described in Ford's paper. You seam to be hung up on Ford's implementation descriptions. Maybe, the reason you do not understand the cc example. cc is a work in progress. Right now I am working mostly on the runtime library. The cc source has not been compiled. It is hand compiled into x86 code. I am using C++ naked functions. The syntax language does not use a stack frame. ebp is used as a backtrack block pointer. Have you ever looked at CWIC. I am not sure what is in the ACM paper. I got my manuals from Erwin Book. There are more pages then in the ACM publication. I wrote SLIC which is an extended version of CWIC and it was used to write a COBOL compiler. I have some of the SLIC code. I lost the working version when my son used the mag tape, it was on, to decorate the house. He was 4 at the time. Really no matter the Dec-System-10s are long gone. Is it that you do not under stand the cc code because you are hung up on generative languages like BNF. An analytical syntax avoids backtracking by factoring. Back tracking is used in error recovery. What exactly do you think is wrong? other then an extra right paren. Fixed. Thinks that is the problem I have been looking for, in the wrong place. The hand compiled assembly is more up to date. The parser is working. and generating the correct trees. No matter what metacompilers we are talking about I still do not think compiling them selves is what defines them.
A metacompiler is defined by a set of grammar productions defining itself, There is a problem here then. None of Schorre's work can be included here.
A production or production rule in computer science is a rewrite rule specifying a symbol substitution that can be recursively performed to generate new symbol sequences.
In formal language theory, a grammar (when the context is not given, often called a formal grammar for clarity) is a set of production rules for strings in a formal language. The rules describe how to form strings from the language's alphabet that are valid according to the language's syntax.
The syntax language of META II TREEMETA and CWIC are not generative rules. They are analytic gremmar rules. NOT GENERATIVE. If you can not understand the difference analytical and generative, maybe you need to go back to school. You are the one who wrote the original article. Rught. Then it is you who does not know what you are talking about. Those are the classic meta compilers and they do not fit your description.

--Steamerandy (talk) 23:13, 7 October 2014 (UTC)

A MetaCompiler is:

Meta is used to mean about or describing, as in metadata (about data). A language that is used to describe other languages is a metalanguage. English can be considered a metalanguage. BNF, Backus–Naur Form, is a formal metalanguage originally used to define ALGOL 60. BNF is a weak metalanguage, for it describes only the syntax and says nothing about the semantics or meaning. English on the other hand is powerful, Yet its informality prohibits its translation into computer programs.

A meta-compiler is a program that reads a metalanguage program as input and translates that program into a set of instructions. If the input program is a complete description of a formal programming language, the translation is a compiler for the language.

More explanatiion of why a meta compiler is a PEG

The reasion I say they are parsing expression grammars, is simply that when you write a grammar rule it is executed in the order you write it. In the meta language a parsing rule is a conditional expression. You as the compiler writer control the order of parsing. If you write as one would in bnf it wont work. For example the following left recursive rule:

 expr = expr | expr '+' term;

will be an infinite loop. Rules are made up of tests and control structures with occasional calls to code productions. A rule is a test, a function that return success or failure. As a programmer you are programming tests against the input stream. And like most programming languages it does exactly what you tell it to do. That may not be what you intended. For illustration say we compile expr above, as a meta compiler would do, to C++ naked asm function. It would generate code (not the comments) exactly following the rule testing sequance as written. (inside _asm{} C++ // commenting is used for better explanaton)

 static char plus_OPR[]  = "+";

 __declspec(naked) void expr () {  _asm { //  = expr | expr '+' term;
	call	expr		// expr   <<<<<<======== Bad news here.
	je	l2
	call	expr		// | expr 
	jne	l2
	push	offset plus_OPR	// '+'
	call	_CmpStr
	jne	l2
	call	term		// term
 l2:	ret			// ;
 }}				// Hand compile time as fast as I could type it. Extra time for comments!!

In the above example generated code the first thing expr does is call it's self a never ending recursive loop. Not really as it would quickly run out of stack and generate an exception. This specific blunder can easily be detected at compile time. These language do not interpret your intentions. The simply and rightfully compile the expression as written. The programmer decide how the source is to be parsed. Right or wrong.

The programmer has control of the parsing order. expr in CWIC would be written as:

 expr = term (('+':ADD|'-':SUB) expr !2 |.empty);

The :ADD, :SUB and !2 are tree generation elements of CWIC. Above expr is defined as a term followed by a plus or minus term or just a term by the empty alternate. After recognizing term. The parser function then looks for a (+|-) The grouping says that if neither a + or - is recognized the .empty alternate is taken. .empty is always successful, allowing a single term to be recognized. But if a + or - is recognized a turm must follow. If the second term is not recognized it would be a backtrack failure. Not only would term fail but all rules back to a back track point. Probably the whole statement containing the error-ed expression.

Generating:

__declspec(naked) void expr () {  _asm { 	//  = (('+':ADD|'-':SUB) expr !2 |.empty);
	call	term		// = term 
	je	l2		// succes | failure retuned  in processor status z flag
l1:	ret			// returns failure
l2:	push	offset plus_OPR	// (('+'
	call	_CmpStr		// matchs a string skiping white space
	jne	l3
	push	offser ADD_Node	//  :ADD
	jmp	l5
l3:	push	offset dash_OPR	// | '-'
	call	_CmpStr		//
	je	l4
	cmp	al,al
	ret			//  return success
l4:	push	offser ADD_Node	//  :SUB
l5:	call	_node
	call	expr		// expr) 
	je	l6
	mov	esp,ebp		// long backtrack fail
	ret			// return
l6:	mov	ecx,2
	jmp	_tree		// !2 
}}// With all the generated tags l1: l2: etc this took a bit longer aa I was trying to make it readable.

The right recursion generates a right handed tree. Given a+b-c would produce: ADD[a,SUB[b,c]]

     ADD
    /   \
   a    SUB
       /    \
      b      c

It can be written to generate a left handed tree using the zero or more $ operator:

 expr = term $(('+':ADD|'-':SUB) term !2);

Given the same expression a+b-c would generate a left handed tree, SUB[ADD[a,b],c]

        SUB
       /   \
    ADD     c
   /   \
  a     b

Both of the above generate a tree. The programmer has program control parsing and production. A list can be generated of desired:

 expr = <term $(('+'|'-':MNS) term !1)>:SUM!1;

Produces:

  SUM[[a,b,MNS[c]]]

       SUM
        |
       [ ]
      / |  \
    a   b  MNS
            |
            c

Concerning the "meta compiler" I think that because their documentation describes the parser language as bnf like has confused the issue. I have been using these meta compiler concepts for a lot of years. Never used yacc. Why would I. It's primitive. The first compiler I wrote, based on CWIC (Compiler for Writing and Implementing Compilers), was SLIC (System of Languages for Writing Compiler) which I extended adding languages to do in sequence optimizations and produce code for any computer. Other compilers I worked on were implemented in PASCAL and or assembly on a early micro computers. A lot of concepts I used writing compilers in PASCAL were from my experience with SLIC. That brings me to my point. Everybody I know, that has used one of the meta compilers, has worked on them. That's the nature of these beasts. That makes for possible conflict of interest to be raised. I think that the compiler technology should not be lost. The secret classification that was slapper on CWIC by the U.S. government in the early 70s probably is the reason this technology isn't known well today.

The meta compiler do not use a contest free grammer. The programmer had control. The grammer is a recursive decent LR grammar.

As an example here is an example of the meta description of a meta compiler. Meta compilers are recursive decent top down compilers. There is a top level driving rule. In the following example program is that top rule. This is no different then main in c or C++. Here program is equivalent to main.

/*

  CC  (COMPILER COMPILERS)

   Mine compiler driver starts with Program.

*/

program =  $ declarations;

declarations =	"#" dirictive
		| comment
		| global		DECLAR(*1)                 
		|(id (grammar	PARSER(*1)
		    | sequencer	GENERATOR(*1)
		    | optimizer	ISO(*1)
		    | pseudo_op	PRODUCTION(*1) 
		    | emitor_op	MACHOP(*1)))
		|| ERRORX("!Grammer error") garbol);

grammar =	( ':'  class	:CLASS   // parse character class rule
		| ".." token	:TOKEN   // parse token rule
		| "==" syntax	:BCKTRAK // backtrack grammar rule
		| '='  syntax	:SYNTAX  // this grammar can long fail.
		)';'!2			 // Combine name and rule tree
			$(-break -"/*" .any);

--Steamerandy (talk) 19:39, 29 September 2014 (UTC)

Proposed Change to MetaCompiler

Metacompilers are a special class of Syntax-directed compiler compiler. Using Parsing expression grammar rules combined with semantic production rules or elements.[1][2][3][4]

Meta is used to mean about or describing, as in metadata (about data). A language that is used to describe other languages is a metalanguage. English can be considered a metalanguage. BNF, Backus–Naur Form, is a formal metalanguage originally used to define ALGOL 60. BNF is a weak metalanguage, for it describes only the syntax and says nothing about the semantics or meaning. English on the other hand is powerful, Yet its informality prohibits its translation into computer programs.

A meta-compiler is a program that reads a metalanguage program as input and translates that program into a set of instructions. If the input program is a complete description of a formal programming language, the translation is a compiler for the language.

A metacompiler is, it's self, usually defined in a metalanguage. This metalanguage can then be compiled by a metacompiler, usually it's self, or hand compiled into an existing computer language. The step of first writing a matacompiler by hand compiling into an existing language is called bootstrapping. That is, the programmer pretends to be the metacompiler, parsing its rules and writing the code the metacompiler would.generate, if it existed.

"Looks like I was programming before you were born."
"You youngsters just think you know everything!!"
I'm as old as you, dumbshit
(you left those arrogant quotes on my User Talk page as part of a rant about how superior you are)
-- Damon Simms (talk) 12:34, 7 October 2014 (UTC)
I seriously doubt you understand this topic and associated concepts. Any changes you make to the Metacompiler Wikipedia page I will consider an act of ignorant vandalism, and will report it as such. Damon Simms (talk) 13:41, 7 October 2014 (UTC)

aha

Ok, now I'm beginning to understand the source of your confusion, Andy.

You keep talking about "meta compilers" -- but this article is about metacompilers.

"Metacompiler" is a concept in Computer Science, not a particular instance or set of instantiations.

You apparently are very familiar with the Meta-II work of Schorre. That is one instance of a metacompiler. You seem to think we are talking about derivatives of Schorre's work which could all be considered "meta compilers", that is, compiler compiler systems based on that Meta-II family.

Well, we're not. Metacompilers are the generic abstract concept, whose main feature, by definition, are that they are written in their own metadefinitional form and can translate a metadefinitional description of themselves, written in that metadefinitional form, into an operational executable version of themselves. Screw the Meta-II work of Schorre and his cohorts, forget all about it. You seem hung up on it. That's not what we're talking about here.

I know of at least 3 other versions of metacompilers that are totally different from Meta-II and it's descendants, totally orthogonal to that whole approach. And yet they are still metacompilers, not "meta compilers", they have nothing nothing nothing to do with Meta-I or Meta-II or TreeMeta or CWIC or anything else like that. And yet they still fit the bill for being metacompilers -- by the definition I have given.

Once again. This is about metacompilers, not meta compilers.

Forget about Meta, cast it into Hell... for God's sake, learn Forth.

And if you try changing this page to your limited wrong myopic views, we will consider it to be vandalism. This is an article about a Computer Science concept, not a particular implementation of that concept. -- Damon Simms (talk) 16:53, 7 October 2014 (UTC)

Well then this article should be deleted. There are no references other then to Schorre's work. Neighbors is META II. and the rest of the references are to Schorre. And as I pointed your text excludes any analytical parsing languages!!
"Metacompiler" is a concept in Computer Science, not a particular instance or set of instantiations
where is the referance.
aha, The McGraw-Hill Dictionary of Scientific and Technical Terms definition of metacompiler:

A Metacompiler is a compiler that is used chiefly to construct compilers for other programming languages.[5]

Nothing about having to compile it's self in that definition. It's way down on the list of attributes. There were several metacompiler that were not used to compile them selves. An assembly coded version was used because it ran faster then the code produced by the self compiled version and was sufficient for the task at hand. Another compiler that was not compiled by it's self because it was used to study it's recursive decent properties. In that case extra motoring code in the assembly coded metacompiler would not have been generated by the compiler compiling it's self. Metacompiler were meant to produce compilers for other languages. The first, metacompiler was meant to produce ALGOL like compilers. All metacompilers have been developed as general compiler compiler tools that are used to define and produce programming languages other then them selves. The present definition as found in the McGraw-Hill Technical Terms dictionary is saying that they are meant to construct compilers. Compilers are produced by metacompilers. Forth includes a compiler as does Lisp and is able to compile their own language. That is why Lisp is not a metacompiler. Forth is not a metacompiler for the same reason.

Metacompiler are meant to develop other languages other then them selves. Every real metacompiler is able to define multiple languages not just themselves. Who was the brainless nincompoop that decided to call Forth a metacompiler. Nobody is calming lisp to be a metacompiler and it compiles it's self identically the same way that forth does.

--Steamerandy (talk) 23:40, 7 October 2014 (UTC) --Steamerandy (talk) 23:21, 7 October 2014 (UTC)

What's going on

Andy, do you have a problem with reading? Or understanding what you read? Dyslexia perhaps? Or are you intentionally misreading or ignoring what seems to be clearly written?

First of all, you keep confusing the items I provided you about the Forth metacompilers. These are 4 separate items:

Forth -- is a computer language, sometimes implemented with its own metacompiler
the Forth Metacompiler -- is a metacompiler, completely separate from the Meta family of metacompilers... it makes use of the unique characteristics of the Forth language for its implementation
MetaStic -- is a metacompiler derived from the Forth metacompiler which was created to implement Elastic, which is not the same as MetaStic; the first version of MetaStic was the bootstrap version of the MetaStic metacompiler which would only compile itself, clearly stated on that page; it was used to compile successively stronger versions of itself until it was able to implement Elastic
Elastic -- is a subsystem of code, originally written in assembly language but later re-written in the MetaStic metalanguage so there could be a Forth version, so it would be more portable... from the MetaStic documentation:
"Elastic was a subroutine threaded system, hence the STIC as acronym of Subroutine Threaded Interpreter/Compiler. What the ELA was for, i can't remember anymore. Even though MetaStic is not subroutine threaded any more, the "Stic" part of the name just stuck."

Why you keep muddling the 4 of them and claiming they are the same thing, I do not know, but I'm beginning to suspect you are trying to avoid the truth that other systems of metacompilers exist. Either that, or you can't read or you are unable to deduce what is obvious and apparent.

Next, I don't know why you keep going on about generative grammars vs. analytical parsers etc. etc. I don't think metacompilers have to make a commitment to either one, and frankly I don't care. When I say:

"A metacompiler is defined by a set of grammar productions defining itself"

you seem to think I'm talking about generative grammars and thus excluding Meta-II. I assume you are misreading the term "grammar production".

A grammar production is a computer science term referring to rewrite rules (which you would have understood if you had followed the links). This is exactly how the Meta-II family of metacompilers are written: a set of grammar productions. You yourself have included some of them here:

expr = term (('+':ADD|'-':SUB) expr !2 |.empty);

That's a grammar production. I said "set" of grammar productions because (1) they have to be unique in the set, and (2) they are unordered, that is, you don't have to specify what order they get executed, that is implicit in the grammar itself.

Note: ok, I changed the word "productions" to "production rules" in that spot in the article, maybe that clarifies things a little better -- Damon Simms (talk) 14:51, 9 October 2014 (UTC)

For further clarification: (from the Context-free grammar article)

In formal language theory, a context-free grammar (CFG) is a formal grammar in which every production rule is of the form
V → w
where V is a single nonterminal symbol, and w is a string of terminals and/or nonterminals (w can be empty).

The Meta-II family of metacompilers are context-free grammars as far as I know, but metacompilers in general don't have to be. I myself have written a bootstrap version of Meta-II which was a finite-state grammar (less powerful than context-free), but it didn't require a stack like the usual Meta-II implementations. It worked on the basis of nested rules to handle expression nesting, could only process nesting as deep as I had provided rules (usually around 3 or 4 deep), but was sufficient for light-weight parsing. And that limited balls-cut-off version of Meta-II which I implemented I believe is equivalent to the Regex parsers and that PEG crap you go on about. Which means for real metacompiler usage it sucks donkey balls.

But because it was a metacompiler, I could write successively more powerful versions of itself, in its own language, which eventually (within 3 generations of itself) became a full-blown CFG metacompiler with non-limited nesting. (I say non-limited, because theoretically it was unlimited, but due to the reality of the depth of the stack, CFG grammars are always limited by how much stack memory is allocated).

But the point is, the statement in this article is correct. There are 4 levels of grammar in formal theory, and a metacompiler could be implemented using any of them, or could be any combination of the 4. But you gotta define the fucking grammar somehow, and usually, usually, that is done by providing a set of rewrite rules in the grammar which the metacompiler accepts. The metacompiler productions (rewrite rules) define the grammar accepted by the metacompiler. I believe some researchers (Neighbors perhaps?) have even produced metacompilers that accept Type-0 grammars, also known as an Unrestricted Grammars or Universal Rewriting Rules. Powerful stuff.

By the way -- have you read or done the Neighbors Metacompilers Tutorial yet?? I think you should.

Neighbors clearly states:

What the 1964 META II does is generate top-down recursive-descent parsers with Wirth-Weber code ejection... [but] that long-winded definition is too complex and obtuse. Ironically as a definition it is essentially obsolete because it doesn't really capture the idea of what a metacompiler is or how it was achieved in META II. META II can reproduce its own code from a description. After working with these systems for over 35 years I still find that amazing! I hope you will find it amazing too.

So clearly Dr. Neighbors finds the fact that a metacompiler can reproduce itself "amazing".

The fact that a metacompiler can compile itself is very significant, contrary to your myopic self-inflated opinions. As Neighbors shows in the tutorial, and is a major important part of the tutorial, is that a metacompiler can extend itself by compiling a newer improved version of itself, just by changing its syntax rules. That is hot! That is amazing! You may not see it. But just because you don't get it, doesn't mean it isn't so.

Do the fucking tutorial!

As for using blogs as reference points, yes, I know the Wikipedia rules. That's for the articles. Does not apply here to the discussions on the talk pages if used to prove a point. You asked for references where "metacompiler" is used as an accepted term of art in computer science. So I gave you several places where people refer to metacompilers as an accepted thing, a concept independent of Schorre et al, even though many of them (not all) are based on the work of Schorre and Neighbors.

Neighbors, if I understand his work correctly, expanded the use of metacompilers to create compilers for domain-specific languages for specific user-defined domains. So, for example, you could have a DSL (domain-specific language) that knows about Supermarkets, and can be used to write all kinds of software related to Supermarkets -- inventory control, shelf-space algorithms, cash register transactions, price tag laser scanning, and so on, and ties it all together into an integrated system. And that's a big advance in software engineering. All due to the fact that you can have a metacompiler modify its own behavior by reprocessing itself to create more or less powerful versions of itself in order to process new domains of knowledge. Yeah, supermarkets may not be glamorous as a domain, but if you're a supermarket owner who is losing thousands of dollars every month because your fucking computer people can't get their act together and fix the goddamned software, then the ability to produce more better code becomes a very big fucking deal. The military, I believe, is one group that has embraced Domain Analysis and Domain-Specific Languages as a way to help solve the ongoing software crisis. The military has lots of domains that require software management, just think of their inventory control problems alone. If you have inventory software that can be tweaked to handle the requirements for an Army fort or a Navy ship, then you will save them tons of money in development and testing. Afterall, the basic processes for inventory control, whether for a supermarket or a fort or a ship, are all basically the same. It's just in a couple of cases you have stuff that can blow up.

Look, Andy, the only reason I keep responding to you is I think you could do great work. You seem to have the enthusiasm and love for the work, and that's important. Usually I'm gruff in defending this stuff because I am tired of people, mostly know-it-all Mr. Young Guys, coming on here and dissing metacompilers because they didn't learn about them in school and they never heard of the Schorre Meta-II work. I'm not familiar with Forth and their metacompiler, so I never posted anything here about that -- I was waiting for the Forth experts to show up here and add their own stuff to the article. They are not shy. Maybe I should learn Forth and their metacompiler so I can talk and write about them more knowledgeably.

Hey, I really do hope you make your cc compiler-compiler, and I'm still good on the offer to help you with that if I can. I suggest you go through the Neighbors Tutorial, as I suggested elsewhere here, you can use the little toy metacompiler in the tutorial to bootstrap your own work.

I've got some other things I gotta do now, including doing the Neighbors Tutorial myself. I also have to finish my new little stripped-down metacompiler so I can post it online like Dr. Neighbors has done with his work.

Good luck, man.

Coda

I have also reverted some of your responses where you have changed my responses to you. What you did muddles the discussion and made some of my answers unreadable. And frankly, I'm beginning to suspect your motivations.

In my response section you thought fit to hack up, I had already answered your latest response with a response of my own respectfully following your latest response and clearly labeled "Response 2" (along with accompanying "Coda" and "Coda 2") and clearly demarked with my signature.

Please confine yourself to putting your responses to what I say after the section I wrote, and please always sign your entries with your signature, otherwise people reading this discussion will assume I said it. If you need to, copy and paste what I said into your section, indent or quote it if need be to show it was what I said, and then you can comment after it about what I had written -- but please restrict editing to your own signed comment section and please do not change what I have written.

Damon Simms (talk) 11:03, 9 October 2014 (UTC)


Confusion on the part of some participants

See RFC were it is supposed to be at at top of TALK --Steamerandy (talk) 21:21, 9 October 2014 (UTC)

Response

Hey Andy -- if you bothered to check the first reference on the Metacompiler page, you would see Dr. Neighbors thesis project is built on metacompiler technology, and that website has a complete tutorial on metacompilers. Here are some other references:

http://www.bayfronttechnologies.com/mc_tutorial.html -- a metacompiler tutorial, not a Meta-II tutorial

http://dictionary.reference.com/browse/cwic -- "One of the early metacompilers."

http://www.forthfreak.net/index.cgi?MetaCompiler

"meta: (greek) A prefix, meaning 'one level of description higher'"
"A metacompiler is a compiler which processes its own source code, resulting in an executable version of itself. Many Forth-systems have been written as metacompilers."

http://www.forthfreak.net/index.cgi?MetaStic

"MetaStic is the reimplementation of Elastic (which was written in assembly language) as metacompiler. It differs from other Forth metacompilers in that respect that it does not implement a Forth system, which can be extended to become the meta compiler. It rather implements a meta compiler, which can be extended to become Forth. The minimal system (the kernel) can metacompile without any additions."

http://archive.org/stream/gpglmodelinterac00bean/gpglmodelinterac00bean_djvu.txt -- "GPGL: A Model Interactive, General Purpose Graphic Language ", James Dale Beans, Thesis, Naval Postgraduate School, December, 1971.

"More recently, graphic languages have also used metacompilers, compilers, interpreters, and subroutine calls."

http://archive.org/stream/graftrangraphice00elki/graftrangraphice00elki_djvu.txt -- GRAPHTRAN: Graphic Extensions to Fortran, David R. Elkins, Naval Postgraduate School, December, 1972.

"Kulsrud further proposed that the GPGL compiler be constructed through the use of a meta-compiler in order that it may be implemented on various hardware configurations"

http://onfoodandcoding.blogspot.com/2013/02/meta-ii-early-meta-compiler.html

"So now we can answer the question: "What's a meta-compiler?" A meta-compiler is a compiler-compiler written in its own source language. So when the meta-compiler reads its own description, it generates exactly its own object code, a fixed-point if you want to think of it that way. It's a neat trick, and clearly an easier way to build a compiler-compiler"

And as I've related elsewhere, I know of other implementations of metacompilers in private industry that are based on completely different approaches from the recursive-descent parsers of the Meta family of compilers.

"Metacompiler" is an accepted term of art in Computer Science and compiler technology, independent from the work of Schorre et al.

As a matter of fact, I believe Schorre et al named their first metacompiler "Meta" in the same way Microsoft named their word processor "Word". But their's is not the only approach and is not the defining technology, although it was an important contribution.

Damon Simms (talk) 03:21, 8 October 2014 (UTC)

Yes you have answered the question. But your not going to like it:
Dr. Neighbors bayfronttechnologies site is all about META II. I conversed with him. He is really into data mining.
http://dictionary.reference.com/browse/cwic -- "One of the early metacompilers."
I already used CWIC : Schorre META compiler combined with Erwin Books LISP 2
The forth compiler link leads to an interesting statement:
MetaStic is the reimplementation of Elastic (which was written in assembly language) as metacompiler.
GEMS (Graphical Experimental Meta System) [Ref. 12] developed at the Stanford Linear Accelerator Center is a system which facilitates economical experimentation with graphical systems with a linguistic base and provides device independence. A graphical system defined utilizing GEMS can function interactively or in slave mode. Also, the capability exists to create a system which allows for recognition and/or generation of pictures. A graphical system is implemented by defining its components utilizing a simple precedence translator writing system. GEMS is implemented on an IBM 360/91 in PL/1 as three language preprocessors using SIMPLE [Ref. 1 3J and a comprehensive procedure library for accessing data structures.
written in PL/1 ???
The two references that are not to a Schorre compiler are not written in them selves.
So we are in agreement then. A metacompiler can be written in a different language then it's self. It doesn't have to be able to compile it's self Those were your links.
Check this, artical metalanguage is a language used to make statements about statements in another language. Follow the metalanguage link. I do mot think that is correct. A metalanguage can certainly make statements about it's self. From every thing I have ever seen. A metacompiler takes as input a metalanguage. What else could it take?. But doesn't mean that one article must be wrong?
You have given no reference that claims a meta compiler compiles it's self except from one that also says it was a reimplementation of one written in assembly language.
Score 2 for not self compiling and 0 for self compiling.

Damon, you have no valid references proving that The feature that sets a metacompiler apart from a standard compiler-compiler is that a metacompiler is written in its own language and translates itself. is the so called defining feature that sets a metacompiler apart from a standard compiler-compiler. In fact you have given two that say otherwise. Score is still at 2 to nothing my favor.

--Steamerandy (talk) 20:06, 8 October 2014 (UTC)

--Steamerandy (talk) 08:37, 8 October 2014 (UTC)

The origional meaning of Metaphysics was misunderstood

In origin Metaphysics was just the title of one of the principal works of Aristotle; it was so named (by Andronicus of Rhodes) simply because in the customary ordering of the works of Aristotle it was the book following Physics; it thus meant nothing more than "[the book that comes] after [the book entitled] Physics"

Stop misconstruing the meaning of meta as commonly used in computer science and science.

[[metalanguage]} Language used to talk about language
Metatheory theories about theories
Metalogic study of the properties of logical systems

So in what way does metacompiler fit in. It doesn't. It is a contraction of metalanguage compiler A compiler that compiles a metalanguage. — --Steamerandy (talk) 19:52, 15 October 2014 (UTC)

Response 2

You're wrong again. And what about the Forth reference? You seem to have ignored it.

http://www.forthfreak.net/index.cgi?MetaCompiler

"meta: (greek) A prefix, meaning 'one level of description higher'"
"A metacompiler is a compiler which processes its own source code, resulting in an executable version of itself. Many Forth-systems have been written as metacompilers."

The Forth people implemented Forth as a metacompiler because "metacompiler" is a general concept in computer science which they deemed useful and could apply. Forth, because of its metacompiler, is easy to bootstrap onto new hardware. But being a metacompiler was not Forth's original purpose. Forth was invented by an astronomer for use as a control language on process control computers used to control telescopes. Where the ability to bootstrap a usable language onto new hardware was an advantage. Because new hardware may not come with any usable software, and you have to write your own.

As for MetaStic, you obviously didn't read carefully. Elastic was originally written in assembly. MetaStic is a re-implementation of Elastic via the metacompiler/Forth route. MetaStic bootstraps itself into the desired target using Forth as the implementation language (because it's good for making metacompilers), through the usual recursive process:

"It differs from other Forth metacompilers in that respect that it does not implement a Forth system, which can be extended to become the meta compiler. It rather implements a meta compiler, which can be extended to become Forth. The minimal system (the kernel) can metacompile without any additions."
"be aware that the resulting executable can't be used for much, except for compiling its own source, until stages 2..4 become available."
That is, it only compiles itself... until you use it to compile stronger and stronger versions of itself to the point that it can be used to create compilers for other languages -- this is one of the defining features of a metacompiler. It processes itself through successive levels of bootstrapping into becoming more powerful systems.

And for some reason you've locked onto PEG (parsing expression grammars) which if you check the PEG Talk page you will see it is a fairly recent invention (some would say re-invention) and that it's in some dispute. From what I read there, many think it's stupid. It seems to be a limited defining of theory and capabilities we've had for many years. Some think it's an unnecessary rehash. So why are you pushing this stupid new formalism?

And I really don't think Dr. Neighbors is into data mining. He's the creator of Domain Analysis and the concept of domain-specific languages. The metadefinitional form of a metacompiler, like Meta-II, is itself a domain-specific language, a processable language for describing translator writing systems (compilers). He uses Meta-II as his metacompiler base, on which he builds stronger versions of metacompilers for doing all kinds of things, including processing domain-specific languages. The concept of "metacompiler" is an important adjunct to his work, not solely the Schorre family of Meta metacompilers, but metacompilers used to parse and generate a wide range of systems, both analytical and generative.

In graduate school I took a class in Hardware Description Languages in 1978, and we studied metacompilers as a way to bootstrap up new compilers for hardware description languages for new hardware. This was related to the VSIC project for computer-aided chip design. The idea was to describe your chip functionality and interactions using a descriptive language which could then be compiled into a full-blown layout design.

Also, the reference you make to the GEMS system is nonsensical and misleading. I didn't say it was a metacompiler or implemented with a metacompiler. I was giving references in the academic literature where they refer to the use of metacompilers, among other tools, for processing graphics languages, which were kind of a big thing back in the late 1960's early 70's. As it said: "More recently, graphic languages have also used metacompilers, compilers, interpreters, and subroutine calls." Showing that the academic community is aware of metacompilers as one of a number of tools for computer language processing. By the way, computer graphics is another of my areas of interest, so I have tried to keep up with all that's going on there.

At some point I may teach Metacompilers as a class at my local Community Colleges. Sure, I'll teach the Meta family of metacompilers -- my own metacompiler is based on derivatives of that work. But there are other alternatives out there, and they should be explored too. What originally appeared on this Wikipedia page would make a good starting point for a class outline. But not if you muck it up with a bunch of BS about PEG's and other stuff that is unrelated, or too specific to a project that died 50 years ago. What about a metacompiler today based totally on objects? What about metacompilers solely based on macro expansion in assembly language, or based on GPM?

You say you don't want to see the work of Meta, TreeMeta, and CWIC be forgotten. I agree, it's not taught as much as it should, hardly at all. Neither are metacompilers in general. I suspect because they just make the job of writing compilers so much easier. Instead, compiler classes focus on parsers and parser generation, hardly ever on semantic translation, because going on and on about parsers has lots of literature you can refer to, and you can always publish another paper about some new parser algorithm or technique (like PEG perhaps). I don't want to see Schorre's work forgotten, but I also don't want to see the other metacompilers forgotten and lost in the laziness of academia. By the way, have you noticed that? it's a problem in Artificial Intelligence too. Once a problem is solved, academia moves on. Why? I suspect because you can't get papers published (and thus tenure) by investigating and writing about solved problems. Metacompilers are a solution. Plus there's that whole thing about all TreeMeta descendants becoming classified/secret technology.

Metacompilers are a general concept in Computer Science. You seem intent on making it just about your little pet obsession. Why? What are you getting out of this? I get it, you think I'm wrong and don't know what I'm talking about. But why do you want to deny others the recognition they deserve? Many people have worked at developing metacompiler technology as a wonderful hammer for a lot of different nails. Have you done any work in this area in recent years? Or are you just resting on your laurels and ranting and trying to get people to acknowledge what a smart boy you are, based on stuff you did 40 years ago?

You want to be a player? Then get off your ass and do something. Make your cc compiler-compiler. I would really like to see that.

I'll even give you any support I can. Do you currently have a working metacompiler? You know you can use the metacompiler in Dr. Neighbors tutorial as a starting place to bootstrap your own if you don't have one. Here's a link you should check out:

http://onfoodandcoding.blogspot.com/2013/02/meta-ii-early-meta-compiler.html

Now begone. Go and do good work.

Damon Simms (talk) 10:34, 8 October 2014 (UTC)

response

Think you very much. but we have not reached a resolution.

First: the definitions of meta that more closely apply from an online dictionary:

a prefix added to the name of a subject and designating another subject that analyzes the original one but at a more abstract, higher level: metaphilosophy; metalinguistics.

a prefix added to the name of something that consciously references or comments upon its own subject or features: a meta-painting of an artist painting a canvas.

In computer science we have:

Metadata meaning about data.

metalanguage is language or symbols used when language itself is being discussed or examined.[1] In logic and linguistics, a metalanguage is a language used to make statements about statements in another language (the object language).

I think it should not exclude it's self. But as it stands it does. And that it excludes a metacompiler taking a metalanguage description of it's self and compiling it's self. I do not exclude a meta compiler compiling it's self. I just not think that is such a big deal.

I think the defining difference is that a metacompiler produces a compiler completely from a metalanguage source. I can not speak for other compilers building tools. But the few I have looked at do not do much in the way of code generation. An usually do not generate a parser but a table that is used by a table driven parser. In the Schorre line of meta compilers you didn't have vary powerful code generators in the early ones. TREEMETA added the tree building and unparse rules that allowed some local optimization. TREEMETA could recognize tree patterns, x=x+1 in tree form STORE[x,ADD[x,1]], generate special case code. CWIC was a giant leap from TREEMETA. However CWIC lost generality. The previous ones output text. CWIC output binary object code directly. I do not it had very little in the way of text output capabilities.

I am not arguing for this to be about Schorre's work. I asked if that were the case because that was all that was referenced. Your references to the grammars used by meta compilers is to a generative grammar which excludes Schorre's meta compilers. I do not think Fords paper does anything except give an old idea a name. Maybe not even that sense we have analytical grammars described. Really none of the named parsing algorithm define the Schorre META grammars. It is not recursive unless the language has recursive constructs such as parenthesized expressions or is block structured. Their grammar input is straightforward set of logical relations expressed as equations. It is a top down definition of a language. Anyway I accept that this is not about just Schorre's work. But it should not exclude it by defining a meta compiler's grammar as be a generative grammar. This should be changed:

A metacompiler is defined by a set of grammar productions

Or maybe formal grammar needs to be changed. What ever. why is a meta compiler restricted to a generative grammar when META II, TREEMETA and CWIC are not a generative grammar.

Analytic grammars

Though there is a tremendous body of literature on parsing algorithms, most of these algorithms assume that the language to be parsed is initially described by means of a generative formal grammar, and that the goal is to transform this generative grammar into a working parser. Strictly speaking, a generative grammar does not in any way correspond to the algorithm used to parse a language, and various algorithms have different restrictions on the form of production rules that are considered well-formed.

An alternative approach is to formalize the language in terms of an analytic grammar in the first place, which more directly corresponds to the structure and semantics of a parser for the language. Examples of analytic grammar formalisms include the following:

Maybe this is more acceptable then PEGs. My point is that they are not using a generative grammar.

--Steamerandy (talk) 22:17, 8 October 2014 (UTC)

Coda

You said:

"[This article] has conflicting description with the well known metacompilers and would exclude them as they are analytical parser languages not using a generative parser as stated in the article. All references given are to the analytical parsers, None of the metacompilers talked about here have a generative syntax description language."

I don't see how I exclude them, if you're referring to the Meta family of metacompilers. I was the one who first added the Meta-II and TreeMeta links as references. I was the one who added the link to Dr. Neighbors thesis. I was the one who added the historical context, citing the work of Schorre et al in this field. I was the one who first discussed the descendants of TreeMeta, like CWIC, being removed from access due to being made classified for security reasons (which was later mysteriously removed from the article). And I first excluded the work on Forth because, frankly, they can be demented zealots about their parochial view of things, not unlike yourself, and they might want to exclude the work of others, just as you are trying to do.

Is there some metacompiler technology or system we're missing here? Please, provide references or links. Or any kind of information. As far as I can tell, you're the one who keeps wanting to exclude information, until it's just about your own little kingdom: the "Meta compilers", referring to a research project 50 years ago, as opposed to the all-embracing concept of "metacompilers", which hopefully will continue to bloom. You would exclude some very interesting work that others have done. Shame on you.

I have tried to make this description of metacompilers as broad as possible, because it is a general concept in Computer Science and I want to be able to embrace all implementations of metacompilers. You seem to want to restrict this to just your pet area of expertise. You keep interpreting everything I have written through the tunnel vision view of your own past experience, which though vast, may only be half-vast.

Or perhaps you are trying to get recognition for compiler technology that you really like but isn't really metacompiler technology. What do I have to do? Call up Schorre and get him to defend the concept here? Call up Neighbors and get him to defend the general concept of metacompilers? Call up SDC to see if they agree with what is defined here?

You seem intent on imposing your own narrow-minded view of what is said here.

I say put up or shut up. Produce a metacompiler. Then I'll believe you. C'mon. Make that cc puppy come alive.

Damon Simms (talk) 13:04, 8 October 2014 (UTC)

Coda 2

You also said:

"A metacompiler is defined by a set of grammar productions defining itself" defines a meta compiler as using a generative grammar. A great many meta compilers use analytical grammars.

Really? Are you arguing just for the sake of argument? You seem to be misinterpreting everything I say, and ignoring what doesn't fit your neat little world view.

I don't care if a metacompiler generates code by squirting out rat poop, I don't even care if it uses PEG or Regex or APL for it's parsing strategy. If it's really a metacompiler, I'll embrace it.

It almost seems like you're just trying to prove me wrong on something, anything. You keep making up crap to try to show that I don't know what I'm talking about. You keep dragging in extraneous crap to prove your world view is the right and correct one. I don't care. What pisses me off though is that you're willing to discard the hard work others have done on metacompilers just to prove yourself right. Shame on you.

Believe me, I'm ignorant on a lot of topics. I make mistakes in what I say and write, and Computer Science formal theory is one of my weak spots. But I understand enough formal theory to get the implications. And I work at spreading the use and deployment of metacompilers because I believe they are a great technology and a great solution. I'm not trying to boost my ego, as I suspect is your motive. I want the whole world using metacompilers. I don't give a rat's ass if it's Schorre Meta, or Forth, or ObjCC. But limiting what "metacompiler" means to some long dead project seems counter to that goal.

Please stop trying to limit this topic to your own prejudiced view.

And wrong. Your wrong view. Most everything I've seen you write on here is garbled, a mish-mash, full of spelling and grammar errors, incomplete thoughts and sentences. Your examples are often full of errors. I also think you are confusing topics, mixing in extraneous crap which adds nothing to the discussion. I think it's an indication you're here just to promote yourself. You did big things a long time ago related to a long dead project. Big whoop. You seem more interested in generating heat than light. You seem to think it's ok to just come in and start insulting people. You have a high IQ (so you say) and you seem to think that gives you the right to impose your view on others, even when they point out that you make no sense, or are wrong. When shown the truth, or references that prove you wrong, you change your story and your complaints. What gives? Are you all talk and no walk? I've met people like you, hell I've worked with people like you, all wide-eyed theory and BS and tales of past glory. Mostly babbling nonsense or regurgitated pablum. But when the rubber hits the road, they don't deliver, they can't deliver.

Make that cc compiler-compiler, prove me wrong.

Damon Simms (talk) 13:43, 8 October 2014 (UTC)

Coda 2 response 1
"A metacompiler is defined by a set of grammar productions defining itself" defines a meta compiler as using a generative grammar. A great many meta compilers use analytical grammars
Really? Are you arguing just for the sake of argument? You seem to be misinterpreting everything I say, and ignoring what doesn't fit your neat little world view.
I don't care if a metacompiler generates code by squirting out rat poop, I don't even care if it uses PEG or Regex or APL for it's parsing strategy. If it's really a metacompiler, I'll embrace it.

That's just showing you ignorance. I wasn't talking about code production. Read very carefully before making a dumb response. OK.

Specifically I am talking about this link grammar Follow it and read the full description,. I would have thought it obvious in explaining Schorre's compile are an Analytic grammars. Here is the line again:

A metacompiler is defined by a set of grammar production rules

AGAIN -- Following the formal grammar link. But here it is. I copied it here to for all to see.

In formal language theory, a grammar (when the context is not given, often called a formal grammar for clarity) is a set of production rules for strings in a formal language. The rules describe how to form strings from the language's alphabet that are valid according to the language's syntax.

AGAIN: "The rules describe how to form strings from the language's alphabet that are valid according to the language's syntax."

You see yet? That line is excluding any metacompiler not using a generative grammar. That means the META II, TREEMETA etc do not fit this mold.

All of Schorre's meta compilers, including CWIC, are analytical grammars. This subject is contradicting it's self either by including Schorre's works or by restricting metacompilers to generative grammars.

I think this is more of a problem with the grammar article. But this is all apart of computer science and needs cleaning up. It should not have to be fixed here. It would take explaining that both are viable.

Ok. So I am net picking. The dam . is out of place in the program.

Further down the grammar topic page you find "Analytic grammars"

Though there is a tremendous body of literature on parsing algorithms, most of these algorithms assume that the language to be parsed is initially described by means of a generative formal grammar, and that the goal is to transform this generative grammar into a working parser. Strictly speaking, a generative grammar does not in any way correspond to the algorithm used to parse a language, and various algorithms have different restrictions on the form of production rules that are considered well-formed.
An alternative approach is to formalize the language in terms of an analytic grammar in the first place, which more directly corresponds to the structure and semantics of a parser for the language. Examples of analytic grammar formalisms include the following:

I favor the Analytic grammars side. It is a top down approach.

We start at the top:

program = $statements;

A program is made up of a series of statements. Simple easy to read an understand. $ meaning 0 or more of the following. I might also be shown writtens statements*

statements = type1
           | type2
           | type3
           | type4;

So now the language is made up of four types of statements.

Take the syntax for an arithmetic expression:

In BNF a generative production language.

<expr>::= <expr> | <expr> <addop> <term>

The analytical rule:

expr = term (('+'|'-') expr| --);

the -- means look for nothing we are successful. So expr is a term followed by a + or - or nothing. If we do find a + or - then we need to look for another expr.

The top down design approach has been the most successful approach to systems design.

Believe me, I'm ignorant on a lot of topics. I make mistakes in what I say and write, and Computer Science formal theory is one of my weak spots. But I understand enough formal theory to get the implications. And I work at spreading the use and deployment of metacompilers because I believe they are a great technology and a great solution. I'm not trying to boost my ego, as I suspect is your motive. I want the whole world using metacompilers. I don't give a rat's ass if it's Schorre Meta, or Forth, or ObjCC. But limiting what "metacompiler" means to some long dead project seems counter to that goal.
Please stop trying to limit this topic to your own prejudiced view.

I hope you under my concern with the grammar link. You need to stop taking this as an assault on your self. Though it kinda works that way when your prejudiced concept of your self keeps getting in the way of you understanding what I am trying to say. Writing is not one of my strong suits. My thing has always been math, science, and computers. My sophomore year in high school I aced the higher math SAT test. Getting 100% of the answers right. Off their their percentile scale. That really what got me through high school. I did real well in math, science, electronics and gymnastics. I have a terrible memory. All the subjects that required memorization, history probably my worst. I retain things I do and the like. But I just can not read something and remember it. Have to use it to remember.


You seem more interested in generating heat than light. You seem to think it's ok to just come in and start insulting people. You have a high IQ (so you say) and you seem to think that gives you the right to impose your view on others, even when they point out that you make no sense, or are wrong. When shown the truth, or references that prove you wrong, you change your story and your complaints. What gives? Are you all talk and no walk? I've met people like you, hell I've worked with people like you, all wide-eyed theory and BS and tales of past glory. Mostly babbling nonsense or regurgitated pablum. But when the rubber hits the road, they don't deliver, they can't deliver.

Do you not know that heat and light are the same thing. photons (electromagnetic waves)

I did not say any thing insulting until you started it. The post history does not lie. You called me a dumbshit.

When shown the truth, or references that prove you wrong, you change your story and your complaints.

That is not the case. I specifically asked what compilers were being talked about here. And got no answer. So when I proposed to change the article towards what appeared in the topic and in references you posted and called me a dumshit.

The two complaints I still have I always had. I conceded that there are other meta compilers other then Schorre types. But as yet you have not come up with one valid metacompiler that does not prove my point that they do not necessarily compile them selves.

And when I point out that your own references contradict your position. You come back a blow fish all puffed up trying to cut me down. I wasn't the normal nerd in school. As the bullies found out.

--Steamerandy (talk) 03:05, 10 October 2014 (UTC)

  1. ^ a b Cite error: The named reference FORD was invoked but never defined (see the help page).
  2. ^ Cite error: The named reference METAII was invoked but never defined (see the help page).
  3. ^ Cite error: The named reference GLENNIE was invoked but never defined (see the help page).
  4. ^ Cite error: The named reference SMALGOL was invoked but never defined (see the help page).
  5. ^ Sci-Tech Dictionary McGraw-Hill Dictionary of Scientific and Technical Terms, 6th edition, published by The McGraw-Hill Companies, Inc. http://www.answers.com/topic/metacompiler