From Wikipedia, the free encyclopedia
Jump to: navigation, search

contact: mdunlavey AT

Quick bio: PhD Information and Computer Science, Georgia Tech, '77. Thesis in machine vision done off-site at the M.I.T. A.I. lab. Taught C.S. at Boston College 80-84. Done numerous consulting assignments over the years, and helped to start-up many companies, including Bachman Information Systems. Employed for over 10 years at Pharsight Corp., doing software products for data analysis in the pharmaceutical industry.

I live in Needham, Mass., with my wife Mary. Four kids - Brian in Safford, AZ, Elizabeth Cashman in Madison, CT, Robby and Michael (adopted from Korea) still living at home.

Early on, I got interested in the intersection between formal computer science and information theory. This business of software has a lot of folklore, and I wanted to find out which parts of it had good scientific reasons, which were not well-founded, and where were the holes where new ideas could spring up. This led to my book and a number of articles.

  • Dunlavey, “Lightweight Coding of Structurally Varying Dialogs”, T.C.N. Graham and P. Palanque (Eds.): DSVIS 2008, LNCS 5136, pp. 149-154, 2008.
  • Dunlavey, “Performance tuning with instruction-level cost derived from call-stack sampling”, ACM SIGPLAN Notices 42, 8 (August, 2007), pp. 4-8.
  • Dunlavey, “Simulation of finite state machines in a quantum computer”, arXiv:quant-ph/9807026v1, (July 1998)
  • Dunlavey, “Performance Tuning: Slugging It Out!”, Dr. Dobb’s Journal, Vol 18, #12, November 1993, pp 18-26.
  • Dunlavey, "Differential Evaluation: a Cache-based Technique for Incremental Update of Graphical Displays of Structures" Software Practice and Experience 23(8):871-893, August 1993.


Linguistic Method

Differential Execution

Performance analysis


/Difex Article

/Linguistic Method Article


Proof of Differential Execution:

Correctness of Differential Execution depends on a synchronization property. Consider the contents-defining procedure as a sequence of statements. For simplicity, each of these statements is either a simple statement S, that reads and writes one value, the IF(test) body ENDIF statement, or a call to any procedure that follows the same rules.

Procedure calls need not be considered because any successful execution terminates, and in any such execution, procedure calls can be "flattened" by expanding them in-place to a sufficient extent that every execution pass in a finite series of passes never does a procedure call.

Consider any two consecutive execution passes, called the first and the second. For any consecutive sequence of statements s, we need to show Sync(s), that the number of values written during s on the first pass equals the number read during s on the second pass. This depends on the sequence of statements leading up to s, s', also being synchronized.

As the procedure is executing, there are two global flags, W meaning “is writing”, and R meaning “is reading”. Statement S reads a value if R is true, and it writes a value (any value) if W is true. Statement IF(test) reads a boolean(0,1) value (test’) if R is true, and it writes the test value if W is true. Then it sets R = (R ∧ test’), and W = (W ∧ test). If (R ∨ W) is true, it executes the body, otherwise it skips it. The ENDIF statement restores the values of R and W.

Proof is by induction on procedure length.

 w′(s) = the number of values written prior to sequence s on the first pass.
 r′(s) = the number of values read prior to sequence s on the second pass.
 Sync′(s) ≡ (w′(s) = r′(s))
 w(s) = the number of values written during sequence s on the first pass.
 r(s) = the number of values read during sequence s on the second pass.
 Sync(s) ≡ Sync′(s) ⇒ (w(s) = r(s))
 w(S) = r(S) = 1
 w(IF(test) s2 ENDIF) = 1 + test × w(s2)
 r(IF(test) s2 ENDIF) = 1 + test’ × r(s2)
where test′ is the boolean value read by the IF statement on the second pass.

Prove: ∀(P)Sync(P)

Base case: Show Sync(P) where P = Λ
 w′(P) = r′(P) = w(P) = r(P) = 0
 therefore Sync′(P) ∧ Sync(P)

Inductive case: Show ∀(s1,s2)(Sync(s1) ∧ Sync(s2)) ⇒ Sync(P)
where (P = s1; S) ∨ (P = s1; IF(test) s2 ENDIF)
 Assume (Sync(s1) ∧ Sync(s2))
 Show Sync(P)

 w′(P) = w′(s1) = 0; r′(P) = r′(s1) = 0;
 therefore Sync′(s1) ∧ Sync′(P)

Case P = s1; S
 w(P) = w′(s1) + w(s1) + 1
 r(P) = r′(s1) + r(s1) + 1
 Is w(P) = r(P) ?
 Is w′(s1) + w(s1) + 1 = r′(s1) + r(s1) + 1 ?
 Is w(s1) + 1 = r(s1) + 1 ? because Sync′(s1)
 Yes, because Sync′(s1) ∧ Sync(s1)
 Sync(P) because Sync′(P) ∧ (w(P) = r(P))

Case P = s1; IF(test) s2 ENDIF
 test′ = test because Sync′(s1) ∧ Sync(s1) and behavior of sequential files.
(This is the cornerstone of the proof.)
 Case test = test′ = false
  w(P) = w′(s1) + w(s1) + 1 because test = false
  r(P) = r′(s1) + r(s1) + 1 because test′ = false
  Sync(P) by same reasoning as case above
 Case test = test′ = true
  w′(s2) = w′(s1) + w(s1) + 1
  r′(s2) = r′(s1) + r(s1) + 1
  w′(s2) = r′(s2) i.e. Sync′(s2) because Sync′(s1) ∧ Sync(s1)
  w(P) = w′(s1) + w(s1) + 1 + w(s2) because test = true
  r(P) = r′(s1) + r(s1) + 1 + r(s2) because test’ = true
  Is w(P) = r(P) ?
  Is w′(s1) + w(s1) + 1 + w(s2) = r′(s1) + r(s1) + 1 + r(s2) ?
  Is w(s1) + 1 + w(s2) = r(s1) + 1 + r(s2) ? because Sync′(s1)
  Is 1 + w(s2) = 1 + r(s2) ? because Sync′(s1) ∧ Sync(s1 )
  Is w(s2) = r(s2) ?
  Yes, because Sync′(s2) ∧ Sync(s2)
  Sync(P) because Sync′(P) ∧ (w(P) = r(P))