|This is a Wikipedia user page.
This is not an encyclopedia article. If you find this page on any site other than Wikipedia, you are viewing a mirror site. Be aware that the page may be outdated and that the user to whom this page belongs may have no personal affiliation with any site other than Wikipedia itself. The original page is located at
contact: mdunlavey AT pharsight.com
Quick bio: PhD Information and Computer Science, Georgia Tech, '77. Thesis in machine vision done off-site at the M.I.T. A.I. lab. Taught C.S. at Boston College 80-84. Done numerous consulting assignments over the years, and helped to start-up many companies, including Bachman Information Systems. Employed for over 10 years at Pharsight Corp., doing software products for data analysis in the pharmaceutical industry.
I live in Needham, Mass., with my wife Mary. Four kids - Brian in Safford, AZ, Elizabeth Cashman in Madison, CT, Robby and Michael (adopted from Korea) still living at home.
Early on, I got interested in the intersection between formal computer science and information theory. This business of software has a lot of folklore, and I wanted to find out which parts of it had good scientific reasons, which were not well-founded, and where were the holes where new ideas could spring up. This led to my book and a number of articles.
- Dunlavey, “Lightweight Coding of Structurally Varying Dialogs”, T.C.N. Graham and P. Palanque (Eds.): DSVIS 2008, LNCS 5136, pp. 149-154, 2008.
- Dunlavey, “Performance tuning with instruction-level cost derived from call-stack sampling”, ACM SIGPLAN Notices 42, 8 (August, 2007), pp. 4-8.
- Dunlavey, “Simulation of finite state machines in a quantum computer”, arXiv:quant-ph/9807026v1, (July 1998)
- Dunlavey, "Building Better Applications: a Theory of Efficient Software Development" International Thomson Publishing ISBN 0-442-01740-5, 1994.
- Dunlavey, “Performance Tuning: Slugging It Out!”, Dr. Dobb’s Journal, Vol 18, #12, November 1993, pp 18-26.
- Dunlavey, "Differential Evaluation: a Cache-based Technique for Incremental Update of Graphical Displays of Structures" Software Practice and Experience 23(8):871-893, August 1993.
Proof of Differential Execution:
Correctness of Differential Execution depends on a synchronization property. Consider the contents-defining procedure as a sequence of statements. For simplicity, each of these statements is either a simple statement S, that reads and writes one value, the IF(test) body ENDIF statement, or a call to any procedure that follows the same rules.
Procedure calls need not be considered because any successful execution terminates, and in any such execution, procedure calls can be "flattened" by expanding them in-place to a sufficient extent that every execution pass in a finite series of passes never does a procedure call.
Consider any two consecutive execution passes, called the first and the second. For any consecutive sequence of statements s, we need to show Sync(s), that the number of values written during s on the first pass equals the number read during s on the second pass. This depends on the sequence of statements leading up to s, s', also being synchronized.
As the procedure is executing, there are two global flags, W meaning “is writing”, and R meaning “is reading”. Statement S reads a value if R is true, and it writes a value (any value) if W is true. Statement IF(test) reads a boolean(0,1) value (test’) if R is true, and it writes the test value if W is true. Then it sets R = (R ∧ test’), and W = (W ∧ test). If (R ∨ W) is true, it executes the body, otherwise it skips it. The ENDIF statement restores the values of R and W.
Proof is by induction on procedure length.
1. w′(s) = the number of values written prior to sequence s on the first pass.
2. r′(s) = the number of values read prior to sequence s on the second pass.
3. Sync′(s) ≡ (w′(s) = r′(s)) [I.e. prior to s, the program is synchronized.]
4. w(s) = the number of values written during sequence s on the first pass.
5. r(s) = the number of values read during sequence s on the second pass.
6. Sync(s) ≡ Sync′(s) ⇒ (w(s) = r(s)) [I.e. through s the program is synchronized if its being synchronized prior to s implies s reads and writes the same amount.]
7. w(S) = r(S) = 1 [I.e. a single atomic statement S always reads and writes 1.]
8. w(IF(test) s2 ENDIF) = 1 + test × w(s2) [I.e. an IF statement writes 1 and, optionally, w(s2)]
9. r(IF(test) s2 ENDIF) = 1 + test’ × r(s2) [I.e. an IF statement reads 1 and, optionally, r(s2)].
10. Initial(s) ≡ w′(s) = r′(s) = 0 [I.e. s is Initial if nothing is written or read before it.].
where test′ is the boolean value read by the IF statement on the second pass.
Prove: ∀(P)Initial(P)⇒Sync(P) [I.e. show that all Initial programs have the synchronization property. Show by induction.]
Base case: Show Sync(P) where P = Λ [I.e. show it's true for the empty program.]
w′(P) = r′(P) = w(P) = r(P) = 0 [I.e. the empty program reads and writes nothing.]
therefore Sync′(P) ∧ Sync(P) [By definitions 3 and 6]
Inductive case: Show ∀(s1,s2)(Sync(s1) ∧ Sync(s2)) ⇒ Sync(P)
where (P = s1; S) ∨ (P = s1; IF(test) s2 ENDIF)
[I.e. show it's true for program P if you get P by appending a simple statement S or an IF s2 statement for which it's true to a shorter program s1 for which it's true.]
Assume (Sync(s1) ∧ Sync(s2)) [Make the assumption that it's true for s1 and s2]
Show Sync(P) [and then show it's true for P]
[First of all, since P is a whole program and s1 is its leading portion, the number of reads and writes prior to P and s1 are zero.]
w′(P) = w′(s1) = 0; r′(P) = r′(s1) = 0;
therefore Sync′(s1) ∧ Sync′(P) [By definition 3]
[First consider the simple case, where s1 is extended by a simple statement S:]
Case P = s1; S
w(P) = w′(s1) + w(s1) + 1 [By definitions 1, 4, and 7]
r(P) = r′(s1) + r(s1) + 1 [By definitions 2, 5, and 7]
[Take note of lines below because they will be referred to in the next case]
Is w(P) = r(P) ? [Asking because we want to show Sync(P) by defn. 6 and we've already shown Sync′(P)]
Is w′(s1) + w(s1) + 1 = r′(s1) + r(s1) + 1 ? [Restating the question.]
Is w(s1) + 1 = r(s1) + 1 ? [Because Sync′(s1) and defn. 3]
Yes, because Sync′(s1) ∧ Sync(s1) [By defn. 6]
Sync(P) because Sync′(P) ∧ (w(P) = r(P)) [By defn. 6]
[Now consider the case where s1 is extended by an IF statement whose body s2 has the Sync property.]
Case P = s1; IF(test) s2 ENDIF
test′ = test because Sync′(s1) ∧ Sync(s1) and behavior of sequential files.
[This is the cornerstone of the proof. For a sequential file, the nth byte read equals the nth byte written.]
Case test = test′ = false
w(P) = w′(s1) + w(s1) + 1 because test = false [Defn. 8]
r(P) = r′(s1) + r(s1) + 1 because test′ = false [Defn. 9]
Sync(P) [By same reasoning as case noted above]
Case test = test′ = true
w′(s2) = w′(s1) + w(s1) + 1 [Defn. 8]
r′(s2) = r′(s1) + r(s1) + 1 [Defn. 9]
w′(s2) = r′(s2) [I.e. Sync′(s2) because Sync′(s1) ∧ Sync(s1) (defns. 3 & 6)]
w(P) = w′(s1) + w(s1) + 1 + w(s2) because test = true [Defn. 8]
r(P) = r′(s1) + r(s1) + 1 + r(s2) because test’ = true [Defn. 9]
[Show Sync(P) similar to case noted above except it also removes s2]
Is w(P) = r(P) ? [Ask question needed to show Sync(P)]
Is w′(s1) + w(s1) + 1 + w(s2) = r′(s1) + r(s1) + 1 + r(s2) ? [Restate the above question.]
Is w(s1) + 1 + w(s2) = r(s1) + 1 + r(s2) ? because Sync′(s1) '['Defn. 3]
Is 1 + w(s2) = 1 + r(s2) ? because Sync′(s1) ∧ Sync(s1) [Defn. 6]
w(s2) = r(s2) because Sync′(s2) ∧ Sync(s2) '['Defn. 6]
w(P) = r(P) [Above question is answered in the affirmative]
Sync(P) because Sync′(P) ∧ (w(P) = r(P)) [Defn. 6]