Planning and discovery, maybe?
Draft of 2009.02.02 ☛ 2015.03.18 ☛ 2016.07.22
May include: philosophy ↘ engineering ↗ &c.
Still trying to put my finger on something bothering me. Very subjective, no doubt ill-considered… but still there and not quite stated clearly enough.
This is something about business, project management, planning and implementation: About how a certain class of manager views the specification of goals, the sense that goals met create business value, and how those people deal with the real people whose work it is to connect the two idealizations (goal, value) to one another by applying their experience, insight, and ability to communicate.
In my experience, software developers are appropriate to the task; “computer programmers” cannot as a rule reliably deliver value from their work.
This is something about pedagogy, graduate training, the Academy and specialization: About how grant applications are written years before monies are acquired; how “real” academic projects are spelled out in grant applications as if foresight were perfect and exploration was rational, while the work is done by substitutable and inexperienced students and young faculty; how “homework” projects and evaluations are treated as if individual people can learn in a vacuum of reading and self-direction and wordy lecture, as if textbooks were helpful without conversation; as if the cost, utility, quality and duration of scholarship were all perfectly fungible with one another, perfectly liquid… subject to insignificant exchange costs not worthy of note.
In my experience, students learn when they work collectively on a shared goal, supporting one another, and in the process learn by discovering and sharing their non-overlapping skills: when they “cheat”. “Stars” who cannot explain their work, who cannot collaborate, who disdain “cheating” (by the standards of most modern Honor Pledges and tenure review committees) by sitting quietly by themselves and doing what their massive insight has revealed is the path to what you (mere people) need… these folks cannot as a rule reliably deliver value from their work.
This is something about the theory and practice of artificial intelligence, operations research, machine learning, and metaheuristics: About the unwillingness or inability to treat techniques prescriptively except as a form of self-promotion of one’s own research or personal bias; about the strangely persistent shortfall in communicating the utility of those thousand variant methods from linear programming to fictitious play to genetic programming or graphical model learning, any one of which might potentially answer questions, identify patterns, and help people invent software or physical engineering designs; about a culture of “practitioners” who cannot be bothered to learn enough theory to explain why their approach is sufficient for their particular tasks, and a separate culture of “theorists” who cannot be bothered to learn enough of best practice to explain why their approach is necessary for any task.
In my experience, the average time an algorithm is expected to run may be of interest, but as far as my particular problem is concerned it has no bearing until I have run it for a while to see some results, see how it’s going, suss out what “kind” of problem this specific instance is—to see what value comes from “how long” it will take to run, as opposed to seeing any answer at all. I do work, I create stuff, to better understand the path from idealized goal to realized value. Things like speed, accuracy, ease of use and understandability, these are things I try to measure, not assume beforehand for some combination of problem and approach, and I want information with which to update my assessments as quickly and accurately as possible. Because for some strange reason I am unable to tell beforehand how difficult an interesting instance of a problem will be, even with the most familiar approach.
I have a great deal of both practical experience and theoretical backing in these matters, and all that has happened for me (your mileage may vary) is that I am more uncertain about my prejudices, and yours, all the time.
On average, doing something small, immediately, is better than talking a long time about the many things you could do, about potentialities and limits and average behavior. And perhaps better than doing “just anything” is considering the small set of simple incremental improvements, selecting the one that seems it will provide the most value for that scale of effort, and trying it.
In too many domains we conflate rationality with rigor, and treat the straightest path between them as a recipe for success. But isn’t “rationality” an intentionally bounded thought process? a strategy of fully dismissing alternatives as greedily and thoroughly as possible?
But I don’t want to spend my time with rigorous people. They’re fucking annoying, when you get right down to it. When I’m actually trying to solve a problem, I would prefer to collaborate with ten experienced people (some “theorists”, some “practitioners”) who can speak quickly, approximately, and explore oh so many alternatives. I want people who can use simple, stupid, non-optimal tools all of us poor fools can understand… but who in using those tools discover many paths by which we might collectively trace our way—any goddamned way as long as we arrive—from our immediate goal to our desired value.
Because value trumps method.
And value (as I’ve said) is something that may not be rationally predictable. Value comes along the way, it emerges. Value in so many cases is contingent on multiple scales of experience, long and short term, on constantly revised and discarded models, on alternative hypotheses easily exchanged. Achieving value depends on my tools, my inclination, my habit. On what I’ve done so far.
And all these change from person to person, from problem to problem. From moment to moment. In my experience, on a shorter scale than any—any—problem-solving method, whether it’s a business project, a thesis or grant, a single “simple” application of heuristic to instance.
Something deep is missing out there.