Although I will say that I thoroughly enjoyed the new Iron Man movie, that's not what this blog is about. This blog is about software. Wha?! I think the software industry as a whole has become fairly stagnant in recent years. It needs to be invigorated with not necessarily some new blood, but at least new thinking.
Most of the people in the software industry even if they agree with my sentiments about new thinking seem to feel that the answer is AJAX or Web Services or Web 2.0 (which is largely a buzz-word) or some combination of these things. That's where I disagree. The problem is precisely that they've labelled one or more of these things as "the solution". This is a problem because labelling "the solution" is a very different thing than labelling "a solution".
Labelling "a solution" means you've examined a specific problem and resolved it in the present tense. Labelling "a solution" leaves open the possibility that it may not be the best solution available. Labelling "the solution" means you've decided arbitrarily that it's the best solution for not only this problem but any similar problems in the future and it's the only solution you'll use. You've closed the door on any future discussion of the possibility of alternative solutions. You no longer examine the problem at hand and simply apply the golden hammer as prescribed.
Peter Sommerlad has this to say about design patterns:
Sommerlad: I feel guilty as an author of many patterns and supporter of the pattern community because I've come to the conclusion that -in general- Design Patterns (DP) are bad for software design.
Aha! See, I was right! Oh wait... did I say that out loud? Anyway... no, I'm not just going to stop at confirmation bias, there's a lot more to this.
Sommerlad: You might ask why the splendidly successful Design Patterns book [ GoF ] is bad for software design? ... in the early days of OO programming, only guru level people were actually designing working OO systems and the average programmer was stuck with BASIC, Pascal or C.
Note that the three languages described as available to the average programmer were all procedural languages (though there is an OO iteration of Pascal used in Delphi and C had already evolved into C++). One thing I will say here though is that, although Sommerlad may (or may not) disagree, Object Orientation is simply not required to create encapsulation (cohesion and loose coupling). There was quite a lot of good encapsulation and cohesion done with ColdFusion 5 and there's an awful lot of very poor encapsulation and high coupling being done today with ColdFusion 8.
Sommerlad: Those were the people that invented the architectures that later became popular Design Patterns. The gurus were able to conciously think about their design decisions or already had the experience to decide between good or worse designs by a gut feeling. In addition they came from a time when ... principles of simplicity, abstraction, structure, encapsulation, coherence and decoupling were well known ... at least by the people considered capable of good software design.
Today, Design Patterns allows average developers to design OO systems and get them working that would have been beyond their design capabilities before. This sounds like a great thing, but the relative lack of expertise or brilliance can easily result in bigger software design desasters with DP applied than without.
Most of the Design Patterns in the GoF book are about introducing flexibility by indirection and inheritance. This is great when you use it to reduce code size and simplify logic by applying polymorphism, but in the hand of the uninitiated Design Patterns are a tool for overengineering and introducing unnecessary complexity. A feeble designer that cannot decide on a system property will use Design Patterns to postpone too many decisions, will speculate about features never needed and will lay a heavy burden on implementers and maintainers of the system.
This reminds me in particular of something I've said before, and I'll say it again here. In the world of web software in particular, we've become accustomed to a situation that the users of other software would never accept. It is a situation that I believe we should not accept and for the very specific reason that it causes major headaches for us as programmers and monetary loss that affects the bottom line for our companies. The users of standard desktop software would never accept software that required programming to install it. Imagine the horror of your relatively computer illiterate aunt upon reading the installation instructions for Microsoft Office if it involved "step 1 - Press the start button and select Run - enter REGEDT32 and press enter ..." However, this is in essence what has become standard to ask of people who've purchased our web-based software.
As a matter of fact, as much as I dislike Eclipse, this is one place where the authors got it right. Eclipse is one of the few applications even for the desktop in which the installation is one step: unzip. Though even the average desktop application at a minimum provides an installation "wizard" that guides the user through the installation process, insulating them so that the software does all its own programming work.
But the situation of requiring programming work to install software actually becomes even worse when we're talking about web software than when we're talking about desktop software. And this is where it comes down to the bottom line for companies with regard to "total cost of ownership". It is significantly more expensive to own a typical web application that is advertised as "easy to modify". Why? Because when web applications are modified in the way that most of ours are modified (editing someone else's code) they become much harder to upgrade at a later date when the original authors release newer versions.
This is a part of the "heavy burden" on maintainers Sommerlad is talking about. How many times have you seen a company that purchased a web application that was "almost" what they wanted, made "minor changes" and were then stuck with that system for the forseeable future, unable to upgrade because they couldn't take the time to merge their modifications with the new version and test it?
To date, the onTap framework is the only framework I'm aware of that actually addresses that issue, and it does so specifically because I've taken a Tony Stark approach to development which I'll talk about in a minute.
Sommerlad: Not only the GoF book is a reason for this situation, but also its use in training and education by teachers inexperienced in OO programming. Often the drawbacks of a Design Pattern are not explained well enough by the GoF or are omitted by readers or teachers, since DP are perceived as the OO design panacea. One reason for the obscurity of some of the issues with DPs lie in the aged form of the GoF style. Most modern pattern books provide a style that more clearly shows the problem and forces resolved and the downsides of the solution. Another issue is the often exclusive focus on the original 23 Design Patterns without showing students the breadth of pattern literature where better solutions for their design problems might be presented.
Human beings are notoriously biased. I will admit that I haven't been studying it for very long, but I believe all software engineers really should study cognitive science because it can help us to understand when and how our thinking is consistently failing us.
For example, I know that my perception of a block of code is very different if it's my code than if it's someone else's code. In particular, I'm apt to perceive that poor conventions in my code address specific problems while perceiving that poor conventions in someone else's code exist because the other person has bad habbits. More specifically, see my recent comment on some code in Luis Majano's ColdBox framework. This is an example of the actor-observer bias, which put simply goes something like this: "if others do it, it's their fault - if I do it, it's not my fault, it's because of my circumstances". I suspect that the actor-observer bias is actually a product of the general availability heuristic - we perceive others as having "bad habits" and ourselves as having "bad circumstances" because while our own circumstances (and solutions) are consciously available to us, it takes a little digging to uncover other people's reasons. Since we don't (or really can't) dig to get at the reasons for others, we generally perceive them as being less rational (among other things). I can think of several commedians off the top of my head who've been quite successful with routines involving pointing out the stupidity of others, such as George Carlin, Gallagher and Bill Engvall. Plus we've also got cognitive dissonance ensuring that we'll have a more positive view of ourselves and our own abilities most of the time (depression notwithstanding).
While I can recognize it in my writing, this doesn't mean that I'm immune to it. In fact I'm certain I do more of it than I recognize - we all do, that's why it's a "human" bias. They affect everyone, including the researchers who study them. Humans are just not very objective... but we aren't objective not because we're "faulty" in any real way -- quite the opposite. We're inobjective because these biases, while making our perception innacurate, also help to ensure our survival. In an evolutionary sense, the traits that are most advantageous to survival in a population are the ones that generally become dominant over time. So we're inobjective precisely because these are rather literally "healthy illusions" to have. Both Richard Wiseman in his book the Luck Factor and Martin Seligman in his book Learned Optimism, two of the world's most eminent scientists, have shown both statistically and through experimentation how the majority of people are overconfident and that, in spite of the occasional pitfall, overconfidence is actually an advantage to our survival, health and success in life.
But we haven't actually evolved to write software. That is to say, we've not evolved such that the traits that are advantageous in software engineering have become dominant in the population. That's a good thing for us engineers! If we evolved to be better software engineers as a species, then you and I might be out of a job. :)
Indeed our biases negatively influence most of our software. For one thing, we suffer from a rather nasty endowment effect. What is the endowment effect? Simply put, once you own something, you overvalue it. Granted it's not always applicable, but much of the time (if not most of the time) it's pretty accurate. And it applies equally to ideas (or ideals) as it does to physical posessions like cars or homes, which is why software engineers sometimes sound a lot like religious zealots. Note the ardent furvor of the open-source Ubuntu zealot as he verbally vivisects the casual user of Microsoft Office for his ambivalence to the EVIL corporate empire and its Sith master Darth Gates. Of course Microsoft flunkies are often equally insular. And as such the techno-cultural war in middle america resembles at least in spirit the neverending wars in the middle east... or alternatively the trilogy war in Clerks 2.
But just as endowment influences our subscription to Windows or anti-Windows camps, it also influences everything else we do in software engineering. What this means is that once we've decided on "the solution" as I mentioned before, we significantly overvalue "the solution" because it's "the solution" and not "a solution". And this is precisely what Sommerlad was talking about with regard to design patterns -- the tendancy they have to become golden hammers. When Sommerlad mentions the fact that people often omit any discussion of the drawbacks of a pattern, he's also describing at least in part the effect of confirmation bias.
A while back Matt Woodward also posted a good and resaonably thorough blog in which he talked about the same phenomenon. In short, there've been some general grumblings over the use of the "Gateway" design pattern in ColdFusion. Some ColdFusion programmers wants a gateway to return a collection (an array) of fully instantiated objects, which is common practice in Java.
Herein lies a huge problem for either people coming to ColdFusion from Java or ColdFusion programmers who are interested in learning the way things are done in Java, because simply moving code from Java to ColdFusion line by line is a bad idea. Why? Because ColdFusion isn't Java - its strengths and weaknesses are very different than the strengths and weaknesses of Java. For starters, instantiating objects in Java is a pretty efficient prospect, but an object (CFC instance) in ColdFusion is a lot less efficient. So where Java programmers can get away with the inefficiency of instantiating several hundred objects returned from a search all at once (even when only a handful of them will be used), the same technique is dreadfully, painfully slow in ColdFusion.
Though where Java often forces you to use try-catch all over the place, causing java developers to often develop a habit of adding try-catch statements out of habit instead of adding them for a reason, ColdFusion is much more sane in that regard. Hence the reason why if you look at any of the code produced by Paul Hastings, you'll notice that he's littered nearly every method in every CFC he's published with a gratuitous try-catch, which serves no purpose other than to litter the code, slow him down and (albeit marginally) create overhead for the server. He's endowed with the arbitrary and silly notion that Java is a "better" language and as a result his bias for Java informs every decision and he doesn't stop to consider the strengths and weaknesses of the current environment. (I honestly really wonder why he bothers with ColdFusion at all -- he seems to hate it so much, especially for having been a Team Macromedia member and now Adobe Community Experts member.)
So ColdFusion has its own set of strengths and weaknesses separate from Java and yet in part due to endowment, ColdFusion programmers often don't respect this fact. They use ColdFusion as though it's synonymous with Java, which makes for at best mediocre ColdFusion development.
Matt went further actually to show the endowment effect by showing how it is that ColdFusion developers have aquired a relatively unique interpretation of "DAO" and "Gateway" and how actually when these patterns were described in the original GOF book, not only did they not mention anything (anything at all) about databases, but also were virtually synonymous with regard to their descriptions. Indeed, the original GOF book which Sommerlad now says is "bad for software design", said very little about the actual code. We tend to think of a "DAO" as something very specific, but in the parlance of classical design patterns it was actually very inspecific. Not only was it very inspecific, there were strong reasons why it was inspecific. A design pattern isn't about practical application, it's about concept -- the art of the abstract, far removed from the very specific ways that we as programmers typically approach them.
It's a question of mindset. The GOF book was designed for engineers -- but the end result has been mechanics trying to use them. There's nothing wrong with mechanics or even with being a mechanic, it's just a different job and it requires a different kind of thinking. A mechanic receives specific tools that perform specific tasks and he performs those tasks. He has books with specific regulations for already designed and manufactured equipment. The engineer works on the car at the other end before anything has even been discussed with the manufacturing division. When the engineer is doing his job it's not a nut or a bolt or a shock absorber -- to the engineer, it's a weight and a force and a desired outcome. Where the mechanic has to be concerned with fuel distribution, ignition and transmission, the engineer is free to swap out the internal combustion engine for batteries, allowing for the straight acceleration that allowed an electric car to beat a gas-powered Formula 3 racecar in this MythBusters episode.
The mechanic deals with specifics of manufacture because the car is already built and rarely is the driver able and willing to pay the immense sums needed to make major modifications to an already assembled vehicle. The engineer deals with generalities of purpose. It's a totally different problem. The mechanic is asked to "fix the engine" - the engineer is asked to "produce a top speed of 200mph". Those two jobs require totally different ways of thinking.
The intent of software of course is to take the hassle out of our daily lives by automating tasks that we currently perform manually and by allowing us to perform tasks that we previously couldn't, through the application of similar automation. As such, the objective of software can't really be met via the mechanic mindset. The mechanic would be asked to "fix the shopping cart". The engineer would be asked to "reduce the workload of our order fulfillment department by automating their paperwork". The mechanic task doesn't achieve anything new, it only perpetuates the system that exists. The engineering task saves the company money and creates opportunities for the order fulfillment department to be more productive and do other things to help the company's success.
The best software always has been and always will be created by engineers (and especially those engineers who take the time to understand cognitive science and human factors issues like the magical number 7 plus or minus 2). The best software will be developed by engineers who think like the archetypal engineer Tony Stark.
We should all strive to think like Iron Man, but what exactly does that mean? (spoilers follow)
- Think Options: In the beginning of the film, Stark is captured by a group of terrorists who demand that he build a missile system for them. His initial reaction is one of despair - utter depression. He's convinced that he's going to die and refuses to do any work. Then a fellow prisoner says "this is a very important week for you". At that point he begins planning is escape and out of a box of scraps, he builds the arc reactor that he wedges into his chest to both keep the schrapnel out of his heart and to power his first prototype of the armor. The terrorists did bring him equipment, but for the most part he used scraps. He didn't sit around lamenting the equipment he didn't have. Like Tom Hanks character in Castaway, he looked at the objects he had available and he considered their individual properties - what were they good for? If you broke them or melted them down, or strapped them to something else, what else could they be good for? In the early days of software development, everything was new, and everyone had to think this way, because there weren't already handily predesigned tools to solve every problem. These days the problem is reversed - the "solutions" are too readily available and we fall into the trap of applying them without really thinking through the solution to determine if it's the best we can do (and it often isn't). Like Stark at the beginning of his captivity, we become stuck in the thinking that our hands are tied by these tools we have, forgetting that we made them in the first place.
- Don't Take No For An Answer: This is actually rather closely related to the admonition to think of options. Tony says "we should look into Arc Reactor technology again," and is met with the response, "we knew when we built it that it wasn't practical... we only built it to shut up the hippies!" He doesn't let that distract him from his purpose. When someone says something can't be done, it means they don't know how to do it, nothing more.
Several years ago when ColdFusion MX was first released, I ended up going through a hair-pulling session involving the cfstoredproc tags. I mentioned it on the CF-Talk list and Sean Corfield had replied that "it can't be done because JDBC doesn't support named parameters". He was right about JDBC support for named parameters -- he was wrong about it not being doable, and in fact you can use JDBC (2.0) to do it. Over the past few years there's been apparently on and off again support for dbvarname, which boggles my mind, because I had actually fixed the problem myself not too long after I had complained about it. While I didn't launch a major campaign over it (because most of us aren't using a lot of stored procedures), I did let people know. Anyway, it's still baked into the onTap framework's SQL abstraction tools.
The SQL Abstraction tools in and of themselves represent a large body of work which Ben Forta claimed to be impossible when he said "true DBMS portability is unatainable". While it's true that not all databases support triggers for example, that's outside the realm of the CF application and you can certainly write the cf application to account for the same sorts of things the triggers might otherwise handle. Ultimately, most of what Forta describes as the challenges of database portability are covered in the onTap framework's SQL abstraction tools with (now that they've been optimized) minimal overhead. I also contend with Forta's comment that switching platforms isn't common -- it is, as Hermes Conrad might say, "technically correct... the best kind of correct". Meaning that it ignores the fact that many of us (myself in particular) are designing software to be used by others and unwilling to sacrifice customers simply because they chose a different database platform.
Far from being a burden, the SQL abstraction tools in the onTap framework actually allow me much greater flexibility with regard to database interaction, letting me quickly and easily build very complex queries that would otherwise be very challenging to read and understand, but to do it in a way that is so straightforward that it's virtually self-documenting. The use of not merely and/or keywords in searches, but internationalized and/or keywords is a prime example. I'm able to do this with one line of code and afterward, not only is the code both plenty efficient and eminently legible, it's also SQL-Injection proofed, because unlike ObjectBreeze, the onTap SQL tools use cfqueryparam. It also means I never have to type a cfqueryparam tag myself, which means both that the sqltype is automated (which reduces coupling in the application) and reduces the amount of code I have to write in general. I was doing ORM before Reactor or Transfer were a twinkle in anyone's eye -- how's that for something that's (according to Ben) "not doable and not worth doing"?
Stark Ent. Employee: the technology doesn't exist...
Obadiah Stane: Tony Stark built one IN A CAVE! With a BOX OF SCRAPS!
Stark Ent. Employee: well... I'm not Tony Stark
Okay, I've beat this one to death, moving on...
- Think Small: There are actually two parts to this proposition. Thinking small is actually not about the size of the application or its features. Thinking small ultimately is about agile software - the ability to switch gears quickly. The application itself may be very large, but it's built in very small, encapsulated pieces. The Arc Reactor in tony's chest is no more than four inches across. Each individual piece of the suit is tiny, performing only one very specific function - it's only when the individual pieces are put together that the larger suit functions as a whole. The SQL abstraction tools in the onTap framework that allow me to create internationalized and/or keyword search support (something everyone should do, but nobody does), are only possible because each part of the system deals with a very very small aspect of the SQL that's being generated. It's a single comparison in a where clause or it's a single join to another table, etc.
Though I also apply this to my work in a more literal respect. If I'm working on a template that's approaching 1000 lines of code, I generally feel something's gone wrong and immediately start looking for ways to make it smaller. If I'm generating code with a bean generator like Illudium PU-36 or using any kind of file-merging tool as a standard operating procedure during my work (something I've heard described as SOP for applications created using bean generators), then something is definitely horribly wrong. All of these things - code generation, file merging, humongous templates, create headaches and are a major hassle to maintain and all of them are totally avoidable, and if you learn the techniques that allow you to avoid them, you'll discover they're the same techniques that allow you to do some amazing things like the SQL abstraction system I described before.
The smaller the code you write (physically), the easier it will be to modify and maintain. Try and think of the code you write like a stirrup. The stirrup is considered one of the most important inventions in human history -- and it involves a very minimal amount of material to create one. In fact, if the stirrup were much larger or more complex, it likely would have either been cumbersome or fragile, either of which would have made it much less useful and prevented its relatively rapid and widespread adoption.
- Turn It Around: Don't think of "failures". Think instead of learning experiences. If a particular project "fails" it is still an opportunity to utilize the strengths and weaknesses of the experiment for other things. In the film, Tony is working on a "simple flight stabilizer, it's perfectly harmless". At which point he tests the stabilizer and it throws him bodily across the room. That stabilizer later becomes the repulsor he uses to incapacitate terrorists and destroy his company's stolen weapons in the east. Never assume because something isn't working as expected that it doesn't have value - and never assume because something is working as expected that it shouldn't be improved. Has everything I've done worked the way I hoped? No. At one point during the early development of the onTap framework I developed a fairly rudimentary blog for myself using text files instead of a database to store both the blogs themselves and the comments. That project might have actually worked if I had at that time a better understanding of either Verity and/or XML (both of which I understand better now), but although that blog was taken down, I learned some valuable things about file access and developing facades from that experiment.
Actually, more specifically, one of the things I did with the file management is to create a singular "file" component that can read and write multiple types of files (wddx, text, zip, etc) using a singular interface. And if I had anything to impart to ColdBox at the moment, what little I know of it at present, it would be that they should have used that same approach for the IOC plugin. Right now the latest ColdBox distribution says there are two options for the IOC setting in the XML config file - "ColdSpring" and "LightWire". That's great! Assuming that's all there ever is... The reason for this is because the IOC plugin CFC has those strings hard-coded into its methods, instead of creating a facade that would select the appropriate IOC plugin based on an interface (or an abstract parent class) from a directory in the same way that the framework automatically selects the appropriate event handler cfc. And then there would be a IOC/ColdSpring component and a IOC/LightWire component. This would provide the flexibility intended by the GOF in their OO design patterns, where it currently doesn't exist in ColdBox IOC selection.
Some of you may be thinking that I'm horribly conceited because I'm citing examples from my own work and comparing myself to Iron Man. I'm not saying these things to be conceited - my own achievements are simply the ones I'm most familiar with, they're the ones that spring to mind first when making these sorts of comparisons.
Am I conceited? I do generally speaking find myself to be five minutes into the future with regard to the ColdFusion community - I had polymorphic OO code in ColdFusion 4, I was doing ORM before there were "objects", I was doing much of what's in Spry before there was Spry etc. I've always disliked XML configuration files and preferred Convention over Configuration (CoC) which has become more popular lately with ColdBox and now Fusebox adopting convention-based architecture, and everyone doing auto-wiring in ColdSpring and LightWire. It's typically been only after I've been doing something for a while that someone else in the ColdFusion community popularizes it - Reactor, Transfer, Spry, etc. Both the Fusebox and Mach-II frameworks have recently changed or added features that closely resemble suggestions I'd made several years ago and were shot down at the time. Am I conceited for noticing that my ideas were "ahead of their time"?
Although I may sometimes feel a bit bitter about the way things have turned out, I don't post these things to inflate my ego or to say "I told you so". I don't consider myself an absolute authority or even an expert on most subjects. While I do consider myself to be a very good software engineer and when I post these blogs there is always some hope that it will encourage more people to consider the framework, ultimately I post these things because I love what I do and I want to make people think and possibly spur debate in the hopes that everyone might benefit from it (including myself).
So long story short, think like Iron Man! :)