Friday, December 21, 2007

Critical Thinking about Software Technology

For all that we work in an industry full of well educated and presumably intelligent people, it still amazes me a times just how much we, as an industry, regularly act like lemmings. By way of example, consider what kinds of responses you would typically find to the following statements if you’d encountered them on-line in the last 5 to 7 years:
  • J2EE is overly expensive, over used, over-hyped, and too complicated. The DTO pattern is over-used and a sign of a failed architectural design.

  • Agile is well intentioned but too hard to implement and doesn’t really work in all development environments.

  • Dynamically typed languages like Ruby are good for a niche market but aren’t suitable for large-scale programs.

  • Getters and setters are over-used and a sign of bad or thoughtless OO design.


Perhaps I’m being too negative. After all, it’s easy to complain about something. Let’s try some affirmative statements instead:
  • EJB 3.0 is well designed and ready for prime time. Developers should consider incorporating it into their designs.

  • JSF is ready to take off and become the next major web-development framework.

  • Web services are a growing technology and definitely on the must-use list for distributed applications.


Before proceeding, I’d like to pause for a moment and point out that I’m not actually endorsing any of the statements above. That’s not really the point. (Actually, I deliberately picked a mix of statements that I agree with and disagree with.)

The point is that nearly all of the above statements, at one time or another, are or were virtually guaranteed to start a flame war. What strikes me as odd though, is that many of the people repeating the arguments for or against any given technology, often haven’t tried them. Personally, I believe that we tend to look for consensus on the Internet. After all, not many people have the luxury of spending months trying out a technology before they pass judgment on it. However, in my experience, the Internet doesn’t really show a representative sample. When I’ve spoken at conferences, I or a colleague will often poll the audience on their previous experience with whatever we’re talking about. How many people are developing for the web? According to my unscientific observations, about half. How many people have actually used a relatively new language like Flex? My bet is that fewer than 10% of the people will raise their hands when I ask that in a couple of months. Ruby? Probably very few. Yet if you just read blogs, technical articles, vendor websites, conference speaker’s notes, etc. it would seem that nobody is still using “old” technologies like struts.

I also believe that there are many valid arguments against all of the above statements. However, they’re only valid when delivered by people who have thought critically about the issue, taken their own personal experiences and observations into account, and left room for the possibility that their experiences probably don’t represent all of the possible perspectives on the issue. (This applies to me too. I encourage anyone reading this to take what I’m saying with a grain of salt. After all, I’m just some guy writing a blog too.)

So, here is one of my New Year’s resolutions: I will think critically about all of the new technologies, ideas, and buzzwords in the coming year. Just because someone is excited doesn’t make it a brilliant idea no matter how smart the individual is or how great his or her reputation is. Similarly, I will not bash someone else’s ideas simply because everyone else is down on it. I, like everyone else in my industry, am paid to think. I am not paid for my charm, wit, or dashing good looks. (which is just as well) Therefore, when I fail to think for myself and arrive at my own conclusions, I am over-paid.