The “Web Application” Myth

Christian Heilmann is dead-on in this post. It’s a long one, but worth reading. Here’s my favorite bit:

What is an application? To me, it is a tool that allows people to reach a certain goal in the most effective fashion. What matters is not what language or technology you build it in. What matters most is:

  • that it is the right tool for the right audience,
  • that it does what is expected of it and not more,
  • that it is safe to use,
  • that it works in the environment it is most used in,
  • that it can be easily maintained without dependencies that only a few people know how to use,
  • that it is built with components that are reliable to use and not a “alpha” or “beta” or “not production ready” experimental technology
  • that we know how to maintain the thing, how to add new functionality and above all, fix security issues in the future without replacing it as a whole.

These are the things we should concentrate on. To find the answer as to what format this “application” will be, we need a mixture of skills of people working on the product:

  • researchers,
  • designers,
  • UX people,
  • content writers,
  • trainers to show people how to use the tool and how to put content in it afterwards and,
  • yes, of course, developers.

And this is the scary part: this costs money and a lot of effort. It also means that we have to think about communicating and building teams that are good at bouncing ideas off one another and find a good consensus. It also means it will take longer to build this.

All of this is anathema to people who have to show off to venture capital companies and stakeholders. We have to move faster, we have to be better. Less people, more products, quicker iterations, more features. It doesn’t matter what the product does: the most important part is that you show that it evolves and changes constantly.