Another place where people are starting to get it.
Perls of wisdom in a sea of site mismanagement
(David Walker, SMH)
In short, Berk has been reporting on what sites are actually doing, rather than describing the idealised world portrayed by technology vendors and integrators. His core complaint: site management system vendors are creating generic solutions that actually increase the cost of running a site. Meanwhile, most businesses either have very simple needs that require only cheap, simple systems, or have specific needs that generic solutions handle poorly. That means the vendors’ ideal of a generic site-management system “is completely wrong”, Berk says. “The development overhead is very, very high – and for 90per cent of the problems, that’s too much overhead.”
So what should most organisations do? “Use the tools that are simple and cheapest,” he says.
What sort of tools does Berk have in mind? Perl scripts, for instance. A tiny technical team armed with Perl scripts and an Oracle database ran the first sites he worked on back in the mid-1990s. Berk recalls his fascination as he saw larger and larger teams implementing more and more complex platforms in the late 1990s and early 2000s to achieve essentially the same result.
I wrote about this her on my blog a few months ago. I used to be a big fan of a nice, well thought out, generic content management system. After having worked on several projects of that sort, my view has turned completely. Except in VERY SPECIFIC situations where the group doing the project really needs it, and is already structured for and mentally comfortable with the notion of the full seperation of content and presentation, doing a generalized content management system is just courting disaster.
At my last position, in late 2002 I was brought on to take over a Content Management project that was having lots of trouble. After investigating the situation my first recomendation was to stop the approach completely. And instead build quick, small, targeted cheap and easy systems that would meet the specific content management projects that were on the table, not try to solve larger problems that were mostly imaginary, or consolidate for the sake of consolidation.
I was overrulled.
So we tried to define the big system as best we could. And did a damn good job I think. But then it proved impossible for the tech team to implement in the time allotted, and the tech team tried to develop it using the wrong technologies (mandated to them from above). What resulted was a horrible mess that we were forced to use because by then we had no choice.
A small, quickly crafted custom application done by one or two good developers, could have blown away the system we ended up getting. One that had the data model we wanted, but tried to use the generic interface provided by one of the big enterprise systems instead of the one we had defined.
It isn’t just a matter of maturing technology, it is a matter of being smart and picking the right tools for the job, and not trying to solve bigger problems than you need to.
There may be disadvantages to doing “quick and dirty” solutions in that they eventually pile up and cause spaghetti type problems… but all in all, they often end up being much more cost effective than going all out on massive “enterprise” solutions that try to do everything.
With a fraction of the money my company spent on various failed content management solutions over the last few years, they could have kept employed a small army of HTML people manually updating the sites. Yes, it would have been manual. But the end results would have been just as good, and the company would have saved a lot of money. Some quick and dirty automation tools would have helped even more. But the larger systems… unless a specific need is there… boondoggle.
Good to see more places are learning it is time to be smart about such things.