Two points might be mixed here:
- reproducibility / documentation (in the sense of what was done in a given analysis)
- reusable, modular, documentation (in the sense of code documentation for software development)
The first point is part of good practice in all science fields and is something that can keep us debating for a long time; the documentation part is the overlapping part, and there plenty of nice ideas coming from literate programming (R's Sweave for example).
A significant fraction of bioinformaticians are not software developers at heart. This is not judgmental as this has both good and bad aspects to it, but a bad aspect is that tools from software development that would help with reproducibility are largely unknown or looked down upon.
Regarding the second point, bioinformatics work can often be placed on a plane with two axes:
- Help answer questions of biological interest in a given dataset
- Develop tools that help answer questions of biological interest (this can be theoretical work, or implementation work).
[Edit: Lars posted while I was writing - Russ Altman's blog refers to something similar. However, I disagree with the wording because of the material published in the Journal of Computational Biology: the journal focuses on methods, to quote them "The Journal publishes peer-reviewed papers focusing on novel, cutting-edge methods in computational biology and bioinformatics."
Developing tools can occur during some of the larger projects, but as it was nicely said in other answers developing good modularity and extensive documentation does not come without a cost. If biology is the main interest, modularity, reusability and maintainability goes down the list of priorities.
Functionality is something that commercial software development wants to freeze as early as possible because late changes cost a lot, and on the other hand when doing research one often does not exactly know what will be found and what will be most useful. Also, software development shops with an aim at support and extended development cycles have developed rules such strong guidelines for writing code (naming rules for variables, documentation rules), a preferred development methodology. In fact distinct people can be working on software design while others implement the design.
As long as there is no penalty for writing unmaintainable code (we all understand what means selection pressure, right ? ;-) ) those will not be much adopted. Other answers mention that getting funding for support would correct it; I do not think it would completely without a competitive advantage for doing so. The metric of success for grants in the short term is typically publication(s) (the more prestigious the journal(s) the better), and possibly number of citations in the long term. Well-supported software would help the number citation... if only software was cited whenever it is used (personally I can't complain too much, but take the biopython project for example: 10 years of existence and continued support for twice ~50 citations - I'd happy to hear from the biopython folks how many times it was downloaded in the meantime).
Finally, I do not think that the hypothesis that the relatively poor state of software in some places is to be attributed to the non-commercial nature of the work or the fact that "high-level" languages are used holds. There are many open source projects that are handled impressively well, and there is plenty of unmaintainable code in the business implemented in "low-level" languages.
To get more modular and reusable code:
train people in software design and software engineering (or apply selective pressure and only hire people with those skills).
cite more software (number of citations can eventually mean something to funding agencies)