Everybody's doing them. Installing a major Microsoft Windows application can involve copying dozens if not hundreds of Dynamically-Linked Libraries (DLLs). Apple's MacOS has, not one, but two architectures for building shared code: the Shared Library Manager (SLM) and the Code Fragment Manager (CFM).
In theory, shared libraries are supposed to offer the following advantages:
In practice, the first three advantages are illusory. Plug-ins are a great idea, and have been augmenting the power of applications since the early days of XCMDs for HyperCard, right up to today's add-ons for Photoshop. But they don't require different applications to be sharing a common copy of the same code. Indeed, the idea that applications should share code turns out to be a very bad one indeed.
Version conflicts are the single biggest bugbear with shared libraries. These occur for various variations on the same reason:
(Thanks to those folks who pointed out that the MacOS shared-library architectures include a sophisticated version-control system that Microsoft Windows DLLs lack. So if different applications need different versions of a library loaded at the same time, they can have it. But then, if each application has its own version of the library, then you're not sharing code, are you?)
To add to the fun, replace the phrase "existing applications" above with "code in other shared libraries". When you release an application, you expect to be able to test the code that you will be including in the package, to make sure it all works together. But what happens when customers can individually update various libraries that your code depends on, with newer (or worse still, older!) versions that might come with other applications? Even if you wanted to support this, the number of possible combinations of versions that they might end up with is more than you could ever hope to test!
This just brings to mind an old engineering adage:
In any system, the complexity arises not so much from the number of parts, as from the number of potential interactions between them.
To make this clearer, just consider the number of possible interactions between pairs of components. If you have n components, then the number of pairs is n * (n - 1) / 2. Thus, if you have five components, then you have 10 possible combinations of pairs. If you have fifteen components, that's 105 pairs, and for fifty components, it's 1225 pairs! In short, the complexity is going up roughly as the square of the number of components--double the number of components, and you get four times the complexity.
As every engineer knows, complexity has implications for both reliability and cost. In computers, the issue of user-support costs is becoming particularly acute, and in terms of the total cost of ownership of a personal computer over its lifetime, support costs already exceed by several times the initial purchase price of the machine. Now, people are starting to realize that shared libraries may be accounting for a large chunk of these support costs.
Deinstallation of applications with shared libraries is a big headache. How can you be sure that a library you're removing isn't being used by some other application that you want to keep? Deinstallers are a thriving category of commercial products, yet none of them is quite perfect, as users of DLL-ridden Microsoft Windows systems know only too well.
RAM and disk space is simply not an issue any more. RAM only costs a few dollars per megabyte, and the cost of disk can now be measured in cents per megabyte. Imagine you have ten different applications, each making use of a common library totalling a megabyte in size. The cost of not using a shared library, which means each application has to have its own copy of the code, means you end up taking ten megabytes of RAM and disk space instead of one. The extra cost is still cheap compared to the cost of the support headaches caused by shared libraries.
Inter-process communication does not require shared libraries at all, only a shared protocol. Such communication schemes commonly work like this:
As you can see, there is no need for different communicating clients to share any code at all, only data. And any issues of version conflicts at the protocol level are precisely the same whether you use shared libraries or not, as are the solutions to those issues.
Libraries that come with the operating system should not be visible as separate "shared libraries" at all. As a rule, third-party vendors should not be shipping these libraries with their products. The only releases should come from the OS vendor, and clearly, a small number of major updates to several parts of the system coming out at once, is preferable to lots of incremental updates to individual parts coming out independently (in accordance with the principle of minimizing the number of potential interactions between parts).
In other words, the more monolithic the operating system appears to third-party programmers and users, the more reliably it can be expected to work.
Conclusion: shared libraries shouldn't be for sharing. Use them for dynamically-loaded plug-ins, by all means, but make sure each application uses its own copy of the code!
Created by Lawrence D'Oliveiro 1998 July 23, last updated 1998 August 4.
Back to LDO's Home Page