Revision as of 03:19, 21 September 2003 editCprompt (talk | contribs)Extended confirmed users1,670 edits Added a link← Previous edit | Revision as of 10:33, 29 October 2003 edit undo194.202.153.195 (talk)No edit summaryNext edit → | ||
Line 8: | Line 8: | ||
However, even with systems with standard locations for dynamic libraries and careful versioning schemes, problems may be encountered with the use of such libraries. For example, the ] desktop environment, common on ] and other Unix-like systems, is notorious for bundling an enormous number of libraries. Even though the versioned dynamic linking largely solves the runtime issues, the sheer number of libraries (which must be retrieved and installed separately on some systems) is in itself a burden. Additionally, whilst the system provides facilities for installing side-by-side versions of the runtime libraries, it doesn't provide much assistance for side-by-side installs of the ]s required for ] the programs. However, these issues mainly inconvenience developers and people who compile their software from source, rather than end users who usually download precompiled software from their distribution provider. Distributions tend to use package managers that significantly reduce these problems for end-users. | However, even with systems with standard locations for dynamic libraries and careful versioning schemes, problems may be encountered with the use of such libraries. For example, the ] desktop environment, common on ] and other Unix-like systems, is notorious for bundling an enormous number of libraries. Even though the versioned dynamic linking largely solves the runtime issues, the sheer number of libraries (which must be retrieved and installed separately on some systems) is in itself a burden. Additionally, whilst the system provides facilities for installing side-by-side versions of the runtime libraries, it doesn't provide much assistance for side-by-side installs of the ]s required for ] the programs. However, these issues mainly inconvenience developers and people who compile their software from source, rather than end users who usually download precompiled software from their distribution provider. Distributions tend to use package managers that significantly reduce these problems for end-users. | ||
As an aside, the problem might well be called "so-hell" on ] systems, because the shared libraries are given the .so ("shared object") suffix, rather than the .dll of Microsoft systems. However, the actual problems are less frequent if using a ] distribution which uses a well-controlled automatic update mechanism, such as rpm, but can occur if additional software is installed manually. There can be problems with the main C and C++ run time library, glibc, which ideally needs to be the latest version, but might not work with software compiled with old versions of the C compiler. This is similar to issues involving Windows and msvcrt30.dll, msvcrt40.dll etc. The glibc library is also dependent on the kernel version, because it interfaces directly to it, again similar to Windows system .dlls. It would not be a good idea to mix Windows 2000 with bits of Windows XP (or worse, 9x!) by mixing the system .dlls, so you should not expect the same sort of thing to work in Linux either. The BIG difference between ] and Windows is that in Linux you have, or can get, the source code, so it is always possible, and often fairly simple, to re-compile applications to use your current kernel and glibc, just use the corresponding C compiler and run the "make" file. With closed source systems such as Windows you would need to rely on support from the manufacturer of your application, who may well not themselves get good cooperation from Microsoft. | |||
There are several measures known to avoid DLL-hell, which have to be used simultaneously, for optimal results: | There are several measures known to avoid DLL-hell, which have to be used simultaneously, for optimal results: | ||
Line 15: | Line 17: | ||
* Proper software design is paramount. DLLs are best for modularizing the system's components and as third-party libraries; their usage is not imperative in all cases. For example, if an application needs a library that won't be used anywhere else, they can be linked statically, with no space penalty and with a speed gain. | * Proper software design is paramount. DLLs are best for modularizing the system's components and as third-party libraries; their usage is not imperative in all cases. For example, if an application needs a library that won't be used anywhere else, they can be linked statically, with no space penalty and with a speed gain. | ||
The same four principles should be applied to ] and other operating systems, and in most cases have already been arranged by the supplier of your distribution (Red Hat, SuSE, Debian, etc). | |||
==External Links== | ==External Links== | ||
Revision as of 10:33, 29 October 2003
DLL-hell is a colorful term given to any problem based on a difficulty in managing Dynamic Link Libraries (DLLs) installed on a particular copy of an operating system. This includes conflicts between various versions of these libraries, difficulty in obtaining a large number of such libraries, and/or having many unnecessary copies of different versions (which can cause both disk space and performance problems).
Generally, the concept of DLLs means that many applications can share the same DLL file. However, in many cases, applications may introduce a changed version of a particular DLL which is already present on a system, either overwriting the old copy (which can, in turn, break compatibility with other applications), or install another copy, wasting disk space, memory space and slowing program load times because it takes more time to locate the right DLL among many.
As time goes by, the DLL-hell problem can become only worse, since software that installed the unnecessary DLLs is unlikely to remove them when uninstalled. This could eventually cause a chaos of thousands of mysterious DLL-files, some of which are necessary for the system to function normally, while others are just wasting space, and with no way to distinguish between them.
DLL-hell as described above is a very common phenomenon on Microsoft Windows systems, as they have limited facilities for system file management and versioning of libraries (and existing programs often disrespect the few facilities that do exist). Many "DLL-hell" problems are less likely on Unix-like systems, because there are a large number of standards and conventions that enable having multiple different DLL versions available simultaneously, Unix-like systems' search systems for libraries tend to be efficient (so load times are not strongly affected), and most tend to have package managers to support automated management of DLLs (including automated update of a library, and removal once a version of a library is no longer needed).
However, even with systems with standard locations for dynamic libraries and careful versioning schemes, problems may be encountered with the use of such libraries. For example, the GNOME desktop environment, common on Linux and other Unix-like systems, is notorious for bundling an enormous number of libraries. Even though the versioned dynamic linking largely solves the runtime issues, the sheer number of libraries (which must be retrieved and installed separately on some systems) is in itself a burden. Additionally, whilst the system provides facilities for installing side-by-side versions of the runtime libraries, it doesn't provide much assistance for side-by-side installs of the header files required for compiling the programs. However, these issues mainly inconvenience developers and people who compile their software from source, rather than end users who usually download precompiled software from their distribution provider. Distributions tend to use package managers that significantly reduce these problems for end-users.
As an aside, the problem might well be called "so-hell" on Linux systems, because the shared libraries are given the .so ("shared object") suffix, rather than the .dll of Microsoft systems. However, the actual problems are less frequent if using a Linux distribution which uses a well-controlled automatic update mechanism, such as rpm, but can occur if additional software is installed manually. There can be problems with the main C and C++ run time library, glibc, which ideally needs to be the latest version, but might not work with software compiled with old versions of the C compiler. This is similar to issues involving Windows and msvcrt30.dll, msvcrt40.dll etc. The glibc library is also dependent on the kernel version, because it interfaces directly to it, again similar to Windows system .dlls. It would not be a good idea to mix Windows 2000 with bits of Windows XP (or worse, 9x!) by mixing the system .dlls, so you should not expect the same sort of thing to work in Linux either. The BIG difference between Linux and Windows is that in Linux you have, or can get, the source code, so it is always possible, and often fairly simple, to re-compile applications to use your current kernel and glibc, just use the corresponding C compiler and run the "make" file. With closed source systems such as Windows you would need to rely on support from the manufacturer of your application, who may well not themselves get good cooperation from Microsoft.
There are several measures known to avoid DLL-hell, which have to be used simultaneously, for optimal results:
- Ship the operating system with a capable package management system, that would be able to track the DLL dependencies. Declare using the package manager good style and using manual installs bad style.
- Have a central authority for distributing the library. Changes to the library can be submitted to this authority; this, it can make sure compatibility is preserved in the developed branches. If some older software is incompatible with the current library, the authority can provide a compatibility interface for it, or bundle the old version as a distinct package.
- If software developers need to customize a library, and if the main library release is unlikely to incorporate the changes that they need, they can use static linking against their own version, or create a new package (with a different name) for the library.
- Proper software design is paramount. DLLs are best for modularizing the system's components and as third-party libraries; their usage is not imperative in all cases. For example, if an application needs a library that won't be used anywhere else, they can be linked statically, with no space penalty and with a speed gain.
The same four principles should be applied to Linux and other operating systems, and in most cases have already been arranged by the supplier of your distribution (Red Hat, SuSE, Debian, etc).