Navigation in and access to the contents of digital audio archives have become increasingly important topics in Information Retrieval. Both private and commercial music collections are growing both in terms of size and acceptance in the user community. Content based approaches relying on signal processing techniques have been used in Music Information Retrieval for some time to represent the acoustic characteristics of pieces of music, which may be used for collection organisation or retrieval tasks. However, music is not deﬁned by acoustic characteristics only, but also, sometimes even to a large degree, by its contents in terms of lyrics. A song´s lyrics provide more information to search for or may be more representative of speciﬁc musical genres than the acoustic content, e.g. `love songs´ or `Christmas carols´. We therefore suggest an improved indexing of audioﬁles by two modalities. Combinations of audio features and song lyrics can be used to organise audio collections and to display them via map based interfaces. Speciﬁcally, we use Self-Organising Maps as visualisation and interface metaphor. Separate maps are created and linked to provide a multi-modal view of an audio collection. Moreover, we introduce quality measures for quantitative validation of cluster spreads across the resulting multiple topographic mappings provided by the Self-Organising Maps.