This work is licensed under a Creative Commons
License. This copyright applies to the Music Ontology Specification
and accompanying documentation and does not apply to Music Ontology data formats,
ontology terms, or technology. Regarding underlying technology, Music Ontology
relies heavily on W3C's RDF
technology, an open Web standard
that can be freely used by anyone.
This visual layout and structure of the
specification was adapted from the FOAF
Vocabulary Specification by Dan Brickley and Libby Miller and the SIOC Ontolofy Specification by Uldis Bojar and John G. Breslin. This document was automatically generated using the OntoSpec script.
Abstract
The Music Ontology Specification provides main concepts and properties fo describing music (i.e. artists, albums, tracks, but also performances,
arrangements, etc.) on the Semantic Web.
This document contains a detailed description of the Music Ontology.
Status of This Document
NOTE:This section describes the status of this document at the time
of its publication. Other documents may supersede this document.
This specification is an evolving document. This document is generated
by a machine-readable Music Ontology expressed in RDF/XML with a specification template.
Authors welcome suggestions on the Music Ontology and this document. This document may be updated or added to based on implementation experience, but no commitment is made
by the authors regarding future updates.
Internet changed the music industry. At first, sharing systems like Napster
allowed people to share any song they had on their computer with millions other people.
That new reality changed the music industry’s landscape for good, and many juridical battles followed.
However, a biggest change followed a couple of years later.
Communities like MySpace started to appear.
Strong of millions of regular users, such communities helped garage bands and obscure musicians to create
their musical niche: the longtail of the music industry.
This second change is more profound than the first one:
now any musician has the possibility to reach their audience by sharing their work on the Web.
In the mean time, a free database called MusicBrainz archiving
million of between artists, albums and tracks appeared; music suggesting services like
Pandora or Last.FM started to appear and Apple started
to sell individual tracks at 1$ with iTunes.
At that point, the music industry of the eighties leaded by blockbusters was completely changed.
The Music Ontology is an attempt to provide a vocabulary for linking a wide range music-related information,
and to provide a democratic mechanism for doing so.
Anybody can publish Music Ontology data and link it with existing data, in order to help
create a music-related web of data.
For example, John Doe may publish some information about a performance he saw last night (like the fact
that he was there, and a review). Mary Doe may publish the fact that she attended the same performance, that
she recorded it using her cell-phone, and that the corresponding item is available in her podcast.
The Music Ontology provides a vocabulary to express information ranging from this example to:
In this performance was interpreted a particular arrangement of the Trout Quintet by Franz Schubert.
This work was performed ten times, but only two of these performances were recorded.
Ten takes of this particular track have been recorded, each of which with a particular microphone location.
"Come as You Are" by Nirvana was released on a single and the "Nevermind" album.
During this gig, the band played ten songs. During the last one (a cover of "Eight days a week"),
the drummer from the support band joined them to play with them.
The Music Ontology is divided in three levels of expressiveness - from the simplest one to the more complex one.
Everything is clustered around the following categories:
Level 1: aims at providing a vocabulary for simple editorial information (tracks/artists/releases, etc.)
Level 2: aims at providing a vocabulary for expressing the music creation workflow (composition, arrangement,
performance, recording, etc.)
Level 3: aims at providing a vocabulary for complex event decomposition, to express, for example, what happened during
a particular performance, what is the melody line of a particular work, etc.
Terminology and Notation
The keywords "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT",
"SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this
document are to be interpreted as described in RFC 2119 [RFC 2119].
Namespace URIs of the general form "http://www.example.com/." represents some application-dependent
or context-dependent URI as defined in RFC 2396 [RFC 2396].
The XML Namespace URI that MUST be used by implementations of this specification is:
An alphabetical index of Music Ontology terms, by class (categories or types), by
property and by individuals. All the terms are hyperlinked to their detailed description for quick reference.
External Ontologies used by the Music Ontology At A Glance
An alphabetical index of external ontologies terms, by class (categories or types), by
property and by individuals, used by the Music Ontology. All the terms are hyperlinked to their detailed description for quick reference.
A list of terms used within the level 1 of the Music Ontology,
therefore covering simple editorial information (tracks, releases, artists, groups, etc.).
A list of terms used within the level 2 of the Music Ontology,
therefore covering information about the music creation workflow (Composition, Performance, Arrangement, Recording, etc.).
A list of terms used within the level 3 of the Music Ontology,
therefore covering event decomposition (`this particular performer was playing that particular instrument at that particular time').
The Music Ontology definitions presented here are written using
a computer language (RDF/OWL) that makes it easy for software to
process some basic facts about the terms in the Music Ontology, and
consequently about the things described in Music Ontology documents. A Music Ontology
document, unlike a traditional Web page, can be combined with other
Music Ontology documents to create a unified database of information
There are the schemes of the hierarchy of the Music Ontology classes. These schemes show the interaction between the Music Ontology classes and other ontologies classes.
This specification serves as the Music Ontology "namespace document". As such
it describes the Music Ontology and the terms (RDF classes and properties) that
constitute it, so that Semantic
Web applications can use those terms in a variety of RDF-compatible
document formats and applications.
This document presents the Music Ontology as a Semantic Web vocabulary or Ontology.
The Music Ontology is straightforward, pragmatic and designed to allow
simultaneous deployment and extension, and is therefore intended for
widescale use.
Evolution and
Extension of the Music Ontology
The Music Ontology is identified by the namespace URI
'http://purl.org/ontology/mo/'.
Revisions and extensions of Music Ontology are conducted through edits to the namespace document, which by convention is published in the Web at the namespace
URI.
The properties and types defined here provide some basic
concepts for use in Music Ontology descriptions. Other vocabularies (e.g. the
Dublin Core metadata elements for
simple bibliographic description, FOAF, etc.) can also be mixed in with the Music Ontology terms, as can local
extensions. The Music Ontology is designed to be extended, and modules may be added
at a later date.
Music Ontology Modules
Music Ontology modules may be used to extend the ontology and avoid making the base ontology too complex.
A list of available modules is available on the Wiki.
Time, and TimeLine, Event
The parts of the Music Ontology related to the production process of a particular piece of music (composition, performance, arrangement,...) as well as the parts dealing with time-related information are based on three external ontologies. The Music Ontology provides RDFS wrappers for the main classes, properties and individuals of these three ontologies.
The first ontology is: OWL-Time. Three terms of this ontology are used by the Music Ontology: TemporalThing, Instant and Interval.
However the kind of temporal information we may want to express goes a bit beyond OWL-Time, so we use an extension of it, developped in the Centre for Digital Music, Queen Mary, University of London: the TimeLine ontology. Indeed, we may want to express instants and intervals on multiple "timelines" (a timeline being a coherent backbone for temporal things): the one backing a particular audio file, the one behind an audio/video stream, or the physical one, backing a musical performance. Two classes of timelines are defined: PhysicalTimeLine (an instance of it being universaltimeline, which is the one on which we may address "the 13th of october, 2006"), and RelativeTimeLine (instances of this class may back audio signals, and we may address things such as "between 2 and 3 seconds" on them).
There is only one way of addressing temporal things per class of timeline. On a physical time line, a point is identified by a xsd:dateTime -- through the beginsAtDateTime property, and a duration by a xsd:duration -- through the durationXSD property. On a relative time line, a point P is identified by the duration of the interval [0,P], and this duration is identified by a xsd:duration -- through the beginsAtDuration property. A duration is identified by durationXSD.
In order to express knowledge about the production process of a piece of music, we use the Event ontology, also developped at the Centre for Digital Music. Events are seen as a way to arbitrary classify a space/time region. We have the possibiliy to attach to these events: agents (active participants to the event, like a performer, a sound engineer, ...), factors (passive things having a role in the event, like a musical instrument, a musical score, ...) and products (things produced by the event, such as a sound, a musical work, ...). A key feature of this ontology is also to allow "splitting" of events, through the sub_event transitive property. Using events, we may express: this musician was playing this instrument at this given time.
In the current version of the Music Ontology, the main sub-classes of Event are: Performance, Recording, Arrangement, Composition. However, given its abstract definition, we can describe lots of other things using this class: results of feature extraction, beat tracking, segmentation of songs...
Music Creation Workflow
In order to describe music-related events, we consider describing the
workflow beginning with the creation of a musical work to
its release on a particular record. This is our main description
paradigm, and was first used in the music production ontology
developed at C4DM.
In the "easy" case (non-electronic music), We can describe this
workflow within two boundaries: the simplest one and the most expressive one.
The simplest one consider the existence of 4 concepts within this workflow:
MusicalWork (the musical work itself), Performance (the event
corresponding to an actual performance
of the work), Signal (recording the performance as a signal),
and MusicalManifestation
(the release of this signal on a particular record).
The most expressive one consider the existence of 7 concepts:
Composition (the event leading to the creation of a musical work),
MusicalWork, Performance, Sound (the physical sound
produced by the performance), Recording (the event representing the
transduction from a physical sound to a signal, through
the use of a microphone), Signal, and MusicalManifestation.
Thus, we could imagine other ontologies plugged on top of MO, in order
to represent the cognition of a sound (related to Sound),
the different types of microphones that can be used (related to
Recording), and so on.
In order to switch from the simplest workflow to the most expressive
one, we define a single shortcut property:
recorded_as, directly linking a Performance to a Signal.
This property MUST be present in every case, in order to be
able to do a simple query for accessing simple information.
The Music Ontology and Standards
It is important to understand that the Music Ontology as specified in this document is not a standard
in the sense of ISO Standardisation,
or that associated with W3C Process.
The Music Ontology depends heavily on W3C's standards work, specifically on XML, XML Namespaces, RDF, and OWL.
All the Music Ontology documents must be well-formed RDF/XML documents.
This specification contributes an ontology, the "Music Ontology ", to the Semantic
Web, specifying it using W3C's Resource
Description Framework (RDF). As such, the Music Ontology adopts by reference both
a syntax (using XML), a data model (RDF graphs) and a mathematically
grounded definition for the rules that underpin the RDF design.
The Music Ontology is an application of the Resource Description Framework (RDF) because the subject area we're describing – music: artists, albums and tracks -- has so many competing requirements that a standalone format would not capture them or would lead to trying to describe these requirements in a number of incompatible formats. By using RDF, the Music Ontology gains a powerful extensibility mechanism, allowing Music-Ontology-based descriptions to be mixed with claims made in any other RDF vocabulary.
This is for that reason that the RDF document examples at the beginning of this document suggested to use the FOAF ontology to describe the Person, and its relations with other Person (artist or non-artist), and to use the Music Ontology to specify that that Person is also an Artist. That way, we can re-use other ontologies such as the Relationship ontology, in conjunction with the FOAF ontology, to describe different relationship between Artists.
The Music Ontology as an ontology cannot incorporate everything we might want to talk about that is related to music: artists (people), albums and tracks. Instead of covering all topics within the Music Ontology itself, we describe the basic topics and build into a larger framework - RDF - that allows us to take advantage of work elsewhere on more specific description vocabularies.
RDF provides the Music Ontology with a way to mix together different descriptive vocabularies in a consistent way. Vocabularies can be created by different communities and groups as appropriate and mixed together as required, without needing any centralized agreement on how terms from different vocabularies can be written down in XML or N3.
Check the Ontology namespaces referenced section to find some ontologies that ca be use in conjonction with the Music Ontology.
This mixing happens in two ways: firstly, RDF provides an underlying model of (typed) objects and their attributes or relationships. mo:Album is an example of a type of object (a "class"), while mo:compilation_of and mo:has_track are examples of a relationship and an attribute of an mo:Album; in RDF we call these "properties". Any vocabulary described in RDF shares this basic model, which is discernable in the syntax for RDF, and which removes one level of confusion in understanding a given vocabulary, making it simpler to comprehend and therefore reuse a vocabulary that you have not written yourself. This is the minimal self-documentation that RDF gives you.
Secondly, there are mechanisms for saying which RDF properties are connected to which classes, and how different classes are related to each other, using RDF Syntax and OWL. These can be quite general (all RDF properties by default come from an rdf:Resource for example) or very specific and precise (for example by using OWL constructs). This is another form of self-documentation, which allows you to connect different vocabularies together as you please.
In summary then, RDF is self-documenting in ways which enable the creation and combination of vocabularies in a devolved manner. This is particularly important for an ontology which describes communities, since online communities connect to many other domains of interest, which it would be impossible (as well as suboptimal) for a single group to describe adequately in non-geological time.
RDF is usually written using the XML or N3 syntaxes. If you want to process the data, you will need to use one of the many RDF toolkits available, such as Jena (Java) or Redland (C).
More information about RDF can be found in the RDF Primer.
The Music Ontology cross-reference: Classes, Properties and Individuals
AbstractTimeLine
- Abstract time lines may be used as a backbone for Score, Works, ...
This allows for TimeLine maps to relate works to a given performance (this note was played at this time).
No coordinate systems are defined for these timelines. Their structure is implicitly defined
by the relations between time objects defined on them (eg. this note is before this note, which is
before this silent, which is at the same time as this note).
Composition
- A composition event.
Takes as agent the composer himself.
It produces a MusicalWork, or a MusicalExpression (when the initial "product" is a score, for example), or both...
Event
-
An event: a way of arbitrary classifying a space/time region.
An event has agents (active entities contributing to the event -- a performer, a composer, an engineer, ...),
factors (passive entities contributing to the event -- flute, score, ...),
and products (things produced by the event -- sound, signal, score, ...). For
example, we may describe as Events: performances, composition events, recordings, arrangements,
creation of a musical group, separation of a musical group,
but also sounds, signals, notes (in a score)...
Festival
- A festival - musical/artistic event lasting several days, like Glastonbury, Rock Am Ring...
We migth decompose this event (which is in fact just a classification of the space/time region related to
a particular festival) using hasSubEvent in several performances at different space/time.
Any taxonomy can be plug-in here. You can either define a genre by yourself, like this:
:mygenre a mo:Genre; dc:title "electro rock".
Or you can refer to a DBPedia genre (such as http://dbpedia.org/resource/Baroque_music), allowing semantic web
clients to access easily really detailed structured information about the genre you are refering to.
Instrument
- Any of various devices or contrivances that can be used to produce musical tones or sound.
Any taxonomy can be used to subsume this concept. The default one is one extracted by Ivan Herman
from the Musicbrainz instrument taxonomy, conforming to SKOS. This concept holds a seeAlso link
towards this taxonomy.
Instrumentation
- Instrumentation deals with the techniques of writing music for a specific instrument,
including the limitations of the instrument, playing techniques and idiomatic handling of the instrument.
Movement
- A movement is a self-contained part of a musical work. While individual or selected movements from a composition are sometimes performed separately, a performance of the complete work requires all the movements to be performed in succession.
Often a composer attempts to interrelate the movements thematically, or sometimes in more subtle ways, in order that the individual
movements exert a cumulative effect. In some forms, composers sometimes link the movements, or ask for them to be played without a
pause between them.
MusicalExpression
- The intellectual or artistic realization of a work in the form of alpha-numeric, musical, or choreographic notation, sound, etc., or any combination of such forms.
For example:
Work #1 Franz Schubert's Trout quintet
* Expression #1 the composer's score
* Expression #2 sound issued from the performance by the Amadeus Quartet and Hephzibah Menuhin on piano
* Expression #3 sound issued from the performance by the Cleveland Quartet and Yo-Yo Ma on the cello
* . . . .
The Music Ontology defines the following sub-concepts of a MusicalExpression, which should be used instead of MusicalExpression itself: Score (the
result of an arrangement), Sound (produced during a performance), Signal. However, it is possible to stick to FRBR and bypass the worflow
mechanism this ontology defines by using the core FRBR properties on such objects. But it is often better to use events to interconnect such
expressions (allowing to go deeply into the production process - `this performer was playing this particular instrument at that
particular time').
MusicalItem
- A single exemplar of a musical expression.
For example, it could be a single exemplar of a CD. This is normally an single object (a CD) possessed by somebody.
From the FRBR final report: The entity defined as item is a concrete entity. It is in many instances a single physical object (e.g., a copy of a one-volume monograph, a single audio cassette, etc.). There are instances, however, where the entity defined as item comprises more than one physical object (e.g., a monograph issued as two separately bound volumes, a recording issued on three separate compact discs, etc.).
In terms of intellectual content and physical form, an item exemplifying a manifestation is normally the same as the manifestation itself. However, variations may occur from one item to another, even when the items exemplify the same manifestation, where those variations are the result of actions external to the intent of the producer of the manifestation (e.g., damage occurring after the item was produced, binding performed by a library, etc.).
MusicalManifestation
-
This entity is related to the edition/production/publication of a musical expression (musical manifestation are closely related with the music industry (their terms, concepts, definitions, methods (production, publication, etc.), etc.)
From the FRBR final report: The entity defined as manifestation encompasses a wide range of materials, including manuscripts, books, periodicals, maps, posters, sound recordings, films, video recordings, CD-ROMs, multimedia kits, etc. As an entity, manifestation represents all the physical objects that bear the same characteristics, in respect to both intellectual content and physical form.
Work #1 J. S. Bach's Six suites for unaccompanied cello
* Expression #1 sound issued during the performance by Janos Starker recorded in 1963 and 1965
o Manifestation #1 recordings released on 33 1/3 rpm sound discs in 1965 by Mercury
o Manifestation #2 recordings re-released on compact disc in 1991 by Mercury
* Expression #2 sound issued during the performances by Yo-Yo Ma recorded in 1983
o Manifestation #1 recordings released on 33 1/3 rpm sound discs in 1983 by CBS Records
o Manifestation #2 recordings re-released on compact disc in 1992 by CBS Records
Changes that occur deliberately or even inadvertently in the production process that affect the copies result, strictly speaking, in a new manifestation. A manifestation resulting from such a change may be identified as a particular "state" or "issue" of the publication.
Changes that occur to an individual copy after the production process is complete (e.g., the loss of a page, rebinding, etc.) are not considered to result in a new manifestation. That copy is simply considered to be an exemplar (or item) of the manifestation that deviates from the copy as produced.
With the entity defined as manifestation we can describe the physical characteristics of a set of items and the characteristics associated with the production and distribution of that set of items that may be important factors in enabling users to choose a manifestation appropriate to their physical needs and constraints, and to identify and acquire a copy of that manifestation.
Defining manifestation as an entity also enables us to draw relationships between specific manifestations of a work. We can use the relationships between manifestations to identify, for example, the specific publication that was used to create a microreproduction.
MusicalWork
- Distinct intellectual or artistic musical creation.
From the FRBR final report: A work is an abstract entity; there is no single material object one can point to as the work. We recognize the work through individual realizations or expressions of the work, but the work itself exists only in the commonality of
content between and among the various expressions of the work. When we speak of Homer's Iliad as a work, our point of reference is not a particular recitation or text of the work, but the intellectual creation that lies behind all the various expressions of the work.
OriginMap
- This time line map represents the relation between the physical timeline and a
continuous time line where the origin is at a given point on the physical timeline
(eg. "the timeline backing this signal corresponds
to the physical timeline: point 0 on this timeline corresponds to the
20th of december at 5pm").
Performance
- A performance event.
It might include as agents performers, engineers, conductors, or even listeners.
It might include as factors a score, a MusicalWork, musical instruments.
It might produce a sound:-)
Recording
- A recording event.
Takes a sound as a factor to produce a signal (analog or digital).
The location of such events (if any) is the actual location of the corresponding
microphone or the "recording device".
RelativeTimeLine
- A semi-infinite continuous timeline. Instances of RelativeTimeLine can
back audio/video signals, sounds. Such timelines can
be linked to a physical time line using the OriginMap.
Score
- Here, we are dealing with the informational object (the MusicalExpression), not the actually "published" score.
This may be, for example, the product of an arrangement process.
Show
- A show - a musical event lasting several days, in a particular venue. Examples can be
"The Magic Flute" at the Opera Bastille, August 2005, or a musical in the west end...
TimeLine
-
A time line -- a coherent "backbone" for addressing points and intervals.
We can consider the timeline backing an audio/video signal, the one
corresponding to the "physical" time, or even the one backing a score.
Here, we consider that the timeline is *also* its coordinate system, for
simplification purposes. In the DL version of the timeline ontology,
coordinate systems are defined through restrictions on the way to
address time points/intervals on a timeline.
TimeLineMap
- Two time lines can be related, such as the one backing a continuous signal and
the one backing the digitized signal. This sort of relation is expressed through an instance
of a TimeLine map (eg. "the timeline backing this signal corresponds
to the physical timeline: point 0 on this timeline corresponds to the
20th of december at 5pm").
atDuration - Place a time point on an abstract time line by expressing its distance to
the point 0, through xsd:duration (example: this instant is at 2s after 0 --> T2S)
available_as - Relates a musical manifestation to a musical item (this album, and my particular cd). By using
this property, there is no assumption on wether the full content is available on the linked item.
To be explicit about this, you can use a sub-property, such as mo:item (the full manifestation
is available on that item) or mo:preview (only a part of the manifestation is available on
that item).
beginsAtDuration - Links an interval on a semi-infinite continuous time line to its
start point, addressed using xsd:duration (duration between 0 and the
start point)
bpm - Indicates the BPM of a MusicalWork or a particular Performance
Beats per minute: the pace of music measured by the number of beats occurring in 60 seconds.
composed_in - Associates a MusicalWork to the Composition event pertaining
to its creation. For example, I might use this property to associate
the Magic Flute to its composition event, occuring during 1782 and having as
a mo:composer Mozart.
composer - Associates a composition event to the actual composer. For example,
this property could link the event corresponding to the composition of the
Magic Flute in 1782 to Mozart himself (who obviously has a FOAF profile:-) ).
discography - Used to links an artist to an online discography of their musical works. The discography should provide a summary of each released musical work of the artist.
djmix_of - Indicates that all (or most of) the tracks of a musical work or the expression of a musical work were mixed together from all (or most of) the tracks from another musical work or the expression of a musical work to form a so called DJ-Mix.
The tracks might have been altered by pitching (so that the tempo of one track matches the tempo of the following track) and fading (so that one track blends in smoothly with the other). If the tracks have been more substantially altered, the "mo:remix" relationship type is more appropriate.
djmixed - Used to relate an artist who djmixed a musical work or the expression of a musical work.
The artist usually selected the tracks, chose their sequence, and slightly changed them by fading (so that one track blends in smoothly with the other) or pitching (so that the tempo of one track matches the tempo of the following track). This applies to a 'Mixtape' in which all tracks were DJ-mixed together into one single long track.
djmixed_by - Used to relate a work or the expression of a work to an artist who djmixed it.
The artist usually selected the tracks, chose their sequence, and slightly changed them by fading (so that one track blends in smoothly with the other) or pitching (so that the tempo of one track matches the tempo of the following track). This applies to a 'Mixtape' in which all tracks were DJ-mixed together into one single long track.
download - This property can be used to link from a person to the website where they make their works available, or from
a manifestation (a track or an album, for example) to a web page where it is available for
download.
It is better to use one of the three sub-properties instead of this one in order to specify wether the
content can be accessed for free (mo:freedownload), if it is just free preview material (mo:previewdownload), or
if it can be accessed for some money (mo:paiddownload) (this includes links to the Amazon store, for example).
This property MUST be used only if the content is just available through a web page (holding, for example
a Flash application) - it is better to link to actual content directly through the use of mo:available_as and
mo:Stream, mo:Torrent or mo:ED2K, etc. Therefore, Semantic Web user agents that don't know how to read HTML and even
less to rip streams from Flash applications can still access the audio content.
free_download - This property can be used to link from a person to the website where they make their works available, or from
a manifestation (a track or an album, for example) to a web page where it is available for free
download.
This property MUST be used only if the content is just available through a web page (holding, for example
a Flash application) - it is better to link to actual content directly through the use of mo:available_as and
mo:Stream, mo:Torrent or mo:ED2K, etc. Therefore, Semantic Web user agents that don't know how to read HTML and even
less to rip streams from Flash applications can still access the audio content.
genre - Associates an event (like a performance or a recording) to a particular musical genre.
Further version of this property may also include works and scores in the domain.
image - Indicates a pictorial image (JPEG, GIF, PNG, Etc.) of a musical work, the expression of a musical work, the manifestation of a work or the examplar of a manifestation.
intervalDuring - One of Allen's relations. Specifies that an interval occurs during an other.
It is really handy to express things like "it happened the 15th of august, but I do not remember exactly when".
isrc - The ISRC (International Standard Recording Code) is the international identification system for sound recordings and music videorecordings.
Each ISRC is a unique and permanent identifier for a specific recording which can be permanently encoded into a product as its digital fingerprint.
Encoded ISRC provide the means to automatically identify recordings for royalty payments.
item - Relates a musical manifestation to a musical item (this album, and my particular cd) holding the
entire manifestation, and not just a part of it.
key - Indicated the key used by the musicians during a performance, or the key of a MusicalWork.
Any of 24 major or minor diatonic scales that provide the tonal framework for a piece of music.
maker - Relates a manifestation to an agent who contributed to create it.
This property might be used for weak assertion of such a relationship. In case we want
to attach a more concrete role (this agent performed, or was the composer, etc.), we must
use mo:Performer, mo:MusicalWork/mo:Composition, etc. This indeed allows to specify where a
particular agent took part in the actual workflow.
mashup_of - Indicates that musical works or the expressions of a musical work were mashed up on this album or track.
This means that two musical works or the expressions of a musical work by different artists are mixed together, over each other, or otherwise combined into a single musical work (usually by a third artist, the remixer).
onTimeLine - Links an instant or an interval to the timeline it is defined on (eg. "1970 is defined on the
time line universaltimeline", or "the interval between 0 and 2 minutes is defined on the time
line backing this sound and this signal").
paid_download - Provide a link from an artist to a web page where all of that artist's musical work is available for some money,
or a link from a manifestation (record/track, for example) to a web page providing a paid access to this manifestation.
performance_of - Associates a Performance to a musical work or an arrangement that is being used as a factor in it.
For example, I might use this property to attach the Magic Flute musical work to
a particular Performance.
performed_in - Associates a Musical Work or an Score to Performances in which they were
a factor. For example, I might use this property in order to
associate the Magic Flute to a particular performance at the Opera
Bastille last year.
preview - Relates a musical manifestation to a musical item (this album, and my particular cd), which holds
a preview of the manifestation (eg. one track for an album, or a snippet for a track)
preview_download - This property can be used to link from a person to the website where they make previews of their works available, or from
a manifestation (a track or an album, for example) to a web page where a preview download is available.
This property MUST be used only if the content is just available through a web page (holding, for example
a Flash application) - it is better to link to actual content directly through the use of mo:available_as and
mo:Stream, mo:Torrent or mo:ED2K, etc. Therefore, Semantic Web user agents that don't know how to read HTML and even
less to rip streams from Flash applications can still access the audio content.
produced_score - Associates an arrangement event to a score product (score here does not refer to a published score, but more
an abstract arrangement of a particular work).
produced_work - Associates a composition event to the produced MusicalWork. For example,
this property could link the event corresponding to the composition of the
Magic Flute in 1782 to the Magic Flute musical work itself. This musical work
can then be used in particular performances.
puid - Link a signal to the PUIDs associated with it, that is, PUID computed from MusicalItems (mo:AudioFile)
derived from this signal.
PUIDs (Portable Unique IDentifier) are the IDs used in the
proprietary MusicDNS AudioFingerprinting system which is operated by MusicIP.
Using PUIDs, one (with some luck) can identify the Signal object associated with a particular audio file, therefore allowing
to access further information (on which release this track is featured? etc.). Using some more metadata one can identify
the particular Track corresponding to the audio file (a track on a particular release).
recorded_as - This is a shortcut property, allowing to bypass all the Sound/Recording steps. This property
allows to directly link a Performance to the recorded Signal. This is recommended for "normal"
users. However, advanced users wanting to express things such as the location of the microphone will
have to create this shortcut as well as the whole workflow, in order to let the "normal" users access
simply the, well, simple information:-) .
recorded_in - Associates a physical Sound to a Recording event where it is being used
in order to produce a signal. For example, I might use this property to
associate the sound produced by a particular performance of the magic flute
to a given recording, done using my cell-phone.
recording_of - Associates a Recording event to a physical Sound being recorded.
For example, I might use this property to
associate a given recording, done using my cell phone, to the
sound produced by a particular performance of the magic flute.
remaster_of - This relates two musical work or the expression of a musical work, where one is a remaster of the other.
A remaster is a new version made for release from source recordings that were earlier released separately. This is usually done to improve the audio quality or adjust for more modern playback equipment. The process generally doesn't involve changing the music in any artistically important way. It may, however, result in tracks that are a few seconds longer or shorter.
remix_of - Used to relate the remix of a musical work in a substantially altered version produced by mixing together individual tracks or segments of an original musical source work.
remixed - Used to relate an artist who remixed a musical work or the expression of a musical work.
This involves taking just one other musical work and using audio editing to make it sound like a significantly different, but usually still recognisable, song. It can be used to link an artist to a single song that they remixed, or, if they remixed an entire musical work.
remixer - Used to relate a musical work or the expression of a musical work to an artist who remixed it.
This involves taking just one other musical work and using audio editing to make it sound like a significantly different, but usually still recognisable, song. It can be used to link an artist to a single song that they remixed, or, if they remixed an entire musical work.
sample_rate - Associates a digital signal to its sample rate. It might be easier to express it this way instead of
defining a timeline map:-) Range is xsd:float.
similar_to - A similarity relationships between two objects (so far, either an agent, a signal or a genre, but
this could grow).
This relationship is pretty general and doesn't make any assumptions on how the similarity claim
was derived.
Such similarity statements can come from a range of different sources (Musicbrainz similarities between
artists, or coming from some automatic content analysis).
However, the origin of such statements should be kept using a named graph approach - and ultimately, the
documents providing such statements should attach some metadata to themselves (confidence of the claim, etc.).
sub_event - Allows to link an event to a sub-event. A sub-event might be an event split by time,
space, agents, factors... This property can be used to express things such as "during
this performance, this person was playing this instrument at this particular time", through
the creation of a sub-event, occuring at this given time, and having as agent the person and
as factor the instrument
tempo - Rate of speed or pace of music. Tempo markings are traditionally given in Italian;
common markings include: grave (solemn; very, very slow); largo (broad; very slow);
adagio (quite slow); andante (a walking pace); moderato (moderate); allegro (fast; cheerful);
vivace (lively); presto (very fast); accelerando (getting faster); ritardando (getting slower);
and a tempo (in time; returning to the original pace).
tribute_to - Indicates a musical work or the expression of a musical work that is a tribute to an artist - normally consisting of music being composed by the artist but performed by other artists.
trmid - Indicates the TRMID of a track.
TRM IDs are MusicBrainz' old AudioFingerprinting system.
TRM (TRM Recognizes Music) IDs are (somewhat) unique ids that represent
the audio signature of a musical piece (see AudioFingerprint).
wikipedia - Used to link an work, an expression of a work, a manifestation of a work,
a person, an instrument or a musical genre to its corresponding WikiPedia page.
The full URL should be used, not just the WikiName.
compilation - Collection of previously released manifestations of a musical expression by one or more artists.
This is a type of MusicalManifestation defined by the musical industry.
promotion - A giveaway musical work or the expression of a musical work intended to promote an upcoming official musical work or the expression of a musical work.
spoken word - Spoken word is a form of music or artistic performance in which lyrics, poetry, or stories are spoken rather than sung.
Spoken-word is often done with a musical background, but emphasis is kept on the speaker.
This is a type of MusicalManifestation defined by the musical industry.
universal time line - The "canonical" physical time-line, on which points/intervals are addressed through UTC.
(Remember: we do here the amalgam between timelines and coordinate systems, as we
choose one canonical one per timeline).
"Functional Requirements for Bibliographic Records - Final Report", 1998 by International Federation of Library Associations and Institutions, ISBN 3-598-11382-X. (http://www.ifla.org/VII/s13/frbr/frbr.htm.)
[RFC 2119] S. Bradner,
"Key words for use in RFCs to Indicate Requirement Levels," RFC 2119, Harvard University, March 1997. (http://www.ietf.org/rfc/rfc2119.txt.)
[RFC 2396] T.Berners-Lee, et al, "Uniform Resource Identifiers
(URI): Generic Syntax," RFC 2396, Internet Engineering Task Force, August 1998. (http://www.ietf.org/rfc/rfc2396.txt.)
Change Log
2008-02-05: Revision 1.13 of the Music Ontology Specification
FIXED: wrong bio namespace
ADDED: owl:imports statements, to make the Music Ontology load fine in Protege
ADDED: OWL typing for all properties
ADDED: XSD types as ranges for literal properties
2007-08-10: Revision 1.12 of the Music Ontology Specification
ADDED: mo:availableAs subPropertyOf frbr:examplar
ADDED: mo:Torrent as medium (bittorrent items)
ADDED: mo:ED2K as medium (edonkey items)
ADDED: mo:usesScore (Performance --> Score (particular arrangement of a work))
ADDED: mo:preview_download (Agent or Manifestation to Document
ADDED: mo:download, super-property of mo:previewdonwload, mo:paiddownload and mo:freedownload
ADDED: mo:AudioFile (as a medium)
ADDED: two sub-properties of mo:availableAs -- mo:item (full manifestation available) and mo:preview (only a part of the manifestation is available)
ADDED: wrapper for foaf:maker, foaf:made, foaf:Agent, foaf:Person, foaf:Group, foaf:member
ADDED: mo:isrc (international standard recording code)
ADDED: mo:encodes (musical item --> signal), allowing to associate, for example, an item to a lower-resolution version of the master signal (issued from a Recording event)
ADDED: mo:genre, subproperty of event:hasFactor
ADDED: mo:level, an annotation property to specify wether a term belongs to level 1, 2 or 3
ADDED: mo:sampled_version, inverse of sampled_version_of
ADDED: produced_score (Arrangement --> Score)
MODIFIED: mo:usesSound to mo:recording_of (still a owl:sameAs statement between these two for backward compatibility sake)
MODIFIED: mo:usedInRecording to mo:recorded_in (same)
MODIFIED: mo:producesSignal to mo:produced_signal (same)
MODIFIED: mo:producesSound to mo:produced_sound (same)
MODIFIED: mo:usesWork to mo:performance_of (same)
MODIFIED: mo:productOfComposition to mo:composed_in (same)
MODIFIED: mo:usedInPerformance to mo:performed_in (same)
MODIFIED: mo:producesWork to mo:produced_work (same)
MODIFIED: mo:paiddownload to mo:paid_download (same)
MODIFIED: mo:freedownload to mo:free_download (same)
MODIFIED: mo:signalTime to mo:time (same)
MODIFIED: mo:sampledVersionOf to mo:sampled_version_of (same)
MODIFIED: mo:has_track to mo:track (same)
MODIFIED: mo:trackNum to mo:track_number (same)
MODIFIED: mo:releaseType to mo:release_type (same)
MODIFIED: mo:releaseStatus to mo:release_status (same)
MODIFIED: mo:eventHomePage to mo:event_homepage (same)
MODIFIED: mo:hasManifestation to mo:manifestation (same)
MODIFIED: mo:publishedAs to mo:published_as (same)
MODIFIED: mo:movementNum to mo:movement_number (same)
MODIFIED: mo:publicationOf to mo:publication_of (same)
MODIFIED: mo:publishingLocation to mo:publishing_location (same)
MODIFIED: mo:sampleRate to mo:sample_rate (same)
MODIFIED: mo:recordedAs to mo:recorded_as (same)
MODIFIED: mo:image subproperty of foaf:depiction
MODIFIED: changed comments and domain of freedownload and paiddownload
MODIFIED: changed comments on mo:MusicalWork
MODIFIED: changed comments on availableAs
MODIFIED: mo:encoding has MusicalItem as a domain
DELETED: mo:creatorOf - use foaf:made instead
DELETED: mo:pitch and mo:timbre - this should go to the audio features ontology
DELETED: mo:stream_url (it should be the identifier of the resource (eg. rtsp://...)
DELETED: sub concepts of mo:Instrument - instead, a link to Ivan Herman musical instrument SKOS taxonomy
DELETED: sub concepts of mo:Genre - instead a more proper documentation of this concept
2007-03-21: Revision 1.11 of the Music Ontology Specification
ADDED: mo:Festival and mo:Show
ADDED: mo:eventHomePage
ADDED: records (inverse of recordedAs) and publicationOf (inverse of
publishedAs)
REMOVED: mo:record, use mo:publishedAs instead (which goes from
Signal,Score,Lyrics,Libretto to
Record,PublishedScore,PublishedLyrics,PublishedLibretto)
REMOVED: member_of, use foaf:member, now a MusicGroup is a foaf:Group
REMOVED: realization_of, and inverse, use a Performance, or a Performance
and a Recording
REMOVED: mo:duration, use event:time instead (a Work cannot have a
duration, btw). it seems like dc:format may be used for this as well (see
http://dublincore.org/documents/format-element/)
REMOVED: second occurence of tempo
REMOVED: recordingstudio, use the place of a recording event instead
REMOVED: composer, use dc:creator instead
REMOVED: composed, use Composition
REMOVED: performed/performer/orchestrated/... use Performance / performer /
performed/orchestrator/... (subProperties of hasAgent)
REMOVED: provided_instrumentation / provider_of_instrumentation, use
Arrangement
REMOVED: wrote_lyrics, just use dc:creator on the Lyrics instance, and
this as a factor of the performance - same thing for Libretto
REMOVED: arranged, use Arrangement
REMOVED: instrument, trying to model a 3ary predicate... use a
Performance
Changed: Release status is just linked to the Manifestation, neither the
Work or the Expression
Changed: bpm/tempo/key has as a domain both MusicalWork (unchanged) and Performance
(this is a factor of the performance, not of the "sound")
Changed: has_movement was pointing towards a bad URI
Changed: pitch/domain has as a domain Sound (pitch for MusicalWork?)
Changed: djmixed has as a range MusicalManifestation (DJs work with
actual vynils/tracks...)
Changed: second occurence of djmixed to djmixed_by
Changed: remix_of goes from MusicalManifestation to MusicalManifestation
(this track is a remix of this other track?)
Changed: same thing for medley_of (this track is a medley of this track,
this track, this track)
2007-02-12: Revision 1.03 of the Music Ontology Specification
Changed the range of mo:key from a rdfs:Literal to a http://purl.org/NET/c4dm/music.owl#Key
Added property mo:opus
Added mo:MusicalWork in the domain of: mo:bpm, mo:duration, mo:key, mo:pitch
Added class mo:Movement
Added property mo:has_movement
Added property mo:movementNum
Added property mo:tempo
Added mo:Instrument in the range of mo:title
Remove mo:MusicalWork and mo:MusicalManifestation and added mo:MusicalManifestation to the property's domain mo:publisher, mo:producer, mo:engineer, mo:conductor, mo:arranger, mo:sampler, mo:compiler
Remove mo:MusicalWork and mo:MusicalManifestation and added mo:MusicalManifestation to the property's range mo:published, mo:produced, mo:engineering, mo:conducted, mo:arranged, mo:sampled, mo:compiled
2007-01-30: Revision 1.02 of the Music Ontology Specification
2007-01-06: Revision 1.01 of the Music Ontology Specification
2006-12-21: Initial version of the Music Ontology Specification