Misplaced Pages

Tunebot

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Music search engine
This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
The topic of this article may not meet Misplaced Pages's general notability guideline. Please help to demonstrate the notability of the topic by citing reliable secondary sources that are independent of the topic and provide significant coverage of it beyond a mere trivial mention. If notability cannot be shown, the article is likely to be merged, redirected, or deleted.
Find sources: "Tunebot" – news · newspapers · books · scholar · JSTOR (September 2022) (Learn how and when to remove this message)
This article relies excessively on references to primary sources. Please improve this article by adding secondary or tertiary sources.
Find sources: "Tunebot" – news · newspapers · books · scholar · JSTOR (September 2022) (Learn how and when to remove this message)
(Learn how and when to remove this message)

Tunebot is a music search engine developed by the Interactive Audio Lab at Northwestern University. Users can search the database by humming or singing a melody into a microphone, playing the melody on a virtual keyboard, or by typing some of the lyrics. This allows users to finally identify that song that was stuck in their head.

Searching techniques

Tunebot is a query by humming system. It compares a sung query to a database of musical themes by using the intervals between each note. This allows a user to sing in a different key than the target recording and still produce a match. The intervals are also unquantized to allow for other tunings besides the standard A=440Hz, since not many people in the world have perfect pitch.

In addition to note intervals, Tunebot compares a query with potential targets by using rhythmic ratios between notes. Since ratios between note lengths are used, the tempo of the performance does not affect the rhythmic similarity measure.

Queries and targets are then matched by a weighted string alignment algorithm between the note intervals and rhythmic ratios.

Database

The database consists of unaccompanied melodies sung by contributors (a cappella). Contributors log into the website and sing their examples to the system. Each of these recordings is associated with a corresponding song on Amazon. A sung query is compared to these examples. A cappella sung examples are used as search keys because it is much easier to compare one unaccompanied vocal (the sung query) to another (an example search key) than it is to compare an unaccompanied vocal to a full band recording, which may contain guitar, drums, other singers, sound effects, etc.

Distinguishing features

Tunebot learns from user input, and it improve its results as each user submits more queries. Since no human can sing perfectly in tune every time they sing, the search engine must take that into account. By choosing a song from a list of ranked results, users tell Tunebot which song was correct. Tunebot then pairs that song with the user's query, analyzes the differences, and runs a genetic algorithm. This process tweaks the parameters that control how the system compares the user's query to the targets. For instance, if a user has no sense of rhythm, that factor of the comparison is lowered for future queries.

References

External links

Categories:
Tunebot Add topic