=Paper= {{Paper |id=Vol-1521/paper3 |storemode=property |title=Listener-aware Music Search and Recommendation |pdfUrl=https://ceur-ws.org/Vol-1521/paper3.pdf |volume=Vol-1521 |dblpUrl=https://dblp.org/rec/conf/pkdd/Schedl15a }} ==Listener-aware Music Search and Recommendation == https://ceur-ws.org/Vol-1521/paper3.pdf
                    Listener-aware Music Search and
                            Recommendation

                                         Markus Schedl

                            Department of Computational Perception
                            Johannes Kepler University, Linz, Austria
                                 http://www.cp.jku.at

        Abstract. Ubiquitous systems for music search, retrieval, and recommendation
        are recently receiving a considerable amount of attention, both in academia and
        industry. This is evidenced not least by the emergence of novel music streaming
        services and the respective availability of millions of music pieces, which have
        become easily accessible at the user’s fingertips, anywhere and anytime. In this
        keynote, I will report on two research directions we are currently pursuing in this
        context: (i) mining and analyzing social media data to improve music browsing
        and recommendation and (ii) exploiting sensor data for automatic music playlist
        modification on smart devices. As for (i), I will elaborate on the extraction and
        annotation of music listening events from social media, in particular Twitter and
        Last.fm, on the analysis of the data with respect to artist and song popularity
        and “mainstreaminess” of a country or population, and on exploiting this data to
        adapt music recommendation algorithms to user characteristics. For what con-
        cerns (ii), I will detail our insights into the extent to which we can predict the
        context-specific music taste (e.g., genre or artist) of the listener from a variety
        of sensor data. Furthermore, an analysis of the considered user-centric feature
        categories (location, time, weather, activity, etc.) and their usefulness for this
        prediction will be provided. I will showcase our work using two prototype ap-
        plications: Music Tweet Map1 (MTM) and Mobile Music Genius2 (MMG). The
        former is a visualization and exploration tool for music listening behavior ex-
        tracted from microblogs. In addition to simple metadata-based search, it allows
        its users to browse music by time, location, similar artists (using a social similar-
        ity measure), artist and genre charts, and induced topics (by clustering according
        to tags). The latter is an intelligent mobile music player that learns in which sit-
        uation or context a listener prefers which kind of music and adapts the playlist
        accordingly.


Acknowledgement
This work is supported by the EU-FP7 project no. 601166 (“PHENICX”) and by the
Austrian Science Fund (FWF): P25655.
    Copyright c 2015 by the paper’s authors. Copying permitted only for private and academic
    purposes. In: M. Atzmueller, F. Lemmerich (Eds.): Proceedings of 6th International Workshop
    on Mining Ubiquitous and Social Environments (MUSE), co-located with the ECML PKDD
    2015. Published at http://ceur-ws.org

1
    http://www.cp.jku.at/projects/MusicTweetMap
2
    http://www.cp.jku.at/projects/MMG