Mopidy-2.0.0/ 0000775 0001750 0001750 00000000000 12660436443 013170 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/PKG-INFO 0000664 0001750 0001750 00000010635 12660436443 014272 0 ustar jodal jodal 0000000 0000000 Metadata-Version: 1.1
Name: Mopidy
Version: 2.0.0
Summary: Music server with MPD and Spotify support
Home-page: http://www.mopidy.com/
Author: Stein Magnus Jodal
Author-email: stein.magnus@jodal.no
License: Apache License, Version 2.0
Description: ******
Mopidy
******
Mopidy is an extensible music server written in Python.
Mopidy plays music from local disk, Spotify, SoundCloud, Google Play Music, and
more. You edit the playlist from any phone, tablet, or computer using a range
of MPD and web clients.
**Stream music from the cloud**
Vanilla Mopidy only plays music from your local disk and radio streams.
Through extensions, Mopidy can play music from cloud services like Spotify,
SoundCloud, and Google Play Music. With Mopidy's extension support, backends
for new music sources can be easily added.
**Mopidy is just a server**
Mopidy is a Python application that runs in a terminal or in the background on
Linux computers or Macs that have network connectivity and audio output. Out of
the box, Mopidy is an MPD and HTTP server. Additional frontends for controlling
Mopidy can be installed from extensions.
**Everybody use their favorite client**
You and the people around you can all connect their favorite MPD or web client
to the Mopidy server to search for music and manage the playlist together. With
a browser or MPD client, which is available for all popular operating systems,
you can control the music from any phone, tablet, or computer.
**Mopidy on Raspberry Pi**
The Raspberry Pi is a popular device to run Mopidy on, either using Raspbian or
Arch Linux. It is quite slow, but it is very affordable. In fact, the
Kickstarter funded Gramofon: Modern Cloud Jukebox project used Mopidy on a
Raspberry Pi to prototype the Gramofon device. Mopidy is also a major building
block in the Pi Musicbox integrated audio jukebox system for Raspberry Pi.
**Mopidy is hackable**
Mopidy's extension support and Python, JSON-RPC, and JavaScript APIs makes
Mopidy perfect for building your own hacks. In one project, a Raspberry Pi was
embedded in an old cassette player. The buttons and volume control are wired up
with GPIO on the Raspberry Pi, and is used to control playback through a custom
Mopidy extension. The cassettes have NFC tags used to select playlists from
Spotify.
To get started with Mopidy, check out
`the installation docs `_.
- `Documentation `_
- `Discussion forum `_
- `Source code `_
- `Issue tracker `_
- IRC: ``#mopidy`` at `irc.freenode.net `_
- Announcement list: `mopidy@googlegroups.com `_
- Twitter: `@mopidy `_
.. image:: https://img.shields.io/pypi/v/Mopidy.svg?style=flat
:target: https://pypi.python.org/pypi/Mopidy/
:alt: Latest PyPI version
.. image:: https://img.shields.io/pypi/dm/Mopidy.svg?style=flat
:target: https://pypi.python.org/pypi/Mopidy/
:alt: Number of PyPI downloads
.. image:: https://img.shields.io/travis/mopidy/mopidy/develop.svg?style=flat
:target: https://travis-ci.org/mopidy/mopidy
:alt: Travis CI build status
.. image:: https://img.shields.io/coveralls/mopidy/mopidy/develop.svg?style=flat
:target: https://coveralls.io/r/mopidy/mopidy?branch=develop
:alt: Test coverage
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Environment :: No Input/Output (Daemon)
Classifier: Intended Audience :: End Users/Desktop
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Operating System :: MacOS :: MacOS X
Classifier: Operating System :: POSIX :: Linux
Classifier: Programming Language :: Python :: 2.7
Classifier: Topic :: Multimedia :: Sound/Audio :: Players
Mopidy-2.0.0/tasks.py 0000664 0001750 0001750 00000002174 12505224626 014667 0 ustar jodal jodal 0000000 0000000 import sys
from invoke import run, task
@task
def docs(watch=False, warn=False):
if watch:
return watcher(docs)
run('make -C docs/ html', warn=warn)
@task
def test(path=None, coverage=False, watch=False, warn=False):
if watch:
return watcher(test, path=path, coverage=coverage)
path = path or 'tests/'
cmd = 'py.test'
if coverage:
cmd += ' --cov=mopidy --cov-report=term-missing'
cmd += ' %s' % path
run(cmd, pty=True, warn=warn)
@task
def lint(watch=False, warn=False):
if watch:
return watcher(lint)
run('flake8', warn=warn)
@task
def update_authors():
# Keep authors in the order of appearance and use awk to filter out dupes
run("git log --format='- %aN <%aE>' --reverse | awk '!x[$0]++' > AUTHORS")
def watcher(task, *args, **kwargs):
while True:
run('clear')
kwargs['warn'] = True
task(*args, **kwargs)
try:
run(
'inotifywait -q -e create -e modify -e delete '
'--exclude ".*\.(pyc|sw.)" -r docs/ mopidy/ tests/')
except KeyboardInterrupt:
sys.exit()
Mopidy-2.0.0/setup.py 0000664 0001750 0001750 00000003421 12635122146 014674 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import re
from setuptools import find_packages, setup
def get_version(filename):
with open(filename) as fh:
metadata = dict(re.findall("__([a-z]+)__ = '([^']+)'", fh.read()))
return metadata['version']
setup(
name='Mopidy',
version=get_version('mopidy/__init__.py'),
url='http://www.mopidy.com/',
license='Apache License, Version 2.0',
author='Stein Magnus Jodal',
author_email='stein.magnus@jodal.no',
description='Music server with MPD and Spotify support',
long_description=open('README.rst').read(),
packages=find_packages(exclude=['tests', 'tests.*']),
zip_safe=False,
include_package_data=True,
install_requires=[
'Pykka >= 1.1',
'requests >= 2.0',
'setuptools',
'tornado >= 2.3',
],
extras_require={'http': []},
entry_points={
'console_scripts': [
'mopidy = mopidy.__main__:main',
],
'mopidy.ext': [
'http = mopidy.http:Extension',
'local = mopidy.local:Extension',
'file = mopidy.file:Extension',
'm3u = mopidy.m3u:Extension',
'mpd = mopidy.mpd:Extension',
'softwaremixer = mopidy.softwaremixer:Extension',
'stream = mopidy.stream:Extension',
],
},
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: No Input/Output (Daemon)',
'Intended Audience :: End Users/Desktop',
'License :: OSI Approved :: Apache Software License',
'Operating System :: MacOS :: MacOS X',
'Operating System :: POSIX :: Linux',
'Programming Language :: Python :: 2.7',
'Topic :: Multimedia :: Sound/Audio :: Players',
],
)
Mopidy-2.0.0/dev-requirements.txt 0000664 0001750 0001750 00000000611 12575004517 017224 0 ustar jodal jodal 0000000 0000000 # Automate tasks
invoke
# Build documentation
sphinx
# Check code style, errors, etc
flake8
flake8-import-order
# Mock dependencies in tests
mock
responses
# Test runners
pytest
pytest-capturelog
pytest-cov
pytest-xdist
tox
# Check that MANIFEST.in matches Git repo contents before making a release
check-manifest
# To make wheel packages
wheel
# Securely upload packages to PyPI
twine
Mopidy-2.0.0/AUTHORS 0000664 0001750 0001750 00000006041 12660436420 014234 0 ustar jodal jodal 0000000 0000000 - Stein Magnus Jodal
- Johannes Knutsen
- Thomas Adamcik
- Kristian Klette
- Martins Grunskis
- Henrik Olsson
- Antoine Pierlot-Garcin
- John Bäckstrand
- Fred Hatfull
- Erling Børresen
- David Caruso
- Christian Johansen
- Matt Bray
- Trygve Aaberge
- Wouter van Wijk
- Jeremy B. Merrill
- Adam Rigg
- Ernst Bammer
- Nick Steel
- Zan Dobersek
- Thomas Refis
- Janez Troha
- Tobias Sauerwein
- Alli Witheford
- Alexandre Petitjean
- Terje Larsen
- Javier Domingo Cansino
- Pavol Babincak
- Javier Domingo
- Lasse Bigum
- David Eisner
- Pål Ruud
- Thomas Kemmer
- Paul Connolley
- Luke Giuliani
- Colin Montgomerie
- Simon de Bakker
- Arnaud Barisain-Monrose
- Nathan Harper
- Pierpaolo Frasa
- Thomas Scholtes
- Sam Willcocks
- Ignasi Fosch
- Arjun Naik
- Christopher Schirner
- Dmitry Sandalov
- Lukas Vogel
- Thomas Amland
- Deni Bertovic
- Ali Ukani
- Dirk Groenen
- John Cass
- Laura Barber
- Jakab Kristóf
- Ronald Zielaznicki
- Wojciech Wnętrzak
- Camilo Nova
- Dražen Lučanin
- Naglis Jonaitis
- Kyle Heyne
- Tom Roth
- Mark Greenwood
- Stein Karlsen
- Dejan Prokić
- Eric Jahn
- Mikhail Golubev
- Danilo Bargen
- Bjørnar Snoksrud
- Giorgos Logiotatidis
- Ben Evans
- vrs01
- Cadel Watson
- Loïck Bonniot
- Gustaf Hallberg
- kozec
- Jelle van der Waa
- Alex Malone
- Daniel Hahler
- Bryan Bennett
Mopidy-2.0.0/Mopidy.egg-info/ 0000775 0001750 0001750 00000000000 12660436443 016123 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/Mopidy.egg-info/PKG-INFO 0000664 0001750 0001750 00000010635 12660436442 017224 0 ustar jodal jodal 0000000 0000000 Metadata-Version: 1.1
Name: Mopidy
Version: 2.0.0
Summary: Music server with MPD and Spotify support
Home-page: http://www.mopidy.com/
Author: Stein Magnus Jodal
Author-email: stein.magnus@jodal.no
License: Apache License, Version 2.0
Description: ******
Mopidy
******
Mopidy is an extensible music server written in Python.
Mopidy plays music from local disk, Spotify, SoundCloud, Google Play Music, and
more. You edit the playlist from any phone, tablet, or computer using a range
of MPD and web clients.
**Stream music from the cloud**
Vanilla Mopidy only plays music from your local disk and radio streams.
Through extensions, Mopidy can play music from cloud services like Spotify,
SoundCloud, and Google Play Music. With Mopidy's extension support, backends
for new music sources can be easily added.
**Mopidy is just a server**
Mopidy is a Python application that runs in a terminal or in the background on
Linux computers or Macs that have network connectivity and audio output. Out of
the box, Mopidy is an MPD and HTTP server. Additional frontends for controlling
Mopidy can be installed from extensions.
**Everybody use their favorite client**
You and the people around you can all connect their favorite MPD or web client
to the Mopidy server to search for music and manage the playlist together. With
a browser or MPD client, which is available for all popular operating systems,
you can control the music from any phone, tablet, or computer.
**Mopidy on Raspberry Pi**
The Raspberry Pi is a popular device to run Mopidy on, either using Raspbian or
Arch Linux. It is quite slow, but it is very affordable. In fact, the
Kickstarter funded Gramofon: Modern Cloud Jukebox project used Mopidy on a
Raspberry Pi to prototype the Gramofon device. Mopidy is also a major building
block in the Pi Musicbox integrated audio jukebox system for Raspberry Pi.
**Mopidy is hackable**
Mopidy's extension support and Python, JSON-RPC, and JavaScript APIs makes
Mopidy perfect for building your own hacks. In one project, a Raspberry Pi was
embedded in an old cassette player. The buttons and volume control are wired up
with GPIO on the Raspberry Pi, and is used to control playback through a custom
Mopidy extension. The cassettes have NFC tags used to select playlists from
Spotify.
To get started with Mopidy, check out
`the installation docs `_.
- `Documentation `_
- `Discussion forum `_
- `Source code `_
- `Issue tracker `_
- IRC: ``#mopidy`` at `irc.freenode.net `_
- Announcement list: `mopidy@googlegroups.com `_
- Twitter: `@mopidy `_
.. image:: https://img.shields.io/pypi/v/Mopidy.svg?style=flat
:target: https://pypi.python.org/pypi/Mopidy/
:alt: Latest PyPI version
.. image:: https://img.shields.io/pypi/dm/Mopidy.svg?style=flat
:target: https://pypi.python.org/pypi/Mopidy/
:alt: Number of PyPI downloads
.. image:: https://img.shields.io/travis/mopidy/mopidy/develop.svg?style=flat
:target: https://travis-ci.org/mopidy/mopidy
:alt: Travis CI build status
.. image:: https://img.shields.io/coveralls/mopidy/mopidy/develop.svg?style=flat
:target: https://coveralls.io/r/mopidy/mopidy?branch=develop
:alt: Test coverage
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Environment :: No Input/Output (Daemon)
Classifier: Intended Audience :: End Users/Desktop
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Operating System :: MacOS :: MacOS X
Classifier: Operating System :: POSIX :: Linux
Classifier: Programming Language :: Python :: 2.7
Classifier: Topic :: Multimedia :: Sound/Audio :: Players
Mopidy-2.0.0/Mopidy.egg-info/entry_points.txt 0000664 0001750 0001750 00000000436 12660436442 021423 0 ustar jodal jodal 0000000 0000000 [console_scripts]
mopidy = mopidy.__main__:main
[mopidy.ext]
file = mopidy.file:Extension
http = mopidy.http:Extension
local = mopidy.local:Extension
m3u = mopidy.m3u:Extension
mpd = mopidy.mpd:Extension
softwaremixer = mopidy.softwaremixer:Extension
stream = mopidy.stream:Extension
Mopidy-2.0.0/Mopidy.egg-info/not-zip-safe 0000664 0001750 0001750 00000000001 12660436442 020350 0 ustar jodal jodal 0000000 0000000
Mopidy-2.0.0/Mopidy.egg-info/dependency_links.txt 0000664 0001750 0001750 00000000001 12660436442 022170 0 ustar jodal jodal 0000000 0000000
Mopidy-2.0.0/Mopidy.egg-info/top_level.txt 0000664 0001750 0001750 00000000007 12660436442 020651 0 ustar jodal jodal 0000000 0000000 mopidy
Mopidy-2.0.0/Mopidy.egg-info/requires.txt 0000664 0001750 0001750 00000000077 12660436442 020526 0 ustar jodal jodal 0000000 0000000 Pykka >= 1.1
requests >= 2.0
setuptools
tornado >= 2.3
[http]
Mopidy-2.0.0/Mopidy.egg-info/SOURCES.txt 0000664 0001750 0001750 00000022336 12660436443 020015 0 ustar jodal jodal 0000000 0000000 .mailmap
.travis.yml
AUTHORS
LICENSE
MANIFEST.in
README.rst
dev-requirements.txt
setup.cfg
setup.py
tasks.py
tox.ini
Mopidy.egg-info/PKG-INFO
Mopidy.egg-info/SOURCES.txt
Mopidy.egg-info/dependency_links.txt
Mopidy.egg-info/entry_points.txt
Mopidy.egg-info/not-zip-safe
Mopidy.egg-info/requires.txt
Mopidy.egg-info/top_level.txt
docs/Makefile
docs/audio.rst
docs/authors.rst
docs/changelog.rst
docs/codestyle.rst
docs/command.rst
docs/conf.py
docs/config.rst
docs/contributing.rst
docs/devenv.rst
docs/extensiondev.rst
docs/glossary.rst
docs/index.rst
docs/releasing.rst
docs/requirements.txt
docs/running.rst
docs/service.rst
docs/sponsors.rst
docs/troubleshooting.rst
docs/versioning.rst
docs/_static/mopidy.png
docs/api/architecture.rst
docs/api/audio.rst
docs/api/backend.rst
docs/api/commands.rst
docs/api/config.rst
docs/api/core.rst
docs/api/ext.rst
docs/api/frontend.rst
docs/api/http-server.rst
docs/api/http.rst
docs/api/httpclient.rst
docs/api/index.rst
docs/api/js.rst
docs/api/mixer.rst
docs/api/models.rst
docs/api/zeroconf.rst
docs/clients/http.rst
docs/clients/mpd-client-gmpc.png
docs/clients/mpd-client-mpad.jpg
docs/clients/mpd-client-mpdroid.jpg
docs/clients/mpd-client-mpod.jpg
docs/clients/mpd-client-ncmpcpp.png
docs/clients/mpd-client-sonata.png
docs/clients/mpd.rst
docs/clients/mpris.rst
docs/clients/rompr.png
docs/clients/ubuntu-sound-menu.png
docs/clients/upnp.rst
docs/ext/api_explorer.png
docs/ext/backends.rst
docs/ext/file.rst
docs/ext/frontends.rst
docs/ext/http.rst
docs/ext/local.rst
docs/ext/local_images.jpg
docs/ext/m3u.rst
docs/ext/material_webclient.png
docs/ext/mixers.rst
docs/ext/mobile.png
docs/ext/moped.png
docs/ext/mopidy_party.png
docs/ext/mopify.jpg
docs/ext/mopster.png
docs/ext/mpd.rst
docs/ext/musicbox_webclient.png
docs/ext/simple_webclient.png
docs/ext/softwaremixer.rst
docs/ext/spotmop.jpg
docs/ext/stream.rst
docs/ext/web.rst
docs/installation/arch.rst
docs/installation/debian.rst
docs/installation/index.rst
docs/installation/osx.rst
docs/installation/raspberrypi.rst
docs/installation/raspberrypi2.jpg
docs/installation/source.rst
docs/modules/index.rst
docs/modules/local.rst
docs/modules/mpd.rst
extra/desktop/mopidy.desktop
extra/mopidyctl/mopidyctl
extra/mopidyctl/mopidyctl.8
extra/systemd/mopidy.service
mopidy/__init__.py
mopidy/__main__.py
mopidy/backend.py
mopidy/commands.py
mopidy/compat.py
mopidy/exceptions.py
mopidy/ext.py
mopidy/httpclient.py
mopidy/listener.py
mopidy/mixer.py
mopidy/zeroconf.py
mopidy/audio/__init__.py
mopidy/audio/actor.py
mopidy/audio/constants.py
mopidy/audio/listener.py
mopidy/audio/scan.py
mopidy/audio/tags.py
mopidy/audio/utils.py
mopidy/config/__init__.py
mopidy/config/default.conf
mopidy/config/keyring.py
mopidy/config/schemas.py
mopidy/config/types.py
mopidy/config/validators.py
mopidy/core/__init__.py
mopidy/core/actor.py
mopidy/core/history.py
mopidy/core/library.py
mopidy/core/listener.py
mopidy/core/mixer.py
mopidy/core/playback.py
mopidy/core/playlists.py
mopidy/core/tracklist.py
mopidy/file/__init__.py
mopidy/file/backend.py
mopidy/file/ext.conf
mopidy/file/library.py
mopidy/http/__init__.py
mopidy/http/actor.py
mopidy/http/ext.conf
mopidy/http/handlers.py
mopidy/http/data/clients.html
mopidy/http/data/favicon.ico
mopidy/http/data/mopidy.css
mopidy/http/data/mopidy.js
mopidy/http/data/mopidy.min.js
mopidy/internal/__init__.py
mopidy/internal/deprecation.py
mopidy/internal/deps.py
mopidy/internal/encoding.py
mopidy/internal/formatting.py
mopidy/internal/gi.py
mopidy/internal/http.py
mopidy/internal/jsonrpc.py
mopidy/internal/log.py
mopidy/internal/network.py
mopidy/internal/path.py
mopidy/internal/playlists.py
mopidy/internal/process.py
mopidy/internal/timer.py
mopidy/internal/validation.py
mopidy/internal/versioning.py
mopidy/internal/xdg.py
mopidy/local/__init__.py
mopidy/local/actor.py
mopidy/local/commands.py
mopidy/local/ext.conf
mopidy/local/json.py
mopidy/local/library.py
mopidy/local/playback.py
mopidy/local/search.py
mopidy/local/storage.py
mopidy/local/translator.py
mopidy/m3u/__init__.py
mopidy/m3u/backend.py
mopidy/m3u/ext.conf
mopidy/m3u/playlists.py
mopidy/m3u/translator.py
mopidy/models/__init__.py
mopidy/models/fields.py
mopidy/models/immutable.py
mopidy/models/serialize.py
mopidy/mpd/__init__.py
mopidy/mpd/actor.py
mopidy/mpd/dispatcher.py
mopidy/mpd/exceptions.py
mopidy/mpd/ext.conf
mopidy/mpd/session.py
mopidy/mpd/tokenize.py
mopidy/mpd/translator.py
mopidy/mpd/uri_mapper.py
mopidy/mpd/protocol/__init__.py
mopidy/mpd/protocol/audio_output.py
mopidy/mpd/protocol/channels.py
mopidy/mpd/protocol/command_list.py
mopidy/mpd/protocol/connection.py
mopidy/mpd/protocol/current_playlist.py
mopidy/mpd/protocol/mount.py
mopidy/mpd/protocol/music_db.py
mopidy/mpd/protocol/playback.py
mopidy/mpd/protocol/reflection.py
mopidy/mpd/protocol/status.py
mopidy/mpd/protocol/stickers.py
mopidy/mpd/protocol/stored_playlists.py
mopidy/mpd/protocol/tagtype_list.py
mopidy/softwaremixer/__init__.py
mopidy/softwaremixer/ext.conf
mopidy/softwaremixer/mixer.py
mopidy/stream/__init__.py
mopidy/stream/actor.py
mopidy/stream/ext.conf
tests/__init__.py
tests/dummy_audio.py
tests/dummy_backend.py
tests/dummy_mixer.py
tests/test_commands.py
tests/test_exceptions.py
tests/test_ext.py
tests/test_help.py
tests/test_httpclient.py
tests/test_mixer.py
tests/test_version.py
tests/audio/__init__.py
tests/audio/test_actor.py
tests/audio/test_listener.py
tests/audio/test_scan.py
tests/audio/test_tags.py
tests/audio/test_utils.py
tests/backend/__init__.py
tests/backend/test_backend.py
tests/backend/test_listener.py
tests/config/__init__.py
tests/config/test_config.py
tests/config/test_defaults.py
tests/config/test_schemas.py
tests/config/test_types.py
tests/config/test_validator.py
tests/core/__init__.py
tests/core/test_actor.py
tests/core/test_events.py
tests/core/test_history.py
tests/core/test_library.py
tests/core/test_listener.py
tests/core/test_mixer.py
tests/core/test_playback.py
tests/core/test_playlists.py
tests/core/test_tracklist.py
tests/data/blank.flac
tests/data/blank.mp3
tests/data/blank.ogg
tests/data/blank.wav
tests/data/comment-ext.m3u
tests/data/comment.m3u
tests/data/empty-ext.m3u
tests/data/empty.m3u
tests/data/encoding-ext.m3u
tests/data/encoding.m3u
tests/data/file1.conf
tests/data/file2.conf
tests/data/file3.conf
tests/data/file4.conf
tests/data/one-ext.m3u
tests/data/one.m3u
tests/data/song1.flac
tests/data/song1.mp3
tests/data/song1.ogg
tests/data/song1.wav
tests/data/song2.flac
tests/data/song2.mp3
tests/data/song2.ogg
tests/data/song2.wav
tests/data/song3.flac
tests/data/song3.mp3
tests/data/song3.ogg
tests/data/song3.wav
tests/data/song4.wav
tests/data/two-ext.m3u
tests/data/conf1.d/file1.conf
tests/data/conf1.d/file2.conf
tests/data/conf2.d/file1.conf
tests/data/conf2.d/file2.conf.disabled
tests/data/local/library.json.gz
tests/data/scanner/empty.wav
tests/data/scanner/example.log
tests/data/scanner/plain.txt
tests/data/scanner/playlist.m3u
tests/data/scanner/sample.mp3
tests/data/scanner/advanced/song1.mp3
tests/data/scanner/advanced/song2.mp3
tests/data/scanner/advanced/song3.mp3
tests/data/scanner/advanced/subdir1/song4.mp3
tests/data/scanner/advanced/subdir1/song5.mp3
tests/data/scanner/advanced/subdir1/subsubdir/song8.mp3
tests/data/scanner/advanced/subdir1/subsubdir/song9.mp3
tests/data/scanner/advanced/subdir2/song6.mp3
tests/data/scanner/advanced/subdir2/song7.mp3
tests/data/scanner/empty/.gitignore
tests/data/scanner/image/test.png
tests/data/scanner/simple/song1.mp3
tests/data/scanner/simple/song1.ogg
tests/file/__init__.py
tests/file/conftest.py
tests/file/test_browse.py
tests/file/test_lookup.py
tests/http/__init__.py
tests/http/test_events.py
tests/http/test_handlers.py
tests/http/test_server.py
tests/internal/__init__.py
tests/internal/test_deps.py
tests/internal/test_encoding.py
tests/internal/test_http.py
tests/internal/test_jsonrpc.py
tests/internal/test_path.py
tests/internal/test_playlists.py
tests/internal/test_validation.py
tests/internal/test_xdg.py
tests/internal/network/__init__.py
tests/internal/network/test_connection.py
tests/internal/network/test_lineprotocol.py
tests/internal/network/test_server.py
tests/internal/network/test_utils.py
tests/local/__init__.py
tests/local/test_json.py
tests/local/test_library.py
tests/local/test_playback.py
tests/local/test_search.py
tests/local/test_tracklist.py
tests/local/test_translator.py
tests/m3u/__init__.py
tests/m3u/test_playlists.py
tests/m3u/test_translator.py
tests/models/test_fields.py
tests/models/test_legacy.py
tests/models/test_models.py
tests/mpd/__init__.py
tests/mpd/test_actor.py
tests/mpd/test_commands.py
tests/mpd/test_dispatcher.py
tests/mpd/test_exceptions.py
tests/mpd/test_status.py
tests/mpd/test_tokenizer.py
tests/mpd/test_translator.py
tests/mpd/protocol/__init__.py
tests/mpd/protocol/test_audio_output.py
tests/mpd/protocol/test_authentication.py
tests/mpd/protocol/test_channels.py
tests/mpd/protocol/test_command_list.py
tests/mpd/protocol/test_connection.py
tests/mpd/protocol/test_current_playlist.py
tests/mpd/protocol/test_idle.py
tests/mpd/protocol/test_mount.py
tests/mpd/protocol/test_music_db.py
tests/mpd/protocol/test_playback.py
tests/mpd/protocol/test_reflection.py
tests/mpd/protocol/test_regression.py
tests/mpd/protocol/test_status.py
tests/mpd/protocol/test_stickers.py
tests/mpd/protocol/test_stored_playlists.py
tests/stream/__init__.py
tests/stream/test_library.py
tests/stream/test_playback.py Mopidy-2.0.0/extra/ 0000775 0001750 0001750 00000000000 12660436443 014313 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/extra/desktop/ 0000775 0001750 0001750 00000000000 12660436443 015764 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/extra/desktop/mopidy.desktop 0000664 0001750 0001750 00000000365 12505224626 020660 0 ustar jodal jodal 0000000 0000000 [Desktop Entry]
Type=Application
Version=1.0
Name=Mopidy
Comment=Music server with support for MPD and HTTP clients
Icon=audio-x-generic
TryExec=mopidy
Exec=mopidy
Terminal=true
Categories=AudioVideo;Audio;Player;ConsoleOnly;
StartupNotify=true
Mopidy-2.0.0/extra/mopidyctl/ 0000775 0001750 0001750 00000000000 12660436443 016317 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/extra/mopidyctl/mopidyctl 0000775 0001750 0001750 00000001054 12505224626 020245 0 ustar jodal jodal 0000000 0000000 #!/bin/sh
SELF=$(basename $0)
DAEMON="/usr/bin/mopidy"
DAEMON_USER="mopidy"
CONFIG_FILES="/usr/share/mopidy/conf.d:/etc/mopidy/mopidy.conf"
CMD="$DAEMON --config $CONFIG_FILES $@"
if [ $# -eq 0 ]; then
echo "Usage: $SELF [options]" 1>&2
echo "Examples:" 1>&2
echo " $SELF --help" 1>&2
echo " $SELF config" 1>&2
echo " $SELF local scan" 1>&2
exit 1
fi
if [ $(id -u) -ne 0 ]; then
echo "$SELF must be run as root" 1>&2
exit 2
fi
echo "Running \"$CMD\" as user $DAEMON_USER" 1>&2
su -s /bin/sh -c "$CMD" -- $DAEMON_USER
Mopidy-2.0.0/extra/mopidyctl/mopidyctl.8 0000664 0001750 0001750 00000001075 12505224626 020413 0 ustar jodal jodal 0000000 0000000 .\" Manpage for mopidyctl
.TH "MOPIDYCTL" "8" "October 11, 2014" "1.0" "mopidyctl"
.SH NAME
mopidyctl \- manage the Mopidy music server system service
.SH SYNOPSIS
.B mopidyctl
[any mopidy(1) option]
.SH DESCRIPTION
The \fBmopidyctl\fP command runs \fBmopidy\fP subcommands in the
same environment as the Mopidy system service is running in. That is, as the
same user and with the same config as the Mopidy system service is using.
.SH OPTIONS
mopidyctl(8) takes the same options as mopidy(1).
.SH SEE ALSO
mopidy(1)
.SH COPYRIGHT
2014, Stein Magnus Jodal and contributors
Mopidy-2.0.0/extra/systemd/ 0000775 0001750 0001750 00000000000 12660436443 016003 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/extra/systemd/mopidy.service 0000664 0001750 0001750 00000000526 12505224626 020665 0 ustar jodal jodal 0000000 0000000 [Unit]
Description=Mopidy music server
After=avahi-daemon.service
After=dbus.service
After=network.target
After=nss-lookup.target
After=pulseaudio.service
After=remote-fs.target
After=sound.target
[Service]
User=mopidy
ExecStart=/usr/bin/mopidy --config /usr/share/mopidy/conf.d:/etc/mopidy/mopidy.conf
[Install]
WantedBy=multi-user.target
Mopidy-2.0.0/.travis.yml 0000664 0001750 0001750 00000001505 12660436420 015275 0 ustar jodal jodal 0000000 0000000 sudo: required
dist: trusty
language: python
python:
- "2.7_with_system_site_packages"
env:
- TOX_ENV=py27
- TOX_ENV=py27-tornado23
- TOX_ENV=py27-tornado31
- TOX_ENV=docs
- TOX_ENV=flake8
before_install:
- "sudo sed -i '/127.0.1.1/d' /etc/hosts" # Workaround tornadoweb/tornado#1573
- "sudo apt-get update -qq"
- "sudo apt-get install -y gir1.2-gst-plugins-base-1.0 gir1.2-gstreamer-1.0 graphviz-dev gstreamer1.0-plugins-good gstreamer1.0-plugins-bad python-gst-1.0"
install:
- "pip install tox"
script:
- "tox -e $TOX_ENV"
after_success:
- "if [ $TOX_ENV == 'py27' ]; then pip install coveralls; coveralls; fi"
branches:
except:
- debian
notifications:
irc:
channels:
- "irc.freenode.org#mopidy"
on_success: change
on_failure: change
use_notice: true
skip_join: true
Mopidy-2.0.0/mopidy/ 0000775 0001750 0001750 00000000000 12660436443 014471 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/mopidy/backend.py 0000664 0001750 0001750 00000030532 12660436420 016430 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
from mopidy import listener, models
logger = logging.getLogger(__name__)
class Backend(object):
"""Backend API
If the backend has problems during initialization it should raise
:exc:`mopidy.exceptions.BackendError` with a descriptive error message.
This will make Mopidy print the error message and exit so that the user can
fix the issue.
:param config: the entire Mopidy configuration
:type config: dict
:param audio: actor proxy for the audio subsystem
:type audio: :class:`pykka.ActorProxy` for :class:`mopidy.audio.Audio`
"""
#: Actor proxy to an instance of :class:`mopidy.audio.Audio`.
#:
#: Should be passed to the backend constructor as the kwarg ``audio``,
#: which will then set this field.
audio = None
#: The library provider. An instance of
#: :class:`~mopidy.backend.LibraryProvider`, or :class:`None` if
#: the backend doesn't provide a library.
library = None
#: The playback provider. An instance of
#: :class:`~mopidy.backend.PlaybackProvider`, or :class:`None` if
#: the backend doesn't provide playback.
playback = None
#: The playlists provider. An instance of
#: :class:`~mopidy.backend.PlaylistsProvider`, or class:`None` if
#: the backend doesn't provide playlists.
playlists = None
#: List of URI schemes this backend can handle.
uri_schemes = []
# Because the providers is marked as pykka_traversible, we can't get() them
# from another actor, and need helper methods to check if the providers are
# set or None.
def has_library(self):
return self.library is not None
def has_library_browse(self):
return self.has_library() and self.library.root_directory is not None
def has_playback(self):
return self.playback is not None
def has_playlists(self):
return self.playlists is not None
def ping(self):
"""Called to check if the actor is still alive."""
return True
class LibraryProvider(object):
"""
:param backend: backend the controller is a part of
:type backend: :class:`mopidy.backend.Backend`
"""
pykka_traversable = True
root_directory = None
"""
:class:`mopidy.models.Ref.directory` instance with a URI and name set
representing the root of this library's browse tree. URIs must
use one of the schemes supported by the backend, and name should
be set to a human friendly value.
*MUST be set by any class that implements* :meth:`LibraryProvider.browse`.
"""
def __init__(self, backend):
self.backend = backend
def browse(self, uri):
"""
See :meth:`mopidy.core.LibraryController.browse`.
If you implement this method, make sure to also set
:attr:`root_directory`.
*MAY be implemented by subclass.*
"""
return []
def get_distinct(self, field, query=None):
"""
See :meth:`mopidy.core.LibraryController.get_distinct`.
*MAY be implemented by subclass.*
Default implementation will simply return an empty set.
Note that backends should always return an empty set for unexpected
field types.
"""
return set()
def get_images(self, uris):
"""
See :meth:`mopidy.core.LibraryController.get_images`.
*MAY be implemented by subclass.*
Default implementation will simply call lookup and try and use the
album art for any tracks returned. Most extensions should replace this
with something smarter or simply return an empty dictionary.
"""
result = {}
for uri in uris:
image_uris = set()
for track in self.lookup(uri):
if track.album and track.album.images:
image_uris.update(track.album.images)
result[uri] = [models.Image(uri=u) for u in image_uris]
return result
def lookup(self, uri):
"""
See :meth:`mopidy.core.LibraryController.lookup`.
*MUST be implemented by subclass.*
"""
raise NotImplementedError
def refresh(self, uri=None):
"""
See :meth:`mopidy.core.LibraryController.refresh`.
*MAY be implemented by subclass.*
"""
pass
def search(self, query=None, uris=None, exact=False):
"""
See :meth:`mopidy.core.LibraryController.search`.
*MAY be implemented by subclass.*
.. versionadded:: 1.0
The ``exact`` param which replaces the old ``find_exact``.
"""
pass
class PlaybackProvider(object):
"""
:param audio: the audio actor
:type audio: actor proxy to an instance of :class:`mopidy.audio.Audio`
:param backend: the backend
:type backend: :class:`mopidy.backend.Backend`
"""
pykka_traversable = True
def __init__(self, audio, backend):
self.audio = audio
self.backend = backend
def pause(self):
"""
Pause playback.
*MAY be reimplemented by subclass.*
:rtype: :class:`True` if successful, else :class:`False`
"""
return self.audio.pause_playback().get()
def play(self):
"""
Start playback.
*MAY be reimplemented by subclass.*
:rtype: :class:`True` if successful, else :class:`False`
"""
return self.audio.start_playback().get()
def prepare_change(self):
"""
Indicate that an URI change is about to happen.
*MAY be reimplemented by subclass.*
It is extremely unlikely it makes sense for any backends to override
this. For most practical purposes it should be considered an internal
call between backends and core that backend authors should not touch.
"""
self.audio.prepare_change().get()
def translate_uri(self, uri):
"""
Convert custom URI scheme to real playable URI.
*MAY be reimplemented by subclass.*
This is very likely the *only* thing you need to override as a backend
author. Typically this is where you convert any Mopidy specific URI
to a real URI and then return it. If you can't convert the URI just
return :class:`None`.
:param uri: the URI to translate
:type uri: string
:rtype: string or :class:`None` if the URI could not be translated
"""
return uri
def change_track(self, track):
"""
Swith to provided track.
*MAY be reimplemented by subclass.*
It is unlikely it makes sense for any backends to override
this. For most practical purposes it should be considered an internal
call between backends and core that backend authors should not touch.
The default implementation will call :meth:`translate_uri` which
is what you want to implement.
:param track: the track to play
:type track: :class:`mopidy.models.Track`
:rtype: :class:`True` if successful, else :class:`False`
"""
uri = self.translate_uri(track.uri)
if uri != track.uri:
logger.debug(
'Backend translated URI from %s to %s', track.uri, uri)
if not uri:
return False
self.audio.set_uri(uri).get()
return True
def resume(self):
"""
Resume playback at the same time position playback was paused.
*MAY be reimplemented by subclass.*
:rtype: :class:`True` if successful, else :class:`False`
"""
return self.audio.start_playback().get()
def seek(self, time_position):
"""
Seek to a given time position.
*MAY be reimplemented by subclass.*
:param time_position: time position in milliseconds
:type time_position: int
:rtype: :class:`True` if successful, else :class:`False`
"""
return self.audio.set_position(time_position).get()
def stop(self):
"""
Stop playback.
*MAY be reimplemented by subclass.*
Should not be used for tracking if tracks have been played or when we
are done playing them.
:rtype: :class:`True` if successful, else :class:`False`
"""
return self.audio.stop_playback().get()
def get_time_position(self):
"""
Get the current time position in milliseconds.
*MAY be reimplemented by subclass.*
:rtype: int
"""
return self.audio.get_position().get()
class PlaylistsProvider(object):
"""
A playlist provider exposes a collection of playlists, methods to
create/change/delete playlists in this collection, and lookup of any
playlist the backend knows about.
:param backend: backend the controller is a part of
:type backend: :class:`mopidy.backend.Backend` instance
"""
pykka_traversable = True
def __init__(self, backend):
self.backend = backend
def as_list(self):
"""
Get a list of the currently available playlists.
Returns a list of :class:`~mopidy.models.Ref` objects referring to the
playlists. In other words, no information about the playlists' content
is given.
:rtype: list of :class:`mopidy.models.Ref`
.. versionadded:: 1.0
"""
raise NotImplementedError
def get_items(self, uri):
"""
Get the items in a playlist specified by ``uri``.
Returns a list of :class:`~mopidy.models.Ref` objects referring to the
playlist's items.
If a playlist with the given ``uri`` doesn't exist, it returns
:class:`None`.
:rtype: list of :class:`mopidy.models.Ref`, or :class:`None`
.. versionadded:: 1.0
"""
raise NotImplementedError
def create(self, name):
"""
Create a new empty playlist with the given name.
Returns a new playlist with the given name and an URI, or :class:`None`
on failure.
*MUST be implemented by subclass.*
:param name: name of the new playlist
:type name: string
:rtype: :class:`mopidy.models.Playlist` or :class:`None`
"""
raise NotImplementedError
def delete(self, uri):
"""
Delete playlist identified by the URI.
*MUST be implemented by subclass.*
:param uri: URI of the playlist to delete
:type uri: string
"""
raise NotImplementedError
def lookup(self, uri):
"""
Lookup playlist with given URI in both the set of playlists and in any
other playlist source.
Returns the playlists or :class:`None` if not found.
*MUST be implemented by subclass.*
:param uri: playlist URI
:type uri: string
:rtype: :class:`mopidy.models.Playlist` or :class:`None`
"""
raise NotImplementedError
def refresh(self):
"""
Refresh the playlists in :attr:`playlists`.
*MUST be implemented by subclass.*
"""
raise NotImplementedError
def save(self, playlist):
"""
Save the given playlist.
The playlist must have an ``uri`` attribute set. To create a new
playlist with an URI, use :meth:`create`.
Returns the saved playlist or :class:`None` on failure.
*MUST be implemented by subclass.*
:param playlist: the playlist to save
:type playlist: :class:`mopidy.models.Playlist`
:rtype: :class:`mopidy.models.Playlist` or :class:`None`
"""
raise NotImplementedError
class BackendListener(listener.Listener):
"""
Marker interface for recipients of events sent by the backend actors.
Any Pykka actor that mixes in this class will receive calls to the methods
defined here when the corresponding events happen in a backend actor. This
interface is used both for looking up what actors to notify of the events,
and for providing default implementations for those listeners that are not
interested in all events.
Normally, only the Core actor should mix in this class.
"""
@staticmethod
def send(event, **kwargs):
"""Helper to allow calling of backend listener events"""
listener.send(BackendListener, event, **kwargs)
def playlists_loaded(self):
"""
Called when playlists are loaded or refreshed.
*MAY* be implemented by actor.
"""
pass
Mopidy-2.0.0/mopidy/audio/ 0000775 0001750 0001750 00000000000 12660436443 015572 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/mopidy/audio/utils.py 0000664 0001750 0001750 00000006335 12660436420 017306 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy import httpclient
from mopidy.internal.gi import Gst
def calculate_duration(num_samples, sample_rate):
"""Determine duration of samples using GStreamer helper for precise
math."""
return Gst.util_uint64_scale(num_samples, Gst.SECOND, sample_rate)
def create_buffer(data, timestamp=None, duration=None):
"""Create a new GStreamer buffer based on provided data.
Mainly intended to keep gst imports out of non-audio modules.
.. versionchanged:: 2.0
``capabilites`` argument was removed.
"""
if not data:
raise ValueError('Cannot create buffer without data')
buffer_ = Gst.Buffer.new_wrapped(data)
if timestamp is not None:
buffer_.pts = timestamp
if duration is not None:
buffer_.duration = duration
return buffer_
def millisecond_to_clocktime(value):
"""Convert a millisecond time to internal GStreamer time."""
return value * Gst.MSECOND
def clocktime_to_millisecond(value):
"""Convert an internal GStreamer time to millisecond time."""
return value // Gst.MSECOND
def supported_uri_schemes(uri_schemes):
"""Determine which URIs we can actually support from provided whitelist.
:param uri_schemes: list/set of URIs to check support for.
:type uri_schemes: list or set or URI schemes as strings.
:rtype: set of URI schemes we can support via this GStreamer install.
"""
supported_schemes = set()
registry = Gst.Registry.get()
for factory in registry.get_feature_list(Gst.ElementFactory):
for uri in factory.get_uri_protocols():
if uri in uri_schemes:
supported_schemes.add(uri)
return supported_schemes
def setup_proxy(element, config):
"""Configure a GStreamer element with proxy settings.
:param element: element to setup proxy in.
:type element: :class:`Gst.GstElement`
:param config: proxy settings to use.
:type config: :class:`dict`
"""
if not hasattr(element.props, 'proxy') or not config.get('hostname'):
return
element.set_property('proxy', httpclient.format_proxy(config, auth=False))
element.set_property('proxy-id', config.get('username'))
element.set_property('proxy-pw', config.get('password'))
class Signals(object):
"""Helper for tracking gobject signal registrations"""
def __init__(self):
self._ids = {}
def connect(self, element, event, func, *args):
"""Connect a function + args to signal event on an element.
Each event may only be handled by one callback in this implementation.
"""
assert (element, event) not in self._ids
self._ids[(element, event)] = element.connect(event, func, *args)
def disconnect(self, element, event):
"""Disconnect whatever handler we have for an element+event pair.
Does nothing it the handler has already been removed.
"""
signal_id = self._ids.pop((element, event), None)
if signal_id is not None:
element.disconnect(signal_id)
def clear(self):
"""Clear all registered signal handlers."""
for element, event in self._ids.keys():
element.disconnect(self._ids.pop((element, event)))
Mopidy-2.0.0/mopidy/audio/__init__.py 0000664 0001750 0001750 00000000434 12626555165 017711 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
# flake8: noqa
from .actor import Audio
from .listener import AudioListener
from .constants import PlaybackState
from .utils import (
calculate_duration, create_buffer, millisecond_to_clocktime,
supported_uri_schemes)
Mopidy-2.0.0/mopidy/audio/tags.py 0000664 0001750 0001750 00000012164 12660436420 017101 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import collections
import datetime
import logging
import numbers
from mopidy import compat
from mopidy.internal import log
from mopidy.internal.gi import GLib, Gst
from mopidy.models import Album, Artist, Track
logger = logging.getLogger(__name__)
def convert_taglist(taglist):
"""Convert a :class:`Gst.TagList` to plain Python types.
Knows how to convert:
- Dates
- Buffers
- Numbers
- Strings
- Booleans
Unknown types will be ignored and trace logged. Tag keys are all strings
defined as part GStreamer under GstTagList_.
.. _GstTagList: https://developer.gnome.org/gstreamer/stable/\
gstreamer-GstTagList.html
:param taglist: A GStreamer taglist to be converted.
:type taglist: :class:`Gst.TagList`
:rtype: dictionary of tag keys with a list of values.
"""
result = collections.defaultdict(list)
for n in range(taglist.n_tags()):
tag = taglist.nth_tag_name(n)
for i in range(taglist.get_tag_size(tag)):
value = taglist.get_value_index(tag, i)
if isinstance(value, GLib.Date):
date = datetime.date(
value.get_year(), value.get_month(), value.get_day())
result[tag].append(date.isoformat().decode('utf-8'))
if isinstance(value, Gst.DateTime):
result[tag].append(value.to_iso8601_string().decode('utf-8'))
elif isinstance(value, bytes):
result[tag].append(value.decode('utf-8', 'replace'))
elif isinstance(value, (compat.text_type, bool, numbers.Number)):
result[tag].append(value)
else:
logger.log(
log.TRACE_LOG_LEVEL,
'Ignoring unknown tag data: %r = %r', tag, value)
# TODO: dict(result) to not leak the defaultdict, or just use setdefault?
return result
# TODO: split based on "stream" and "track" based conversion? i.e. handle data
# from radios in it's own helper instead?
def convert_tags_to_track(tags):
"""Convert our normalized tags to a track.
:param tags: dictionary of tag keys with a list of values
:type tags: :class:`dict`
:rtype: :class:`mopidy.models.Track`
"""
album_kwargs = {}
track_kwargs = {}
track_kwargs['composers'] = _artists(tags, Gst.TAG_COMPOSER)
track_kwargs['performers'] = _artists(tags, Gst.TAG_PERFORMER)
track_kwargs['artists'] = _artists(tags, Gst.TAG_ARTIST,
'musicbrainz-artistid',
'musicbrainz-sortname')
album_kwargs['artists'] = _artists(
tags, Gst.TAG_ALBUM_ARTIST, 'musicbrainz-albumartistid')
track_kwargs['genre'] = '; '.join(tags.get(Gst.TAG_GENRE, []))
track_kwargs['name'] = '; '.join(tags.get(Gst.TAG_TITLE, []))
if not track_kwargs['name']:
track_kwargs['name'] = '; '.join(tags.get(Gst.TAG_ORGANIZATION, []))
track_kwargs['comment'] = '; '.join(tags.get('comment', []))
if not track_kwargs['comment']:
track_kwargs['comment'] = '; '.join(tags.get(Gst.TAG_LOCATION, []))
if not track_kwargs['comment']:
track_kwargs['comment'] = '; '.join(tags.get(Gst.TAG_COPYRIGHT, []))
track_kwargs['track_no'] = tags.get(Gst.TAG_TRACK_NUMBER, [None])[0]
track_kwargs['disc_no'] = tags.get(Gst.TAG_ALBUM_VOLUME_NUMBER, [None])[0]
track_kwargs['bitrate'] = tags.get(Gst.TAG_BITRATE, [None])[0]
track_kwargs['musicbrainz_id'] = tags.get('musicbrainz-trackid', [None])[0]
album_kwargs['name'] = tags.get(Gst.TAG_ALBUM, [None])[0]
album_kwargs['num_tracks'] = tags.get(Gst.TAG_TRACK_COUNT, [None])[0]
album_kwargs['num_discs'] = tags.get(Gst.TAG_ALBUM_VOLUME_COUNT, [None])[0]
album_kwargs['musicbrainz_id'] = tags.get('musicbrainz-albumid', [None])[0]
album_kwargs['date'] = tags.get(Gst.TAG_DATE, [None])[0]
if not album_kwargs['date']:
datetime = tags.get(Gst.TAG_DATE_TIME, [None])[0]
if datetime is not None:
album_kwargs['date'] = datetime.split('T')[0]
# Clear out any empty values we found
track_kwargs = {k: v for k, v in track_kwargs.items() if v}
album_kwargs = {k: v for k, v in album_kwargs.items() if v}
# Only bother with album if we have a name to show.
if album_kwargs.get('name'):
track_kwargs['album'] = Album(**album_kwargs)
return Track(**track_kwargs)
def _artists(
tags, artist_name, artist_id=None, artist_sortname=None):
# Name missing, don't set artist
if not tags.get(artist_name):
return None
# One artist name and either id or sortname, include all available fields
if len(tags[artist_name]) == 1 and \
(artist_id in tags or artist_sortname in tags):
attrs = {'name': tags[artist_name][0]}
if artist_id in tags:
attrs['musicbrainz_id'] = tags[artist_id][0]
if artist_sortname in tags:
attrs['sortname'] = tags[artist_sortname][0]
return [Artist(**attrs)]
# Multiple artist, provide artists with name only to avoid ambiguity.
return [Artist(name=name) for name in tags[artist_name]]
Mopidy-2.0.0/mopidy/audio/constants.py 0000664 0001750 0001750 00000000536 12575004517 020162 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
class PlaybackState(object):
"""
Enum of playback states.
"""
#: Constant representing the paused state.
PAUSED = 'paused'
#: Constant representing the playing state.
PLAYING = 'playing'
#: Constant representing the stopped state.
STOPPED = 'stopped'
Mopidy-2.0.0/mopidy/audio/scan.py 0000664 0001750 0001750 00000020442 12660436420 017065 0 ustar jodal jodal 0000000 0000000 from __future__ import (
absolute_import, division, print_function, unicode_literals)
import collections
import time
from mopidy import exceptions
from mopidy.audio import tags as tags_lib, utils
from mopidy.internal import encoding
from mopidy.internal.gi import Gst, GstPbutils
# GST_ELEMENT_FACTORY_LIST:
_DECODER = 1 << 0
_AUDIO = 1 << 50
_DEMUXER = 1 << 5
_DEPAYLOADER = 1 << 8
_PARSER = 1 << 6
# GST_TYPE_AUTOPLUG_SELECT_RESULT:
_SELECT_TRY = 0
_SELECT_EXPOSE = 1
_Result = collections.namedtuple(
'Result', ('uri', 'tags', 'duration', 'seekable', 'mime', 'playable'))
# TODO: replace with a scan(uri, timeout=1000, proxy_config=None)?
class Scanner(object):
"""
Helper to get tags and other relevant info from URIs.
:param timeout: timeout for scanning a URI in ms
:param proxy_config: dictionary containing proxy config strings.
:type event: int
"""
def __init__(self, timeout=1000, proxy_config=None):
self._timeout_ms = int(timeout)
self._proxy_config = proxy_config or {}
def scan(self, uri, timeout=None):
"""
Scan the given uri collecting relevant metadata.
:param uri: URI of the resource to scan.
:type uri: string
:param timeout: timeout for scanning a URI in ms. Defaults to the
``timeout`` value used when creating the scanner.
:type timeout: int
:return: A named tuple containing
``(uri, tags, duration, seekable, mime)``.
``tags`` is a dictionary of lists for all the tags we found.
``duration`` is the length of the URI in milliseconds, or
:class:`None` if the URI has no duration. ``seekable`` is boolean.
indicating if a seek would succeed.
"""
timeout = int(timeout or self._timeout_ms)
tags, duration, seekable, mime = None, None, None, None
pipeline, signals = _setup_pipeline(uri, self._proxy_config)
try:
_start_pipeline(pipeline)
tags, mime, have_audio = _process(pipeline, timeout)
duration = _query_duration(pipeline)
seekable = _query_seekable(pipeline)
finally:
signals.clear()
pipeline.set_state(Gst.State.NULL)
del pipeline
return _Result(uri, tags, duration, seekable, mime, have_audio)
# Turns out it's _much_ faster to just create a new pipeline for every as
# decodebins and other elements don't seem to take well to being reused.
def _setup_pipeline(uri, proxy_config=None):
src = Gst.Element.make_from_uri(Gst.URIType.SRC, uri)
if not src:
raise exceptions.ScannerError('GStreamer can not open: %s' % uri)
typefind = Gst.ElementFactory.make('typefind')
decodebin = Gst.ElementFactory.make('decodebin')
pipeline = Gst.ElementFactory.make('pipeline')
for e in (src, typefind, decodebin):
pipeline.add(e)
src.link(typefind)
typefind.link(decodebin)
if proxy_config:
utils.setup_proxy(src, proxy_config)
signals = utils.Signals()
signals.connect(typefind, 'have-type', _have_type, decodebin)
signals.connect(decodebin, 'pad-added', _pad_added, pipeline)
signals.connect(decodebin, 'autoplug-select', _autoplug_select)
return pipeline, signals
def _have_type(element, probability, caps, decodebin):
decodebin.set_property('sink-caps', caps)
struct = Gst.Structure.new_empty('have-type')
struct.set_value('caps', caps.get_structure(0))
element.get_bus().post(Gst.Message.new_application(element, struct))
def _pad_added(element, pad, pipeline):
sink = Gst.ElementFactory.make('fakesink')
sink.set_property('sync', False)
pipeline.add(sink)
sink.sync_state_with_parent()
pad.link(sink.get_static_pad('sink'))
if pad.query_caps().is_subset(Gst.Caps.from_string('audio/x-raw')):
# Probably won't happen due to autoplug-select fix, but lets play it
# safe until we've tested more.
struct = Gst.Structure.new_empty('have-audio')
element.get_bus().post(Gst.Message.new_application(element, struct))
def _autoplug_select(element, pad, caps, factory):
if factory.list_is_type(_DECODER | _AUDIO):
struct = Gst.Structure.new_empty('have-audio')
element.get_bus().post(Gst.Message.new_application(element, struct))
if not factory.list_is_type(_DEMUXER | _DEPAYLOADER | _PARSER):
return _SELECT_EXPOSE
return _SELECT_TRY
def _start_pipeline(pipeline):
result = pipeline.set_state(Gst.State.PAUSED)
if result == Gst.StateChangeReturn.NO_PREROLL:
pipeline.set_state(Gst.State.PLAYING)
def _query_duration(pipeline, timeout=100):
# 1. Try and get a duration, return if success.
# 2. Some formats need to play some buffers before duration is found.
# 3. Wait for a duration change event.
# 4. Try and get a duration again.
success, duration = pipeline.query_duration(Gst.Format.TIME)
if success and duration >= 0:
return duration // Gst.MSECOND
result = pipeline.set_state(Gst.State.PLAYING)
if result == Gst.StateChangeReturn.FAILURE:
return None
gst_timeout = timeout * Gst.MSECOND
bus = pipeline.get_bus()
bus.timed_pop_filtered(gst_timeout, Gst.MessageType.DURATION_CHANGED)
success, duration = pipeline.query_duration(Gst.Format.TIME)
if success and duration >= 0:
return duration // Gst.MSECOND
return None
def _query_seekable(pipeline):
query = Gst.Query.new_seeking(Gst.Format.TIME)
pipeline.query(query)
return query.parse_seeking()[1]
def _process(pipeline, timeout_ms):
bus = pipeline.get_bus()
tags = {}
mime = None
have_audio = False
missing_message = None
types = (
Gst.MessageType.ELEMENT |
Gst.MessageType.APPLICATION |
Gst.MessageType.ERROR |
Gst.MessageType.EOS |
Gst.MessageType.ASYNC_DONE |
Gst.MessageType.TAG
)
timeout = timeout_ms
previous = int(time.time() * 1000)
while timeout > 0:
message = bus.timed_pop_filtered(timeout * Gst.MSECOND, types)
if message is None:
break
elif message.type == Gst.MessageType.ELEMENT:
if GstPbutils.is_missing_plugin_message(message):
missing_message = message
elif message.type == Gst.MessageType.APPLICATION:
if message.get_structure().get_name() == 'have-type':
mime = message.get_structure().get_value('caps').get_name()
if mime and (
mime.startswith('text/') or mime == 'application/xml'):
return tags, mime, have_audio
elif message.get_structure().get_name() == 'have-audio':
have_audio = True
elif message.type == Gst.MessageType.ERROR:
error = encoding.locale_decode(message.parse_error()[0])
if missing_message and not mime:
caps = missing_message.get_structure().get_value('detail')
mime = caps.get_structure(0).get_name()
return tags, mime, have_audio
raise exceptions.ScannerError(error)
elif message.type == Gst.MessageType.EOS:
return tags, mime, have_audio
elif message.type == Gst.MessageType.ASYNC_DONE:
if message.src == pipeline:
return tags, mime, have_audio
elif message.type == Gst.MessageType.TAG:
taglist = message.parse_tag()
# Note that this will only keep the last tag.
tags.update(tags_lib.convert_taglist(taglist))
now = int(time.time() * 1000)
timeout -= now - previous
previous = now
raise exceptions.ScannerError('Timeout after %dms' % timeout_ms)
if __name__ == '__main__':
import os
import sys
from mopidy.internal import path
scanner = Scanner(5000)
for uri in sys.argv[1:]:
if not Gst.uri_is_valid(uri):
uri = path.path_to_uri(os.path.abspath(uri))
try:
result = scanner.scan(uri)
for key in ('uri', 'mime', 'duration', 'playable', 'seekable'):
print('%-20s %s' % (key, getattr(result, key)))
print('tags')
for tag, value in result.tags.items():
print('%-20s %s' % (tag, value))
except exceptions.ScannerError as error:
print('%s: %s' % (uri, error))
Mopidy-2.0.0/mopidy/audio/listener.py 0000664 0001750 0001750 00000006304 12660436420 017767 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy import listener
class AudioListener(listener.Listener):
"""
Marker interface for recipients of events sent by the audio actor.
Any Pykka actor that mixes in this class will receive calls to the methods
defined here when the corresponding events happen in the core actor. This
interface is used both for looking up what actors to notify of the events,
and for providing default implementations for those listeners that are not
interested in all events.
"""
@staticmethod
def send(event, **kwargs):
"""Helper to allow calling of audio listener events"""
listener.send(AudioListener, event, **kwargs)
def reached_end_of_stream(self):
"""
Called whenever the end of the audio stream is reached.
*MAY* be implemented by actor.
"""
pass
def stream_changed(self, uri):
"""
Called whenever the audio stream changes.
*MAY* be implemented by actor.
:param string uri: URI the stream has started playing.
"""
pass
def position_changed(self, position):
"""
Called whenever the position of the stream changes.
*MAY* be implemented by actor.
:param int position: Position in milliseconds.
"""
pass
def state_changed(self, old_state, new_state, target_state):
"""
Called after the playback state have changed.
Will be called for both immediate and async state changes in GStreamer.
Target state is used to when we should be in the target state, but
temporarily need to switch to an other state. A typical example of this
is buffering. When this happens an event with
`old=PLAYING, new=PAUSED, target=PLAYING` will be emitted. Once we have
caught up a `old=PAUSED, new=PLAYING, target=None` event will be
be generated.
Regular state changes will not have target state set as they are final
states which should be stable.
*MAY* be implemented by actor.
:param old_state: the state before the change
:type old_state: string from :class:`mopidy.core.PlaybackState` field
:param new_state: the state after the change
:type new_state: A :class:`mopidy.core.PlaybackState` field
:type new_state: string from :class:`mopidy.core.PlaybackState` field
:param target_state: the intended state
:type target_state: string from :class:`mopidy.core.PlaybackState`
field or :class:`None` if this is a final state.
"""
pass
def tags_changed(self, tags):
"""
Called whenever the current audio stream's tags change.
This event signals that some track metadata has been updated. This can
be metadata such as artists, titles, organization, or details about the
actual audio such as bit-rates, numbers of channels etc.
For the available tag keys please refer to GStreamer documentation for
tags.
*MAY* be implemented by actor.
:param tags: The tags that have just been updated.
:type tags: :class:`set` of strings
"""
pass
Mopidy-2.0.0/mopidy/audio/actor.py 0000664 0001750 0001750 00000073035 12660436420 017257 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
import os
import threading
import pykka
from mopidy import exceptions
from mopidy.audio import tags as tags_lib, utils
from mopidy.audio.constants import PlaybackState
from mopidy.audio.listener import AudioListener
from mopidy.internal import deprecation, process
from mopidy.internal.gi import GObject, Gst, GstPbutils
logger = logging.getLogger(__name__)
# This logger is only meant for debug logging of low level GStreamer info such
# as callbacks, event, messages and direct interaction with GStreamer such as
# set_state() on a pipeline.
gst_logger = logging.getLogger('mopidy.audio.gst')
_GST_STATE_MAPPING = {
Gst.State.PLAYING: PlaybackState.PLAYING,
Gst.State.PAUSED: PlaybackState.PAUSED,
Gst.State.NULL: PlaybackState.STOPPED,
}
# TODO: expose this as a property on audio?
class _Appsrc(object):
"""Helper class for dealing with appsrc based playback."""
def __init__(self):
self._signals = utils.Signals()
self.reset()
def reset(self):
"""Reset the helper.
Should be called whenever the source changes and we are not setting up
a new appsrc.
"""
self.prepare(None, None, None, None)
def prepare(self, caps, need_data, enough_data, seek_data):
"""Store info we will need when the appsrc element gets installed."""
self._signals.clear()
self._source = None
self._caps = caps
self._need_data_callback = need_data
self._seek_data_callback = seek_data
self._enough_data_callback = enough_data
def configure(self, source):
"""Configure the supplied source for use.
Should be called whenever we get a new appsrc.
"""
source.set_property('caps', self._caps)
source.set_property('format', b'time')
source.set_property('stream-type', b'seekable')
source.set_property('max-bytes', 1 << 20) # 1MB
source.set_property('min-percent', 50)
if self._need_data_callback:
self._signals.connect(source, 'need-data', self._on_signal,
self._need_data_callback)
if self._seek_data_callback:
self._signals.connect(source, 'seek-data', self._on_signal,
self._seek_data_callback)
if self._enough_data_callback:
self._signals.connect(source, 'enough-data', self._on_signal, None,
self._enough_data_callback)
self._source = source
def push(self, buffer_):
if self._source is None:
return False
if buffer_ is None:
gst_logger.debug('Sending appsrc end-of-stream event.')
result = self._source.emit('end-of-stream')
return result == Gst.FlowReturn.OK
else:
result = self._source.emit('push-buffer', buffer_)
return result == Gst.FlowReturn.OK
def _on_signal(self, element, clocktime, func):
# This shim is used to ensure we always return true, and also handles
# that not all the callbacks have a time argument.
if clocktime is None:
func()
else:
func(utils.clocktime_to_millisecond(clocktime))
return True
# TODO: expose this as a property on audio when #790 gets further along.
class _Outputs(Gst.Bin):
def __init__(self):
Gst.Bin.__init__(self)
# TODO gst1: Set 'outputs' as the Bin name for easier debugging
self._tee = Gst.ElementFactory.make('tee')
self.add(self._tee)
ghost_pad = Gst.GhostPad.new('sink', self._tee.get_static_pad('sink'))
self.add_pad(ghost_pad)
# Add an always connected fakesink which respects the clock so the tee
# doesn't fail even if we don't have any outputs.
fakesink = Gst.ElementFactory.make('fakesink')
fakesink.set_property('sync', True)
self._add(fakesink)
def add_output(self, description):
# XXX This only works for pipelines not in use until #790 gets done.
try:
output = Gst.parse_bin_from_description(
description, ghost_unlinked_pads=True)
except GObject.GError as ex:
logger.error(
'Failed to create audio output "%s": %s', description, ex)
raise exceptions.AudioException(bytes(ex))
self._add(output)
logger.info('Audio output set to "%s"', description)
def _add(self, element):
queue = Gst.ElementFactory.make('queue')
self.add(element)
self.add(queue)
queue.link(element)
self._tee.link(queue)
class SoftwareMixer(object):
pykka_traversable = True
def __init__(self, mixer):
self._mixer = mixer
self._element = None
self._last_volume = None
self._last_mute = None
self._signals = utils.Signals()
def setup(self, element, mixer_ref):
self._element = element
self._mixer.setup(mixer_ref)
def teardown(self):
self._signals.clear()
self._mixer.teardown()
def get_volume(self):
return int(round(self._element.get_property('volume') * 100))
def set_volume(self, volume):
self._element.set_property('volume', volume / 100.0)
self._mixer.trigger_volume_changed(self.get_volume())
def get_mute(self):
return self._element.get_property('mute')
def set_mute(self, mute):
self._element.set_property('mute', bool(mute))
self._mixer.trigger_mute_changed(self.get_mute())
class _Handler(object):
def __init__(self, audio):
self._audio = audio
self._element = None
self._pad = None
self._message_handler_id = None
self._event_handler_id = None
def setup_message_handling(self, element):
self._element = element
bus = element.get_bus()
bus.add_signal_watch()
self._message_handler_id = bus.connect('message', self.on_message)
def setup_event_handling(self, pad):
self._pad = pad
self._event_handler_id = pad.add_probe(
Gst.PadProbeType.EVENT_BOTH, self.on_pad_event)
def teardown_message_handling(self):
bus = self._element.get_bus()
bus.remove_signal_watch()
bus.disconnect(self._message_handler_id)
self._message_handler_id = None
def teardown_event_handling(self):
self._pad.remove_probe(self._event_handler_id)
self._event_handler_id = None
def on_message(self, bus, msg):
if msg.type == Gst.MessageType.STATE_CHANGED:
if msg.src != self._element:
return
old_state, new_state, pending_state = msg.parse_state_changed()
self.on_playbin_state_changed(old_state, new_state, pending_state)
elif msg.type == Gst.MessageType.BUFFERING:
self.on_buffering(msg.parse_buffering(), msg.get_structure())
elif msg.type == Gst.MessageType.EOS:
self.on_end_of_stream()
elif msg.type == Gst.MessageType.ERROR:
error, debug = msg.parse_error()
self.on_error(error, debug)
elif msg.type == Gst.MessageType.WARNING:
error, debug = msg.parse_warning()
self.on_warning(error, debug)
elif msg.type == Gst.MessageType.ASYNC_DONE:
self.on_async_done()
elif msg.type == Gst.MessageType.TAG:
taglist = msg.parse_tag()
self.on_tag(taglist)
elif msg.type == Gst.MessageType.ELEMENT:
if GstPbutils.is_missing_plugin_message(msg):
self.on_missing_plugin(msg)
elif msg.type == Gst.MessageType.STREAM_START:
self.on_stream_start()
def on_pad_event(self, pad, pad_probe_info):
event = pad_probe_info.get_event()
if event.type == Gst.EventType.SEGMENT:
self.on_segment(event.parse_segment())
return Gst.PadProbeReturn.OK
def on_playbin_state_changed(self, old_state, new_state, pending_state):
gst_logger.debug(
'Got STATE_CHANGED bus message: old=%s new=%s pending=%s',
old_state.value_name, new_state.value_name,
pending_state.value_name)
if new_state == Gst.State.READY and pending_state == Gst.State.NULL:
# XXX: We're not called on the last state change when going down to
# NULL, so we rewrite the second to last call to get the expected
# behavior.
new_state = Gst.State.NULL
pending_state = Gst.State.VOID_PENDING
if pending_state != Gst.State.VOID_PENDING:
return # Ignore intermediate state changes
if new_state == Gst.State.READY:
return # Ignore READY state as it's GStreamer specific
new_state = _GST_STATE_MAPPING[new_state]
old_state, self._audio.state = self._audio.state, new_state
target_state = _GST_STATE_MAPPING.get(self._audio._target_state)
if target_state is None:
# XXX: Workaround for #1430, to be fixed properly by #1222.
logger.debug('Race condition happened. See #1222 and #1430.')
return
if target_state == new_state:
target_state = None
logger.debug('Audio event: state_changed(old_state=%s, new_state=%s, '
'target_state=%s)', old_state, new_state, target_state)
AudioListener.send('state_changed', old_state=old_state,
new_state=new_state, target_state=target_state)
if new_state == PlaybackState.STOPPED:
logger.debug('Audio event: stream_changed(uri=None)')
AudioListener.send('stream_changed', uri=None)
if 'GST_DEBUG_DUMP_DOT_DIR' in os.environ:
Gst.debug_bin_to_dot_file(
self._audio._playbin, Gst.DebugGraphDetails.ALL, 'mopidy')
def on_buffering(self, percent, structure=None):
if structure is not None and structure.has_field('buffering-mode'):
buffering_mode = structure.get_enum(
'buffering-mode', Gst.BufferingMode)
if buffering_mode == Gst.BufferingMode.LIVE:
return # Live sources stall in paused.
level = logging.getLevelName('TRACE')
if percent < 10 and not self._audio._buffering:
self._audio._playbin.set_state(Gst.State.PAUSED)
self._audio._buffering = True
level = logging.DEBUG
if percent == 100:
self._audio._buffering = False
if self._audio._target_state == Gst.State.PLAYING:
self._audio._playbin.set_state(Gst.State.PLAYING)
level = logging.DEBUG
gst_logger.log(
level, 'Got BUFFERING bus message: percent=%d%%', percent)
def on_end_of_stream(self):
gst_logger.debug('Got EOS (end of stream) bus message.')
logger.debug('Audio event: reached_end_of_stream()')
self._audio._tags = {}
AudioListener.send('reached_end_of_stream')
def on_error(self, error, debug):
error_msg = str(error).decode('utf-8')
debug_msg = debug.decode('utf-8')
gst_logger.debug(
'Got ERROR bus message: error=%r debug=%r', error_msg, debug_msg)
gst_logger.error('GStreamer error: %s', error_msg)
# TODO: is this needed?
self._audio.stop_playback()
def on_warning(self, error, debug):
error_msg = str(error).decode('utf-8')
debug_msg = debug.decode('utf-8')
gst_logger.warning('GStreamer warning: %s', error_msg)
gst_logger.debug(
'Got WARNING bus message: error=%r debug=%r', error_msg, debug_msg)
def on_async_done(self):
gst_logger.debug('Got ASYNC_DONE bus message.')
def on_tag(self, taglist):
tags = tags_lib.convert_taglist(taglist)
gst_logger.debug('Got TAG bus message: tags=%r', dict(tags))
# Postpone emitting tags until stream start.
if self._audio._pending_tags is not None:
self._audio._pending_tags.update(tags)
return
# TODO: Add proper tests for only emitting changed tags.
unique = object()
changed = []
for key, value in tags.items():
# Update any tags that changed, and store changed keys.
if self._audio._tags.get(key, unique) != value:
self._audio._tags[key] = value
changed.append(key)
if changed:
logger.debug('Audio event: tags_changed(tags=%r)', changed)
AudioListener.send('tags_changed', tags=changed)
def on_missing_plugin(self, msg):
desc = GstPbutils.missing_plugin_message_get_description(msg)
debug = GstPbutils.missing_plugin_message_get_installer_detail(msg)
gst_logger.debug(
'Got missing-plugin bus message: description=%r', desc)
logger.warning('Could not find a %s to handle media.', desc)
if GstPbutils.install_plugins_supported():
logger.info('You might be able to fix this by running: '
'gst-installer "%s"', debug)
# TODO: store the missing plugins installer info in a file so we can
# can provide a 'mopidy install-missing-plugins' if the system has the
# required helper installed?
def on_stream_start(self):
gst_logger.debug('Got STREAM_START bus message')
uri = self._audio._pending_uri
logger.debug('Audio event: stream_changed(uri=%r)', uri)
AudioListener.send('stream_changed', uri=uri)
# Emit any postponed tags that we got after about-to-finish.
tags, self._audio._pending_tags = self._audio._pending_tags, None
self._audio._tags = tags
if tags:
logger.debug('Audio event: tags_changed(tags=%r)', tags.keys())
AudioListener.send('tags_changed', tags=tags.keys())
def on_segment(self, segment):
gst_logger.debug(
'Got SEGMENT pad event: '
'rate=%(rate)s format=%(format)s start=%(start)s stop=%(stop)s '
'position=%(position)s', {
'rate': segment.rate,
'format': Gst.Format.get_name(segment.format),
'start': segment.start,
'stop': segment.stop,
'position': segment.position
})
position_ms = segment.position // Gst.MSECOND
logger.debug('Audio event: position_changed(position=%r)', position_ms)
AudioListener.send('position_changed', position=position_ms)
# TODO: create a player class which replaces the actors internals
class Audio(pykka.ThreadingActor):
"""
Audio output through `GStreamer `_.
"""
#: The GStreamer state mapped to :class:`mopidy.audio.PlaybackState`
state = PlaybackState.STOPPED
#: The software mixing interface :class:`mopidy.audio.actor.SoftwareMixer`
mixer = None
def __init__(self, config, mixer):
super(Audio, self).__init__()
self._config = config
self._target_state = Gst.State.NULL
self._buffering = False
self._tags = {}
self._pending_uri = None
self._pending_tags = None
self._playbin = None
self._outputs = None
self._queue = None
self._about_to_finish_callback = None
self._handler = _Handler(self)
self._appsrc = _Appsrc()
self._signals = utils.Signals()
if mixer and self._config['audio']['mixer'] == 'software':
self.mixer = SoftwareMixer(mixer)
def on_start(self):
self._thread = threading.current_thread()
try:
self._setup_preferences()
self._setup_playbin()
self._setup_outputs()
self._setup_audio_sink()
except GObject.GError as ex:
logger.exception(ex)
process.exit_process()
def on_stop(self):
self._teardown_mixer()
self._teardown_playbin()
def _setup_preferences(self):
# TODO: move out of audio actor?
# Fix for https://github.com/mopidy/mopidy/issues/604
registry = Gst.Registry.get()
jacksink = registry.find_feature('jackaudiosink', Gst.ElementFactory)
if jacksink:
jacksink.set_rank(Gst.Rank.SECONDARY)
def _setup_playbin(self):
playbin = Gst.ElementFactory.make('playbin')
playbin.set_property('flags', 2) # GST_PLAY_FLAG_AUDIO
# TODO: turn into config values...
playbin.set_property('buffer-size', 5 << 20) # 5MB
playbin.set_property('buffer-duration', 5 * Gst.SECOND)
self._signals.connect(playbin, 'source-setup', self._on_source_setup)
self._signals.connect(playbin, 'about-to-finish',
self._on_about_to_finish)
self._playbin = playbin
self._handler.setup_message_handling(playbin)
def _teardown_playbin(self):
self._handler.teardown_message_handling()
self._handler.teardown_event_handling()
self._signals.disconnect(self._playbin, 'about-to-finish')
self._signals.disconnect(self._playbin, 'source-setup')
self._playbin.set_state(Gst.State.NULL)
def _setup_outputs(self):
# We don't want to use outputs for regular testing, so just install
# an unsynced fakesink when someone asks for a 'testoutput'.
if self._config['audio']['output'] == 'testoutput':
self._outputs = Gst.ElementFactory.make('fakesink')
else:
self._outputs = _Outputs()
try:
self._outputs.add_output(self._config['audio']['output'])
except exceptions.AudioException:
process.exit_process() # TODO: move this up the chain
self._handler.setup_event_handling(
self._outputs.get_static_pad('sink'))
def _setup_audio_sink(self):
audio_sink = Gst.ElementFactory.make('bin', 'audio-sink')
# Queue element to buy us time between the about-to-finish event and
# the actual switch, i.e. about to switch can block for longer thanks
# to this queue.
# TODO: See if settings should be set to minimize latency. Previous
# setting breaks appsrc, and settings before that broke on a few
# systems. So leave the default to play it safe.
queue = Gst.ElementFactory.make('queue')
if self._config['audio']['buffer_time'] > 0:
queue.set_property(
'max-size-time',
self._config['audio']['buffer_time'] * Gst.MSECOND)
audio_sink.add(queue)
audio_sink.add(self._outputs)
if self.mixer:
volume = Gst.ElementFactory.make('volume')
audio_sink.add(volume)
queue.link(volume)
volume.link(self._outputs)
self.mixer.setup(volume, self.actor_ref.proxy().mixer)
else:
queue.link(self._outputs)
ghost_pad = Gst.GhostPad.new('sink', queue.get_static_pad('sink'))
audio_sink.add_pad(ghost_pad)
self._playbin.set_property('audio-sink', audio_sink)
self._queue = queue
def _teardown_mixer(self):
if self.mixer:
self.mixer.teardown()
def _on_about_to_finish(self, element):
if self._thread == threading.current_thread():
logger.error(
'about-to-finish in actor, aborting to avoid deadlock.')
return
gst_logger.debug('Got about-to-finish event.')
if self._about_to_finish_callback:
logger.debug('Running about-to-finish callback.')
self._about_to_finish_callback()
def _on_source_setup(self, element, source):
gst_logger.debug(
'Got source-setup signal: element=%s', source.__class__.__name__)
if source.get_factory().get_name() == 'appsrc':
self._appsrc.configure(source)
else:
self._appsrc.reset()
utils.setup_proxy(source, self._config['proxy'])
def set_uri(self, uri):
"""
Set URI of audio to be played.
You *MUST* call :meth:`prepare_change` before calling this method.
:param uri: the URI to play
:type uri: string
"""
# XXX: Hack to workaround issue on Mac OS X where volume level
# does not persist between track changes. mopidy/mopidy#886
if self.mixer is not None:
current_volume = self.mixer.get_volume()
else:
current_volume = None
self._pending_uri = uri
self._pending_tags = {}
self._playbin.set_property('uri', uri)
if self.mixer is not None and current_volume is not None:
self.mixer.set_volume(current_volume)
def set_appsrc(
self, caps, need_data=None, enough_data=None, seek_data=None):
"""
Switch to using appsrc for getting audio to be played.
You *MUST* call :meth:`prepare_change` before calling this method.
:param caps: GStreamer caps string describing the audio format to
expect
:type caps: string
:param need_data: callback for when appsrc needs data
:type need_data: callable which takes data length hint in ms
:param enough_data: callback for when appsrc has enough data
:type enough_data: callable
:param seek_data: callback for when data from a new position is needed
to continue playback
:type seek_data: callable which takes time position in ms
"""
self._appsrc.prepare(
Gst.Caps.from_string(caps), need_data, enough_data, seek_data)
uri = 'appsrc://'
self._pending_uri = uri
self._playbin.set_property('uri', uri)
def emit_data(self, buffer_):
"""
Call this to deliver raw audio data to be played.
If the buffer is :class:`None`, the end-of-stream token is put on the
playbin. We will get a GStreamer message when the stream playback
reaches the token, and can then do any end-of-stream related tasks.
Note that the URI must be set to ``appsrc://`` for this to work.
Returns :class:`True` if data was delivered.
:param buffer_: buffer to pass to appsrc
:type buffer_: :class:`Gst.Buffer` or :class:`None`
:rtype: boolean
"""
return self._appsrc.push(buffer_)
def emit_end_of_stream(self):
"""
Put an end-of-stream token on the playbin. This is typically used in
combination with :meth:`emit_data`.
We will get a GStreamer message when the stream playback reaches the
token, and can then do any end-of-stream related tasks.
.. deprecated:: 1.0
Use :meth:`emit_data` with a :class:`None` buffer instead.
"""
deprecation.warn('audio.emit_end_of_stream')
self._appsrc.push(None)
def set_about_to_finish_callback(self, callback):
"""
Configure audio to use an about-to-finish callback.
This should be used to achieve gapless playback. For this to work the
callback *MUST* call :meth:`set_uri` with the new URI to play and
block until this call has been made. :meth:`prepare_change` is not
needed before :meth:`set_uri` in this one special case.
:param callable callback: Callback to run when we need the next URI.
"""
self._about_to_finish_callback = callback
def get_position(self):
"""
Get position in milliseconds.
:rtype: int
"""
success, position = self._playbin.query_position(Gst.Format.TIME)
if not success:
# TODO: take state into account for this and possibly also return
# None as the unknown value instead of zero?
logger.debug('Position query failed')
return 0
return utils.clocktime_to_millisecond(position)
def set_position(self, position):
"""
Set position in milliseconds.
:param position: the position in milliseconds
:type position: int
:rtype: :class:`True` if successful, else :class:`False`
"""
# TODO: double check seek flags in use.
gst_position = utils.millisecond_to_clocktime(position)
gst_logger.debug('Sending flushing seek: position=%r', gst_position)
# Send seek event to the queue not the playbin. The default behavior
# for bins is to forward this event to all sinks. Which results in
# duplicate seek events making it to appsrc. Since elements are not
# allowed to act on the seek event, only modify it, this should be safe
# to do.
result = self._queue.seek_simple(
Gst.Format.TIME, Gst.SeekFlags.FLUSH, gst_position)
return result
def start_playback(self):
"""
Notify GStreamer that it should start playback.
:rtype: :class:`True` if successfull, else :class:`False`
"""
return self._set_state(Gst.State.PLAYING)
def pause_playback(self):
"""
Notify GStreamer that it should pause playback.
:rtype: :class:`True` if successfull, else :class:`False`
"""
return self._set_state(Gst.State.PAUSED)
def prepare_change(self):
"""
Notify GStreamer that we are about to change state of playback.
This function *MUST* be called before changing URIs or doing
changes like updating data that is being pushed. The reason for this
is that GStreamer will reset all its state when it changes to
:attr:`Gst.State.READY`.
"""
return self._set_state(Gst.State.READY)
def stop_playback(self):
"""
Notify GStreamer that is should stop playback.
:rtype: :class:`True` if successfull, else :class:`False`
"""
self._buffering = False
return self._set_state(Gst.State.NULL)
def wait_for_state_change(self):
"""Block until any pending state changes are complete.
Should only be used by tests.
"""
self._playbin.get_state(timeout=Gst.CLOCK_TIME_NONE)
def enable_sync_handler(self):
"""Enable manual processing of messages from bus.
Should only be used by tests.
"""
def sync_handler(bus, message):
self._handler.on_message(bus, message)
return Gst.BusSyncReply.DROP
bus = self._playbin.get_bus()
bus.set_sync_handler(sync_handler)
def _set_state(self, state):
"""
Internal method for setting the raw GStreamer state.
.. digraph:: gst_state_transitions
graph [rankdir="LR"];
node [fontsize=10];
"NULL" -> "READY"
"PAUSED" -> "PLAYING"
"PAUSED" -> "READY"
"PLAYING" -> "PAUSED"
"READY" -> "NULL"
"READY" -> "PAUSED"
:param state: State to set playbin to. One of: `Gst.State.NULL`,
`Gst.State.READY`, `Gst.State.PAUSED` and `Gst.State.PLAYING`.
:type state: :class:`Gst.State`
:rtype: :class:`True` if successfull, else :class:`False`
"""
self._target_state = state
result = self._playbin.set_state(state)
gst_logger.debug(
'Changing state to %s: result=%s', state.value_name,
result.value_name)
if result == Gst.StateChangeReturn.FAILURE:
logger.warning(
'Setting GStreamer state to %s failed', state.value_name)
return False
# TODO: at this point we could already emit stopped event instead
# of faking it in the message handling when result=OK
return True
# TODO: bake this into setup appsrc perhaps?
def set_metadata(self, track):
"""
Set track metadata for currently playing song.
Only needs to be called by sources such as ``appsrc`` which do not
already inject tags in playbin, e.g. when using :meth:`emit_data` to
deliver raw audio data to GStreamer.
:param track: the current track
:type track: :class:`mopidy.models.Track`
"""
taglist = Gst.TagList.new_empty()
artists = [a for a in (track.artists or []) if a.name]
def set_value(tag, value):
gobject_value = GObject.Value()
gobject_value.init(GObject.TYPE_STRING)
gobject_value.set_string(value)
taglist.add_value(Gst.TagMergeMode.REPLACE, tag, gobject_value)
# Default to blank data to trick shoutcast into clearing any previous
# values it might have.
# TODO: Verify if this works at all, likely it doesn't.
set_value(Gst.TAG_ARTIST, ' ')
set_value(Gst.TAG_TITLE, ' ')
set_value(Gst.TAG_ALBUM, ' ')
if artists:
set_value(Gst.TAG_ARTIST, ', '.join([a.name for a in artists]))
if track.name:
set_value(Gst.TAG_TITLE, track.name)
if track.album and track.album.name:
set_value(Gst.TAG_ALBUM, track.album.name)
gst_logger.debug(
'Sending TAG event for track %r: %r',
track.uri, taglist.to_string())
event = Gst.Event.new_tag(taglist)
# TODO: check if we get this back on our own bus?
self._playbin.send_event(event)
def get_current_tags(self):
"""
Get the currently playing media's tags.
If no tags have been found, or nothing is playing this returns an empty
dictionary. For each set of tags we collect a tags_changed event is
emitted with the keys of the changes tags. After such calls users may
call this function to get the updated values.
:rtype: {key: [values]} dict for the current media.
"""
# TODO: should this be a (deep) copy? most likely yes
# TODO: should we return None when stopped?
# TODO: support only fetching keys we care about?
return self._tags
Mopidy-2.0.0/mopidy/config/ 0000775 0001750 0001750 00000000000 12660436443 015736 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/mopidy/config/types.py 0000664 0001750 0001750 00000021031 12614502604 017441 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
import re
import socket
from mopidy import compat
from mopidy.config import validators
from mopidy.internal import log, path
def decode(value):
if isinstance(value, compat.text_type):
return value
# TODO: only unescape \n \t and \\?
return value.decode('string-escape').decode('utf-8')
def encode(value):
if not isinstance(value, compat.text_type):
return value
for char in ('\\', '\n', '\t'): # TODO: more escapes?
value = value.replace(char, char.encode('unicode-escape'))
return value.encode('utf-8')
class ExpandedPath(bytes):
def __new__(cls, original, expanded):
return super(ExpandedPath, cls).__new__(cls, expanded)
def __init__(self, original, expanded):
self.original = original
class DeprecatedValue(object):
pass
class ConfigValue(object):
"""Represents a config key's value and how to handle it.
Normally you will only be interacting with sub-classes for config values
that encode either deserialization behavior and/or validation.
Each config value should be used for the following actions:
1. Deserializing from a raw string and validating, raising ValueError on
failure.
2. Serializing a value back to a string that can be stored in a config.
3. Formatting a value to a printable form (useful for masking secrets).
:class:`None` values should not be deserialized, serialized or formatted,
the code interacting with the config should simply skip None config values.
"""
def deserialize(self, value):
"""Cast raw string to appropriate type."""
return value
def serialize(self, value, display=False):
"""Convert value back to string for saving."""
if value is None:
return b''
return bytes(value)
class Deprecated(ConfigValue):
"""Deprecated value
Used for ignoring old config values that are no longer in use, but should
not cause the config parser to crash.
"""
def deserialize(self, value):
return DeprecatedValue()
def serialize(self, value, display=False):
return DeprecatedValue()
class String(ConfigValue):
"""String value.
Is decoded as utf-8 and \\n \\t escapes should work and be preserved.
"""
def __init__(self, optional=False, choices=None):
self._required = not optional
self._choices = choices
def deserialize(self, value):
value = decode(value).strip()
validators.validate_required(value, self._required)
if not value:
return None
validators.validate_choice(value, self._choices)
return value
def serialize(self, value, display=False):
if value is None:
return b''
return encode(value)
class Secret(String):
"""Secret string value.
Is decoded as utf-8 and \\n \\t escapes should work and be preserved.
Should be used for passwords, auth tokens etc. Will mask value when being
displayed.
"""
def __init__(self, optional=False, choices=None):
self._required = not optional
self._choices = None # Choices doesn't make sense for secrets
def serialize(self, value, display=False):
if value is not None and display:
return b'********'
return super(Secret, self).serialize(value, display)
class Integer(ConfigValue):
"""Integer value."""
def __init__(
self, minimum=None, maximum=None, choices=None, optional=False):
self._required = not optional
self._minimum = minimum
self._maximum = maximum
self._choices = choices
def deserialize(self, value):
validators.validate_required(value, self._required)
if not value:
return None
value = int(value)
validators.validate_choice(value, self._choices)
validators.validate_minimum(value, self._minimum)
validators.validate_maximum(value, self._maximum)
return value
class Boolean(ConfigValue):
"""Boolean value.
Accepts ``1``, ``yes``, ``true``, and ``on`` with any casing as
:class:`True`.
Accepts ``0``, ``no``, ``false``, and ``off`` with any casing as
:class:`False`.
"""
true_values = ('1', 'yes', 'true', 'on')
false_values = ('0', 'no', 'false', 'off')
def __init__(self, optional=False):
self._required = not optional
def deserialize(self, value):
validators.validate_required(value, self._required)
if not value:
return None
if value.lower() in self.true_values:
return True
elif value.lower() in self.false_values:
return False
raise ValueError('invalid value for boolean: %r' % value)
def serialize(self, value, display=False):
if value:
return b'true'
else:
return b'false'
class List(ConfigValue):
"""List value.
Supports elements split by commas or newlines. Newlines take presedence and
empty list items will be filtered out.
"""
def __init__(self, optional=False):
self._required = not optional
def deserialize(self, value):
if b'\n' in value:
values = re.split(r'\s*\n\s*', value)
else:
values = re.split(r'\s*,\s*', value)
values = (decode(v).strip() for v in values)
values = filter(None, values)
validators.validate_required(values, self._required)
return tuple(values)
def serialize(self, value, display=False):
if not value:
return b''
return b'\n ' + b'\n '.join(encode(v) for v in value if v)
class LogColor(ConfigValue):
def deserialize(self, value):
validators.validate_choice(value.lower(), log.COLORS)
return value.lower()
def serialize(self, value, display=False):
if value.lower() in log.COLORS:
return value.lower()
return b''
class LogLevel(ConfigValue):
"""Log level value.
Expects one of ``critical``, ``error``, ``warning``, ``info``, ``debug``,
or ``all``, with any casing.
"""
levels = {
b'critical': logging.CRITICAL,
b'error': logging.ERROR,
b'warning': logging.WARNING,
b'info': logging.INFO,
b'debug': logging.DEBUG,
b'all': logging.NOTSET,
}
def deserialize(self, value):
validators.validate_choice(value.lower(), self.levels.keys())
return self.levels.get(value.lower())
def serialize(self, value, display=False):
lookup = dict((v, k) for k, v in self.levels.items())
if value in lookup:
return lookup[value]
return b''
class Hostname(ConfigValue):
"""Network hostname value."""
def __init__(self, optional=False):
self._required = not optional
def deserialize(self, value, display=False):
validators.validate_required(value, self._required)
if not value.strip():
return None
try:
socket.getaddrinfo(value, None)
except socket.error:
raise ValueError('must be a resolveable hostname or valid IP')
return value
class Port(Integer):
"""Network port value.
Expects integer in the range 0-65535, zero tells the kernel to simply
allocate a port for us.
"""
# TODO: consider probing if port is free or not?
def __init__(self, choices=None, optional=False):
super(Port, self).__init__(
minimum=0, maximum=2 ** 16 - 1, choices=choices, optional=optional)
class Path(ConfigValue):
"""File system path
The following expansions of the path will be done:
- ``~`` to the current user's home directory
- ``$XDG_CACHE_DIR`` according to the XDG spec
- ``$XDG_CONFIG_DIR`` according to the XDG spec
- ``$XDG_DATA_DIR`` according to the XDG spec
- ``$XDG_MUSIC_DIR`` according to the XDG spec
"""
def __init__(self, optional=False):
self._required = not optional
def deserialize(self, value):
value = value.strip()
expanded = path.expand_path(value)
validators.validate_required(value, self._required)
validators.validate_required(expanded, self._required)
if not value or expanded is None:
return None
return ExpandedPath(value, expanded)
def serialize(self, value, display=False):
if isinstance(value, compat.text_type):
raise ValueError('paths should always be bytes')
if isinstance(value, ExpandedPath):
return value.original
return value
Mopidy-2.0.0/mopidy/config/schemas.py 0000664 0001750 0001750 00000007711 12575004517 017737 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import collections
from mopidy.config import types
def _did_you_mean(name, choices):
"""Suggest most likely setting based on levenshtein."""
if not choices:
return None
name = name.lower()
candidates = [(_levenshtein(name, c), c) for c in choices]
candidates.sort()
if candidates[0][0] <= 3:
return candidates[0][1]
return None
def _levenshtein(a, b):
"""Calculates the Levenshtein distance between a and b."""
n, m = len(a), len(b)
if n > m:
return _levenshtein(b, a)
current = range(n + 1)
for i in range(1, m + 1):
previous, current = current, [i] + [0] * n
for j in range(1, n + 1):
add, delete = previous[j] + 1, current[j - 1] + 1
change = previous[j - 1]
if a[j - 1] != b[i - 1]:
change += 1
current[j] = min(add, delete, change)
return current[n]
class ConfigSchema(collections.OrderedDict):
"""Logical group of config values that correspond to a config section.
Schemas are set up by assigning config keys with config values to
instances. Once setup :meth:`deserialize` can be called with a dict of
values to process. For convienience we also support :meth:`format` method
that can used for converting the values to a dict that can be printed and
:meth:`serialize` for converting the values to a form suitable for
persistence.
"""
def __init__(self, name):
super(ConfigSchema, self).__init__()
self.name = name
def deserialize(self, values):
"""Validates the given ``values`` using the config schema.
Returns a tuple with cleaned values and errors.
"""
errors = {}
result = {}
for key, value in values.items():
try:
result[key] = self[key].deserialize(value)
except KeyError: # not in our schema
errors[key] = 'unknown config key.'
suggestion = _did_you_mean(key, self.keys())
if suggestion:
errors[key] += ' Did you mean %s?' % suggestion
except ValueError as e: # deserialization failed
result[key] = None
errors[key] = str(e)
for key in self.keys():
if isinstance(self[key], types.Deprecated):
result.pop(key, None)
elif key not in result and key not in errors:
result[key] = None
errors[key] = 'config key not found.'
return result, errors
def serialize(self, values, display=False):
"""Converts the given ``values`` to a format suitable for persistence.
If ``display`` is :class:`True` secret config values, like passwords,
will be masked out.
Returns a dict of config keys and values."""
result = collections.OrderedDict()
for key in self.keys():
if key in values:
result[key] = self[key].serialize(values[key], display)
return result
class MapConfigSchema(object):
"""Schema for handling multiple unknown keys with the same type.
Does not sub-class :class:`ConfigSchema`, but implements the same
serialize/deserialize interface.
"""
def __init__(self, name, value_type):
self.name = name
self._value_type = value_type
def deserialize(self, values):
errors = {}
result = {}
for key, value in values.items():
try:
result[key] = self._value_type.deserialize(value)
except ValueError as e: # deserialization failed
result[key] = None
errors[key] = str(e)
return result, errors
def serialize(self, values, display=False):
result = collections.OrderedDict()
for key in sorted(values.keys()):
result[key] = self._value_type.serialize(values[key], display)
return result
Mopidy-2.0.0/mopidy/config/default.conf 0000664 0001750 0001750 00000000735 12660436420 020231 0 ustar jodal jodal 0000000 0000000 [core]
cache_dir = $XDG_CACHE_DIR/mopidy
config_dir = $XDG_CONFIG_DIR/mopidy
data_dir = $XDG_DATA_DIR/mopidy
max_tracklist_length = 10000
[logging]
color = true
console_format = %(levelname)-8s %(message)s
debug_format = %(levelname)-8s %(asctime)s [%(process)d:%(threadName)s] %(name)s\n %(message)s
debug_file = mopidy.log
config_file =
[audio]
mixer = software
mixer_volume =
output = autoaudiosink
buffer_time =
[proxy]
scheme =
hostname =
port =
username =
password =
Mopidy-2.0.0/mopidy/config/validators.py 0000664 0001750 0001750 00000002513 12505224626 020455 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
# TODO: add validate regexp?
def validate_required(value, required):
"""Validate that ``value`` is set if ``required``
Normally called in :meth:`~mopidy.config.types.ConfigValue.deserialize` on
the raw string, _not_ the converted value.
"""
if required and not value:
raise ValueError('must be set.')
def validate_choice(value, choices):
"""Validate that ``value`` is one of the ``choices``
Normally called in :meth:`~mopidy.config.types.ConfigValue.deserialize`.
"""
if choices is not None and value not in choices:
names = ', '.join(repr(c) for c in choices)
raise ValueError('must be one of %s, not %s.' % (names, value))
def validate_minimum(value, minimum):
"""Validate that ``value`` is at least ``minimum``
Normally called in :meth:`~mopidy.config.types.ConfigValue.deserialize`.
"""
if minimum is not None and value < minimum:
raise ValueError('%r must be larger than %r.' % (value, minimum))
def validate_maximum(value, maximum):
"""Validate that ``value`` is at most ``maximum``
Normally called in :meth:`~mopidy.config.types.ConfigValue.deserialize`.
"""
if maximum is not None and value > maximum:
raise ValueError('%r must be smaller than %r.' % (value, maximum))
Mopidy-2.0.0/mopidy/config/__init__.py 0000664 0001750 0001750 00000023673 12660436420 020055 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import io
import itertools
import logging
import os.path
import re
from mopidy import compat
from mopidy.compat import configparser
from mopidy.config import keyring
from mopidy.config.schemas import * # noqa
from mopidy.config.types import * # noqa
from mopidy.internal import path, versioning
logger = logging.getLogger(__name__)
_core_schema = ConfigSchema('core')
_core_schema['cache_dir'] = Path()
_core_schema['config_dir'] = Path()
_core_schema['data_dir'] = Path()
# MPD supports at most 10k tracks, some clients segfault when this is exceeded.
_core_schema['max_tracklist_length'] = Integer(minimum=1, maximum=10000)
_logging_schema = ConfigSchema('logging')
_logging_schema['color'] = Boolean()
_logging_schema['console_format'] = String()
_logging_schema['debug_format'] = String()
_logging_schema['debug_file'] = Path()
_logging_schema['config_file'] = Path(optional=True)
_loglevels_schema = MapConfigSchema('loglevels', LogLevel())
_logcolors_schema = MapConfigSchema('logcolors', LogColor())
_audio_schema = ConfigSchema('audio')
_audio_schema['mixer'] = String()
_audio_schema['mixer_track'] = Deprecated()
_audio_schema['mixer_volume'] = Integer(optional=True, minimum=0, maximum=100)
_audio_schema['output'] = String()
_audio_schema['visualizer'] = Deprecated()
_audio_schema['buffer_time'] = Integer(optional=True, minimum=1)
_proxy_schema = ConfigSchema('proxy')
_proxy_schema['scheme'] = String(optional=True,
choices=['http', 'https', 'socks4', 'socks5'])
_proxy_schema['hostname'] = Hostname(optional=True)
_proxy_schema['port'] = Port(optional=True)
_proxy_schema['username'] = String(optional=True)
_proxy_schema['password'] = Secret(optional=True)
# NOTE: if multiple outputs ever comes something like LogLevelConfigSchema
# _outputs_schema = config.AudioOutputConfigSchema()
_schemas = [
_core_schema, _logging_schema, _loglevels_schema, _logcolors_schema,
_audio_schema, _proxy_schema]
_INITIAL_HELP = """
# For further information about options in this file see:
# http://docs.mopidy.com/
#
# The initial commented out values reflect the defaults as of:
# %(versions)s
#
# Available options and defaults might have changed since then,
# run `mopidy config` to see the current effective config and
# `mopidy --version` to check the current version.
"""
def read(config_file):
"""Helper to load config defaults in same way across core and extensions"""
with io.open(config_file, 'rb') as filehandle:
return filehandle.read()
def load(files, ext_schemas, ext_defaults, overrides):
config_dir = os.path.dirname(__file__)
defaults = [read(os.path.join(config_dir, 'default.conf'))]
defaults.extend(ext_defaults)
raw_config = _load(files, defaults, keyring.fetch() + (overrides or []))
schemas = _schemas[:]
schemas.extend(ext_schemas)
return _validate(raw_config, schemas)
def format(config, ext_schemas, comments=None, display=True):
schemas = _schemas[:]
schemas.extend(ext_schemas)
return _format(config, comments or {}, schemas, display, False)
def format_initial(extensions_data):
config_dir = os.path.dirname(__file__)
defaults = [read(os.path.join(config_dir, 'default.conf'))]
defaults.extend(d.extension.get_default_config() for d in extensions_data)
raw_config = _load([], defaults, [])
schemas = _schemas[:]
schemas.extend(d.extension.get_config_schema() for d in extensions_data)
config, errors = _validate(raw_config, schemas)
versions = ['Mopidy %s' % versioning.get_version()]
extensions_data = sorted(
extensions_data, key=lambda d: d.extension.dist_name)
for data in extensions_data:
versions.append('%s %s' % (
data.extension.dist_name, data.extension.version))
header = _INITIAL_HELP.strip() % {'versions': '\n# '.join(versions)}
formatted_config = _format(
config=config, comments={}, schemas=schemas,
display=False, disable=True).decode('utf-8')
return header + '\n\n' + formatted_config
def _load(files, defaults, overrides):
parser = configparser.RawConfigParser()
# TODO: simply return path to config file for defaults so we can load it
# all in the same way?
logger.info('Loading config from builtin defaults')
for default in defaults:
if isinstance(default, compat.text_type):
default = default.encode('utf-8')
parser.readfp(io.BytesIO(default))
# Load config from a series of config files
files = [path.expand_path(f) for f in files]
for name in files:
if os.path.isdir(name):
for filename in os.listdir(name):
filename = os.path.join(name, filename)
if os.path.isfile(filename) and filename.endswith('.conf'):
_load_file(parser, filename)
else:
_load_file(parser, name)
# If there have been parse errors there is a python bug that causes the
# values to be lists, this little trick coerces these into strings.
parser.readfp(io.BytesIO())
raw_config = {}
for section in parser.sections():
raw_config[section] = dict(parser.items(section))
logger.info('Loading config from command line options')
for section, key, value in overrides:
raw_config.setdefault(section, {})[key] = value
return raw_config
def _load_file(parser, filename):
if not os.path.exists(filename):
logger.debug(
'Loading config from %s failed; it does not exist', filename)
return
if not os.access(filename, os.R_OK):
logger.warning(
'Loading config from %s failed; read permission missing',
filename)
return
try:
logger.info('Loading config from %s', filename)
with io.open(filename, 'rb') as filehandle:
parser.readfp(filehandle)
except configparser.MissingSectionHeaderError as e:
logger.warning('%s does not have a config section, not loaded.',
filename)
except configparser.ParsingError as e:
linenos = ', '.join(str(lineno) for lineno, line in e.errors)
logger.warning(
'%s has errors, line %s has been ignored.', filename, linenos)
except IOError:
# TODO: if this is the initial load of logging config we might not
# have a logger at this point, we might want to handle this better.
logger.debug('Config file %s not found; skipping', filename)
def _validate(raw_config, schemas):
# Get validated config
config = {}
errors = {}
sections = set(raw_config)
for schema in schemas:
sections.discard(schema.name)
values = raw_config.get(schema.name, {})
result, error = schema.deserialize(values)
if error:
errors[schema.name] = error
if result:
config[schema.name] = result
for section in sections:
logger.debug('Ignoring unknown config section: %s', section)
return config, errors
def _format(config, comments, schemas, display, disable):
output = []
for schema in schemas:
serialized = schema.serialize(
config.get(schema.name, {}), display=display)
if not serialized:
continue
output.append(b'[%s]' % bytes(schema.name))
for key, value in serialized.items():
if isinstance(value, types.DeprecatedValue):
continue
comment = bytes(comments.get(schema.name, {}).get(key, ''))
output.append(b'%s =' % bytes(key))
if value is not None:
output[-1] += b' ' + value
if comment:
output[-1] += b' ; ' + comment.capitalize()
if disable:
output[-1] = re.sub(r'^', b'#', output[-1], flags=re.M)
output.append(b'')
return b'\n'.join(output).strip()
def _preprocess(config_string):
"""Convert a raw config into a form that preserves comments etc."""
results = ['[__COMMENTS__]']
counter = itertools.count(0)
section_re = re.compile(r'^(\[[^\]]+\])\s*(.+)$')
blank_line_re = re.compile(r'^\s*$')
comment_re = re.compile(r'^(#|;)')
inline_comment_re = re.compile(r' ;')
def newlines(match):
return '__BLANK%d__ =' % next(counter)
def comments(match):
if match.group(1) == '#':
return '__HASH%d__ =' % next(counter)
elif match.group(1) == ';':
return '__SEMICOLON%d__ =' % next(counter)
def inlinecomments(match):
return '\n__INLINE%d__ =' % next(counter)
def sections(match):
return '%s\n__SECTION%d__ = %s' % (
match.group(1), next(counter), match.group(2))
for line in config_string.splitlines():
line = blank_line_re.sub(newlines, line)
line = section_re.sub(sections, line)
line = comment_re.sub(comments, line)
line = inline_comment_re.sub(inlinecomments, line)
results.append(line)
return '\n'.join(results)
def _postprocess(config_string):
"""Converts a preprocessed config back to original form."""
flags = re.IGNORECASE | re.MULTILINE
result = re.sub(r'^\[__COMMENTS__\](\n|$)', '', config_string, flags=flags)
result = re.sub(r'\n__INLINE\d+__ =(.*)$', ' ;\g<1>', result, flags=flags)
result = re.sub(r'^__HASH\d+__ =(.*)$', '#\g<1>', result, flags=flags)
result = re.sub(r'^__SEMICOLON\d+__ =(.*)$', ';\g<1>', result, flags=flags)
result = re.sub(r'\n__SECTION\d+__ =(.*)$', '\g<1>', result, flags=flags)
result = re.sub(r'^__BLANK\d+__ =$', '', result, flags=flags)
return result
class Proxy(collections.Mapping):
def __init__(self, data):
self._data = data
def __getitem__(self, key):
item = self._data.__getitem__(key)
if isinstance(item, dict):
return Proxy(item)
return item
def __iter__(self):
return self._data.__iter__()
def __len__(self):
return self._data.__len__()
def __repr__(self):
return b'Proxy(%r)' % self._data
Mopidy-2.0.0/mopidy/config/keyring.py 0000664 0001750 0001750 00000012701 12614502604 017751 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
logger = logging.getLogger(__name__)
try:
import dbus
except ImportError:
dbus = None
from mopidy import compat
# XXX: Hack to workaround introspection bug caused by gnome-keyring, should be
# fixed by version 3.5 per:
# https://git.gnome.org/browse/gnome-keyring/commit/?id=5dccbe88eb94eea9934e2b7
if dbus:
EMPTY_STRING = dbus.String('', variant_level=1)
else:
EMPTY_STRING = ''
FETCH_ERROR = (
'Fetching passwords from your keyring failed. Any passwords '
'stored in the keyring will not be available.')
def fetch():
if not dbus:
logger.debug('%s (dbus not installed)', FETCH_ERROR)
return []
try:
bus = dbus.SessionBus()
except dbus.exceptions.DBusException as e:
logger.debug('%s (%s)', FETCH_ERROR, e)
return []
if not bus.name_has_owner('org.freedesktop.secrets'):
logger.debug(
'%s (org.freedesktop.secrets service not running)', FETCH_ERROR)
return []
service = _service(bus)
session = service.OpenSession('plain', EMPTY_STRING)[1]
items, locked = service.SearchItems({'service': 'mopidy'})
if not locked and not items:
return []
if locked:
# There is a chance we can unlock without prompting the users...
items, prompt = service.Unlock(locked)
if prompt != '/':
_prompt(bus, prompt).Dismiss()
logger.debug('%s (Keyring is locked)', FETCH_ERROR)
return []
result = []
secrets = service.GetSecrets(items, session, byte_arrays=True)
for item_path, values in secrets.items():
session_path, parameters, value, content_type = values
attrs = _item_attributes(bus, item_path)
result.append((attrs['section'], attrs['key'], bytes(value)))
return result
def set(section, key, value):
"""Store a secret config value for a given section/key.
Indicates if storage failed or succeeded.
"""
if not dbus:
logger.debug('Saving %s/%s to keyring failed. (dbus not installed)',
section, key)
return False
try:
bus = dbus.SessionBus()
except dbus.exceptions.DBusException as e:
logger.debug('Saving %s/%s to keyring failed. (%s)', section, key, e)
return False
if not bus.name_has_owner('org.freedesktop.secrets'):
logger.debug(
'Saving %s/%s to keyring failed. '
'(org.freedesktop.secrets service not running)',
section, key)
return False
service = _service(bus)
collection = _collection(bus)
if not collection:
return False
if isinstance(value, compat.text_type):
value = value.encode('utf-8')
session = service.OpenSession('plain', EMPTY_STRING)[1]
secret = dbus.Struct((session, '', dbus.ByteArray(value),
'plain/text; charset=utf8'))
label = 'mopidy: %s/%s' % (section, key)
attributes = {'service': 'mopidy', 'section': section, 'key': key}
properties = {'org.freedesktop.Secret.Item.Label': label,
'org.freedesktop.Secret.Item.Attributes': attributes}
try:
item, prompt = collection.CreateItem(properties, secret, True)
except dbus.exceptions.DBusException as e:
# TODO: catch IsLocked errors etc.
logger.debug('Saving %s/%s to keyring failed. (%s)', section, key, e)
return False
if prompt == '/':
return True
_prompt(bus, prompt).Dismiss()
logger.debug('Saving secret %s/%s failed. (Keyring is locked)',
section, key)
return False
def _service(bus):
return _interface(bus, '/org/freedesktop/secrets',
'org.freedesktop.Secret.Service')
# NOTE: depending on versions and setup 'default' might not exists, so try and
# use it but fall back to the 'login' collection, and finally the 'session' one
# if all else fails. We should probably create a keyring/collection setting
# that allows users to set this so they have control over where their secrets
# get stored.
def _collection(bus):
for name in 'aliases/default', 'collection/login', 'collection/session':
path = '/org/freedesktop/secrets/' + name
if _collection_exists(bus, path):
break
else:
return None
return _interface(bus, path, 'org.freedesktop.Secret.Collection')
# NOTE: Hack to probe if a given collection actually exists. Needed to work
# around an introspection bug in setting passwords for non-existant aliases.
def _collection_exists(bus, path):
try:
item = _interface(bus, path, 'org.freedesktop.DBus.Properties')
item.Get('org.freedesktop.Secret.Collection', 'Label')
return True
except dbus.exceptions.DBusException:
return False
# NOTE: We could call prompt.Prompt('') to unlock the keyring when it is not
# '/', but we would then also have to arrange to setup signals to wait until
# this has been completed. So for now we just dismiss the prompt and expect
# keyrings to be unlocked.
def _prompt(bus, path):
return _interface(bus, path, 'Prompt')
def _item_attributes(bus, path):
item = _interface(bus, path, 'org.freedesktop.DBus.Properties')
result = item.Get('org.freedesktop.Secret.Item', 'Attributes')
return dict((bytes(k), bytes(v)) for k, v in result.items())
def _interface(bus, path, interface):
obj = bus.get_object('org.freedesktop.secrets', path)
return dbus.Interface(obj, interface)
Mopidy-2.0.0/mopidy/models/ 0000775 0001750 0001750 00000000000 12660436443 015754 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/mopidy/models/immutable.py 0000664 0001750 0001750 00000016252 12660436420 020306 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import copy
import itertools
import weakref
from mopidy.internal import deprecation
from mopidy.models.fields import Field
class ImmutableObject(object):
"""
Superclass for immutable objects whose fields can only be modified via the
constructor.
This version of this class has been retained to avoid breaking any clients
relying on it's behavior. Internally in Mopidy we now use
:class:`ValidatedImmutableObject` for type safety and it's much smaller
memory footprint.
:param kwargs: kwargs to set as fields on the object
:type kwargs: any
"""
# Any sub-classes that don't set slots won't be effected by the base using
# slots as they will still get an instance dict.
__slots__ = ['__weakref__']
def __init__(self, *args, **kwargs):
for key, value in kwargs.items():
if not self._is_valid_field(key):
raise TypeError(
'__init__() got an unexpected keyword argument "%s"' % key)
self._set_field(key, value)
def __setattr__(self, name, value):
if name.startswith('_'):
object.__setattr__(self, name, value)
else:
raise AttributeError('Object is immutable.')
def __delattr__(self, name):
if name.startswith('_'):
object.__delattr__(self, name)
else:
raise AttributeError('Object is immutable.')
def _is_valid_field(self, name):
return hasattr(self, name) and not callable(getattr(self, name))
def _set_field(self, name, value):
if value == getattr(self.__class__, name):
self.__dict__.pop(name, None)
else:
self.__dict__[name] = value
def _items(self):
return self.__dict__.iteritems()
def __repr__(self):
kwarg_pairs = []
for key, value in sorted(self._items()):
if isinstance(value, (frozenset, tuple)):
if not value:
continue
value = list(value)
kwarg_pairs.append('%s=%s' % (key, repr(value)))
return '%(classname)s(%(kwargs)s)' % {
'classname': self.__class__.__name__,
'kwargs': ', '.join(kwarg_pairs),
}
def __hash__(self):
hash_sum = 0
for key, value in self._items():
hash_sum += hash(key) + hash(value)
return hash_sum
def __eq__(self, other):
if not isinstance(other, self.__class__):
return False
return all(a == b for a, b in itertools.izip_longest(
self._items(), other._items(), fillvalue=object()))
def __ne__(self, other):
return not self.__eq__(other)
def copy(self, **values):
"""
.. deprecated:: 1.1
Use :meth:`replace` instead.
"""
deprecation.warn('model.immutable.copy')
return self.replace(**values)
def replace(self, **kwargs):
"""
Replace the fields in the model and return a new instance
Examples::
# Returns a track with a new name
Track(name='foo').replace(name='bar')
# Return an album with a new number of tracks
Album(num_tracks=2).replace(num_tracks=5)
:param kwargs: kwargs to set as fields on the object
:type kwargs: any
:rtype: instance of the model with replaced fields
"""
other = copy.copy(self)
for key, value in kwargs.items():
if not self._is_valid_field(key):
raise TypeError(
'replace() got an unexpected keyword argument "%s"' % key)
other._set_field(key, value)
return other
def serialize(self):
data = {}
data['__model__'] = self.__class__.__name__
for key, value in self._items():
if isinstance(value, (set, frozenset, list, tuple)):
value = [
v.serialize() if isinstance(v, ImmutableObject) else v
for v in value]
elif isinstance(value, ImmutableObject):
value = value.serialize()
if not (isinstance(value, list) and len(value) == 0):
data[key] = value
return data
class _ValidatedImmutableObjectMeta(type):
"""Helper that initializes fields, slots and memoizes instance creation."""
def __new__(cls, name, bases, attrs):
fields = {}
for base in bases: # Copy parent fields over to our state
fields.update(getattr(base, '_fields', {}))
for key, value in attrs.items(): # Add our own fields
if isinstance(value, Field):
fields[key] = '_' + key
value._name = key
attrs['_fields'] = fields
attrs['_instances'] = weakref.WeakValueDictionary()
attrs['__slots__'] = list(attrs.get('__slots__', [])) + fields.values()
return super(_ValidatedImmutableObjectMeta, cls).__new__(
cls, name, bases, attrs)
def __call__(cls, *args, **kwargs): # noqa: N805
instance = super(_ValidatedImmutableObjectMeta, cls).__call__(
*args, **kwargs)
return cls._instances.setdefault(weakref.ref(instance), instance)
class ValidatedImmutableObject(ImmutableObject):
"""
Superclass for immutable objects whose fields can only be modified via the
constructor. Fields should be :class:`Field` instances to ensure type
safety in our models.
Note that since these models can not be changed, we heavily memoize them
to save memory. So constructing a class with the same arguments twice will
give you the same instance twice.
"""
__metaclass__ = _ValidatedImmutableObjectMeta
__slots__ = ['_hash']
def __hash__(self):
if not hasattr(self, '_hash'):
hash_sum = super(ValidatedImmutableObject, self).__hash__()
object.__setattr__(self, '_hash', hash_sum)
return self._hash
def _is_valid_field(self, name):
return name in self._fields
def _set_field(self, name, value):
object.__setattr__(self, name, value)
def _items(self):
for field, key in self._fields.items():
if hasattr(self, key):
yield field, getattr(self, key)
def replace(self, **kwargs):
"""
Replace the fields in the model and return a new instance
Examples::
# Returns a track with a new name
Track(name='foo').replace(name='bar')
# Return an album with a new number of tracks
Album(num_tracks=2).replace(num_tracks=5)
Note that internally we memoize heavily to keep memory usage down given
our overly repetitive data structures. So you might get an existing
instance if it contains the same values.
:param kwargs: kwargs to set as fields on the object
:type kwargs: any
:rtype: instance of the model with replaced fields
"""
if not kwargs:
return self
other = super(ValidatedImmutableObject, self).replace(**kwargs)
if hasattr(self, '_hash'):
object.__delattr__(other, '_hash')
return self._instances.setdefault(weakref.ref(other), other)
Mopidy-2.0.0/mopidy/models/__init__.py 0000664 0001750 0001750 00000025175 12660436420 020072 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy import compat
from mopidy.models import fields
from mopidy.models.immutable import ImmutableObject, ValidatedImmutableObject
from mopidy.models.serialize import ModelJSONEncoder, model_json_decoder
__all__ = [
'ImmutableObject', 'Ref', 'Image', 'Artist', 'Album', 'track', 'TlTrack',
'Playlist', 'SearchResult', 'model_json_decoder', 'ModelJSONEncoder',
'ValidatedImmutableObject']
class Ref(ValidatedImmutableObject):
"""
Model to represent URI references with a human friendly name and type
attached. This is intended for use a lightweight object "free" of metadata
that can be passed around instead of using full blown models.
:param uri: object URI
:type uri: string
:param name: object name
:type name: string
:param type: object type
:type type: string
"""
#: The object URI. Read-only.
uri = fields.URI()
#: The object name. Read-only.
name = fields.String()
#: The object type, e.g. "artist", "album", "track", "playlist",
#: "directory". Read-only.
type = fields.Identifier() # TODO: consider locking this down.
# type = fields.Field(choices=(ALBUM, ARTIST, DIRECTORY, PLAYLIST, TRACK))
#: Constant used for comparison with the :attr:`type` field.
ALBUM = 'album'
#: Constant used for comparison with the :attr:`type` field.
ARTIST = 'artist'
#: Constant used for comparison with the :attr:`type` field.
DIRECTORY = 'directory'
#: Constant used for comparison with the :attr:`type` field.
PLAYLIST = 'playlist'
#: Constant used for comparison with the :attr:`type` field.
TRACK = 'track'
@classmethod
def album(cls, **kwargs):
"""Create a :class:`Ref` with ``type`` :attr:`ALBUM`."""
kwargs['type'] = Ref.ALBUM
return cls(**kwargs)
@classmethod
def artist(cls, **kwargs):
"""Create a :class:`Ref` with ``type`` :attr:`ARTIST`."""
kwargs['type'] = Ref.ARTIST
return cls(**kwargs)
@classmethod
def directory(cls, **kwargs):
"""Create a :class:`Ref` with ``type`` :attr:`DIRECTORY`."""
kwargs['type'] = Ref.DIRECTORY
return cls(**kwargs)
@classmethod
def playlist(cls, **kwargs):
"""Create a :class:`Ref` with ``type`` :attr:`PLAYLIST`."""
kwargs['type'] = Ref.PLAYLIST
return cls(**kwargs)
@classmethod
def track(cls, **kwargs):
"""Create a :class:`Ref` with ``type`` :attr:`TRACK`."""
kwargs['type'] = Ref.TRACK
return cls(**kwargs)
class Image(ValidatedImmutableObject):
"""
:param string uri: URI of the image
:param int width: Optional width of image or :class:`None`
:param int height: Optional height of image or :class:`None`
"""
#: The image URI. Read-only.
uri = fields.URI()
#: Optional width of the image or :class:`None`. Read-only.
width = fields.Integer(min=0)
#: Optional height of the image or :class:`None`. Read-only.
height = fields.Integer(min=0)
class Artist(ValidatedImmutableObject):
"""
:param uri: artist URI
:type uri: string
:param name: artist name
:type name: string
:param sortname: artist name for sorting
:type sortname: string
:param musicbrainz_id: MusicBrainz ID
:type musicbrainz_id: string
"""
#: The artist URI. Read-only.
uri = fields.URI()
#: The artist name. Read-only.
name = fields.String()
#: Artist name for better sorting, e.g. with articles stripped
sortname = fields.String()
#: The MusicBrainz ID of the artist. Read-only.
musicbrainz_id = fields.Identifier()
class Album(ValidatedImmutableObject):
"""
:param uri: album URI
:type uri: string
:param name: album name
:type name: string
:param artists: album artists
:type artists: list of :class:`Artist`
:param num_tracks: number of tracks in album
:type num_tracks: integer or :class:`None` if unknown
:param num_discs: number of discs in album
:type num_discs: integer or :class:`None` if unknown
:param date: album release date (YYYY or YYYY-MM-DD)
:type date: string
:param musicbrainz_id: MusicBrainz ID
:type musicbrainz_id: string
:param images: album image URIs
:type images: list of strings
.. deprecated:: 1.2
The ``images`` field is deprecated.
Use :meth:`mopidy.core.LibraryController.get_images` instead.
"""
#: The album URI. Read-only.
uri = fields.URI()
#: The album name. Read-only.
name = fields.String()
#: A set of album artists. Read-only.
artists = fields.Collection(type=Artist, container=frozenset)
#: The number of tracks in the album. Read-only.
num_tracks = fields.Integer(min=0)
#: The number of discs in the album. Read-only.
num_discs = fields.Integer(min=0)
#: The album release date. Read-only.
date = fields.Date()
#: The MusicBrainz ID of the album. Read-only.
musicbrainz_id = fields.Identifier()
#: The album image URIs. Read-only.
#:
#: .. deprecated:: 1.2
#: Use :meth:`mopidy.core.LibraryController.get_images` instead.
images = fields.Collection(type=compat.string_types, container=frozenset)
class Track(ValidatedImmutableObject):
"""
:param uri: track URI
:type uri: string
:param name: track name
:type name: string
:param artists: track artists
:type artists: list of :class:`Artist`
:param album: track album
:type album: :class:`Album`
:param composers: track composers
:type composers: string
:param performers: track performers
:type performers: string
:param genre: track genre
:type genre: string
:param track_no: track number in album
:type track_no: integer or :class:`None` if unknown
:param disc_no: disc number in album
:type disc_no: integer or :class:`None` if unknown
:param date: track release date (YYYY or YYYY-MM-DD)
:type date: string
:param length: track length in milliseconds
:type length: integer or :class:`None` if there is no duration
:param bitrate: bitrate in kbit/s
:type bitrate: integer
:param comment: track comment
:type comment: string
:param musicbrainz_id: MusicBrainz ID
:type musicbrainz_id: string
:param last_modified: Represents last modification time
:type last_modified: integer or :class:`None` if unknown
"""
#: The track URI. Read-only.
uri = fields.URI()
#: The track name. Read-only.
name = fields.String()
#: A set of track artists. Read-only.
artists = fields.Collection(type=Artist, container=frozenset)
#: The track :class:`Album`. Read-only.
album = fields.Field(type=Album)
#: A set of track composers. Read-only.
composers = fields.Collection(type=Artist, container=frozenset)
#: A set of track performers`. Read-only.
performers = fields.Collection(type=Artist, container=frozenset)
#: The track genre. Read-only.
genre = fields.String()
#: The track number in the album. Read-only.
track_no = fields.Integer(min=0)
#: The disc number in the album. Read-only.
disc_no = fields.Integer(min=0)
#: The track release date. Read-only.
date = fields.Date()
#: The track length in milliseconds. Read-only.
length = fields.Integer(min=0)
#: The track's bitrate in kbit/s. Read-only.
bitrate = fields.Integer(min=0)
#: The track comment. Read-only.
comment = fields.String()
#: The MusicBrainz ID of the track. Read-only.
musicbrainz_id = fields.Identifier()
#: Integer representing when the track was last modified. Exact meaning
#: depends on source of track. For local files this is the modification
#: time in milliseconds since Unix epoch. For other backends it could be an
#: equivalent timestamp or simply a version counter.
last_modified = fields.Integer(min=0)
class TlTrack(ValidatedImmutableObject):
"""
A tracklist track. Wraps a regular track and it's tracklist ID.
The use of :class:`TlTrack` allows the same track to appear multiple times
in the tracklist.
This class also accepts it's parameters as positional arguments. Both
arguments must be provided, and they must appear in the order they are
listed here.
This class also supports iteration, so your extract its values like this::
(tlid, track) = tl_track
:param tlid: tracklist ID
:type tlid: int
:param track: the track
:type track: :class:`Track`
"""
#: The tracklist ID. Read-only.
tlid = fields.Integer(min=0)
#: The track. Read-only.
track = fields.Field(type=Track)
def __init__(self, *args, **kwargs):
if len(args) == 2 and len(kwargs) == 0:
kwargs['tlid'] = args[0]
kwargs['track'] = args[1]
args = []
super(TlTrack, self).__init__(*args, **kwargs)
def __iter__(self):
return iter([self.tlid, self.track])
class Playlist(ValidatedImmutableObject):
"""
:param uri: playlist URI
:type uri: string
:param name: playlist name
:type name: string
:param tracks: playlist's tracks
:type tracks: list of :class:`Track` elements
:param last_modified:
playlist's modification time in milliseconds since Unix epoch
:type last_modified: int
"""
#: The playlist URI. Read-only.
uri = fields.URI()
#: The playlist name. Read-only.
name = fields.String()
#: The playlist's tracks. Read-only.
tracks = fields.Collection(type=Track, container=tuple)
#: The playlist modification time in milliseconds since Unix epoch.
#: Read-only.
#:
#: Integer, or :class:`None` if unknown.
last_modified = fields.Integer(min=0)
# TODO: def insert(self, pos, track): ... ?
@property
def length(self):
"""The number of tracks in the playlist. Read-only."""
return len(self.tracks)
class SearchResult(ValidatedImmutableObject):
"""
:param uri: search result URI
:type uri: string
:param tracks: matching tracks
:type tracks: list of :class:`Track` elements
:param artists: matching artists
:type artists: list of :class:`Artist` elements
:param albums: matching albums
:type albums: list of :class:`Album` elements
"""
#: The search result URI. Read-only.
uri = fields.URI()
#: The tracks matching the search query. Read-only.
tracks = fields.Collection(type=Track, container=tuple)
#: The artists matching the search query. Read-only.
artists = fields.Collection(type=Artist, container=tuple)
#: The albums matching the search query. Read-only.
albums = fields.Collection(type=Album, container=tuple)
Mopidy-2.0.0/mopidy/models/fields.py 0000664 0001750 0001750 00000012056 12660436420 017573 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy import compat
class Field(object):
"""
Base field for use in
:class:`~mopidy.models.immutable.ValidatedImmutableObject`. These fields
are responsible for type checking and other data sanitation in our models.
For simplicity fields use the Python descriptor protocol to store the
values in the instance dictionary. Also note that fields are mutable if
the object they are attached to allow it.
Default values will be validated with the exception of :class:`None`.
:param default: default value for field
:param type: if set the field value must be of this type
:param choices: if set the field value must be one of these
"""
def __init__(self, default=None, type=None, choices=None):
self._name = None # Set by ValidatedImmutableObjectMeta
self._choices = choices
self._default = default
self._type = type
if self._default is not None:
self.validate(self._default)
def validate(self, value):
"""Validate and possibly modify the field value before assignment"""
if self._type and not isinstance(value, self._type):
raise TypeError('Expected %s to be a %s, not %r' %
(self._name, self._type, value))
if self._choices and value not in self._choices:
raise TypeError('Expected %s to be a one of %s, not %r' %
(self._name, self._choices, value))
return value
def __get__(self, instance, owner):
if not instance:
return self
return getattr(instance, '_' + self._name, self._default)
def __set__(self, instance, value):
if value is not None:
value = self.validate(value)
if value is None or value == self._default:
self.__delete__(instance)
else:
setattr(instance, '_' + self._name, value)
def __delete__(self, instance):
if hasattr(instance, '_' + self._name):
delattr(instance, '_' + self._name)
class String(Field):
"""
Specialized :class:`Field` which is wired up for bytes and unicode.
:param default: default value for field
"""
def __init__(self, default=None):
# TODO: normalize to unicode?
# TODO: only allow unicode?
# TODO: disallow empty strings?
super(String, self).__init__(type=compat.string_types, default=default)
class Date(String):
"""
:class:`Field` for storing ISO 8601 dates as a string.
Supported formats are ``YYYY-MM-DD``, ``YYYY-MM`` and ``YYYY``, currently
not validated.
:param default: default value for field
"""
pass # TODO: make this check for YYYY-MM-DD, YYYY-MM, YYYY using strptime.
class Identifier(String):
"""
:class:`Field` for storing ASCII values such as GUIDs or other identifiers.
Values will be interned.
:param default: default value for field
"""
def validate(self, value):
return compat.intern(str(super(Identifier, self).validate(value)))
class URI(Identifier):
"""
:class:`Field` for storing URIs
Values will be interned, currently not validated.
:param default: default value for field
"""
pass # TODO: validate URIs?
class Integer(Field):
"""
:class:`Field` for storing integer numbers.
:param default: default value for field
:param min: field value must be larger or equal to this value when set
:param max: field value must be smaller or equal to this value when set
"""
def __init__(self, default=None, min=None, max=None):
self._min = min
self._max = max
super(Integer, self).__init__(
type=compat.integer_types, default=default)
def validate(self, value):
value = super(Integer, self).validate(value)
if self._min is not None and value < self._min:
raise ValueError('Expected %s to be at least %d, not %d' %
(self._name, self._min, value))
if self._max is not None and value > self._max:
raise ValueError('Expected %s to be at most %d, not %d' %
(self._name, self._max, value))
return value
class Collection(Field):
"""
:class:`Field` for storing collections of a given type.
:param type: all items stored in the collection must be of this type
:param container: the type to store the items in
"""
def __init__(self, type, container=tuple):
super(Collection, self).__init__(type=type, default=container())
def validate(self, value):
if isinstance(value, compat.string_types):
raise TypeError('Expected %s to be a collection of %s, not %r'
% (self._name, self._type.__name__, value))
for v in value:
if not isinstance(v, self._type):
raise TypeError('Expected %s to be a collection of %s, not %r'
% (self._name, self._type.__name__, value))
return self._default.__class__(value) or None
Mopidy-2.0.0/mopidy/models/serialize.py 0000664 0001750 0001750 00000002236 12575004517 020316 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import json
from mopidy.models import immutable
_MODELS = ['Ref', 'Artist', 'Album', 'Track', 'TlTrack', 'Playlist']
class ModelJSONEncoder(json.JSONEncoder):
"""
Automatically serialize Mopidy models to JSON.
Usage::
>>> import json
>>> json.dumps({'a_track': Track(name='name')}, cls=ModelJSONEncoder)
'{"a_track": {"__model__": "Track", "name": "name"}}'
"""
def default(self, obj):
if isinstance(obj, immutable.ImmutableObject):
return obj.serialize()
return json.JSONEncoder.default(self, obj)
def model_json_decoder(dct):
"""
Automatically deserialize Mopidy models from JSON.
Usage::
>>> import json
>>> json.loads(
... '{"a_track": {"__model__": "Track", "name": "name"}}',
... object_hook=model_json_decoder)
{u'a_track': Track(artists=[], name=u'name')}
"""
if '__model__' in dct:
from mopidy import models
model_name = dct.pop('__model__')
if model_name in _MODELS:
return getattr(models, model_name)(**dct)
return dct
Mopidy-2.0.0/mopidy/stream/ 0000775 0001750 0001750 00000000000 12660436443 015764 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/mopidy/stream/__init__.py 0000664 0001750 0001750 00000001551 12505224626 020073 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import os
import mopidy
from mopidy import config, ext
class Extension(ext.Extension):
dist_name = 'Mopidy-Stream'
ext_name = 'stream'
version = mopidy.__version__
def get_default_config(self):
conf_file = os.path.join(os.path.dirname(__file__), 'ext.conf')
return config.read(conf_file)
def get_config_schema(self):
schema = super(Extension, self).get_config_schema()
schema['protocols'] = config.List()
schema['metadata_blacklist'] = config.List(optional=True)
schema['timeout'] = config.Integer(
minimum=1000, maximum=1000 * 60 * 60)
return schema
def validate_environment(self):
pass
def setup(self, registry):
from .actor import StreamBackend
registry.add('backend', StreamBackend)
Mopidy-2.0.0/mopidy/stream/ext.conf 0000664 0001750 0001750 00000000177 12575004517 017436 0 ustar jodal jodal 0000000 0000000 [stream]
enabled = true
protocols =
http
https
mms
rtmp
rtmps
rtsp
timeout = 5000
metadata_blacklist =
Mopidy-2.0.0/mopidy/stream/actor.py 0000664 0001750 0001750 00000012716 12660436420 017450 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import fnmatch
import logging
import re
import time
import pykka
from mopidy import audio as audio_lib, backend, exceptions, stream
from mopidy.audio import scan, tags
from mopidy.compat import urllib
from mopidy.internal import http, playlists
from mopidy.models import Track
logger = logging.getLogger(__name__)
class StreamBackend(pykka.ThreadingActor, backend.Backend):
def __init__(self, config, audio):
super(StreamBackend, self).__init__()
self._scanner = scan.Scanner(
timeout=config['stream']['timeout'],
proxy_config=config['proxy'])
self._session = http.get_requests_session(
proxy_config=config['proxy'],
user_agent='%s/%s' % (
stream.Extension.dist_name, stream.Extension.version))
blacklist = config['stream']['metadata_blacklist']
self._blacklist_re = re.compile(
r'^(%s)$' % '|'.join(fnmatch.translate(u) for u in blacklist))
self._timeout = config['stream']['timeout']
self.library = StreamLibraryProvider(backend=self)
self.playback = StreamPlaybackProvider(audio=audio, backend=self)
self.playlists = None
self.uri_schemes = audio_lib.supported_uri_schemes(
config['stream']['protocols'])
if 'file' in self.uri_schemes and config['file']['enabled']:
logger.warning(
'The stream/protocols config value includes the "file" '
'protocol. "file" playback is now handled by Mopidy-File. '
'Please remove it from the stream/protocols config.')
self.uri_schemes -= {'file'}
class StreamLibraryProvider(backend.LibraryProvider):
def lookup(self, uri):
if urllib.parse.urlsplit(uri).scheme not in self.backend.uri_schemes:
return []
if self.backend._blacklist_re.match(uri):
logger.debug('URI matched metadata lookup blacklist: %s', uri)
return [Track(uri=uri)]
_, scan_result = _unwrap_stream(
uri, timeout=self.backend._timeout, scanner=self.backend._scanner,
requests_session=self.backend._session)
if scan_result:
track = tags.convert_tags_to_track(scan_result.tags).replace(
uri=uri, length=scan_result.duration)
else:
logger.warning('Problem looking up %s: %s', uri)
track = Track(uri=uri)
return [track]
class StreamPlaybackProvider(backend.PlaybackProvider):
def translate_uri(self, uri):
if urllib.parse.urlsplit(uri).scheme not in self.backend.uri_schemes:
return None
if self.backend._blacklist_re.match(uri):
logger.debug('URI matched metadata lookup blacklist: %s', uri)
return uri
unwrapped_uri, _ = _unwrap_stream(
uri, timeout=self.backend._timeout, scanner=self.backend._scanner,
requests_session=self.backend._session)
return unwrapped_uri
# TODO: cleanup the return value of this.
def _unwrap_stream(uri, timeout, scanner, requests_session):
"""
Get a stream URI from a playlist URI, ``uri``.
Unwraps nested playlists until something that's not a playlist is found or
the ``timeout`` is reached.
"""
original_uri = uri
seen_uris = set()
deadline = time.time() + timeout
while time.time() < deadline:
if uri in seen_uris:
logger.info(
'Unwrapping stream from URI (%s) failed: '
'playlist referenced itself', uri)
return None, None
else:
seen_uris.add(uri)
logger.debug('Unwrapping stream from URI: %s', uri)
try:
scan_timeout = deadline - time.time()
if scan_timeout < 0:
logger.info(
'Unwrapping stream from URI (%s) failed: '
'timed out in %sms', uri, timeout)
return None, None
scan_result = scanner.scan(uri, timeout=scan_timeout)
except exceptions.ScannerError as exc:
logger.debug('GStreamer failed scanning URI (%s): %s', uri, exc)
scan_result = None
if scan_result is not None:
if scan_result.playable or (
not scan_result.mime.startswith('text/') and
not scan_result.mime.startswith('application/')
):
logger.debug(
'Unwrapped potential %s stream: %s', scan_result.mime, uri)
return uri, scan_result
download_timeout = deadline - time.time()
if download_timeout < 0:
logger.info(
'Unwrapping stream from URI (%s) failed: timed out in %sms',
uri, timeout)
return None, None
content = http.download(
requests_session, uri, timeout=download_timeout)
if content is None:
logger.info(
'Unwrapping stream from URI (%s) failed: '
'error downloading URI %s', original_uri, uri)
return None, None
uris = playlists.parse(content)
if not uris:
logger.debug(
'Failed parsing URI (%s) as playlist; found potential stream.',
uri)
return uri, None
# TODO Test streams and return first that seems to be playable
logger.debug(
'Parsed playlist (%s) and found new URI: %s', uri, uris[0])
uri = uris[0]
Mopidy-2.0.0/mopidy/__init__.py 0000664 0001750 0001750 00000000545 12660436420 016601 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, print_function, unicode_literals
import platform
import sys
import warnings
if not (2, 7) <= sys.version_info < (3,):
sys.exit(
'ERROR: Mopidy requires Python 2.7, but found %s.' %
platform.python_version())
warnings.filterwarnings('ignore', 'could not open display')
__version__ = '2.0.0'
Mopidy-2.0.0/mopidy/local/ 0000775 0001750 0001750 00000000000 12660436443 015563 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/mopidy/local/playback.py 0000664 0001750 0001750 00000000502 12575004517 017716 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy import backend
from mopidy.local import translator
class LocalPlaybackProvider(backend.PlaybackProvider):
def translate_uri(self, uri):
return translator.local_uri_to_file_uri(
uri, self.backend.config['local']['media_dir'])
Mopidy-2.0.0/mopidy/local/translator.py 0000664 0001750 0001750 00000003114 12660436420 020320 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
import os
import urllib
from mopidy import compat
from mopidy.internal import path
logger = logging.getLogger(__name__)
def local_uri_to_file_uri(uri, media_dir):
"""Convert local track or directory URI to file URI."""
return path_to_file_uri(local_uri_to_path(uri, media_dir))
def local_uri_to_path(uri, media_dir):
"""Convert local track or directory URI to absolute path."""
if (
not uri.startswith('local:directory:') and
not uri.startswith('local:track:')):
raise ValueError('Invalid URI.')
file_path = path.uri_to_path(uri).split(b':', 1)[1]
return os.path.join(media_dir, file_path)
def local_track_uri_to_path(uri, media_dir):
# Deprecated version to keep old versions of Mopidy-Local-Sqlite working.
return local_uri_to_path(uri, media_dir)
def path_to_file_uri(abspath):
"""Convert absolute path to file URI."""
# Re-export internal method for use by Mopidy-Local-* extensions.
return path.path_to_uri(abspath)
def path_to_local_track_uri(relpath):
"""Convert path relative to :confval:`local/media_dir` to local track
URI."""
if isinstance(relpath, compat.text_type):
relpath = relpath.encode('utf-8')
return 'local:track:%s' % urllib.quote(relpath)
def path_to_local_directory_uri(relpath):
"""Convert path relative to :confval:`local/media_dir` directory URI."""
if isinstance(relpath, compat.text_type):
relpath = relpath.encode('utf-8')
return 'local:directory:%s' % urllib.quote(relpath)
Mopidy-2.0.0/mopidy/local/storage.py 0000664 0001750 0001750 00000000512 12575504731 017600 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
import os
logger = logging.getLogger(__name__)
def check_dirs_and_files(config):
if not os.path.isdir(config['local']['media_dir']):
logger.warning(
'Local media dir %s does not exist.' %
config['local']['media_dir'])
Mopidy-2.0.0/mopidy/local/__init__.py 0000664 0001750 0001750 00000016653 12660436420 017702 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
import os
import mopidy
from mopidy import config, ext, models
logger = logging.getLogger(__name__)
class Extension(ext.Extension):
dist_name = 'Mopidy-Local'
ext_name = 'local'
version = mopidy.__version__
def get_default_config(self):
conf_file = os.path.join(os.path.dirname(__file__), 'ext.conf')
return config.read(conf_file)
def get_config_schema(self):
schema = super(Extension, self).get_config_schema()
schema['library'] = config.String()
schema['media_dir'] = config.Path()
schema['data_dir'] = config.Deprecated()
schema['playlists_dir'] = config.Deprecated()
schema['tag_cache_file'] = config.Deprecated()
schema['scan_timeout'] = config.Integer(
minimum=1000, maximum=1000 * 60 * 60)
schema['scan_flush_threshold'] = config.Integer(minimum=0)
schema['scan_follow_symlinks'] = config.Boolean()
schema['excluded_file_extensions'] = config.List(optional=True)
return schema
def setup(self, registry):
from .actor import LocalBackend
from .json import JsonLibrary
LocalBackend.libraries = registry['local:library']
registry.add('backend', LocalBackend)
registry.add('local:library', JsonLibrary)
def get_command(self):
from .commands import LocalCommand
return LocalCommand()
class Library(object):
"""
Local library interface.
Extensions that wish to provide an alternate local library storage backend
need to sub-class this class and install and configure it with an
extension. Both scanning and library calls will use the active local
library.
:param config: Config dictionary
"""
ROOT_DIRECTORY_URI = 'local:directory'
"""
URI of the local backend's root directory.
This constant should be used by libraries implementing the
:meth:`Library.browse` method.
"""
#: Name of the local library implementation, must be overriden.
name = None
#: Feature marker to indicate that you want :meth:`add()` calls to be
#: called with optional arguments tags and duration.
add_supports_tags_and_duration = False
def __init__(self, config):
self._config = config
def browse(self, uri):
"""
Browse directories and tracks at the given URI.
The URI for the root directory is a constant available at
:attr:`Library.ROOT_DIRECTORY_URI`.
:param string path: URI to browse.
:rtype: List of :class:`~mopidy.models.Ref` tracks and directories.
"""
raise NotImplementedError
def get_distinct(self, field, query=None):
"""
List distinct values for a given field from the library.
:param string field: One of ``artist``, ``albumartist``, ``album``,
``composer``, ``performer``, ``date``or ``genre``.
:param dict query: Query to use for limiting results, see
:meth:`search` for details about the query format.
:rtype: set of values corresponding to the requested field type.
"""
return set()
def get_images(self, uris):
"""
Lookup the images for the given URIs.
The default implementation will simply call :meth:`lookup` and
try and use the album art for any tracks returned. Most local
libraries should replace this with something smarter or simply
return an empty dictionary.
:param list uris: list of URIs to find images for
:rtype: {uri: tuple of :class:`mopidy.models.Image`}
"""
result = {}
for uri in uris:
image_uris = set()
tracks = self.lookup(uri)
# local libraries may return single track
if isinstance(tracks, models.Track):
tracks = [tracks]
for track in tracks:
if track.album and track.album.images:
image_uris.update(track.album.images)
result[uri] = [models.Image(uri=u) for u in image_uris]
return result
def load(self):
"""
(Re)load any tracks stored in memory, if any, otherwise just return
number of available tracks currently available. Will be called at
startup for both library and update use cases, so if you plan to store
tracks in memory this is when the should be (re)loaded.
:rtype: :class:`int` representing number of tracks in library.
"""
return 0
def lookup(self, uri):
"""
Lookup the given URI.
:param string uri: track URI
:rtype: list of :class:`~mopidy.models.Track` (or single
:class:`~mopidy.models.Track` for backward compatibility)
"""
raise NotImplementedError
# TODO: remove uris, replacing it with support in query language.
# TODO: remove exact, replacing it with support in query language.
def search(self, query=None, limit=100, offset=0, exact=False, uris=None):
"""
Search the library for tracks where ``field`` contains ``values``.
:param dict query: one or more queries to search for
:param int limit: maximum number of results to return
:param int offset: offset into result set to use.
:param bool exact: whether to look for exact matches
:param uris: zero or more URI roots to limit the search to
:type uris: list of strings or :class:`None`
:rtype: :class:`~mopidy.models.SearchResult`
"""
raise NotImplementedError
# TODO: add file browsing support.
# Remaining methods are use for the update process.
def begin(self):
"""
Prepare library for accepting updates. Exactly what this means is
highly implementation depended. This must however return an iterator
that generates all tracks in the library for efficient scanning.
:rtype: :class:`~mopidy.models.Track` iterator
"""
raise NotImplementedError
def add(self, track, tags=None, duration=None):
"""
Add the given track to library. Optional args will only be added if
:attr:`add_supports_tags_and_duration` has been set.
:param track: Track to add to the library
:type track: :class:`~mopidy.models.Track`
:param tags: All the tags the scanner found for the media. See
:mod:`mopidy.audio.utils` for details about the tags.
:type tags: dictionary of tag keys with a list of values.
:param duration: Duration of media in milliseconds or :class:`None` if
unknown
:type duration: :class:`int` or :class:`None`
"""
raise NotImplementedError
def remove(self, uri):
"""
Remove the given track from the library.
:param str uri: URI to remove from the library/
"""
raise NotImplementedError
def flush(self):
"""
Called for every n-th track indicating that work should be committed.
Sub-classes are free to ignore these hints.
:rtype: Boolean indicating if state was flushed.
"""
return False
def close(self):
"""
Close any resources used for updating, commit outstanding work etc.
"""
pass
def clear(self):
"""
Clear out whatever data storage is used by this backend.
:rtype: Boolean indicating if state was cleared.
"""
return False
Mopidy-2.0.0/mopidy/local/commands.py 0000664 0001750 0001750 00000015460 12660436420 017737 0 ustar jodal jodal 0000000 0000000 from __future__ import (
absolute_import, division, print_function, unicode_literals)
import logging
import os
import time
from mopidy import commands, compat, exceptions
from mopidy.audio import scan, tags
from mopidy.internal import path
from mopidy.local import translator
logger = logging.getLogger(__name__)
MIN_DURATION_MS = 100 # Shortest length of track to include.
def _get_library(args, config):
libraries = dict((l.name, l) for l in args.registry['local:library'])
library_name = config['local']['library']
if library_name not in libraries:
logger.error('Local library %s not found', library_name)
return None
logger.debug('Using %s as the local library', library_name)
return libraries[library_name](config)
class LocalCommand(commands.Command):
def __init__(self):
super(LocalCommand, self).__init__()
self.add_child('scan', ScanCommand())
self.add_child('clear', ClearCommand())
class ClearCommand(commands.Command):
help = 'Clear local media files from the local library.'
def run(self, args, config):
library = _get_library(args, config)
if library is None:
return 1
prompt = '\nAre you sure you want to clear the library? [y/N] '
if compat.input(prompt).lower() != 'y':
print('Clearing library aborted.')
return 0
if library.clear():
print('Library successfully cleared.')
return 0
print('Unable to clear library.')
return 1
class ScanCommand(commands.Command):
help = 'Scan local media files and populate the local library.'
def __init__(self):
super(ScanCommand, self).__init__()
self.add_argument('--limit',
action='store', type=int, dest='limit', default=None,
help='Maximum number of tracks to scan')
self.add_argument('--force',
action='store_true', dest='force', default=False,
help='Force rescan of all media files')
def run(self, args, config):
media_dir = config['local']['media_dir']
scan_timeout = config['local']['scan_timeout']
flush_threshold = config['local']['scan_flush_threshold']
excluded_file_extensions = config['local']['excluded_file_extensions']
excluded_file_extensions = tuple(
bytes(file_ext.lower()) for file_ext in excluded_file_extensions)
library = _get_library(args, config)
if library is None:
return 1
file_mtimes, file_errors = path.find_mtimes(
media_dir, follow=config['local']['scan_follow_symlinks'])
logger.info('Found %d files in media_dir.', len(file_mtimes))
if file_errors:
logger.warning('Encountered %d errors while scanning media_dir.',
len(file_errors))
for name in file_errors:
logger.debug('Scan error %r for %r', file_errors[name], name)
num_tracks = library.load()
logger.info('Checking %d tracks from library.', num_tracks)
uris_to_update = set()
uris_to_remove = set()
uris_in_library = set()
for track in library.begin():
abspath = translator.local_track_uri_to_path(track.uri, media_dir)
mtime = file_mtimes.get(abspath)
if mtime is None:
logger.debug('Missing file %s', track.uri)
uris_to_remove.add(track.uri)
elif mtime > track.last_modified or args.force:
uris_to_update.add(track.uri)
uris_in_library.add(track.uri)
logger.info('Removing %d missing tracks.', len(uris_to_remove))
for uri in uris_to_remove:
library.remove(uri)
for abspath in file_mtimes:
relpath = os.path.relpath(abspath, media_dir)
uri = translator.path_to_local_track_uri(relpath)
if b'/.' in relpath:
logger.debug('Skipped %s: Hidden directory/file.', uri)
elif relpath.lower().endswith(excluded_file_extensions):
logger.debug('Skipped %s: File extension excluded.', uri)
elif uri not in uris_in_library:
uris_to_update.add(uri)
logger.info(
'Found %d tracks which need to be updated.', len(uris_to_update))
logger.info('Scanning...')
uris_to_update = sorted(uris_to_update, key=lambda v: v.lower())
uris_to_update = uris_to_update[:args.limit]
scanner = scan.Scanner(scan_timeout)
progress = _Progress(flush_threshold, len(uris_to_update))
for uri in uris_to_update:
try:
relpath = translator.local_track_uri_to_path(uri, media_dir)
file_uri = path.path_to_uri(os.path.join(media_dir, relpath))
result = scanner.scan(file_uri)
if not result.playable:
logger.warning('Failed %s: No audio found in file.', uri)
elif result.duration < MIN_DURATION_MS:
logger.warning('Failed %s: Track shorter than %dms',
uri, MIN_DURATION_MS)
else:
mtime = file_mtimes.get(os.path.join(media_dir, relpath))
track = tags.convert_tags_to_track(result.tags).replace(
uri=uri, length=result.duration, last_modified=mtime)
if library.add_supports_tags_and_duration:
library.add(
track, tags=result.tags, duration=result.duration)
else:
library.add(track)
logger.debug('Added %s', track.uri)
except exceptions.ScannerError as error:
logger.warning('Failed %s: %s', uri, error)
if progress.increment():
progress.log()
if library.flush():
logger.debug('Progress flushed.')
progress.log()
library.close()
logger.info('Done scanning.')
return 0
class _Progress(object):
def __init__(self, batch_size, total):
self.count = 0
self.batch_size = batch_size
self.total = total
self.start = time.time()
def increment(self):
self.count += 1
return self.batch_size and self.count % self.batch_size == 0
def log(self):
duration = time.time() - self.start
if self.count >= self.total or not self.count:
logger.info('Scanned %d of %d files in %ds.',
self.count, self.total, duration)
else:
remainder = duration / self.count * (self.total - self.count)
logger.info('Scanned %d of %d files in %ds, ~%ds left.',
self.count, self.total, duration, remainder)
Mopidy-2.0.0/mopidy/local/search.py 0000664 0001750 0001750 00000020626 12575004517 017406 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy.models import SearchResult
def find_exact(tracks, query=None, limit=100, offset=0, uris=None):
"""
Filter a list of tracks where ``field`` is ``values``.
:param list tracks: a list of :class:`~mopidy.models.Track`
:param dict query: one or more field/value pairs to search for
:param int limit: maximum number of results to return
:param int offset: offset into result set to use.
:param uris: zero or more URI roots to limit the search to
:type uris: list of strings or :class:`None`
:rtype: :class:`~mopidy.models.SearchResult`
"""
# TODO Only return results within URI roots given by ``uris``
if query is None:
query = {}
_validate_query(query)
for (field, values) in query.items():
# FIXME this is bound to be slow for large libraries
for value in values:
if field == 'track_no':
q = _convert_to_int(value)
else:
q = value.strip()
def uri_filter(t):
return q == t.uri
def track_name_filter(t):
return q == t.name
def album_filter(t):
return q == getattr(getattr(t, 'album', None), 'name', None)
def artist_filter(t):
return filter(lambda a: q == a.name, t.artists)
def albumartist_filter(t):
return any([
q == a.name for a in getattr(t.album, 'artists', [])])
def composer_filter(t):
return any([q == a.name for a in getattr(t, 'composers', [])])
def performer_filter(t):
return any([q == a.name for a in getattr(t, 'performers', [])])
def track_no_filter(t):
return q == t.track_no
def genre_filter(t):
return (t.genre and q == t.genre)
def date_filter(t):
return q == t.date
def comment_filter(t):
return q == t.comment
def any_filter(t):
return (uri_filter(t) or
track_name_filter(t) or
album_filter(t) or
artist_filter(t) or
albumartist_filter(t) or
composer_filter(t) or
performer_filter(t) or
track_no_filter(t) or
genre_filter(t) or
date_filter(t) or
comment_filter(t))
if field == 'uri':
tracks = filter(uri_filter, tracks)
elif field == 'track_name':
tracks = filter(track_name_filter, tracks)
elif field == 'album':
tracks = filter(album_filter, tracks)
elif field == 'artist':
tracks = filter(artist_filter, tracks)
elif field == 'albumartist':
tracks = filter(albumartist_filter, tracks)
elif field == 'composer':
tracks = filter(composer_filter, tracks)
elif field == 'performer':
tracks = filter(performer_filter, tracks)
elif field == 'track_no':
tracks = filter(track_no_filter, tracks)
elif field == 'genre':
tracks = filter(genre_filter, tracks)
elif field == 'date':
tracks = filter(date_filter, tracks)
elif field == 'comment':
tracks = filter(comment_filter, tracks)
elif field == 'any':
tracks = filter(any_filter, tracks)
else:
raise LookupError('Invalid lookup field: %s' % field)
if limit is None:
tracks = tracks[offset:]
else:
tracks = tracks[offset:offset + limit]
# TODO: add local:search:
return SearchResult(uri='local:search', tracks=tracks)
def search(tracks, query=None, limit=100, offset=0, uris=None):
"""
Filter a list of tracks where ``field`` is like ``values``.
:param list tracks: a list of :class:`~mopidy.models.Track`
:param dict query: one or more field/value pairs to search for
:param int limit: maximum number of results to return
:param int offset: offset into result set to use.
:param uris: zero or more URI roots to limit the search to
:type uris: list of strings or :class:`None`
:rtype: :class:`~mopidy.models.SearchResult`
"""
# TODO Only return results within URI roots given by ``uris``
if query is None:
query = {}
_validate_query(query)
for (field, values) in query.items():
# FIXME this is bound to be slow for large libraries
for value in values:
if field == 'track_no':
q = _convert_to_int(value)
else:
q = value.strip().lower()
def uri_filter(t):
return bool(t.uri and q in t.uri.lower())
def track_name_filter(t):
return bool(t.name and q in t.name.lower())
def album_filter(t):
return bool(t.album and t.album.name and
q in t.album.name.lower())
def artist_filter(t):
return bool(filter(
lambda a: bool(a.name and q in a.name.lower()), t.artists))
def albumartist_filter(t):
return any([a.name and q in a.name.lower()
for a in getattr(t.album, 'artists', [])])
def composer_filter(t):
return any([a.name and q in a.name.lower()
for a in getattr(t, 'composers', [])])
def performer_filter(t):
return any([a.name and q in a.name.lower()
for a in getattr(t, 'performers', [])])
def track_no_filter(t):
return q == t.track_no
def genre_filter(t):
return bool(t.genre and q in t.genre.lower())
def date_filter(t):
return bool(t.date and t.date.startswith(q))
def comment_filter(t):
return bool(t.comment and q in t.comment.lower())
def any_filter(t):
return (uri_filter(t) or
track_name_filter(t) or
album_filter(t) or
artist_filter(t) or
albumartist_filter(t) or
composer_filter(t) or
performer_filter(t) or
track_no_filter(t) or
genre_filter(t) or
date_filter(t) or
comment_filter(t))
if field == 'uri':
tracks = filter(uri_filter, tracks)
elif field == 'track_name':
tracks = filter(track_name_filter, tracks)
elif field == 'album':
tracks = filter(album_filter, tracks)
elif field == 'artist':
tracks = filter(artist_filter, tracks)
elif field == 'albumartist':
tracks = filter(albumartist_filter, tracks)
elif field == 'composer':
tracks = filter(composer_filter, tracks)
elif field == 'performer':
tracks = filter(performer_filter, tracks)
elif field == 'track_no':
tracks = filter(track_no_filter, tracks)
elif field == 'genre':
tracks = filter(genre_filter, tracks)
elif field == 'date':
tracks = filter(date_filter, tracks)
elif field == 'comment':
tracks = filter(comment_filter, tracks)
elif field == 'any':
tracks = filter(any_filter, tracks)
else:
raise LookupError('Invalid lookup field: %s' % field)
if limit is None:
tracks = tracks[offset:]
else:
tracks = tracks[offset:offset + limit]
# TODO: add local:search:
return SearchResult(uri='local:search', tracks=tracks)
def _validate_query(query):
for (_, values) in query.items():
if not values:
raise LookupError('Missing query')
for value in values:
if not value:
raise LookupError('Missing query')
def _convert_to_int(string):
try:
return int(string)
except ValueError:
return object()
Mopidy-2.0.0/mopidy/local/json.py 0000664 0001750 0001750 00000015517 12614502604 017107 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, absolute_import, unicode_literals
import collections
import gzip
import json
import logging
import os
import re
import sys
import tempfile
import mopidy
from mopidy import compat, local, models
from mopidy.internal import encoding, timer
from mopidy.local import search, storage, translator
logger = logging.getLogger(__name__)
# TODO: move to load and dump in models?
def load_library(json_file):
if not os.path.isfile(json_file):
logger.info(
'No local library metadata cache found at %s. Please run '
'`mopidy local scan` to index your local music library. '
'If you do not have a local music collection, you can disable the '
'local backend to hide this message.',
json_file)
return {}
try:
with gzip.open(json_file, 'rb') as fp:
return json.load(fp, object_hook=models.model_json_decoder)
except (IOError, ValueError) as error:
logger.warning(
'Loading JSON local library failed: %s',
encoding.locale_decode(error))
return {}
def write_library(json_file, data):
data['version'] = mopidy.__version__
directory, basename = os.path.split(json_file)
# TODO: cleanup directory/basename.* files.
tmp = tempfile.NamedTemporaryFile(
prefix=basename + '.', dir=directory, delete=False)
try:
with gzip.GzipFile(fileobj=tmp, mode='wb') as fp:
json.dump(data, fp, cls=models.ModelJSONEncoder,
indent=2, separators=(',', ': '))
os.rename(tmp.name, json_file)
finally:
if os.path.exists(tmp.name):
os.remove(tmp.name)
class _BrowseCache(object):
encoding = sys.getfilesystemencoding()
splitpath_re = re.compile(r'([^/]+)')
def __init__(self, uris):
self._cache = {
local.Library.ROOT_DIRECTORY_URI: collections.OrderedDict()}
for track_uri in uris:
path = translator.local_track_uri_to_path(track_uri, b'/')
parts = self.splitpath_re.findall(
path.decode(self.encoding, 'replace'))
track_ref = models.Ref.track(uri=track_uri, name=parts.pop())
# Look for our parents backwards as this is faster than having to
# do a complete search for each add.
parent_uri = None
child = None
for i in reversed(range(len(parts))):
directory = '/'.join(parts[:i + 1])
uri = translator.path_to_local_directory_uri(directory)
# First dir we process is our parent
if not parent_uri:
parent_uri = uri
# We found ourselves and we exist, done.
if uri in self._cache:
if child:
self._cache[uri][child.uri] = child
break
# Initialize ourselves, store child if present, and add
# ourselves as child for next loop.
self._cache[uri] = collections.OrderedDict()
if child:
self._cache[uri][child.uri] = child
child = models.Ref.directory(uri=uri, name=parts[i])
else:
# Loop completed, so final child needs to be added to root.
if child:
self._cache[
local.Library.ROOT_DIRECTORY_URI][child.uri] = child
# If no parent was set we belong in the root.
if not parent_uri:
parent_uri = local.Library.ROOT_DIRECTORY_URI
self._cache[parent_uri][track_uri] = track_ref
def lookup(self, uri):
return self._cache.get(uri, {}).values()
class JsonLibrary(local.Library):
name = 'json'
def __init__(self, config):
self._tracks = {}
self._browse_cache = None
self._media_dir = config['local']['media_dir']
self._json_file = os.path.join(
local.Extension.get_data_dir(config), b'library.json.gz')
storage.check_dirs_and_files(config)
def browse(self, uri):
if not self._browse_cache:
return []
return self._browse_cache.lookup(uri)
def load(self):
logger.debug('Loading library: %s', self._json_file)
with timer.time_logger('Loading tracks'):
library = load_library(self._json_file)
self._tracks = dict((t.uri, t) for t in library.get('tracks', []))
with timer.time_logger('Building browse cache'):
self._browse_cache = _BrowseCache(sorted(self._tracks.keys()))
return len(self._tracks)
def lookup(self, uri):
try:
return [self._tracks[uri]]
except KeyError:
return []
def get_distinct(self, field, query=None):
if field == 'track':
def distinct(track):
return {track.name}
elif field == 'artist':
def distinct(track):
return {a.name for a in track.artists}
elif field == 'albumartist':
def distinct(track):
album = track.album or models.Album()
return {a.name for a in album.artists}
elif field == 'album':
def distinct(track):
album = track.album or models.Album()
return {album.name}
elif field == 'composer':
def distinct(track):
return {a.name for a in track.composers}
elif field == 'performer':
def distinct(track):
return {a.name for a in track.performers}
elif field == 'date':
def distinct(track):
return {track.date}
elif field == 'genre':
def distinct(track):
return {track.genre}
else:
return set()
distinct_result = set()
search_result = search.search(self._tracks.values(), query, limit=None)
for track in search_result.tracks:
distinct_result.update(distinct(track))
return distinct_result - {None}
def search(self, query=None, limit=100, offset=0, uris=None, exact=False):
tracks = self._tracks.values()
if exact:
return search.find_exact(
tracks, query=query, limit=limit, offset=offset, uris=uris)
else:
return search.search(
tracks, query=query, limit=limit, offset=offset, uris=uris)
def begin(self):
return compat.itervalues(self._tracks)
def add(self, track):
self._tracks[track.uri] = track
def remove(self, uri):
self._tracks.pop(uri, None)
def close(self):
write_library(self._json_file, {'tracks': self._tracks.values()})
def clear(self):
try:
os.remove(self._json_file)
return True
except OSError:
return False
Mopidy-2.0.0/mopidy/local/library.py 0000664 0001750 0001750 00000003250 12575004517 017577 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
from mopidy import backend, local, models
logger = logging.getLogger(__name__)
class LocalLibraryProvider(backend.LibraryProvider):
"""Proxy library that delegates work to our active local library."""
root_directory = models.Ref.directory(
uri=local.Library.ROOT_DIRECTORY_URI, name='Local media')
def __init__(self, backend, library):
super(LocalLibraryProvider, self).__init__(backend)
self._library = library
self.refresh()
def browse(self, uri):
if not self._library:
return []
return self._library.browse(uri)
def get_distinct(self, field, query=None):
if not self._library:
return set()
return self._library.get_distinct(field, query)
def get_images(self, uris):
if not self._library:
return {}
return self._library.get_images(uris)
def refresh(self, uri=None):
if not self._library:
return 0
num_tracks = self._library.load()
logger.info('Loaded %d local tracks using %s',
num_tracks, self._library.name)
def lookup(self, uri):
if not self._library:
return []
tracks = self._library.lookup(uri)
if tracks is None:
logger.debug('Failed to lookup %r', uri)
return []
if isinstance(tracks, models.Track):
tracks = [tracks]
return tracks
def search(self, query=None, uris=None, exact=False):
if not self._library:
return None
return self._library.search(query=query, uris=uris, exact=exact)
Mopidy-2.0.0/mopidy/local/ext.conf 0000664 0001750 0001750 00000000350 12660436420 017223 0 ustar jodal jodal 0000000 0000000 [local]
enabled = true
library = json
media_dir = $XDG_MUSIC_DIR
scan_timeout = 1000
scan_flush_threshold = 100
scan_follow_symlinks = false
excluded_file_extensions =
.directory
.html
.jpeg
.jpg
.log
.nfo
.png
.txt
Mopidy-2.0.0/mopidy/local/actor.py 0000664 0001750 0001750 00000002121 12505224626 017235 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
import pykka
from mopidy import backend
from mopidy.local import storage
from mopidy.local.library import LocalLibraryProvider
from mopidy.local.playback import LocalPlaybackProvider
logger = logging.getLogger(__name__)
class LocalBackend(pykka.ThreadingActor, backend.Backend):
uri_schemes = ['local']
libraries = []
def __init__(self, config, audio):
super(LocalBackend, self).__init__()
self.config = config
storage.check_dirs_and_files(config)
libraries = dict((l.name, l) for l in self.libraries)
library_name = config['local']['library']
if library_name in libraries:
library = libraries[library_name](config)
logger.debug('Using %s as the local library', library_name)
else:
library = None
logger.warning('Local library %s not found', library_name)
self.playback = LocalPlaybackProvider(audio=audio, backend=self)
self.library = LocalLibraryProvider(backend=self, library=library)
Mopidy-2.0.0/mopidy/commands.py 0000664 0001750 0001750 00000036263 12660436420 016651 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, print_function, unicode_literals
import argparse
import collections
import contextlib
import logging
import os
import signal
import sys
import pykka
from mopidy import config as config_lib, exceptions
from mopidy.audio import Audio
from mopidy.core import Core
from mopidy.internal import deps, process, timer, versioning
from mopidy.internal.gi import GLib
logger = logging.getLogger(__name__)
_default_config = []
for base in GLib.get_system_config_dirs() + [GLib.get_user_config_dir()]:
_default_config.append(os.path.join(base, b'mopidy', b'mopidy.conf'))
DEFAULT_CONFIG = b':'.join(_default_config)
def config_files_type(value):
return value.split(b':')
def config_override_type(value):
try:
section, remainder = value.split(b'/', 1)
key, value = remainder.split(b'=', 1)
return (section.strip(), key.strip(), value.strip())
except ValueError:
raise argparse.ArgumentTypeError(
'%s must have the format section/key=value' % value)
class _ParserError(Exception):
def __init__(self, message):
self.message = message
class _HelpError(Exception):
pass
class _ArgumentParser(argparse.ArgumentParser):
def error(self, message):
raise _ParserError(message)
class _HelpAction(argparse.Action):
def __init__(self, option_strings, dest=None, help=None):
super(_HelpAction, self).__init__(
option_strings=option_strings,
dest=dest or argparse.SUPPRESS,
default=argparse.SUPPRESS,
nargs=0,
help=help)
def __call__(self, parser, namespace, values, option_string=None):
raise _HelpError()
class Command(object):
"""Command parser and runner for building trees of commands.
This class provides a wraper around :class:`argparse.ArgumentParser`
for handling this type of command line application in a better way than
argprases own sub-parser handling.
"""
help = None
#: Help text to display in help output.
def __init__(self):
self._children = collections.OrderedDict()
self._arguments = []
self._overrides = {}
def _build(self):
actions = []
parser = _ArgumentParser(add_help=False)
parser.register('action', 'help', _HelpAction)
for args, kwargs in self._arguments:
actions.append(parser.add_argument(*args, **kwargs))
parser.add_argument('_args', nargs=argparse.REMAINDER,
help=argparse.SUPPRESS)
return parser, actions
def add_child(self, name, command):
"""Add a child parser to consider using.
:param name: name to use for the sub-command that is being added.
:type name: string
"""
self._children[name] = command
def add_argument(self, *args, **kwargs):
"""Add an argument to the parser.
This method takes all the same arguments as the
:class:`argparse.ArgumentParser` version of this method.
"""
self._arguments.append((args, kwargs))
def set(self, **kwargs):
"""Override a value in the finaly result of parsing."""
self._overrides.update(kwargs)
def exit(self, status_code=0, message=None, usage=None):
"""Optionally print a message and exit."""
print('\n\n'.join(m for m in (usage, message) if m))
sys.exit(status_code)
def format_usage(self, prog=None):
"""Format usage for current parser."""
actions = self._build()[1]
prog = prog or os.path.basename(sys.argv[0])
return self._usage(actions, prog) + '\n'
def _usage(self, actions, prog):
formatter = argparse.HelpFormatter(prog)
formatter.add_usage(None, actions, [])
return formatter.format_help().strip()
def format_help(self, prog=None):
"""Format help for current parser and children."""
actions = self._build()[1]
prog = prog or os.path.basename(sys.argv[0])
formatter = argparse.HelpFormatter(prog)
formatter.add_usage(None, actions, [])
if self.help:
formatter.add_text(self.help)
if actions:
formatter.add_text('OPTIONS:')
formatter.start_section(None)
formatter.add_arguments(actions)
formatter.end_section()
subhelp = []
for name, child in self._children.items():
child._subhelp(name, subhelp)
if subhelp:
formatter.add_text('COMMANDS:')
subhelp.insert(0, '')
return formatter.format_help() + '\n'.join(subhelp)
def _subhelp(self, name, result):
actions = self._build()[1]
if self.help or actions:
formatter = argparse.HelpFormatter(name)
formatter.add_usage(None, actions, [], '')
formatter.start_section(None)
formatter.add_text(self.help)
formatter.start_section(None)
formatter.add_arguments(actions)
formatter.end_section()
formatter.end_section()
result.append(formatter.format_help())
for childname, child in self._children.items():
child._subhelp(' '.join((name, childname)), result)
def parse(self, args, prog=None):
"""Parse command line arguments.
Will recursively parse commands until a final parser is found or an
error occurs. In the case of errors we will print a message and exit.
Otherwise, any overrides are applied and the current parser stored
in the command attribute of the return value.
:param args: list of arguments to parse
:type args: list of strings
:param prog: name to use for program
:type prog: string
:rtype: :class:`argparse.Namespace`
"""
prog = prog or os.path.basename(sys.argv[0])
try:
return self._parse(
args, argparse.Namespace(), self._overrides.copy(), prog)
except _HelpError:
self.exit(0, self.format_help(prog))
def _parse(self, args, namespace, overrides, prog):
overrides.update(self._overrides)
parser, actions = self._build()
try:
result = parser.parse_args(args, namespace)
except _ParserError as e:
self.exit(1, e.message, self._usage(actions, prog))
if not result._args:
for attr, value in overrides.items():
setattr(result, attr, value)
delattr(result, '_args')
result.command = self
return result
child = result._args.pop(0)
if child not in self._children:
usage = self._usage(actions, prog)
self.exit(1, 'unrecognized command: %s' % child, usage)
return self._children[child]._parse(
result._args, result, overrides, ' '.join([prog, child]))
def run(self, *args, **kwargs):
"""Run the command.
Must be implemented by sub-classes that are not simply an intermediate
in the command namespace.
"""
raise NotImplementedError
@contextlib.contextmanager
def _actor_error_handling(name):
try:
yield
except exceptions.BackendError as exc:
logger.error(
'Backend (%s) initialization error: %s', name, exc.message)
except exceptions.FrontendError as exc:
logger.error(
'Frontend (%s) initialization error: %s', name, exc.message)
except exceptions.MixerError as exc:
logger.error(
'Mixer (%s) initialization error: %s', name, exc.message)
except Exception:
logger.exception('Got un-handled exception from %s', name)
# TODO: move out of this utility class
class RootCommand(Command):
def __init__(self):
super(RootCommand, self).__init__()
self.set(base_verbosity_level=0)
self.add_argument(
'-h', '--help',
action='help', help='Show this message and exit')
self.add_argument(
'--version', action='version',
version='Mopidy %s' % versioning.get_version())
self.add_argument(
'-q', '--quiet',
action='store_const', const=-1, dest='verbosity_level',
help='less output (warning level)')
self.add_argument(
'-v', '--verbose',
action='count', dest='verbosity_level', default=0,
help='more output (repeat up to 3 times for even more)')
self.add_argument(
'--save-debug-log',
action='store_true', dest='save_debug_log',
help='save debug log to "./mopidy.log"')
self.add_argument(
'--config',
action='store', dest='config_files', type=config_files_type,
default=DEFAULT_CONFIG, metavar='FILES',
help='config files to use, colon seperated, later files override')
self.add_argument(
'-o', '--option',
action='append', dest='config_overrides',
type=config_override_type, metavar='OPTIONS',
help='`section/key=value` values to override config options')
def run(self, args, config):
def on_sigterm(loop):
logger.info('GLib mainloop got SIGTERM. Exiting...')
loop.quit()
loop = GLib.MainLoop()
GLib.unix_signal_add(
GLib.PRIORITY_DEFAULT, signal.SIGTERM, on_sigterm, loop)
mixer_class = self.get_mixer_class(config, args.registry['mixer'])
backend_classes = args.registry['backend']
frontend_classes = args.registry['frontend']
exit_status_code = 0
try:
mixer = None
if mixer_class is not None:
mixer = self.start_mixer(config, mixer_class)
if mixer:
self.configure_mixer(config, mixer)
audio = self.start_audio(config, mixer)
backends = self.start_backends(config, backend_classes, audio)
core = self.start_core(config, mixer, backends, audio)
self.start_frontends(config, frontend_classes, core)
logger.info('Starting GLib mainloop')
loop.run()
except (exceptions.BackendError,
exceptions.FrontendError,
exceptions.MixerError):
logger.info('Initialization error. Exiting...')
exit_status_code = 1
except KeyboardInterrupt:
logger.info('Interrupted. Exiting...')
except Exception:
logger.exception('Uncaught exception')
finally:
loop.quit()
self.stop_frontends(frontend_classes)
self.stop_core()
self.stop_backends(backend_classes)
self.stop_audio()
if mixer_class is not None:
self.stop_mixer(mixer_class)
process.stop_remaining_actors()
return exit_status_code
def get_mixer_class(self, config, mixer_classes):
logger.debug(
'Available Mopidy mixers: %s',
', '.join(m.__name__ for m in mixer_classes) or 'none')
if config['audio']['mixer'] == 'none':
logger.debug('Mixer disabled')
return None
selected_mixers = [
m for m in mixer_classes if m.name == config['audio']['mixer']]
if len(selected_mixers) != 1:
logger.error(
'Did not find unique mixer "%s". Alternatives are: %s',
config['audio']['mixer'],
', '.join([m.name for m in mixer_classes]) + ', none' or
'none')
process.exit_process()
return selected_mixers[0]
def start_mixer(self, config, mixer_class):
logger.info('Starting Mopidy mixer: %s', mixer_class.__name__)
with _actor_error_handling(mixer_class.__name__):
mixer = mixer_class.start(config=config).proxy()
try:
mixer.ping().get()
return mixer
except pykka.ActorDeadError as exc:
logger.error('Actor died: %s', exc)
return None
def configure_mixer(self, config, mixer):
volume = config['audio']['mixer_volume']
if volume is not None:
mixer.set_volume(volume)
logger.info('Mixer volume set to %d', volume)
else:
logger.debug('Mixer volume left unchanged')
def start_audio(self, config, mixer):
logger.info('Starting Mopidy audio')
return Audio.start(config=config, mixer=mixer).proxy()
def start_backends(self, config, backend_classes, audio):
logger.info(
'Starting Mopidy backends: %s',
', '.join(b.__name__ for b in backend_classes) or 'none')
backends = []
for backend_class in backend_classes:
with _actor_error_handling(backend_class.__name__):
with timer.time_logger(backend_class.__name__):
backend = backend_class.start(
config=config, audio=audio).proxy()
backends.append(backend)
# Block until all on_starts have finished, letting them run in parallel
for backend in backends[:]:
try:
backend.ping().get()
except pykka.ActorDeadError as exc:
backends.remove(backend)
logger.error('Actor died: %s', exc)
return backends
def start_core(self, config, mixer, backends, audio):
logger.info('Starting Mopidy core')
return Core.start(
config=config, mixer=mixer, backends=backends, audio=audio).proxy()
def start_frontends(self, config, frontend_classes, core):
logger.info(
'Starting Mopidy frontends: %s',
', '.join(f.__name__ for f in frontend_classes) or 'none')
for frontend_class in frontend_classes:
with _actor_error_handling(frontend_class.__name__):
with timer.time_logger(frontend_class.__name__):
frontend_class.start(config=config, core=core)
def stop_frontends(self, frontend_classes):
logger.info('Stopping Mopidy frontends')
for frontend_class in frontend_classes:
process.stop_actors_by_class(frontend_class)
def stop_core(self):
logger.info('Stopping Mopidy core')
process.stop_actors_by_class(Core)
def stop_backends(self, backend_classes):
logger.info('Stopping Mopidy backends')
for backend_class in backend_classes:
process.stop_actors_by_class(backend_class)
def stop_audio(self):
logger.info('Stopping Mopidy audio')
process.stop_actors_by_class(Audio)
def stop_mixer(self, mixer_class):
logger.info('Stopping Mopidy mixer')
process.stop_actors_by_class(mixer_class)
class ConfigCommand(Command):
help = 'Show currently active configuration.'
def __init__(self):
super(ConfigCommand, self).__init__()
self.set(base_verbosity_level=-1)
def run(self, config, errors, schemas):
print(config_lib.format(config, schemas, errors))
return 0
class DepsCommand(Command):
help = 'Show dependencies and debug information.'
def __init__(self):
super(DepsCommand, self).__init__()
self.set(base_verbosity_level=-1)
def run(self):
print(deps.format_dependency_list())
return 0
Mopidy-2.0.0/mopidy/mpd/ 0000775 0001750 0001750 00000000000 12660436443 015251 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/mopidy/mpd/translator.py 0000664 0001750 0001750 00000013564 12653464377 020036 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import datetime
import logging
import re
from mopidy.models import TlTrack
from mopidy.mpd.protocol import tagtype_list
logger = logging.getLogger(__name__)
# TODO: special handling of local:// uri scheme
normalize_path_re = re.compile(r'[^/]+')
def normalize_path(path, relative=False):
parts = normalize_path_re.findall(path or '')
if not relative:
parts.insert(0, '')
return '/'.join(parts)
def track_to_mpd_format(track, position=None, stream_title=None):
"""
Format track for output to MPD client.
:param track: the track
:type track: :class:`mopidy.models.Track` or :class:`mopidy.models.TlTrack`
:param position: track's position in playlist
:type position: integer
:param stream_title: the current streams title
:type position: string
:rtype: list of two-tuples
"""
if isinstance(track, TlTrack):
(tlid, track) = track
else:
(tlid, track) = (None, track)
if not track.uri:
logger.warning('Ignoring track without uri')
return []
result = [
('file', track.uri),
('Time', track.length and (track.length // 1000) or 0),
('Artist', concat_multi_values(track.artists, 'name')),
('Album', track.album and track.album.name or ''),
]
if stream_title is not None:
result.append(('Title', stream_title))
if track.name:
result.append(('Name', track.name))
else:
result.append(('Title', track.name or ''))
if track.date:
result.append(('Date', track.date))
if track.album is not None and track.album.num_tracks is not None:
result.append(('Track', '%d/%d' % (
track.track_no or 0, track.album.num_tracks)))
else:
result.append(('Track', track.track_no or 0))
if position is not None and tlid is not None:
result.append(('Pos', position))
result.append(('Id', tlid))
if track.album is not None and track.album.musicbrainz_id is not None:
result.append(('MUSICBRAINZ_ALBUMID', track.album.musicbrainz_id))
if track.album is not None and track.album.artists:
result.append(
('AlbumArtist', concat_multi_values(track.album.artists, 'name')))
musicbrainz_ids = concat_multi_values(
track.album.artists, 'musicbrainz_id')
if musicbrainz_ids:
result.append(('MUSICBRAINZ_ALBUMARTISTID', musicbrainz_ids))
if track.artists:
musicbrainz_ids = concat_multi_values(track.artists, 'musicbrainz_id')
if musicbrainz_ids:
result.append(('MUSICBRAINZ_ARTISTID', musicbrainz_ids))
if track.composers:
result.append(
('Composer', concat_multi_values(track.composers, 'name')))
if track.performers:
result.append(
('Performer', concat_multi_values(track.performers, 'name')))
if track.genre:
result.append(('Genre', track.genre))
if track.disc_no:
result.append(('Disc', track.disc_no))
if track.last_modified:
datestring = datetime.datetime.utcfromtimestamp(
track.last_modified // 1000).isoformat()
result.append(('Last-Modified', datestring + 'Z'))
if track.musicbrainz_id is not None:
result.append(('MUSICBRAINZ_TRACKID', track.musicbrainz_id))
if track.album and track.album.uri:
result.append(('X-AlbumUri', track.album.uri))
if track.album and track.album.images:
images = ';'.join(i for i in track.album.images if i is not '')
result.append(('X-AlbumImage', images))
result = [element for element in result if _has_value(*element)]
return result
def _has_value(tagtype, value):
"""
Determine whether to add the tagtype to the output or not.
:param tagtype: the MPD tagtype
:type tagtype: string
:param value: the tag value
:rtype: bool
"""
if tagtype in tagtype_list.TAGTYPE_LIST:
return bool(value)
return True
def concat_multi_values(models, attribute):
"""
Format Mopidy model values for output to MPD client.
:param models: the models
:type models: array of :class:`mopidy.models.Artist`,
:class:`mopidy.models.Album` or :class:`mopidy.models.Track`
:param attribute: the attribute to use
:type attribute: string
:rtype: string
"""
# Don't sort the values. MPD doesn't appear to (or if it does it's not
# strict alphabetical). If we just use them in the order in which they come
# in then the musicbrainz ids have a higher chance of staying in sync
return ';'.join(
getattr(m, attribute)
for m in models if getattr(m, attribute, None) is not None
)
def tracks_to_mpd_format(tracks, start=0, end=None):
"""
Format list of tracks for output to MPD client.
Optionally limit output to the slice ``[start:end]`` of the list.
:param tracks: the tracks
:type tracks: list of :class:`mopidy.models.Track` or
:class:`mopidy.models.TlTrack`
:param start: position of first track to include in output
:type start: int (positive or negative)
:param end: position after last track to include in output
:type end: int (positive or negative) or :class:`None` for end of list
:rtype: list of lists of two-tuples
"""
if end is None:
end = len(tracks)
tracks = tracks[start:end]
positions = range(start, end)
assert len(tracks) == len(positions)
result = []
for track, position in zip(tracks, positions):
formatted_track = track_to_mpd_format(track, position)
if formatted_track:
result.append(formatted_track)
return result
def playlist_to_mpd_format(playlist, *args, **kwargs):
"""
Format playlist for output to MPD client.
Arguments as for :func:`tracks_to_mpd_format`, except the first one.
"""
return tracks_to_mpd_format(playlist.tracks, *args, **kwargs)
Mopidy-2.0.0/mopidy/mpd/tokenize.py 0000664 0001750 0001750 00000006272 12505224626 017456 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import re
from mopidy.mpd import exceptions
WORD_RE = re.compile(r"""
^
(\s*) # Leading whitespace not allowed, capture it to report.
([a-z][a-z0-9_]*) # A command name
(?:\s+|$) # trailing whitespace or EOS
(.*) # Possibly a remainder to be parsed
""", re.VERBOSE)
# Quotes matching is an unrolled version of "(?:[^"\\]|\\.)*"
PARAM_RE = re.compile(r"""
^ # Leading whitespace is not allowed
(?:
([^%(unprintable)s"']+) # ord(char) < 0x20, not ", not '
| # or
"([^"\\]*(?:\\.[^"\\]*)*)" # anything surrounded by quotes
)
(?:\s+|$) # trailing whitespace or EOS
(.*) # Possibly a remainder to be parsed
""" % {'unprintable': ''.join(map(chr, range(0x21)))}, re.VERBOSE)
BAD_QUOTED_PARAM_RE = re.compile(r"""
^
"[^"\\]*(?:\\.[^"\\]*)* # start of a quoted value
(?: # followed by:
("[^\s]) # non-escaped quote, followed by non-whitespace
| # or
([^"]) # anything that is not a quote
)
""", re.VERBOSE)
UNESCAPE_RE = re.compile(r'\\(.)') # Backslash escapes any following char.
def split(line):
"""Splits a line into tokens using same rules as MPD.
- Lines may not start with whitespace
- Tokens are split by arbitrary amount of spaces or tabs
- First token must match `[a-z][a-z0-9_]*`
- Remaining tokens can be unquoted or quoted tokens.
- Unquoted tokens consist of all printable characters except double quotes,
single quotes, spaces and tabs.
- Quoted tokens are surrounded by a matching pair of double quotes.
- The closing quote must be followed by space, tab or end of line.
- Any value is allowed inside a quoted token. Including double quotes,
assuming it is correctly escaped.
- Backslash inside a quoted token is used to escape the following
character.
For examples see the tests for this function.
"""
if not line.strip():
raise exceptions.MpdNoCommand('No command given')
match = WORD_RE.match(line)
if not match:
raise exceptions.MpdUnknownError('Invalid word character')
whitespace, command, remainder = match.groups()
if whitespace:
raise exceptions.MpdUnknownError('Letter expected')
result = [command]
while remainder:
match = PARAM_RE.match(remainder)
if not match:
msg = _determine_error_message(remainder)
raise exceptions.MpdArgError(msg, command=command)
unquoted, quoted, remainder = match.groups()
result.append(unquoted or UNESCAPE_RE.sub(r'\g<1>', quoted))
return result
def _determine_error_message(remainder):
"""Helper to emulate MPD errors."""
# Following checks are simply to match MPD error messages:
match = BAD_QUOTED_PARAM_RE.match(remainder)
if match:
if match.group(1):
return 'Space expected after closing \'"\''
else:
return 'Missing closing \'"\''
return 'Invalid unquoted character'
Mopidy-2.0.0/mopidy/mpd/dispatcher.py 0000664 0001750 0001750 00000025660 12647257461 017770 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
import re
import pykka
from mopidy.mpd import exceptions, protocol, tokenize
logger = logging.getLogger(__name__)
protocol.load_protocol_modules()
class MpdDispatcher(object):
"""
The MPD session feeds the MPD dispatcher with requests. The dispatcher
finds the correct handler, processes the request and sends the response
back to the MPD session.
"""
_noidle = re.compile(r'^noidle$')
def __init__(self, session=None, config=None, core=None, uri_map=None):
self.config = config
self.authenticated = False
self.command_list_receiving = False
self.command_list_ok = False
self.command_list = []
self.command_list_index = None
self.context = MpdContext(
self, session=session, config=config, core=core, uri_map=uri_map)
def handle_request(self, request, current_command_list_index=None):
"""Dispatch incoming requests to the correct handler."""
self.command_list_index = current_command_list_index
response = []
filter_chain = [
self._catch_mpd_ack_errors_filter,
self._authenticate_filter,
self._command_list_filter,
self._idle_filter,
self._add_ok_filter,
self._call_handler_filter,
]
return self._call_next_filter(request, response, filter_chain)
def handle_idle(self, subsystem):
# TODO: validate against mopidy/mpd/protocol/status.SUBSYSTEMS
self.context.events.add(subsystem)
subsystems = self.context.subscriptions.intersection(
self.context.events)
if not subsystems:
return
response = []
for subsystem in subsystems:
response.append('changed: %s' % subsystem)
response.append('OK')
self.context.subscriptions = set()
self.context.events = set()
self.context.session.send_lines(response)
def _call_next_filter(self, request, response, filter_chain):
if filter_chain:
next_filter = filter_chain.pop(0)
return next_filter(request, response, filter_chain)
else:
return response
# Filter: catch MPD ACK errors
def _catch_mpd_ack_errors_filter(self, request, response, filter_chain):
try:
return self._call_next_filter(request, response, filter_chain)
except exceptions.MpdAckError as mpd_ack_error:
if self.command_list_index is not None:
mpd_ack_error.index = self.command_list_index
return [mpd_ack_error.get_mpd_ack()]
# Filter: authenticate
def _authenticate_filter(self, request, response, filter_chain):
if self.authenticated:
return self._call_next_filter(request, response, filter_chain)
elif self.config['mpd']['password'] is None:
self.authenticated = True
return self._call_next_filter(request, response, filter_chain)
else:
command_name = request.split(' ')[0]
command = protocol.commands.handlers.get(command_name)
if command and not command.auth_required:
return self._call_next_filter(request, response, filter_chain)
else:
raise exceptions.MpdPermissionError(command=command_name)
# Filter: command list
def _command_list_filter(self, request, response, filter_chain):
if self._is_receiving_command_list(request):
self.command_list.append(request)
return []
else:
response = self._call_next_filter(request, response, filter_chain)
if (self._is_receiving_command_list(request) or
self._is_processing_command_list(request)):
if response and response[-1] == 'OK':
response = response[:-1]
return response
def _is_receiving_command_list(self, request):
return (
self.command_list_receiving and request != 'command_list_end')
def _is_processing_command_list(self, request):
return (
self.command_list_index is not None and
request != 'command_list_end')
# Filter: idle
def _idle_filter(self, request, response, filter_chain):
if self._is_currently_idle() and not self._noidle.match(request):
logger.debug(
'Client sent us %s, only %s is allowed while in '
'the idle state', repr(request), repr('noidle'))
self.context.session.close()
return []
if not self._is_currently_idle() and self._noidle.match(request):
return [] # noidle was called before idle
response = self._call_next_filter(request, response, filter_chain)
if self._is_currently_idle():
return []
else:
return response
def _is_currently_idle(self):
return bool(self.context.subscriptions)
# Filter: add OK
def _add_ok_filter(self, request, response, filter_chain):
response = self._call_next_filter(request, response, filter_chain)
if not self._has_error(response):
response.append('OK')
return response
def _has_error(self, response):
return response and response[-1].startswith('ACK')
# Filter: call handler
def _call_handler_filter(self, request, response, filter_chain):
try:
response = self._format_response(self._call_handler(request))
return self._call_next_filter(request, response, filter_chain)
except pykka.ActorDeadError as e:
logger.warning('Tried to communicate with dead actor.')
raise exceptions.MpdSystemError(e)
def _call_handler(self, request):
tokens = tokenize.split(request)
# TODO: check that blacklist items are valid commands?
blacklist = self.config['mpd'].get('command_blacklist', [])
if tokens and tokens[0] in blacklist:
logger.warning(
'MPD client used blacklisted command: %s', tokens[0])
raise exceptions.MpdDisabled(command=tokens[0])
try:
return protocol.commands.call(tokens, context=self.context)
except exceptions.MpdAckError as exc:
if exc.command is None:
exc.command = tokens[0]
raise
def _format_response(self, response):
formatted_response = []
for element in self._listify_result(response):
formatted_response.extend(self._format_lines(element))
return formatted_response
def _listify_result(self, result):
if result is None:
return []
if isinstance(result, set):
return self._flatten(list(result))
if not isinstance(result, list):
return [result]
return self._flatten(result)
def _flatten(self, the_list):
result = []
for element in the_list:
if isinstance(element, list):
result.extend(self._flatten(element))
else:
result.append(element)
return result
def _format_lines(self, line):
if isinstance(line, dict):
return ['%s: %s' % (key, value) for (key, value) in line.items()]
if isinstance(line, tuple):
(key, value) = line
return ['%s: %s' % (key, value)]
return [line]
class MpdContext(object):
"""
This object is passed as the first argument to all MPD command handlers to
give the command handlers access to important parts of Mopidy.
"""
#: The current :class:`MpdDispatcher`.
dispatcher = None
#: The current :class:`mopidy.mpd.MpdSession`.
session = None
#: The MPD password
password = None
#: The Mopidy core API. An instance of :class:`mopidy.core.Core`.
core = None
#: The active subsystems that have pending events.
events = None
#: The subsytems that we want to be notified about in idle mode.
subscriptions = None
_uri_map = None
def __init__(self, dispatcher, session=None, config=None, core=None,
uri_map=None):
self.dispatcher = dispatcher
self.session = session
if config is not None:
self.password = config['mpd']['password']
self.core = core
self.events = set()
self.subscriptions = set()
self._uri_map = uri_map
def lookup_playlist_uri_from_name(self, name):
"""
Helper function to retrieve a playlist from its unique MPD name.
"""
return self._uri_map.playlist_uri_from_name(name)
def lookup_playlist_name_from_uri(self, uri):
"""
Helper function to retrieve the unique MPD playlist name from its uri.
"""
return self._uri_map.playlist_name_from_uri(uri)
def browse(self, path, recursive=True, lookup=True):
"""
Browse the contents of a given directory path.
Returns a sequence of two-tuples ``(path, data)``.
If ``recursive`` is true, it returns results for all entries in the
given path.
If ``lookup`` is true and the ``path`` is to a track, the returned
``data`` is a future which will contain the results from looking up
the URI with :meth:`mopidy.core.LibraryController.lookup`. If
``lookup`` is false and the ``path`` is to a track, the returned
``data`` will be a :class:`mopidy.models.Ref` for the track.
For all entries that are not tracks, the returned ``data`` will be
:class:`None`.
"""
path_parts = re.findall(r'[^/]+', path or '')
root_path = '/'.join([''] + path_parts)
uri = self._uri_map.uri_from_name(root_path)
if uri is None:
for part in path_parts:
for ref in self.core.library.browse(uri).get():
if ref.type != ref.TRACK and ref.name == part:
uri = ref.uri
break
else:
raise exceptions.MpdNoExistError('Not found')
root_path = self._uri_map.insert(root_path, uri)
if recursive:
yield (root_path, None)
path_and_futures = [(root_path, self.core.library.browse(uri))]
while path_and_futures:
base_path, future = path_and_futures.pop()
for ref in future.get():
path = '/'.join([base_path, ref.name.replace('/', '')])
path = self._uri_map.insert(path, ref.uri)
if ref.type == ref.TRACK:
if lookup:
# TODO: can we lookup all the refs at once now?
yield (path, self.core.library.lookup(uris=[ref.uri]))
else:
yield (path, ref)
else:
yield (path, None)
if recursive:
path_and_futures.append(
(path, self.core.library.browse(ref.uri)))
Mopidy-2.0.0/mopidy/mpd/__init__.py 0000664 0001750 0001750 00000002132 12660436420 017353 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import os
import mopidy
from mopidy import config, ext
class Extension(ext.Extension):
dist_name = 'Mopidy-MPD'
ext_name = 'mpd'
version = mopidy.__version__
def get_default_config(self):
conf_file = os.path.join(os.path.dirname(__file__), 'ext.conf')
return config.read(conf_file)
def get_config_schema(self):
schema = super(Extension, self).get_config_schema()
schema['hostname'] = config.Hostname()
schema['port'] = config.Port()
schema['password'] = config.Secret(optional=True)
schema['max_connections'] = config.Integer(minimum=1)
schema['connection_timeout'] = config.Integer(minimum=1)
schema['zeroconf'] = config.String(optional=True)
schema['command_blacklist'] = config.List(optional=True)
schema['default_playlist_scheme'] = config.String()
return schema
def validate_environment(self):
pass
def setup(self, registry):
from .actor import MpdFrontend
registry.add('frontend', MpdFrontend)
Mopidy-2.0.0/mopidy/mpd/uri_mapper.py 0000664 0001750 0001750 00000005122 12660436420 017761 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import re
# TOOD: refactor this into a generic mapper that does not know about browse
# or playlists and then use one instance for each case?
class MpdUriMapper(object):
"""
Maintains the mappings between uniquified MPD names and URIs.
"""
#: The Mopidy core API. An instance of :class:`mopidy.core.Core`.
core = None
_invalid_browse_chars = re.compile(r'[\n\r]')
_invalid_playlist_chars = re.compile(r'[/]')
def __init__(self, core=None):
self.core = core
self._uri_from_name = {}
self._browse_name_from_uri = {}
self._playlist_name_from_uri = {}
def _create_unique_name(self, name, uri):
stripped_name = self._invalid_browse_chars.sub(' ', name)
name = stripped_name
i = 2
while name in self._uri_from_name:
if self._uri_from_name[name] == uri:
return name
name = '%s [%d]' % (stripped_name, i)
i += 1
return name
def insert(self, name, uri, playlist=False):
"""
Create a unique and MPD compatible name that maps to the given URI.
"""
name = self._create_unique_name(name, uri)
self._uri_from_name[name] = uri
if playlist:
self._playlist_name_from_uri[uri] = name
else:
self._browse_name_from_uri[uri] = name
return name
def uri_from_name(self, name):
"""
Return the uri for the given MPD name.
"""
return self._uri_from_name.get(name)
def refresh_playlists_mapping(self):
"""
Maintain map between playlists and unique playlist names to be used by
MPD.
"""
if self.core is None:
return
for playlist_ref in self.core.playlists.as_list().get():
if not playlist_ref.name:
continue
name = self._invalid_playlist_chars.sub('|', playlist_ref.name)
self.insert(name, playlist_ref.uri, playlist=True)
def playlist_uri_from_name(self, name):
"""
Helper function to retrieve a playlist URI from its unique MPD name.
"""
if name not in self._uri_from_name:
self.refresh_playlists_mapping()
return self._uri_from_name.get(name)
def playlist_name_from_uri(self, uri):
"""
Helper function to retrieve the unique MPD playlist name from its URI.
"""
if uri not in self._playlist_name_from_uri:
self.refresh_playlists_mapping()
return self._playlist_name_from_uri[uri]
Mopidy-2.0.0/mopidy/mpd/ext.conf 0000664 0001750 0001750 00000000336 12660436420 016715 0 ustar jodal jodal 0000000 0000000 [mpd]
enabled = true
hostname = 127.0.0.1
port = 6600
password =
max_connections = 20
connection_timeout = 60
zeroconf = Mopidy MPD server on $hostname
command_blacklist = listall,listallinfo
default_playlist_scheme = m3u
Mopidy-2.0.0/mopidy/mpd/exceptions.py 0000664 0001750 0001750 00000010126 12660436420 017777 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy.exceptions import MopidyException
class MpdAckError(MopidyException):
"""See fields on this class for available MPD error codes"""
ACK_ERROR_NOT_LIST = 1
ACK_ERROR_ARG = 2
ACK_ERROR_PASSWORD = 3
ACK_ERROR_PERMISSION = 4
ACK_ERROR_UNKNOWN = 5
ACK_ERROR_NO_EXIST = 50
ACK_ERROR_PLAYLIST_MAX = 51
ACK_ERROR_SYSTEM = 52
ACK_ERROR_PLAYLIST_LOAD = 53
ACK_ERROR_UPDATE_ALREADY = 54
ACK_ERROR_PLAYER_SYNC = 55
ACK_ERROR_EXIST = 56
error_code = 0
def __init__(self, message='', index=0, command=None):
super(MpdAckError, self).__init__(message, index, command)
self.message = message
self.index = index
self.command = command
def get_mpd_ack(self):
"""
MPD error code format::
ACK [%(error_code)i@%(index)i] {%(command)s} description
"""
return 'ACK [%i@%i] {%s} %s' % (
self.__class__.error_code, self.index, self.command, self.message)
class MpdArgError(MpdAckError):
error_code = MpdAckError.ACK_ERROR_ARG
class MpdPasswordError(MpdAckError):
error_code = MpdAckError.ACK_ERROR_PASSWORD
class MpdPermissionError(MpdAckError):
error_code = MpdAckError.ACK_ERROR_PERMISSION
def __init__(self, *args, **kwargs):
super(MpdPermissionError, self).__init__(*args, **kwargs)
assert self.command is not None, 'command must be given explicitly'
self.message = 'you don\'t have permission for "%s"' % self.command
class MpdUnknownError(MpdAckError):
error_code = MpdAckError.ACK_ERROR_UNKNOWN
class MpdUnknownCommand(MpdUnknownError):
def __init__(self, *args, **kwargs):
super(MpdUnknownCommand, self).__init__(*args, **kwargs)
assert self.command is not None, 'command must be given explicitly'
self.message = 'unknown command "%s"' % self.command
self.command = ''
class MpdNoCommand(MpdUnknownCommand):
def __init__(self, *args, **kwargs):
kwargs['command'] = ''
super(MpdNoCommand, self).__init__(*args, **kwargs)
self.message = 'No command given'
class MpdNoExistError(MpdAckError):
error_code = MpdAckError.ACK_ERROR_NO_EXIST
class MpdExistError(MpdAckError):
error_code = MpdAckError.ACK_ERROR_EXIST
class MpdSystemError(MpdAckError):
error_code = MpdAckError.ACK_ERROR_SYSTEM
class MpdInvalidPlaylistName(MpdAckError):
error_code = MpdAckError.ACK_ERROR_ARG
def __init__(self, *args, **kwargs):
super(MpdInvalidPlaylistName, self).__init__(*args, **kwargs)
self.message = ('playlist name is invalid: playlist names may not '
'contain slashes, newlines or carriage returns')
class MpdNotImplemented(MpdAckError):
error_code = 0
def __init__(self, *args, **kwargs):
super(MpdNotImplemented, self).__init__(*args, **kwargs)
self.message = 'Not implemented'
class MpdInvalidTrackForPlaylist(MpdAckError):
# NOTE: This is a custom error for Mopidy that does not exist in MPD.
error_code = 0
def __init__(self, playlist_scheme, track_scheme, *args, **kwargs):
super(MpdInvalidTrackForPlaylist, self).__init__(*args, **kwargs)
self.message = (
'Playlist with scheme "%s" can\'t store track scheme "%s"' %
(playlist_scheme, track_scheme))
class MpdFailedToSavePlaylist(MpdAckError):
# NOTE: This is a custom error for Mopidy that does not exist in MPD.
error_code = 0
def __init__(self, backend_scheme, *args, **kwargs):
super(MpdFailedToSavePlaylist, self).__init__(*args, **kwargs)
self.message = 'Backend with scheme "%s" failed to save playlist' % (
backend_scheme)
class MpdDisabled(MpdAckError):
# NOTE: This is a custom error for Mopidy that does not exist in MPD.
error_code = 0
def __init__(self, *args, **kwargs):
super(MpdDisabled, self).__init__(*args, **kwargs)
assert self.command is not None, 'command must be given explicitly'
self.message = '"%s" has been disabled in the server' % self.command
Mopidy-2.0.0/mopidy/mpd/session.py 0000664 0001750 0001750 00000003310 12660436420 017276 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
from mopidy.internal import formatting, network
from mopidy.mpd import dispatcher, protocol
logger = logging.getLogger(__name__)
class MpdSession(network.LineProtocol):
"""
The MPD client session. Keeps track of a single client session. Any
requests from the client is passed on to the MPD request dispatcher.
"""
terminator = protocol.LINE_TERMINATOR
encoding = protocol.ENCODING
delimiter = r'\r?\n'
def __init__(self, connection, config=None, core=None, uri_map=None):
super(MpdSession, self).__init__(connection)
self.dispatcher = dispatcher.MpdDispatcher(
session=self, config=config, core=core, uri_map=uri_map)
def on_start(self):
logger.info('New MPD connection from [%s]:%s', self.host, self.port)
self.send_lines(['OK MPD %s' % protocol.VERSION])
def on_line_received(self, line):
logger.debug('Request from [%s]:%s: %s', self.host, self.port, line)
response = self.dispatcher.handle_request(line)
if not response:
return
logger.debug(
'Response to [%s]:%s: %s', self.host, self.port,
formatting.indent(self.terminator.join(response)))
self.send_lines(response)
def on_event(self, subsystem):
self.dispatcher.handle_idle(subsystem)
def decode(self, line):
try:
return super(MpdSession, self).decode(line)
except ValueError:
logger.warning(
'Stopping actor due to unescaping error, data '
'supplied by client was not valid.')
self.stop()
def close(self):
self.stop()
Mopidy-2.0.0/mopidy/mpd/actor.py 0000664 0001750 0001750 00000005440 12660436420 016731 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
import pykka
from mopidy import exceptions, listener, zeroconf
from mopidy.core import CoreListener
from mopidy.internal import encoding, network, process
from mopidy.mpd import session, uri_mapper
logger = logging.getLogger(__name__)
_CORE_EVENTS_TO_IDLE_SUBSYSTEMS = {
'track_playback_paused': None,
'track_playback_resumed': None,
'track_playback_started': None,
'track_playback_ended': None,
'playback_state_changed': 'player',
'tracklist_changed': 'playlist',
'playlists_loaded': 'stored_playlist',
'playlist_changed': 'stored_playlist',
'playlist_deleted': 'stored_playlist',
'options_changed': 'options',
'volume_changed': 'mixer',
'mute_changed': 'output',
'seeked': 'player',
'stream_title_changed': 'playlist',
}
class MpdFrontend(pykka.ThreadingActor, CoreListener):
def __init__(self, config, core):
super(MpdFrontend, self).__init__()
self.hostname = network.format_hostname(config['mpd']['hostname'])
self.port = config['mpd']['port']
self.uri_map = uri_mapper.MpdUriMapper(core)
self.zeroconf_name = config['mpd']['zeroconf']
self.zeroconf_service = None
self._setup_server(config, core)
def _setup_server(self, config, core):
try:
network.Server(
self.hostname, self.port,
protocol=session.MpdSession,
protocol_kwargs={
'config': config,
'core': core,
'uri_map': self.uri_map,
},
max_connections=config['mpd']['max_connections'],
timeout=config['mpd']['connection_timeout'])
except IOError as error:
raise exceptions.FrontendError(
'MPD server startup failed: %s' %
encoding.locale_decode(error))
logger.info('MPD server running at [%s]:%s', self.hostname, self.port)
def on_start(self):
if self.zeroconf_name:
self.zeroconf_service = zeroconf.Zeroconf(
name=self.zeroconf_name,
stype='_mpd._tcp',
port=self.port)
self.zeroconf_service.publish()
def on_stop(self):
if self.zeroconf_service:
self.zeroconf_service.unpublish()
process.stop_actors_by_class(session.MpdSession)
def on_event(self, event, **kwargs):
if event not in _CORE_EVENTS_TO_IDLE_SUBSYSTEMS:
logger.warning(
'Got unexpected event: %s(%s)', event, ', '.join(kwargs))
else:
self.send_idle(_CORE_EVENTS_TO_IDLE_SUBSYSTEMS[event])
def send_idle(self, subsystem):
if subsystem:
listener.send(session.MpdSession, subsystem)
Mopidy-2.0.0/mopidy/mpd/protocol/ 0000775 0001750 0001750 00000000000 12660436443 017112 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/mopidy/mpd/protocol/playback.py 0000664 0001750 0001750 00000034541 12660436420 021254 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy.core import PlaybackState
from mopidy.internal import deprecation
from mopidy.mpd import exceptions, protocol
@protocol.commands.add('consume', state=protocol.BOOL)
def consume(context, state):
"""
*musicpd.org, playback section:*
``consume {STATE}``
Sets consume state to ``STATE``, ``STATE`` should be 0 or
1. When consume is activated, each song played is removed from
playlist.
"""
context.core.tracklist.set_consume(state)
@protocol.commands.add('crossfade', seconds=protocol.UINT)
def crossfade(context, seconds):
"""
*musicpd.org, playback section:*
``crossfade {SECONDS}``
Sets crossfading between songs.
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('mixrampdb')
def mixrampdb(context, decibels):
"""
*musicpd.org, playback section:*
``mixrampdb {deciBels}``
Sets the threshold at which songs will be overlapped. Like crossfading but
doesn't fade the track volume, just overlaps. The songs need to have
MixRamp tags added by an external tool. 0dB is the normalized maximum
volume so use negative values, I prefer -17dB. In the absence of mixramp
tags crossfading will be used. See http://sourceforge.net/projects/mixramp
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('mixrampdelay', seconds=protocol.UINT)
def mixrampdelay(context, seconds):
"""
*musicpd.org, playback section:*
``mixrampdelay {SECONDS}``
Additional time subtracted from the overlap calculated by mixrampdb. A
value of "nan" disables MixRamp overlapping and falls back to
crossfading.
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('next')
def next_(context):
"""
*musicpd.org, playback section:*
``next``
Plays next song in the playlist.
*MPD's behaviour when affected by repeat/random/single/consume:*
Given a playlist of three tracks numbered 1, 2, 3, and a currently
playing track ``c``. ``next_track`` is defined at the track that
will be played upon calls to ``next``.
Tests performed on MPD 0.15.4-1ubuntu3.
====== ====== ====== ======= ===== ===== ===== =====
Inputs next_track
------------------------------- ------------------- -----
repeat random single consume c = 1 c = 2 c = 3 Notes
====== ====== ====== ======= ===== ===== ===== =====
T T T T 2 3 EOPL
T T T . Rand Rand Rand [1]
T T . T Rand Rand Rand [4]
T T . . Rand Rand Rand [4]
T . T T 2 3 EOPL
T . T . 2 3 1
T . . T 3 3 EOPL
T . . . 2 3 1
. T T T Rand Rand Rand [3]
. T T . Rand Rand Rand [3]
. T . T Rand Rand Rand [2]
. T . . Rand Rand Rand [2]
. . T T 2 3 EOPL
. . T . 2 3 EOPL
. . . T 2 3 EOPL
. . . . 2 3 EOPL
====== ====== ====== ======= ===== ===== ===== =====
- When end of playlist (EOPL) is reached, the current track is
unset.
- [1] When *random* and *single* is combined, ``next`` selects
a track randomly at each invocation, and not just the next track
in an internal prerandomized playlist.
- [2] When *random* is active, ``next`` will skip through
all tracks in the playlist in random order, and finally EOPL is
reached.
- [3] *single* has no effect in combination with *random*
alone, or *random* and *consume*.
- [4] When *random* and *repeat* is active, EOPL is never
reached, but the playlist is played again, in the same random
order as the first time.
"""
return context.core.playback.next().get()
@protocol.commands.add('pause', state=protocol.BOOL)
def pause(context, state=None):
"""
*musicpd.org, playback section:*
``pause {PAUSE}``
Toggles pause/resumes playing, ``PAUSE`` is 0 or 1.
*MPDroid:*
- Calls ``pause`` without any arguments to toogle pause.
"""
if state is None:
deprecation.warn('mpd.protocol.playback.pause:state_arg')
playback_state = context.core.playback.get_state().get()
if (playback_state == PlaybackState.PLAYING):
context.core.playback.pause().get()
elif (playback_state == PlaybackState.PAUSED):
context.core.playback.resume().get()
elif state:
context.core.playback.pause().get()
else:
context.core.playback.resume().get()
@protocol.commands.add('play', songpos=protocol.INT)
def play(context, songpos=None):
"""
*musicpd.org, playback section:*
``play [SONGPOS]``
Begins playing the playlist at song number ``SONGPOS``.
The original MPD server resumes from the paused state on ``play``
without arguments.
*Clarifications:*
- ``play "-1"`` when playing is ignored.
- ``play "-1"`` when paused resumes playback.
- ``play "-1"`` when stopped with a current track starts playback at the
current track.
- ``play "-1"`` when stopped without a current track, e.g. after playlist
replacement, starts playback at the first track.
*BitMPC:*
- issues ``play 6`` without quotes around the argument.
"""
if songpos is None:
return context.core.playback.play().get()
elif songpos == -1:
return _play_minus_one(context)
try:
tl_track = context.core.tracklist.slice(songpos, songpos + 1).get()[0]
return context.core.playback.play(tl_track).get()
except IndexError:
raise exceptions.MpdArgError('Bad song index')
def _play_minus_one(context):
playback_state = context.core.playback.get_state().get()
if playback_state == PlaybackState.PLAYING:
return # Nothing to do
elif playback_state == PlaybackState.PAUSED:
return context.core.playback.resume().get()
current_tl_track = context.core.playback.get_current_tl_track().get()
if current_tl_track is not None:
return context.core.playback.play(current_tl_track).get()
tl_tracks = context.core.tracklist.slice(0, 1).get()
if tl_tracks:
return context.core.playback.play(tl_tracks[0]).get()
return # Fail silently
@protocol.commands.add('playid', tlid=protocol.INT)
def playid(context, tlid):
"""
*musicpd.org, playback section:*
``playid [SONGID]``
Begins playing the playlist at song ``SONGID``.
*Clarifications:*
- ``playid "-1"`` when playing is ignored.
- ``playid "-1"`` when paused resumes playback.
- ``playid "-1"`` when stopped with a current track starts playback at the
current track.
- ``playid "-1"`` when stopped without a current track, e.g. after playlist
replacement, starts playback at the first track.
"""
if tlid == -1:
return _play_minus_one(context)
tl_tracks = context.core.tracklist.filter({'tlid': [tlid]}).get()
if not tl_tracks:
raise exceptions.MpdNoExistError('No such song')
return context.core.playback.play(tl_tracks[0]).get()
@protocol.commands.add('previous')
def previous(context):
"""
*musicpd.org, playback section:*
``previous``
Plays previous song in the playlist.
*MPD's behaviour when affected by repeat/random/single/consume:*
Given a playlist of three tracks numbered 1, 2, 3, and a currently
playing track ``c``. ``previous_track`` is defined at the track
that will be played upon ``previous`` calls.
Tests performed on MPD 0.15.4-1ubuntu3.
====== ====== ====== ======= ===== ===== =====
Inputs previous_track
------------------------------- -------------------
repeat random single consume c = 1 c = 2 c = 3
====== ====== ====== ======= ===== ===== =====
T T T T Rand? Rand? Rand?
T T T . 3 1 2
T T . T Rand? Rand? Rand?
T T . . 3 1 2
T . T T 3 1 2
T . T . 3 1 2
T . . T 3 1 2
T . . . 3 1 2
. T T T c c c
. T T . c c c
. T . T c c c
. T . . c c c
. . T T 1 1 2
. . T . 1 1 2
. . . T 1 1 2
. . . . 1 1 2
====== ====== ====== ======= ===== ===== =====
- If :attr:`time_position` of the current track is 15s or more,
``previous`` should do a seek to time position 0.
"""
return context.core.playback.previous().get()
@protocol.commands.add('random', state=protocol.BOOL)
def random(context, state):
"""
*musicpd.org, playback section:*
``random {STATE}``
Sets random state to ``STATE``, ``STATE`` should be 0 or 1.
"""
context.core.tracklist.set_random(state)
@protocol.commands.add('repeat', state=protocol.BOOL)
def repeat(context, state):
"""
*musicpd.org, playback section:*
``repeat {STATE}``
Sets repeat state to ``STATE``, ``STATE`` should be 0 or 1.
"""
context.core.tracklist.set_repeat(state)
@protocol.commands.add('replay_gain_mode')
def replay_gain_mode(context, mode):
"""
*musicpd.org, playback section:*
``replay_gain_mode {MODE}``
Sets the replay gain mode. One of ``off``, ``track``, ``album``.
Changing the mode during playback may take several seconds, because
the new settings does not affect the buffered data.
This command triggers the options idle event.
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('replay_gain_status')
def replay_gain_status(context):
"""
*musicpd.org, playback section:*
``replay_gain_status``
Prints replay gain options. Currently, only the variable
``replay_gain_mode`` is returned.
"""
return 'off' # TODO
@protocol.commands.add('seek', songpos=protocol.UINT, seconds=protocol.UINT)
def seek(context, songpos, seconds):
"""
*musicpd.org, playback section:*
``seek {SONGPOS} {TIME}``
Seeks to the position ``TIME`` (in seconds) of entry ``SONGPOS`` in
the playlist.
*Droid MPD:*
- issues ``seek 1 120`` without quotes around the arguments.
"""
tl_track = context.core.playback.get_current_tl_track().get()
if context.core.tracklist.index(tl_track).get() != songpos:
play(context, songpos)
context.core.playback.seek(seconds * 1000).get()
@protocol.commands.add('seekid', tlid=protocol.UINT, seconds=protocol.UINT)
def seekid(context, tlid, seconds):
"""
*musicpd.org, playback section:*
``seekid {SONGID} {TIME}``
Seeks to the position ``TIME`` (in seconds) of song ``SONGID``.
"""
tl_track = context.core.playback.get_current_tl_track().get()
if not tl_track or tl_track.tlid != tlid:
playid(context, tlid)
context.core.playback.seek(seconds * 1000).get()
@protocol.commands.add('seekcur')
def seekcur(context, time):
"""
*musicpd.org, playback section:*
``seekcur {TIME}``
Seeks to the position ``TIME`` within the current song. If prefixed by
'+' or '-', then the time is relative to the current playing position.
"""
if time.startswith(('+', '-')):
position = context.core.playback.get_time_position().get()
position += protocol.INT(time) * 1000
context.core.playback.seek(position).get()
else:
position = protocol.UINT(time) * 1000
context.core.playback.seek(position).get()
@protocol.commands.add('setvol', volume=protocol.INT)
def setvol(context, volume):
"""
*musicpd.org, playback section:*
``setvol {VOL}``
Sets volume to ``VOL``, the range of volume is 0-100.
*Droid MPD:*
- issues ``setvol 50`` without quotes around the argument.
"""
# NOTE: we use INT as clients can pass in +N etc.
value = min(max(0, volume), 100)
success = context.core.mixer.set_volume(value).get()
if not success:
raise exceptions.MpdSystemError('problems setting volume')
@protocol.commands.add('single', state=protocol.BOOL)
def single(context, state):
"""
*musicpd.org, playback section:*
``single {STATE}``
Sets single state to ``STATE``, ``STATE`` should be 0 or 1. When
single is activated, playback is stopped after current song, or
song is repeated if the ``repeat`` mode is enabled.
"""
context.core.tracklist.set_single(state)
@protocol.commands.add('stop')
def stop(context):
"""
*musicpd.org, playback section:*
``stop``
Stops playing.
"""
context.core.playback.stop()
@protocol.commands.add('volume', change=protocol.INT)
def volume(context, change):
"""
*musicpd.org, playback section:*
``volume {CHANGE}``
Changes volume by amount ``CHANGE``.
Note: ``volume`` is deprecated, use ``setvol`` instead.
"""
if change < -100 or change > 100:
raise exceptions.MpdArgError('Invalid volume value')
old_volume = context.core.mixer.get_volume().get()
if old_volume is None:
raise exceptions.MpdSystemError('problems setting volume')
new_volume = min(max(0, old_volume + change), 100)
success = context.core.mixer.set_volume(new_volume).get()
if not success:
raise exceptions.MpdSystemError('problems setting volume')
Mopidy-2.0.0/mopidy/mpd/protocol/audio_output.py 0000664 0001750 0001750 00000003664 12575004517 022214 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy.mpd import exceptions, protocol
@protocol.commands.add('disableoutput', outputid=protocol.UINT)
def disableoutput(context, outputid):
"""
*musicpd.org, audio output section:*
``disableoutput {ID}``
Turns an output off.
"""
if outputid == 0:
success = context.core.mixer.set_mute(False).get()
if not success:
raise exceptions.MpdSystemError('problems disabling output')
else:
raise exceptions.MpdNoExistError('No such audio output')
@protocol.commands.add('enableoutput', outputid=protocol.UINT)
def enableoutput(context, outputid):
"""
*musicpd.org, audio output section:*
``enableoutput {ID}``
Turns an output on.
"""
if outputid == 0:
success = context.core.mixer.set_mute(True).get()
if not success:
raise exceptions.MpdSystemError('problems enabling output')
else:
raise exceptions.MpdNoExistError('No such audio output')
@protocol.commands.add('toggleoutput', outputid=protocol.UINT)
def toggleoutput(context, outputid):
"""
*musicpd.org, audio output section:*
``toggleoutput {ID}``
Turns an output on or off, depending on the current state.
"""
if outputid == 0:
mute_status = context.core.mixer.get_mute().get()
success = context.core.mixer.set_mute(not mute_status)
if not success:
raise exceptions.MpdSystemError('problems toggling output')
else:
raise exceptions.MpdNoExistError('No such audio output')
@protocol.commands.add('outputs')
def outputs(context):
"""
*musicpd.org, audio output section:*
``outputs``
Shows information about all outputs.
"""
muted = 1 if context.core.mixer.get_mute().get() else 0
return [
('outputid', 0),
('outputname', 'Mute'),
('outputenabled', muted),
]
Mopidy-2.0.0/mopidy/mpd/protocol/reflection.py 0000664 0001750 0001750 00000006000 12575004517 021610 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy.mpd import exceptions, protocol
from mopidy.mpd.protocol import tagtype_list
@protocol.commands.add('config', list_command=False)
def config(context):
"""
*musicpd.org, reflection section:*
``config``
Dumps configuration values that may be interesting for the client. This
command is only permitted to "local" clients (connected via UNIX domain
socket).
"""
raise exceptions.MpdPermissionError(command='config')
@protocol.commands.add('commands', auth_required=False)
def commands(context):
"""
*musicpd.org, reflection section:*
``commands``
Shows which commands the current user has access to.
"""
command_names = set()
for name, handler in protocol.commands.handlers.items():
if not handler.list_command:
continue
if context.dispatcher.authenticated or not handler.auth_required:
command_names.add(name)
return [
('command', command_name) for command_name in sorted(command_names)]
@protocol.commands.add('decoders')
def decoders(context):
"""
*musicpd.org, reflection section:*
``decoders``
Print a list of decoder plugins, followed by their supported
suffixes and MIME types. Example response::
plugin: mad
suffix: mp3
suffix: mp2
mime_type: audio/mpeg
plugin: mpcdec
suffix: mpc
*Clarifications:*
- ncmpcpp asks for decoders the first time you open the browse view. By
returning nothing and OK instead of an not implemented error, we avoid
"Not implemented" showing up in the ncmpcpp interface, and we get the
list of playlists without having to enter the browse interface twice.
"""
return # TODO
@protocol.commands.add('notcommands', auth_required=False)
def notcommands(context):
"""
*musicpd.org, reflection section:*
``notcommands``
Shows which commands the current user does not have access to.
"""
command_names = set(['config', 'kill']) # No permission to use
for name, handler in protocol.commands.handlers.items():
if not handler.list_command:
continue
if not context.dispatcher.authenticated and handler.auth_required:
command_names.add(name)
return [
('command', command_name) for command_name in sorted(command_names)]
@protocol.commands.add('tagtypes')
def tagtypes(context):
"""
*musicpd.org, reflection section:*
``tagtypes``
Shows a list of available song metadata.
"""
return [
('tagtype', tagtype) for tagtype in tagtype_list.TAGTYPE_LIST
]
@protocol.commands.add('urlhandlers')
def urlhandlers(context):
"""
*musicpd.org, reflection section:*
``urlhandlers``
Gets a list of available URL handlers.
"""
return [
('handler', uri_scheme)
for uri_scheme in context.core.get_uri_schemes().get()]
Mopidy-2.0.0/mopidy/mpd/protocol/current_playlist.py 0000664 0001750 0001750 00000033667 12660436420 023101 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy.compat import urllib
from mopidy.internal import deprecation
from mopidy.mpd import exceptions, protocol, translator
@protocol.commands.add('add')
def add(context, uri):
"""
*musicpd.org, current playlist section:*
``add {URI}``
Adds the file ``URI`` to the playlist (directories add recursively).
``URI`` can also be a single file.
*Clarifications:*
- ``add ""`` should add all tracks in the library to the current playlist.
"""
if not uri.strip('/'):
return
# If we have an URI just try and add it directly without bothering with
# jumping through browse...
if urllib.parse.urlparse(uri).scheme != '':
if context.core.tracklist.add(uris=[uri]).get():
return
try:
uris = []
for path, ref in context.browse(uri, lookup=False):
if ref:
uris.append(ref.uri)
except exceptions.MpdNoExistError as e:
e.message = 'directory or file not found'
raise
if not uris:
raise exceptions.MpdNoExistError('directory or file not found')
context.core.tracklist.add(uris=uris).get()
@protocol.commands.add('addid', songpos=protocol.UINT)
def addid(context, uri, songpos=None):
"""
*musicpd.org, current playlist section:*
``addid {URI} [POSITION]``
Adds a song to the playlist (non-recursive) and returns the song id.
``URI`` is always a single file or URL. For example::
addid "foo.mp3"
Id: 999
OK
*Clarifications:*
- ``addid ""`` should return an error.
"""
if not uri:
raise exceptions.MpdNoExistError('No such song')
length = context.core.tracklist.get_length()
if songpos is not None and songpos > length.get():
raise exceptions.MpdArgError('Bad song index')
tl_tracks = context.core.tracklist.add(
uris=[uri], at_position=songpos).get()
if not tl_tracks:
raise exceptions.MpdNoExistError('No such song')
return ('Id', tl_tracks[0].tlid)
@protocol.commands.add('delete', songrange=protocol.RANGE)
def delete(context, songrange):
"""
*musicpd.org, current playlist section:*
``delete [{POS} | {START:END}]``
Deletes a song from the playlist.
"""
start = songrange.start
end = songrange.stop
if end is None:
end = context.core.tracklist.get_length().get()
tl_tracks = context.core.tracklist.slice(start, end).get()
if not tl_tracks:
raise exceptions.MpdArgError('Bad song index', command='delete')
for (tlid, _) in tl_tracks:
context.core.tracklist.remove({'tlid': [tlid]})
@protocol.commands.add('deleteid', tlid=protocol.UINT)
def deleteid(context, tlid):
"""
*musicpd.org, current playlist section:*
``deleteid {SONGID}``
Deletes the song ``SONGID`` from the playlist
"""
tl_tracks = context.core.tracklist.remove({'tlid': [tlid]}).get()
if not tl_tracks:
raise exceptions.MpdNoExistError('No such song')
@protocol.commands.add('clear')
def clear(context):
"""
*musicpd.org, current playlist section:*
``clear``
Clears the current playlist.
"""
context.core.tracklist.clear()
@protocol.commands.add('move', songrange=protocol.RANGE, to=protocol.UINT)
def move_range(context, songrange, to):
"""
*musicpd.org, current playlist section:*
``move [{FROM} | {START:END}] {TO}``
Moves the song at ``FROM`` or range of songs at ``START:END`` to
``TO`` in the playlist.
"""
start = songrange.start
end = songrange.stop
if end is None:
end = context.core.tracklist.get_length().get()
context.core.tracklist.move(start, end, to)
@protocol.commands.add('moveid', tlid=protocol.UINT, to=protocol.UINT)
def moveid(context, tlid, to):
"""
*musicpd.org, current playlist section:*
``moveid {FROM} {TO}``
Moves the song with ``FROM`` (songid) to ``TO`` (playlist index) in
the playlist. If ``TO`` is negative, it is relative to the current
song in the playlist (if there is one).
"""
tl_tracks = context.core.tracklist.filter({'tlid': [tlid]}).get()
if not tl_tracks:
raise exceptions.MpdNoExistError('No such song')
position = context.core.tracklist.index(tl_tracks[0]).get()
context.core.tracklist.move(position, position + 1, to)
@protocol.commands.add('playlist')
def playlist(context):
"""
*musicpd.org, current playlist section:*
``playlist``
Displays the current playlist.
.. note::
Do not use this, instead use ``playlistinfo``.
"""
deprecation.warn('mpd.protocol.current_playlist.playlist')
return playlistinfo(context)
@protocol.commands.add('playlistfind')
def playlistfind(context, tag, needle):
"""
*musicpd.org, current playlist section:*
``playlistfind {TAG} {NEEDLE}``
Finds songs in the current playlist with strict matching.
"""
if tag == 'filename':
tl_tracks = context.core.tracklist.filter({'uri': [needle]}).get()
if not tl_tracks:
return None
position = context.core.tracklist.index(tl_tracks[0]).get()
return translator.track_to_mpd_format(tl_tracks[0], position=position)
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('playlistid', tlid=protocol.UINT)
def playlistid(context, tlid=None):
"""
*musicpd.org, current playlist section:*
``playlistid {SONGID}``
Displays a list of songs in the playlist. ``SONGID`` is optional
and specifies a single song to display info for.
"""
if tlid is not None:
tl_tracks = context.core.tracklist.filter({'tlid': [tlid]}).get()
if not tl_tracks:
raise exceptions.MpdNoExistError('No such song')
position = context.core.tracklist.index(tl_tracks[0]).get()
return translator.track_to_mpd_format(tl_tracks[0], position=position)
else:
return translator.tracks_to_mpd_format(
context.core.tracklist.get_tl_tracks().get())
@protocol.commands.add('playlistinfo')
def playlistinfo(context, parameter=None):
"""
*musicpd.org, current playlist section:*
``playlistinfo [[SONGPOS] | [START:END]]``
Displays a list of all songs in the playlist, or if the optional
argument is given, displays information only for the song
``SONGPOS`` or the range of songs ``START:END``.
*ncmpc and mpc:*
- uses negative indexes, like ``playlistinfo "-1"``, to request
the entire playlist
"""
if parameter is None or parameter == '-1':
start, end = 0, None
else:
tracklist_slice = protocol.RANGE(parameter)
start, end = tracklist_slice.start, tracklist_slice.stop
tl_tracks = context.core.tracklist.get_tl_tracks().get()
if start and start > len(tl_tracks):
raise exceptions.MpdArgError('Bad song index')
if end and end > len(tl_tracks):
end = None
return translator.tracks_to_mpd_format(tl_tracks, start, end)
@protocol.commands.add('playlistsearch')
def playlistsearch(context, tag, needle):
"""
*musicpd.org, current playlist section:*
``playlistsearch {TAG} {NEEDLE}``
Searches case-sensitively for partial matches in the current
playlist.
*GMPC:*
- uses ``filename`` and ``any`` as tags
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('plchanges', version=protocol.INT)
def plchanges(context, version):
"""
*musicpd.org, current playlist section:*
``plchanges {VERSION}``
Displays changed songs currently in the playlist since ``VERSION``.
To detect songs that were deleted at the end of the playlist, use
``playlistlength`` returned by status command.
*MPDroid:*
- Calls ``plchanges "-1"`` two times per second to get the entire playlist.
"""
# XXX Naive implementation that returns all tracks as changed
tracklist_version = context.core.tracklist.get_version().get()
if version < tracklist_version:
return translator.tracks_to_mpd_format(
context.core.tracklist.get_tl_tracks().get())
elif version == tracklist_version:
# A version match could indicate this is just a metadata update, so
# check for a stream ref and let the client know about the change.
stream_title = context.core.playback.get_stream_title().get()
if stream_title is None:
return None
tl_track = context.core.playback.get_current_tl_track().get()
position = context.core.tracklist.index(tl_track).get()
return translator.track_to_mpd_format(
tl_track, position=position, stream_title=stream_title)
@protocol.commands.add('plchangesposid', version=protocol.INT)
def plchangesposid(context, version):
"""
*musicpd.org, current playlist section:*
``plchangesposid {VERSION}``
Displays changed songs currently in the playlist since ``VERSION``.
This function only returns the position and the id of the changed
song, not the complete metadata. This is more bandwidth efficient.
To detect songs that were deleted at the end of the playlist, use
``playlistlength`` returned by status command.
"""
# XXX Naive implementation that returns all tracks as changed
if int(version) != context.core.tracklist.get_version().get():
result = []
for (position, (tlid, _)) in enumerate(
context.core.tracklist.get_tl_tracks().get()):
result.append(('cpos', position))
result.append(('Id', tlid))
return result
@protocol.commands.add(
'prio', priority=protocol.UINT, position=protocol.RANGE)
def prio(context, priority, position):
"""
*musicpd.org, current playlist section:*
``prio {PRIORITY} {START:END...}``
Set the priority of the specified songs. A higher priority means that
it will be played first when "random" mode is enabled.
A priority is an integer between 0 and 255. The default priority of new
songs is 0.
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('prioid')
def prioid(context, *args):
"""
*musicpd.org, current playlist section:*
``prioid {PRIORITY} {ID...}``
Same as prio, but address the songs with their id.
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('rangeid', tlid=protocol.UINT, songrange=protocol.RANGE)
def rangeid(context, tlid, songrange):
"""
*musicpd.org, current playlist section:*
``rangeid {ID} {START:END}``
Specifies the portion of the song that shall be played. START and END
are offsets in seconds (fractional seconds allowed); both are optional.
Omitting both (i.e. sending just ":") means "remove the range, play
everything". A song that is currently playing cannot be manipulated
this way.
.. versionadded:: 0.19
New in MPD protocol version 0.19
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('shuffle', songrange=protocol.RANGE)
def shuffle(context, songrange=None):
"""
*musicpd.org, current playlist section:*
``shuffle [START:END]``
Shuffles the current playlist. ``START:END`` is optional and
specifies a range of songs.
"""
if songrange is None:
start, end = None, None
else:
start, end = songrange.start, songrange.stop
context.core.tracklist.shuffle(start, end)
@protocol.commands.add('swap', songpos1=protocol.UINT, songpos2=protocol.UINT)
def swap(context, songpos1, songpos2):
"""
*musicpd.org, current playlist section:*
``swap {SONG1} {SONG2}``
Swaps the positions of ``SONG1`` and ``SONG2``.
"""
tracks = context.core.tracklist.get_tracks().get()
song1 = tracks[songpos1]
song2 = tracks[songpos2]
del tracks[songpos1]
tracks.insert(songpos1, song2)
del tracks[songpos2]
tracks.insert(songpos2, song1)
# TODO: do we need a tracklist.replace()
context.core.tracklist.clear()
with deprecation.ignore('core.tracklist.add:tracks_arg'):
context.core.tracklist.add(tracks=tracks).get()
@protocol.commands.add('swapid', tlid1=protocol.UINT, tlid2=protocol.UINT)
def swapid(context, tlid1, tlid2):
"""
*musicpd.org, current playlist section:*
``swapid {SONG1} {SONG2}``
Swaps the positions of ``SONG1`` and ``SONG2`` (both song ids).
"""
tl_tracks1 = context.core.tracklist.filter({'tlid': [tlid1]}).get()
tl_tracks2 = context.core.tracklist.filter({'tlid': [tlid2]}).get()
if not tl_tracks1 or not tl_tracks2:
raise exceptions.MpdNoExistError('No such song')
position1 = context.core.tracklist.index(tl_tracks1[0]).get()
position2 = context.core.tracklist.index(tl_tracks2[0]).get()
swap(context, position1, position2)
@protocol.commands.add('addtagid', tlid=protocol.UINT)
def addtagid(context, tlid, tag, value):
"""
*musicpd.org, current playlist section:*
``addtagid {SONGID} {TAG} {VALUE}``
Adds a tag to the specified song. Editing song tags is only possible
for remote songs. This change is volatile: it may be overwritten by
tags received from the server, and the data is gone when the song gets
removed from the queue.
.. versionadded:: 0.19
New in MPD protocol version 0.19
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('cleartagid', tlid=protocol.UINT)
def cleartagid(context, tlid, tag):
"""
*musicpd.org, current playlist section:*
``cleartagid {SONGID} [TAG]``
Removes tags from the specified song. If TAG is not specified, then all
tag values will be removed. Editing song tags is only possible for
remote songs.
.. versionadded:: 0.19
New in MPD protocol version 0.19
"""
raise exceptions.MpdNotImplemented # TODO
Mopidy-2.0.0/mopidy/mpd/protocol/__init__.py 0000664 0001750 0001750 00000014321 12575004517 021222 0 ustar jodal jodal 0000000 0000000 """
This is Mopidy's MPD protocol implementation.
This is partly based upon the `MPD protocol documentation
`_, which is a useful resource, but it is
rather incomplete with regards to data formats, both for requests and
responses. Thus, we have had to talk a great deal with the the original `MPD
server `_ using telnet to get the details we need to
implement our own MPD server which is compatible with the numerous existing
`MPD clients `_.
"""
from __future__ import absolute_import, unicode_literals
import inspect
from mopidy.mpd import exceptions
#: The MPD protocol uses UTF-8 for encoding all data.
ENCODING = 'UTF-8'
#: The MPD protocol uses ``\n`` as line terminator.
LINE_TERMINATOR = '\n'
#: The MPD protocol version is 0.19.0.
VERSION = '0.19.0'
def load_protocol_modules():
"""
The protocol modules must be imported to get them registered in
:attr:`commands`.
"""
from . import ( # noqa
audio_output, channels, command_list, connection, current_playlist,
mount, music_db, playback, reflection, status, stickers,
stored_playlists)
def INT(value): # noqa: N802
"""Converts a value that matches [+-]?\d+ into and integer."""
if value is None:
raise ValueError('None is not a valid integer')
# TODO: check for whitespace via value != value.strip()?
return int(value)
def UINT(value): # noqa: N802
"""Converts a value that matches \d+ into an integer."""
if value is None:
raise ValueError('None is not a valid integer')
if not value.isdigit():
raise ValueError('Only positive numbers are allowed')
return int(value)
def BOOL(value): # noqa: N802
"""Convert the values 0 and 1 into booleans."""
if value in ('1', '0'):
return bool(int(value))
raise ValueError('%r is not 0 or 1' % value)
def RANGE(value): # noqa: N802
"""Convert a single integer or range spec into a slice
``n`` should become ``slice(n, n+1)``
``n:`` should become ``slice(n, None)``
``n:m`` should become ``slice(n, m)`` and ``m > n`` must hold
"""
if ':' in value:
start, stop = value.split(':', 1)
start = UINT(start)
if stop.strip():
stop = UINT(stop)
if start >= stop:
raise ValueError('End must be larger than start')
else:
stop = None
else:
start = UINT(value)
stop = start + 1
return slice(start, stop)
class Commands(object):
"""Collection of MPD commands to expose to users.
Normally used through the global instance which command handlers have been
installed into.
"""
def __init__(self):
self.handlers = {}
# TODO: consider removing auth_required and list_command in favour of
# additional command instances to register in?
def add(self, name, auth_required=True, list_command=True, **validators):
"""Create a decorator that registers a handler and validation rules.
Additional keyword arguments are treated as converters/validators to
apply to tokens converting them to proper Python types.
Requirements for valid handlers:
- must accept a context argument as the first arg.
- may not use variable keyword arguments, ``**kwargs``.
- may use variable arguments ``*args`` *or* a mix of required and
optional arguments.
Decorator returns the unwrapped function so that tests etc can use the
functions with values with correct python types instead of strings.
:param string name: Name of the command being registered.
:param bool auth_required: If authorization is required.
:param bool list_command: If command should be listed in reflection.
"""
def wrapper(func):
if name in self.handlers:
raise ValueError('%s already registered' % name)
args, varargs, keywords, defaults = inspect.getargspec(func)
defaults = dict(zip(args[-len(defaults or []):], defaults or []))
if not args and not varargs:
raise TypeError('Handler must accept at least one argument.')
if len(args) > 1 and varargs:
raise TypeError(
'*args may not be combined with regular arguments')
if not set(validators.keys()).issubset(args):
raise TypeError('Validator for non-existent arg passed')
if keywords:
raise TypeError('**kwargs are not permitted')
def validate(*args, **kwargs):
if varargs:
return func(*args, **kwargs)
try:
callargs = inspect.getcallargs(func, *args, **kwargs)
except TypeError:
raise exceptions.MpdArgError(
'wrong number of arguments for "%s"' % name)
for key, value in callargs.items():
default = defaults.get(key, object())
if key in validators and value != default:
try:
callargs[key] = validators[key](value)
except ValueError:
raise exceptions.MpdArgError('incorrect arguments')
return func(**callargs)
validate.auth_required = auth_required
validate.list_command = list_command
self.handlers[name] = validate
return func
return wrapper
def call(self, tokens, context=None):
"""Find and run the handler registered for the given command.
If the handler was registered with any converters/validators they will
be run before calling the real handler.
:param list tokens: List of tokens to process
:param context: MPD context.
:type context: :class:`~mopidy.mpd.dispatcher.MpdContext`
"""
if not tokens:
raise exceptions.MpdNoCommand()
if tokens[0] not in self.handlers:
raise exceptions.MpdUnknownCommand(command=tokens[0])
return self.handlers[tokens[0]](context, *tokens[1:])
#: Global instance to install commands into
commands = Commands()
Mopidy-2.0.0/mopidy/mpd/protocol/tagtype_list.py 0000664 0001750 0001750 00000000631 12575004517 022172 0 ustar jodal jodal 0000000 0000000 from __future__ import unicode_literals
TAGTYPE_LIST = [
'Artist',
'ArtistSort',
'Album',
'AlbumArtist',
'AlbumArtistSort',
'Title',
'Track',
'Name',
'Genre',
'Date',
'Composer',
'Performer',
'Disc',
'MUSICBRAINZ_ARTISTID',
'MUSICBRAINZ_ALBUMID',
'MUSICBRAINZ_ALBUMARTISTID',
'MUSICBRAINZ_TRACKID',
'X-AlbumUri',
'X-AlbumImage',
]
Mopidy-2.0.0/mopidy/mpd/protocol/stored_playlists.py 0000664 0001750 0001750 00000032166 12660436420 023073 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, division, unicode_literals
import datetime
import logging
import re
import warnings
from mopidy.compat import urllib
from mopidy.mpd import exceptions, protocol, translator
logger = logging.getLogger(__name__)
def _check_playlist_name(name):
if re.search('[/\n\r]', name):
raise exceptions.MpdInvalidPlaylistName()
@protocol.commands.add('listplaylist')
def listplaylist(context, name):
"""
*musicpd.org, stored playlists section:*
``listplaylist {NAME}``
Lists the files in the playlist ``NAME.m3u``.
Output format::
file: relative/path/to/file1.flac
file: relative/path/to/file2.ogg
file: relative/path/to/file3.mp3
"""
uri = context.lookup_playlist_uri_from_name(name)
playlist = uri is not None and context.core.playlists.lookup(uri).get()
if not playlist:
raise exceptions.MpdNoExistError('No such playlist')
return ['file: %s' % t.uri for t in playlist.tracks]
@protocol.commands.add('listplaylistinfo')
def listplaylistinfo(context, name):
"""
*musicpd.org, stored playlists section:*
``listplaylistinfo {NAME}``
Lists songs in the playlist ``NAME.m3u``.
Output format:
Standard track listing, with fields: file, Time, Title, Date,
Album, Artist, Track
"""
uri = context.lookup_playlist_uri_from_name(name)
playlist = uri is not None and context.core.playlists.lookup(uri).get()
if not playlist:
raise exceptions.MpdNoExistError('No such playlist')
return translator.playlist_to_mpd_format(playlist)
@protocol.commands.add('listplaylists')
def listplaylists(context):
"""
*musicpd.org, stored playlists section:*
``listplaylists``
Prints a list of the playlist directory.
After each playlist name the server sends its last modification
time as attribute ``Last-Modified`` in ISO 8601 format. To avoid
problems due to clock differences between clients and the server,
clients should not compare this value with their local clock.
Output format::
playlist: a
Last-Modified: 2010-02-06T02:10:25Z
playlist: b
Last-Modified: 2010-02-06T02:11:08Z
*Clarifications:*
- ncmpcpp 0.5.10 segfaults if we return 'playlist: ' on a line, so we must
ignore playlists without names, which isn't very useful anyway.
"""
last_modified = _get_last_modified()
result = []
for playlist_ref in context.core.playlists.as_list().get():
if not playlist_ref.name:
continue
name = context.lookup_playlist_name_from_uri(playlist_ref.uri)
result.append(('playlist', name))
result.append(('Last-Modified', last_modified))
return result
# TODO: move to translators?
def _get_last_modified(last_modified=None):
"""Formats last modified timestamp of a playlist for MPD.
Time in UTC with second precision, formatted in the ISO 8601 format, with
the "Z" time zone marker for UTC. For example, "1970-01-01T00:00:00Z".
"""
if last_modified is None:
# If unknown, assume the playlist is modified
dt = datetime.datetime.utcnow()
else:
dt = datetime.datetime.utcfromtimestamp(last_modified / 1000.0)
dt = dt.replace(microsecond=0)
return '%sZ' % dt.isoformat()
@protocol.commands.add('load', playlist_slice=protocol.RANGE)
def load(context, name, playlist_slice=slice(0, None)):
"""
*musicpd.org, stored playlists section:*
``load {NAME} [START:END]``
Loads the playlist into the current queue. Playlist plugins are
supported. A range may be specified to load only a part of the
playlist.
*Clarifications:*
- ``load`` appends the given playlist to the current playlist.
- MPD 0.17.1 does not support open-ended ranges, i.e. without end
specified, for the ``load`` command, even though MPD's general range docs
allows open-ended ranges.
- MPD 0.17.1 does not fail if the specified range is outside the playlist,
in either or both ends.
"""
uri = context.lookup_playlist_uri_from_name(name)
playlist = uri is not None and context.core.playlists.lookup(uri).get()
if not playlist:
raise exceptions.MpdNoExistError('No such playlist')
with warnings.catch_warnings():
warnings.filterwarnings('ignore', 'tracklist.add.*"tracks".*')
context.core.tracklist.add(playlist.tracks[playlist_slice]).get()
@protocol.commands.add('playlistadd')
def playlistadd(context, name, track_uri):
"""
*musicpd.org, stored playlists section:*
``playlistadd {NAME} {URI}``
Adds ``URI`` to the playlist ``NAME.m3u``.
``NAME.m3u`` will be created if it does not exist.
"""
_check_playlist_name(name)
uri = context.lookup_playlist_uri_from_name(name)
old_playlist = uri is not None and context.core.playlists.lookup(uri).get()
if not old_playlist:
# Create new playlist with this single track
lookup_res = context.core.library.lookup(uris=[track_uri]).get()
tracks = [
track
for uri_tracks in lookup_res.values()
for track in uri_tracks]
_create_playlist(context, name, tracks)
else:
# Add track to existing playlist
lookup_res = context.core.library.lookup(uris=[track_uri]).get()
new_tracks = [
track
for uri_tracks in lookup_res.values()
for track in uri_tracks]
new_playlist = old_playlist.replace(
tracks=list(old_playlist.tracks) + new_tracks)
saved_playlist = context.core.playlists.save(new_playlist).get()
if saved_playlist is None:
playlist_scheme = urllib.parse.urlparse(old_playlist.uri).scheme
uri_scheme = urllib.parse.urlparse(track_uri).scheme
raise exceptions.MpdInvalidTrackForPlaylist(
playlist_scheme, uri_scheme)
def _create_playlist(context, name, tracks):
"""
Creates new playlist using backend appropriate for the given tracks
"""
uri_schemes = set([urllib.parse.urlparse(t.uri).scheme for t in tracks])
for scheme in uri_schemes:
new_playlist = context.core.playlists.create(name, scheme).get()
if new_playlist is None:
logger.debug(
"Backend for scheme %s can't create playlists", scheme)
continue # Backend can't create playlists at all
new_playlist = new_playlist.replace(tracks=tracks)
saved_playlist = context.core.playlists.save(new_playlist).get()
if saved_playlist is not None:
return # Created and saved
else:
continue # Failed to save using this backend
# Can't use backend appropriate for passed URI schemes, use default one
default_scheme = context.dispatcher.config[
'mpd']['default_playlist_scheme']
new_playlist = context.core.playlists.create(name, default_scheme).get()
if new_playlist is None:
# If even MPD's default backend can't save playlist, everything is lost
logger.warning("MPD's default backend can't create playlists")
raise exceptions.MpdFailedToSavePlaylist(default_scheme)
new_playlist = new_playlist.replace(tracks=tracks)
saved_playlist = context.core.playlists.save(new_playlist).get()
if saved_playlist is None:
uri_scheme = urllib.parse.urlparse(new_playlist.uri).scheme
raise exceptions.MpdFailedToSavePlaylist(uri_scheme)
@protocol.commands.add('playlistclear')
def playlistclear(context, name):
"""
*musicpd.org, stored playlists section:*
``playlistclear {NAME}``
Clears the playlist ``NAME.m3u``.
The playlist will be created if it does not exist.
"""
_check_playlist_name(name)
uri = context.lookup_playlist_uri_from_name(name)
playlist = uri is not None and context.core.playlists.lookup(uri).get()
if not playlist:
playlist = context.core.playlists.create(name).get()
# Just replace tracks with empty list and save
playlist = playlist.replace(tracks=[])
if context.core.playlists.save(playlist).get() is None:
raise exceptions.MpdFailedToSavePlaylist(
urllib.parse.urlparse(uri).scheme)
@protocol.commands.add('playlistdelete', songpos=protocol.UINT)
def playlistdelete(context, name, songpos):
"""
*musicpd.org, stored playlists section:*
``playlistdelete {NAME} {SONGPOS}``
Deletes ``SONGPOS`` from the playlist ``NAME.m3u``.
"""
_check_playlist_name(name)
uri = context.lookup_playlist_uri_from_name(name)
playlist = uri is not None and context.core.playlists.lookup(uri).get()
if not playlist:
raise exceptions.MpdNoExistError('No such playlist')
try:
# Convert tracks to list and remove requested
tracks = list(playlist.tracks)
tracks.pop(songpos)
except IndexError:
raise exceptions.MpdArgError('Bad song index')
# Replace tracks and save playlist
playlist = playlist.replace(tracks=tracks)
saved_playlist = context.core.playlists.save(playlist).get()
if saved_playlist is None:
raise exceptions.MpdFailedToSavePlaylist(
urllib.parse.urlparse(uri).scheme)
@protocol.commands.add(
'playlistmove', from_pos=protocol.UINT, to_pos=protocol.UINT)
def playlistmove(context, name, from_pos, to_pos):
"""
*musicpd.org, stored playlists section:*
``playlistmove {NAME} {SONGID} {SONGPOS}``
Moves ``SONGID`` in the playlist ``NAME.m3u`` to the position
``SONGPOS``.
*Clarifications:*
- The second argument is not a ``SONGID`` as used elsewhere in the protocol
documentation, but just the ``SONGPOS`` to move *from*, i.e.
``playlistmove {NAME} {FROM_SONGPOS} {TO_SONGPOS}``.
"""
if from_pos == to_pos:
return
_check_playlist_name(name)
uri = context.lookup_playlist_uri_from_name(name)
playlist = uri is not None and context.core.playlists.lookup(uri).get()
if not playlist:
raise exceptions.MpdNoExistError('No such playlist')
if from_pos == to_pos:
return # Nothing to do
try:
# Convert tracks to list and perform move
tracks = list(playlist.tracks)
track = tracks.pop(from_pos)
tracks.insert(to_pos, track)
except IndexError:
raise exceptions.MpdArgError('Bad song index')
# Replace tracks and save playlist
playlist = playlist.replace(tracks=tracks)
saved_playlist = context.core.playlists.save(playlist).get()
if saved_playlist is None:
raise exceptions.MpdFailedToSavePlaylist(
urllib.parse.urlparse(uri).scheme)
@protocol.commands.add('rename')
def rename(context, old_name, new_name):
"""
*musicpd.org, stored playlists section:*
``rename {NAME} {NEW_NAME}``
Renames the playlist ``NAME.m3u`` to ``NEW_NAME.m3u``.
"""
_check_playlist_name(old_name)
_check_playlist_name(new_name)
old_uri = context.lookup_playlist_uri_from_name(old_name)
if not old_uri:
raise exceptions.MpdNoExistError('No such playlist')
old_playlist = context.core.playlists.lookup(old_uri).get()
if not old_playlist:
raise exceptions.MpdNoExistError('No such playlist')
new_uri = context.lookup_playlist_uri_from_name(new_name)
if new_uri and context.core.playlists.lookup(new_uri).get():
raise exceptions.MpdExistError('Playlist already exists')
# TODO: should we purge the mapping in an else?
# Create copy of the playlist and remove original
uri_scheme = urllib.parse.urlparse(old_uri).scheme
new_playlist = context.core.playlists.create(new_name, uri_scheme).get()
new_playlist = new_playlist.replace(tracks=old_playlist.tracks)
saved_playlist = context.core.playlists.save(new_playlist).get()
if saved_playlist is None:
raise exceptions.MpdFailedToSavePlaylist(uri_scheme)
context.core.playlists.delete(old_playlist.uri).get()
@protocol.commands.add('rm')
def rm(context, name):
"""
*musicpd.org, stored playlists section:*
``rm {NAME}``
Removes the playlist ``NAME.m3u`` from the playlist directory.
"""
_check_playlist_name(name)
uri = context.lookup_playlist_uri_from_name(name)
if not uri:
raise exceptions.MpdNoExistError('No such playlist')
context.core.playlists.delete(uri).get()
@protocol.commands.add('save')
def save(context, name):
"""
*musicpd.org, stored playlists section:*
``save {NAME}``
Saves the current playlist to ``NAME.m3u`` in the playlist
directory.
"""
_check_playlist_name(name)
tracks = context.core.tracklist.get_tracks().get()
uri = context.lookup_playlist_uri_from_name(name)
playlist = uri is not None and context.core.playlists.lookup(uri).get()
if not playlist:
# Create new playlist
_create_playlist(context, name, tracks)
else:
# Overwrite existing playlist
new_playlist = playlist.replace(tracks=tracks)
saved_playlist = context.core.playlists.save(new_playlist).get()
if saved_playlist is None:
raise exceptions.MpdFailedToSavePlaylist(
urllib.parse.urlparse(uri).scheme)
Mopidy-2.0.0/mopidy/mpd/protocol/mount.py 0000664 0001750 0001750 00000003671 12575004517 020633 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy.mpd import exceptions, protocol
@protocol.commands.add('mount')
def mount(context, path, uri):
"""
*musicpd.org, mounts and neighbors section:*
``mount {PATH} {URI}``
Mount the specified remote storage URI at the given path. Example::
mount foo nfs://192.168.1.4/export/mp3
.. versionadded:: 0.19
New in MPD protocol version 0.19
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('unmount')
def unmount(context, path):
"""
*musicpd.org, mounts and neighbors section:*
``unmount {PATH}``
Unmounts the specified path. Example::
unmount foo
.. versionadded:: 0.19
New in MPD protocol version 0.19
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('listmounts')
def listmounts(context):
"""
*musicpd.org, mounts and neighbors section:*
``listmounts``
Queries a list of all mounts. By default, this contains just the
configured music_directory. Example::
listmounts
mount:
storage: /home/foo/music
mount: foo
storage: nfs://192.168.1.4/export/mp3
OK
.. versionadded:: 0.19
New in MPD protocol version 0.19
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('listneighbors')
def listneighbors(context):
"""
*musicpd.org, mounts and neighbors section:*
``listneighbors``
Queries a list of "neighbors" (e.g. accessible file servers on the
local net). Items on that list may be used with the mount command.
Example::
listneighbors
neighbor: smb://FOO
name: FOO (Samba 4.1.11-Debian)
OK
.. versionadded:: 0.19
New in MPD protocol version 0.19
"""
raise exceptions.MpdNotImplemented # TODO
Mopidy-2.0.0/mopidy/mpd/protocol/connection.py 0000664 0001750 0001750 00000002242 12505224626 021617 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy.mpd import exceptions, protocol
@protocol.commands.add('close', auth_required=False)
def close(context):
"""
*musicpd.org, connection section:*
``close``
Closes the connection to MPD.
"""
context.session.close()
@protocol.commands.add('kill', list_command=False)
def kill(context):
"""
*musicpd.org, connection section:*
``kill``
Kills MPD.
"""
raise exceptions.MpdPermissionError(command='kill')
@protocol.commands.add('password', auth_required=False)
def password(context, password):
"""
*musicpd.org, connection section:*
``password {PASSWORD}``
This is used for authentication with the server. ``PASSWORD`` is
simply the plaintext password.
"""
if password == context.password:
context.dispatcher.authenticated = True
else:
raise exceptions.MpdPasswordError('incorrect password')
@protocol.commands.add('ping', auth_required=False)
def ping(context):
"""
*musicpd.org, connection section:*
``ping``
Does nothing but return ``OK``.
"""
pass
Mopidy-2.0.0/mopidy/mpd/protocol/channels.py 0000664 0001750 0001750 00000003367 12505224626 021264 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy.mpd import exceptions, protocol
@protocol.commands.add('subscribe')
def subscribe(context, channel):
"""
*musicpd.org, client to client section:*
``subscribe {NAME}``
Subscribe to a channel. The channel is created if it does not exist
already. The name may consist of alphanumeric ASCII characters plus
underscore, dash, dot and colon.
"""
# TODO: match channel against [A-Za-z0-9:._-]+
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('unsubscribe')
def unsubscribe(context, channel):
"""
*musicpd.org, client to client section:*
``unsubscribe {NAME}``
Unsubscribe from a channel.
"""
# TODO: match channel against [A-Za-z0-9:._-]+
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('channels')
def channels(context):
"""
*musicpd.org, client to client section:*
``channels``
Obtain a list of all channels. The response is a list of "channel:"
lines.
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('readmessages')
def readmessages(context):
"""
*musicpd.org, client to client section:*
``readmessages``
Reads messages for this client. The response is a list of "channel:"
and "message:" lines.
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('sendmessage')
def sendmessage(context, channel, text):
"""
*musicpd.org, client to client section:*
``sendmessage {CHANNEL} {TEXT}``
Send a message to the specified channel.
"""
# TODO: match channel against [A-Za-z0-9:._-]+
raise exceptions.MpdNotImplemented # TODO
Mopidy-2.0.0/mopidy/mpd/protocol/music_db.py 0000664 0001750 0001750 00000040765 12575004517 021263 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import functools
import itertools
from mopidy.internal import deprecation
from mopidy.models import Track
from mopidy.mpd import exceptions, protocol, translator
_SEARCH_MAPPING = {
'album': 'album',
'albumartist': 'albumartist',
'any': 'any',
'artist': 'artist',
'comment': 'comment',
'composer': 'composer',
'date': 'date',
'file': 'uri',
'filename': 'uri',
'genre': 'genre',
'performer': 'performer',
'title': 'track_name',
'track': 'track_no'}
_LIST_MAPPING = {
'title': 'track',
'album': 'album',
'albumartist': 'albumartist',
'artist': 'artist',
'composer': 'composer',
'date': 'date',
'genre': 'genre',
'performer': 'performer'}
_LIST_NAME_MAPPING = {
'track': 'Title',
'album': 'Album',
'albumartist': 'AlbumArtist',
'artist': 'Artist',
'composer': 'Composer',
'date': 'Date',
'genre': 'Genre',
'performer': 'Performer'}
def _query_from_mpd_search_parameters(parameters, mapping):
query = {}
parameters = list(parameters)
while parameters:
# TODO: does it matter that this is now case insensitive
field = mapping.get(parameters.pop(0).lower())
if not field:
raise exceptions.MpdArgError('incorrect arguments')
if not parameters:
raise ValueError
value = parameters.pop(0)
if value.strip():
query.setdefault(field, []).append(value)
return query
def _get_field(field, search_results):
return list(itertools.chain(*[getattr(r, field) for r in search_results]))
_get_albums = functools.partial(_get_field, 'albums')
_get_artists = functools.partial(_get_field, 'artists')
_get_tracks = functools.partial(_get_field, 'tracks')
def _album_as_track(album):
return Track(
uri=album.uri,
name='Album: ' + album.name,
artists=album.artists,
album=album,
date=album.date)
def _artist_as_track(artist):
return Track(
uri=artist.uri,
name='Artist: ' + artist.name,
artists=[artist])
@protocol.commands.add('count')
def count(context, *args):
"""
*musicpd.org, music database section:*
``count {TAG} {NEEDLE}``
Counts the number of songs and their total playtime in the db
matching ``TAG`` exactly.
*GMPC:*
- use multiple tag-needle pairs to make more specific searches.
"""
try:
query = _query_from_mpd_search_parameters(args, _SEARCH_MAPPING)
except ValueError:
raise exceptions.MpdArgError('incorrect arguments')
results = context.core.library.search(query=query, exact=True).get()
result_tracks = _get_tracks(results)
return [
('songs', len(result_tracks)),
('playtime', sum(t.length for t in result_tracks if t.length) / 1000),
]
@protocol.commands.add('find')
def find(context, *args):
"""
*musicpd.org, music database section:*
``find {TYPE} {WHAT}``
Finds songs in the db that are exactly ``WHAT``. ``TYPE`` can be any
tag supported by MPD, or one of the two special parameters - ``file``
to search by full path (relative to database root), and ``any`` to
match against all available tags. ``WHAT`` is what to find.
*GMPC:*
- also uses ``find album "[ALBUM]" artist "[ARTIST]"`` to list album
tracks.
*ncmpc:*
- capitalizes the type argument.
*ncmpcpp:*
- also uses the search type "date".
- uses "file" instead of "filename".
"""
try:
query = _query_from_mpd_search_parameters(args, _SEARCH_MAPPING)
except ValueError:
return
with deprecation.ignore('core.library.search:empty_query'):
results = context.core.library.search(query=query, exact=True).get()
result_tracks = []
if ('artist' not in query and
'albumartist' not in query and
'composer' not in query and
'performer' not in query):
result_tracks += [_artist_as_track(a) for a in _get_artists(results)]
if 'album' not in query:
result_tracks += [_album_as_track(a) for a in _get_albums(results)]
result_tracks += _get_tracks(results)
return translator.tracks_to_mpd_format(result_tracks)
@protocol.commands.add('findadd')
def findadd(context, *args):
"""
*musicpd.org, music database section:*
``findadd {TYPE} {WHAT}``
Finds songs in the db that are exactly ``WHAT`` and adds them to
current playlist. Parameters have the same meaning as for ``find``.
"""
try:
query = _query_from_mpd_search_parameters(args, _SEARCH_MAPPING)
except ValueError:
return
results = context.core.library.search(query=query, exact=True).get()
with deprecation.ignore('core.tracklist.add:tracks_arg'):
# TODO: for now just use tracks as other wise we have to lookup the
# tracks we just got from the search.
context.core.tracklist.add(tracks=_get_tracks(results)).get()
@protocol.commands.add('list')
def list_(context, *args):
"""
*musicpd.org, music database section:*
``list {TYPE} [ARTIST]``
Lists all tags of the specified type. ``TYPE`` should be ``album``,
``artist``, ``albumartist``, ``date``, or ``genre``.
``ARTIST`` is an optional parameter when type is ``album``,
``date``, or ``genre``. This filters the result list by an artist.
*Clarifications:*
The musicpd.org documentation for ``list`` is far from complete. The
command also supports the following variant:
``list {TYPE} {QUERY}``
Where ``QUERY`` applies to all ``TYPE``. ``QUERY`` is one or more pairs
of a field name and a value. If the ``QUERY`` consists of more than one
pair, the pairs are AND-ed together to find the result. Examples of
valid queries and what they should return:
``list "artist" "artist" "ABBA"``
List artists where the artist name is "ABBA". Response::
Artist: ABBA
OK
``list "album" "artist" "ABBA"``
Lists albums where the artist name is "ABBA". Response::
Album: More ABBA Gold: More ABBA Hits
Album: Absolute More Christmas
Album: Gold: Greatest Hits
OK
``list "artist" "album" "Gold: Greatest Hits"``
Lists artists where the album name is "Gold: Greatest Hits".
Response::
Artist: ABBA
OK
``list "artist" "artist" "ABBA" "artist" "TLC"``
Lists artists where the artist name is "ABBA" *and* "TLC". Should
never match anything. Response::
OK
``list "date" "artist" "ABBA"``
Lists dates where artist name is "ABBA". Response::
Date:
Date: 1992
Date: 1993
OK
``list "date" "artist" "ABBA" "album" "Gold: Greatest Hits"``
Lists dates where artist name is "ABBA" and album name is "Gold:
Greatest Hits". Response::
Date: 1992
OK
``list "genre" "artist" "The Rolling Stones"``
Lists genres where artist name is "The Rolling Stones". Response::
Genre:
Genre: Rock
OK
*ncmpc:*
- capitalizes the field argument.
"""
params = list(args)
if not params:
raise exceptions.MpdArgError('incorrect arguments')
field = params.pop(0).lower()
field = _LIST_MAPPING.get(field)
if field is None:
raise exceptions.MpdArgError('incorrect arguments')
query = None
if len(params) == 1:
if field != 'album':
raise exceptions.MpdArgError('should be "Album" for 3 arguments')
if params[0].strip():
query = {'artist': params}
else:
try:
query = _query_from_mpd_search_parameters(params, _LIST_MAPPING)
except exceptions.MpdArgError as e:
e.message = 'not able to parse args'
raise
except ValueError:
return
name = _LIST_NAME_MAPPING[field]
result = context.core.library.get_distinct(field, query)
return [(name, value) for value in result.get()]
@protocol.commands.add('listall')
def listall(context, uri=None):
"""
*musicpd.org, music database section:*
``listall [URI]``
Lists all songs and directories in ``URI``.
Do not use this command. Do not manage a client-side copy of MPD's
database. That is fragile and adds huge overhead. It will break with
large databases. Instead, query MPD whenever you need something.
.. warning:: This command is disabled by default in Mopidy installs.
"""
result = []
for path, track_ref in context.browse(uri, lookup=False):
if not track_ref:
result.append(('directory', path))
else:
result.append(('file', track_ref.uri))
if not result:
raise exceptions.MpdNoExistError('Not found')
return result
@protocol.commands.add('listallinfo')
def listallinfo(context, uri=None):
"""
*musicpd.org, music database section:*
``listallinfo [URI]``
Same as ``listall``, except it also returns metadata info in the
same format as ``lsinfo``.
Do not use this command. Do not manage a client-side copy of MPD's
database. That is fragile and adds huge overhead. It will break with
large databases. Instead, query MPD whenever you need something.
.. warning:: This command is disabled by default in Mopidy installs.
"""
result = []
for path, lookup_future in context.browse(uri):
if not lookup_future:
result.append(('directory', path))
else:
for tracks in lookup_future.get().values():
for track in tracks:
result.extend(translator.track_to_mpd_format(track))
return result
@protocol.commands.add('listfiles')
def listfiles(context, uri=None):
"""
*musicpd.org, music database section:*
``listfiles [URI]``
Lists the contents of the directory URI, including files are not
recognized by MPD. URI can be a path relative to the music directory or
an URI understood by one of the storage plugins. The response contains
at least one line for each directory entry with the prefix "file: " or
"directory: ", and may be followed by file attributes such as
"Last-Modified" and "size".
For example, "smb://SERVER" returns a list of all shares on the given
SMB/CIFS server; "nfs://servername/path" obtains a directory listing
from the NFS server.
.. versionadded:: 0.19
New in MPD protocol version 0.19
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('lsinfo')
def lsinfo(context, uri=None):
"""
*musicpd.org, music database section:*
``lsinfo [URI]``
Lists the contents of the directory ``URI``.
When listing the root directory, this currently returns the list of
stored playlists. This behavior is deprecated; use
``listplaylists`` instead.
MPD returns the same result, including both playlists and the files and
directories located at the root level, for both ``lsinfo``, ``lsinfo
""``, and ``lsinfo "/"``.
"""
result = []
for path, lookup_future in context.browse(uri, recursive=False):
if not lookup_future:
result.append(('directory', path.lstrip('/')))
else:
for tracks in lookup_future.get().values():
if tracks:
result.extend(translator.track_to_mpd_format(tracks[0]))
if uri in (None, '', '/'):
result.extend(protocol.stored_playlists.listplaylists(context))
return result
@protocol.commands.add('rescan')
def rescan(context, uri=None):
"""
*musicpd.org, music database section:*
``rescan [URI]``
Same as ``update``, but also rescans unmodified files.
"""
return {'updating_db': 0} # TODO
@protocol.commands.add('search')
def search(context, *args):
"""
*musicpd.org, music database section:*
``search {TYPE} {WHAT} [...]``
Searches for any song that contains ``WHAT``. Parameters have the same
meaning as for ``find``, except that search is not case sensitive.
*GMPC:*
- uses the undocumented field ``any``.
- searches for multiple words like this::
search any "foo" any "bar" any "baz"
*ncmpc:*
- capitalizes the field argument.
*ncmpcpp:*
- also uses the search type "date".
- uses "file" instead of "filename".
"""
try:
query = _query_from_mpd_search_parameters(args, _SEARCH_MAPPING)
except ValueError:
return
with deprecation.ignore('core.library.search:empty_query'):
results = context.core.library.search(query).get()
artists = [_artist_as_track(a) for a in _get_artists(results)]
albums = [_album_as_track(a) for a in _get_albums(results)]
tracks = _get_tracks(results)
return translator.tracks_to_mpd_format(artists + albums + tracks)
@protocol.commands.add('searchadd')
def searchadd(context, *args):
"""
*musicpd.org, music database section:*
``searchadd {TYPE} {WHAT} [...]``
Searches for any song that contains ``WHAT`` in tag ``TYPE`` and adds
them to current playlist.
Parameters have the same meaning as for ``find``, except that search is
not case sensitive.
"""
try:
query = _query_from_mpd_search_parameters(args, _SEARCH_MAPPING)
except ValueError:
return
results = context.core.library.search(query).get()
with deprecation.ignore('core.tracklist.add:tracks_arg'):
# TODO: for now just use tracks as other wise we have to lookup the
# tracks we just got from the search.
context.core.tracklist.add(_get_tracks(results)).get()
@protocol.commands.add('searchaddpl')
def searchaddpl(context, *args):
"""
*musicpd.org, music database section:*
``searchaddpl {NAME} {TYPE} {WHAT} [...]``
Searches for any song that contains ``WHAT`` in tag ``TYPE`` and adds
them to the playlist named ``NAME``.
If a playlist by that name doesn't exist it is created.
Parameters have the same meaning as for ``find``, except that search is
not case sensitive.
"""
parameters = list(args)
if not parameters:
raise exceptions.MpdArgError('incorrect arguments')
playlist_name = parameters.pop(0)
try:
query = _query_from_mpd_search_parameters(parameters, _SEARCH_MAPPING)
except ValueError:
return
results = context.core.library.search(query).get()
uri = context.lookup_playlist_uri_from_name(playlist_name)
playlist = uri is not None and context.core.playlists.lookup(uri).get()
if not playlist:
playlist = context.core.playlists.create(playlist_name).get()
tracks = list(playlist.tracks) + _get_tracks(results)
playlist = playlist.replace(tracks=tracks)
context.core.playlists.save(playlist)
@protocol.commands.add('update')
def update(context, uri=None):
"""
*musicpd.org, music database section:*
``update [URI]``
Updates the music database: find new files, remove deleted files,
update modified files.
``URI`` is a particular directory or song/file to update. If you do
not specify it, everything is updated.
Prints ``updating_db: JOBID`` where ``JOBID`` is a positive number
identifying the update job. You can read the current job id in the
``status`` response.
"""
return {'updating_db': 0} # TODO
# TODO: add at least reflection tests before adding NotImplemented version
# @protocol.commands.add('readcomments')
def readcomments(context, uri):
"""
*musicpd.org, music database section:*
``readcomments [URI]``
Read "comments" (i.e. key-value pairs) from the file specified by
"URI". This "URI" can be a path relative to the music directory or a
URL in the form "file:///foo/bar.ogg".
This command may be used to list metadata of remote files (e.g. URI
beginning with "http://" or "smb://").
The response consists of lines in the form "KEY: VALUE". Comments with
suspicious characters (e.g. newlines) are ignored silently.
The meaning of these depends on the codec, and not all decoder plugins
support it. For example, on Ogg files, this lists the Vorbis comments.
"""
pass
Mopidy-2.0.0/mopidy/mpd/protocol/stickers.py 0000664 0001750 0001750 00000002360 12505224626 021310 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy.mpd import exceptions, protocol
@protocol.commands.add('sticker', list_command=False)
def sticker(context, action, field, uri, name=None, value=None):
"""
*musicpd.org, sticker section:*
``sticker list {TYPE} {URI}``
Lists the stickers for the specified object.
``sticker find {TYPE} {URI} {NAME}``
Searches the sticker database for stickers with the specified name,
below the specified directory (``URI``). For each matching song, it
prints the ``URI`` and that one sticker's value.
``sticker get {TYPE} {URI} {NAME}``
Reads a sticker value for the specified object.
``sticker set {TYPE} {URI} {NAME} {VALUE}``
Adds a sticker value to the specified object. If a sticker item
with that name already exists, it is replaced.
``sticker delete {TYPE} {URI} [NAME]``
Deletes a sticker value from the specified object. If you do not
specify a sticker name, all sticker values are deleted.
"""
# TODO: check that action in ('list', 'find', 'get', 'set', 'delete')
# TODO: check name/value matches with action
raise exceptions.MpdNotImplemented # TODO
Mopidy-2.0.0/mopidy/mpd/protocol/status.py 0000664 0001750 0001750 00000022756 12575004517 021021 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import pykka
from mopidy.core import PlaybackState
from mopidy.mpd import exceptions, protocol, translator
#: Subsystems that can be registered with idle command.
SUBSYSTEMS = [
'database', 'mixer', 'options', 'output', 'player', 'playlist',
'stored_playlist', 'update']
@protocol.commands.add('clearerror')
def clearerror(context):
"""
*musicpd.org, status section:*
``clearerror``
Clears the current error message in status (this is also
accomplished by any command that starts playback).
"""
raise exceptions.MpdNotImplemented # TODO
@protocol.commands.add('currentsong')
def currentsong(context):
"""
*musicpd.org, status section:*
``currentsong``
Displays the song info of the current song (same song that is
identified in status).
"""
tl_track = context.core.playback.get_current_tl_track().get()
stream_title = context.core.playback.get_stream_title().get()
if tl_track is not None:
position = context.core.tracklist.index(tl_track).get()
return translator.track_to_mpd_format(
tl_track, position=position, stream_title=stream_title)
@protocol.commands.add('idle', list_command=False)
def idle(context, *subsystems):
"""
*musicpd.org, status section:*
``idle [SUBSYSTEMS...]``
Waits until there is a noteworthy change in one or more of MPD's
subsystems. As soon as there is one, it lists all changed systems
in a line in the format ``changed: SUBSYSTEM``, where ``SUBSYSTEM``
is one of the following:
- ``database``: the song database has been modified after update.
- ``update``: a database update has started or finished. If the
database was modified during the update, the database event is
also emitted.
- ``stored_playlist``: a stored playlist has been modified,
renamed, created or deleted
- ``playlist``: the current playlist has been modified
- ``player``: the player has been started, stopped or seeked
- ``mixer``: the volume has been changed
- ``output``: an audio output has been enabled or disabled
- ``options``: options like repeat, random, crossfade, replay gain
While a client is waiting for idle results, the server disables
timeouts, allowing a client to wait for events as long as MPD runs.
The idle command can be canceled by sending the command ``noidle``
(no other commands are allowed). MPD will then leave idle mode and
print results immediately; might be empty at this time.
If the optional ``SUBSYSTEMS`` argument is used, MPD will only send
notifications when something changed in one of the specified
subsystems.
"""
# TODO: test against valid subsystems
if not subsystems:
subsystems = SUBSYSTEMS
for subsystem in subsystems:
context.subscriptions.add(subsystem)
active = context.subscriptions.intersection(context.events)
if not active:
context.session.prevent_timeout = True
return
response = []
context.events = set()
context.subscriptions = set()
for subsystem in active:
response.append('changed: %s' % subsystem)
return response
@protocol.commands.add('noidle', list_command=False)
def noidle(context):
"""See :meth:`_status_idle`."""
if not context.subscriptions:
return
context.subscriptions = set()
context.events = set()
context.session.prevent_timeout = False
@protocol.commands.add('stats')
def stats(context):
"""
*musicpd.org, status section:*
``stats``
Displays statistics.
- ``artists``: number of artists
- ``songs``: number of albums
- ``uptime``: daemon uptime in seconds
- ``db_playtime``: sum of all song times in the db
- ``db_update``: last db update in UNIX time
- ``playtime``: time length of music played
"""
return {
'artists': 0, # TODO
'albums': 0, # TODO
'songs': 0, # TODO
'uptime': 0, # TODO
'db_playtime': 0, # TODO
'db_update': 0, # TODO
'playtime': 0, # TODO
}
@protocol.commands.add('status')
def status(context):
"""
*musicpd.org, status section:*
``status``
Reports the current status of the player and the volume level.
- ``volume``: 0-100 or -1
- ``repeat``: 0 or 1
- ``single``: 0 or 1
- ``consume``: 0 or 1
- ``playlist``: 31-bit unsigned integer, the playlist version
number
- ``playlistlength``: integer, the length of the playlist
- ``state``: play, stop, or pause
- ``song``: playlist song number of the current song stopped on or
playing
- ``songid``: playlist songid of the current song stopped on or
playing
- ``nextsong``: playlist song number of the next song to be played
- ``nextsongid``: playlist songid of the next song to be played
- ``time``: total time elapsed (of current playing/paused song)
- ``elapsed``: Total time elapsed within the current song, but with
higher resolution.
- ``bitrate``: instantaneous bitrate in kbps
- ``xfade``: crossfade in seconds
- ``audio``: sampleRate``:bits``:channels
- ``updatings_db``: job id
- ``error``: if there is an error, returns message here
*Clarifications based on experience implementing*
- ``volume``: can also be -1 if no output is set.
- ``elapsed``: Higher resolution means time in seconds with three
decimal places for millisecond precision.
"""
tl_track = context.core.playback.get_current_tl_track()
futures = {
'tracklist.length': context.core.tracklist.get_length(),
'tracklist.version': context.core.tracklist.get_version(),
'mixer.volume': context.core.mixer.get_volume(),
'tracklist.consume': context.core.tracklist.get_consume(),
'tracklist.random': context.core.tracklist.get_random(),
'tracklist.repeat': context.core.tracklist.get_repeat(),
'tracklist.single': context.core.tracklist.get_single(),
'playback.state': context.core.playback.get_state(),
'playback.current_tl_track': tl_track,
'tracklist.index': context.core.tracklist.index(tl_track.get()),
'playback.time_position': context.core.playback.get_time_position(),
}
pykka.get_all(futures.values())
result = [
('volume', _status_volume(futures)),
('repeat', _status_repeat(futures)),
('random', _status_random(futures)),
('single', _status_single(futures)),
('consume', _status_consume(futures)),
('playlist', _status_playlist_version(futures)),
('playlistlength', _status_playlist_length(futures)),
('xfade', _status_xfade(futures)),
('state', _status_state(futures)),
]
# TODO: add nextsong and nextsongid
if futures['playback.current_tl_track'].get() is not None:
result.append(('song', _status_songpos(futures)))
result.append(('songid', _status_songid(futures)))
if futures['playback.state'].get() in (
PlaybackState.PLAYING, PlaybackState.PAUSED):
result.append(('time', _status_time(futures)))
result.append(('elapsed', _status_time_elapsed(futures)))
result.append(('bitrate', _status_bitrate(futures)))
return result
def _status_bitrate(futures):
current_tl_track = futures['playback.current_tl_track'].get()
if current_tl_track is None:
return 0
if current_tl_track.track.bitrate is None:
return 0
return current_tl_track.track.bitrate
def _status_consume(futures):
if futures['tracklist.consume'].get():
return 1
else:
return 0
def _status_playlist_length(futures):
return futures['tracklist.length'].get()
def _status_playlist_version(futures):
return futures['tracklist.version'].get()
def _status_random(futures):
return int(futures['tracklist.random'].get())
def _status_repeat(futures):
return int(futures['tracklist.repeat'].get())
def _status_single(futures):
return int(futures['tracklist.single'].get())
def _status_songid(futures):
current_tl_track = futures['playback.current_tl_track'].get()
if current_tl_track is not None:
return current_tl_track.tlid
else:
return _status_songpos(futures)
def _status_songpos(futures):
return futures['tracklist.index'].get()
def _status_state(futures):
state = futures['playback.state'].get()
if state == PlaybackState.PLAYING:
return 'play'
elif state == PlaybackState.STOPPED:
return 'stop'
elif state == PlaybackState.PAUSED:
return 'pause'
def _status_time(futures):
return '%d:%d' % (
futures['playback.time_position'].get() // 1000,
_status_time_total(futures) // 1000)
def _status_time_elapsed(futures):
return '%.3f' % (futures['playback.time_position'].get() / 1000.0)
def _status_time_total(futures):
current_tl_track = futures['playback.current_tl_track'].get()
if current_tl_track is None:
return 0
elif current_tl_track.track.length is None:
return 0
else:
return current_tl_track.track.length
def _status_volume(futures):
volume = futures['mixer.volume'].get()
if volume is not None:
return volume
else:
return -1
def _status_xfade(futures):
return 0 # Not supported
Mopidy-2.0.0/mopidy/mpd/protocol/command_list.py 0000664 0001750 0001750 00000004631 12505224626 022135 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
from mopidy.mpd import exceptions, protocol
@protocol.commands.add('command_list_begin', list_command=False)
def command_list_begin(context):
"""
*musicpd.org, command list section:*
To facilitate faster adding of files etc. you can pass a list of
commands all at once using a command list. The command list begins
with ``command_list_begin`` or ``command_list_ok_begin`` and ends
with ``command_list_end``.
It does not execute any commands until the list has ended. The
return value is whatever the return for a list of commands is. On
success for all commands, ``OK`` is returned. If a command fails,
no more commands are executed and the appropriate ``ACK`` error is
returned. If ``command_list_ok_begin`` is used, ``list_OK`` is
returned for each successful command executed in the command list.
"""
context.dispatcher.command_list_receiving = True
context.dispatcher.command_list_ok = False
context.dispatcher.command_list = []
@protocol.commands.add('command_list_end', list_command=False)
def command_list_end(context):
"""See :meth:`command_list_begin()`."""
# TODO: batch consecutive add commands
if not context.dispatcher.command_list_receiving:
raise exceptions.MpdUnknownCommand(command='command_list_end')
context.dispatcher.command_list_receiving = False
(command_list, context.dispatcher.command_list) = (
context.dispatcher.command_list, [])
(command_list_ok, context.dispatcher.command_list_ok) = (
context.dispatcher.command_list_ok, False)
command_list_response = []
for index, command in enumerate(command_list):
response = context.dispatcher.handle_request(
command, current_command_list_index=index)
command_list_response.extend(response)
if (command_list_response and
command_list_response[-1].startswith('ACK')):
return command_list_response
if command_list_ok:
command_list_response.append('list_OK')
return command_list_response
@protocol.commands.add('command_list_ok_begin', list_command=False)
def command_list_ok_begin(context):
"""See :meth:`command_list_begin()`."""
context.dispatcher.command_list_receiving = True
context.dispatcher.command_list_ok = True
context.dispatcher.command_list = []
Mopidy-2.0.0/mopidy/__main__.py 0000664 0001750 0001750 00000017340 12660436420 016563 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, print_function, unicode_literals
import logging
import os
import signal
import sys
from mopidy.internal.gi import Gst # noqa: Import to initialize
try:
# Make GObject's mainloop the event loop for python-dbus
import dbus.mainloop.glib
dbus.mainloop.glib.threads_init()
dbus.mainloop.glib.DBusGMainLoop(set_as_default=True)
except ImportError:
pass
import pykka.debug
from mopidy import commands, config as config_lib, ext
from mopidy.internal import encoding, log, path, process, versioning
logger = logging.getLogger(__name__)
def main():
log.bootstrap_delayed_logging()
logger.info('Starting Mopidy %s', versioning.get_version())
signal.signal(signal.SIGTERM, process.sigterm_handler)
# Windows does not have signal.SIGUSR1
if hasattr(signal, 'SIGUSR1'):
signal.signal(signal.SIGUSR1, pykka.debug.log_thread_tracebacks)
try:
registry = ext.Registry()
root_cmd = commands.RootCommand()
config_cmd = commands.ConfigCommand()
deps_cmd = commands.DepsCommand()
root_cmd.set(extension=None, registry=registry)
root_cmd.add_child('config', config_cmd)
root_cmd.add_child('deps', deps_cmd)
extensions_data = ext.load_extensions()
for data in extensions_data:
if data.command: # TODO: check isinstance?
data.command.set(extension=data.extension)
root_cmd.add_child(data.extension.ext_name, data.command)
args = root_cmd.parse(sys.argv[1:])
config, config_errors = config_lib.load(
args.config_files,
[d.config_schema for d in extensions_data],
[d.config_defaults for d in extensions_data],
args.config_overrides)
create_core_dirs(config)
create_initial_config_file(args, extensions_data)
verbosity_level = args.base_verbosity_level
if args.verbosity_level:
verbosity_level += args.verbosity_level
log.setup_logging(config, verbosity_level, args.save_debug_log)
extensions = {
'validate': [], 'config': [], 'disabled': [], 'enabled': []}
for data in extensions_data:
extension = data.extension
# TODO: factor out all of this to a helper that can be tested
if not ext.validate_extension_data(data):
config[extension.ext_name] = {'enabled': False}
config_errors[extension.ext_name] = {
'enabled': 'extension disabled by self check.'}
extensions['validate'].append(extension)
elif not config[extension.ext_name]['enabled']:
config[extension.ext_name] = {'enabled': False}
config_errors[extension.ext_name] = {
'enabled': 'extension disabled by user config.'}
extensions['disabled'].append(extension)
elif config_errors.get(extension.ext_name):
config[extension.ext_name]['enabled'] = False
config_errors[extension.ext_name]['enabled'] = (
'extension disabled due to config errors.')
extensions['config'].append(extension)
else:
extensions['enabled'].append(extension)
log_extension_info([d.extension for d in extensions_data],
extensions['enabled'])
# Config and deps commands are simply special cased for now.
if args.command == config_cmd:
schemas = [d.config_schema for d in extensions_data]
return args.command.run(config, config_errors, schemas)
elif args.command == deps_cmd:
return args.command.run()
check_config_errors(config, config_errors, extensions)
if not extensions['enabled']:
logger.error('No extension enabled, exiting...')
sys.exit(1)
# Read-only config from here on, please.
proxied_config = config_lib.Proxy(config)
if args.extension and args.extension not in extensions['enabled']:
logger.error(
'Unable to run command provided by disabled extension %s',
args.extension.ext_name)
return 1
for extension in extensions['enabled']:
try:
extension.setup(registry)
except Exception:
# TODO: would be nice a transactional registry. But sadly this
# is a bit tricky since our current API is giving out a mutable
# list. We might however be able to replace this with a
# collections.Sequence to provide a RO view.
logger.exception('Extension %s failed during setup, this might'
' have left the registry in a bad state.',
extension.ext_name)
# Anything that wants to exit after this point must use
# mopidy.internal.process.exit_process as actors can have been started.
try:
return args.command.run(args, proxied_config)
except NotImplementedError:
print(root_cmd.format_help())
return 1
except KeyboardInterrupt:
pass
except Exception as ex:
logger.exception(ex)
raise
def create_core_dirs(config):
path.get_or_create_dir(config['core']['cache_dir'])
path.get_or_create_dir(config['core']['config_dir'])
path.get_or_create_dir(config['core']['data_dir'])
def create_initial_config_file(args, extensions_data):
"""Initialize whatever the last config file is with defaults"""
config_file = args.config_files[-1]
if os.path.exists(path.expand_path(config_file)):
return
try:
default = config_lib.format_initial(extensions_data)
path.get_or_create_file(config_file, mkdir=False, content=default)
logger.info('Initialized %s with default config', config_file)
except IOError as error:
logger.warning(
'Unable to initialize %s with default config: %s',
config_file, encoding.locale_decode(error))
def log_extension_info(all_extensions, enabled_extensions):
# TODO: distinguish disabled vs blocked by env?
enabled_names = set(e.ext_name for e in enabled_extensions)
disabled_names = set(e.ext_name for e in all_extensions) - enabled_names
logger.info(
'Enabled extensions: %s', ', '.join(enabled_names) or 'none')
logger.info(
'Disabled extensions: %s', ', '.join(disabled_names) or 'none')
def check_config_errors(config, errors, extensions):
fatal_errors = []
extension_names = {}
all_extension_names = set()
for state in extensions:
extension_names[state] = set(e.ext_name for e in extensions[state])
all_extension_names.update(extension_names[state])
for section in sorted(errors):
if not errors[section]:
continue
if section not in all_extension_names:
logger.warning('Found fatal %s configuration errors:', section)
fatal_errors.append(section)
elif section in extension_names['config']:
del errors[section]['enabled']
logger.warning('Found %s configuration errors, the extension '
'has been automatically disabled:', section)
else:
continue
for field, msg in errors[section].items():
logger.warning(' %s/%s %s', section, field, msg)
if extensions['config']:
logger.warning('Please fix the extension configuration errors or '
'disable the extensions to silence these messages.')
if fatal_errors:
logger.error('Please fix fatal configuration errors, exiting...')
sys.exit(1)
if __name__ == '__main__':
sys.exit(main())
Mopidy-2.0.0/mopidy/http/ 0000775 0001750 0001750 00000000000 12660436443 015450 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/mopidy/http/__init__.py 0000664 0001750 0001750 00000002665 12505224626 017566 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import logging
import os
import mopidy
from mopidy import config as config_lib, exceptions, ext
logger = logging.getLogger(__name__)
class Extension(ext.Extension):
dist_name = 'Mopidy-HTTP'
ext_name = 'http'
version = mopidy.__version__
def get_default_config(self):
conf_file = os.path.join(os.path.dirname(__file__), 'ext.conf')
return config_lib.read(conf_file)
def get_config_schema(self):
schema = super(Extension, self).get_config_schema()
schema['hostname'] = config_lib.Hostname()
schema['port'] = config_lib.Port()
schema['static_dir'] = config_lib.Path(optional=True)
schema['zeroconf'] = config_lib.String(optional=True)
return schema
def validate_environment(self):
try:
import tornado.web # noqa
except ImportError as e:
raise exceptions.ExtensionError('tornado library not found', e)
def setup(self, registry):
from .actor import HttpFrontend
from .handlers import make_mopidy_app_factory
HttpFrontend.apps = registry['http:app']
HttpFrontend.statics = registry['http:static']
registry.add('frontend', HttpFrontend)
registry.add('http:app', {
'name': 'mopidy',
'factory': make_mopidy_app_factory(
registry['http:app'], registry['http:static']),
})
Mopidy-2.0.0/mopidy/http/handlers.py 0000664 0001750 0001750 00000016076 12575004517 017632 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import functools
import logging
import os
import socket
import tornado.escape
import tornado.ioloop
import tornado.web
import tornado.websocket
import mopidy
from mopidy import core, models
from mopidy.internal import encoding, jsonrpc
logger = logging.getLogger(__name__)
def make_mopidy_app_factory(apps, statics):
def mopidy_app_factory(config, core):
return [
(r'/ws/?', WebSocketHandler, {
'core': core,
}),
(r'/rpc', JsonRpcHandler, {
'core': core,
}),
(r'/(.+)', StaticFileHandler, {
'path': os.path.join(os.path.dirname(__file__), 'data'),
}),
(r'/', ClientListHandler, {
'apps': apps,
'statics': statics,
}),
]
return mopidy_app_factory
def make_jsonrpc_wrapper(core_actor):
inspector = jsonrpc.JsonRpcInspector(
objects={
'core.get_uri_schemes': core.Core.get_uri_schemes,
'core.get_version': core.Core.get_version,
'core.history': core.HistoryController,
'core.library': core.LibraryController,
'core.mixer': core.MixerController,
'core.playback': core.PlaybackController,
'core.playlists': core.PlaylistsController,
'core.tracklist': core.TracklistController,
})
return jsonrpc.JsonRpcWrapper(
objects={
'core.describe': inspector.describe,
'core.get_uri_schemes': core_actor.get_uri_schemes,
'core.get_version': core_actor.get_version,
'core.history': core_actor.history,
'core.library': core_actor.library,
'core.mixer': core_actor.mixer,
'core.playback': core_actor.playback,
'core.playlists': core_actor.playlists,
'core.tracklist': core_actor.tracklist,
},
decoders=[models.model_json_decoder],
encoders=[models.ModelJSONEncoder]
)
def _send_broadcast(client, msg):
# We could check for client.ws_connection, but we don't really
# care why the broadcast failed, we just want the rest of them
# to succeed, so catch everything.
try:
client.write_message(msg)
except Exception as e:
error_msg = encoding.locale_decode(e)
logger.debug('Broadcast of WebSocket message to %s failed: %s',
client.request.remote_ip, error_msg)
# TODO: should this do the same cleanup as the on_message code?
class WebSocketHandler(tornado.websocket.WebSocketHandler):
# XXX This set is shared by all WebSocketHandler objects. This isn't
# optimal, but there's currently no use case for having more than one of
# these anyway.
clients = set()
@classmethod
def broadcast(cls, msg):
if hasattr(tornado.ioloop.IOLoop, 'current'):
loop = tornado.ioloop.IOLoop.current()
else:
loop = tornado.ioloop.IOLoop.instance() # Fallback for pre 3.0
# This can be called from outside the Tornado ioloop, so we need to
# safely cross the thread boundary by adding a callback to the loop.
for client in cls.clients:
# One callback per client to keep time we hold up the loop short
# NOTE: Pre 3.0 does not support *args or **kwargs...
loop.add_callback(functools.partial(_send_broadcast, client, msg))
def initialize(self, core):
self.jsonrpc = make_jsonrpc_wrapper(core)
def open(self):
if hasattr(self, 'set_nodelay'):
# New in Tornado 3.1
self.set_nodelay(True)
else:
self.stream.socket.setsockopt(
socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
self.clients.add(self)
logger.debug(
'New WebSocket connection from %s', self.request.remote_ip)
def on_close(self):
self.clients.discard(self)
logger.debug(
'Closed WebSocket connection from %s',
self.request.remote_ip)
def on_message(self, message):
if not message:
return
logger.debug(
'Received WebSocket message from %s: %r',
self.request.remote_ip, message)
try:
response = self.jsonrpc.handle_json(
tornado.escape.native_str(message))
if response and self.write_message(response):
logger.debug(
'Sent WebSocket message to %s: %r',
self.request.remote_ip, response)
except Exception as e:
error_msg = encoding.locale_decode(e)
logger.error('WebSocket request error: %s', error_msg)
if self.ws_connection:
# Tornado 3.2+ checks if self.ws_connection is None before
# using it, but not older versions.
self.close()
def check_origin(self, origin):
# Allow cross-origin WebSocket connections, like Tornado before 4.0
# defaulted to.
return True
def set_mopidy_headers(request_handler):
request_handler.set_header('Cache-Control', 'no-cache')
request_handler.set_header(
'X-Mopidy-Version', mopidy.__version__.encode('utf-8'))
class JsonRpcHandler(tornado.web.RequestHandler):
def initialize(self, core):
self.jsonrpc = make_jsonrpc_wrapper(core)
def head(self):
self.set_extra_headers()
self.finish()
def post(self):
data = self.request.body
if not data:
return
logger.debug(
'Received RPC message from %s: %r', self.request.remote_ip, data)
try:
self.set_extra_headers()
response = self.jsonrpc.handle_json(
tornado.escape.native_str(data))
if response and self.write(response):
logger.debug(
'Sent RPC message to %s: %r',
self.request.remote_ip, response)
except Exception as e:
logger.error('HTTP JSON-RPC request error: %s', e)
self.write_error(500)
def set_extra_headers(self):
set_mopidy_headers(self)
self.set_header('Accept', 'application/json')
self.set_header('Content-Type', 'application/json; utf-8')
class ClientListHandler(tornado.web.RequestHandler):
def initialize(self, apps, statics):
self.apps = apps
self.statics = statics
def get_template_path(self):
return os.path.dirname(__file__)
def get(self):
set_mopidy_headers(self)
names = set()
for app in self.apps:
names.add(app['name'])
for static in self.statics:
names.add(static['name'])
names.discard('mopidy')
self.render('data/clients.html', apps=sorted(list(names)))
class StaticFileHandler(tornado.web.StaticFileHandler):
def set_extra_headers(self, path):
set_mopidy_headers(self)
class AddSlashHandler(tornado.web.RequestHandler):
@tornado.web.addslash
def prepare(self):
return super(AddSlashHandler, self).prepare()
Mopidy-2.0.0/mopidy/http/ext.conf 0000664 0001750 0001750 00000000157 12441116635 017115 0 ustar jodal jodal 0000000 0000000 [http]
enabled = true
hostname = 127.0.0.1
port = 6680
static_dir =
zeroconf = Mopidy HTTP server on $hostname
Mopidy-2.0.0/mopidy/http/actor.py 0000664 0001750 0001750 00000013500 12660436420 017124 0 ustar jodal jodal 0000000 0000000 from __future__ import absolute_import, unicode_literals
import json
import logging
import os
import threading
import pykka
import tornado.httpserver
import tornado.ioloop
import tornado.netutil
import tornado.web
import tornado.websocket
from mopidy import exceptions, models, zeroconf
from mopidy.core import CoreListener
from mopidy.http import handlers
from mopidy.internal import encoding, formatting, network
logger = logging.getLogger(__name__)
class HttpFrontend(pykka.ThreadingActor, CoreListener):
apps = []
statics = []
def __init__(self, config, core):
super(HttpFrontend, self).__init__()
self.hostname = network.format_hostname(config['http']['hostname'])
self.port = config['http']['port']
tornado_hostname = config['http']['hostname']
if tornado_hostname == '::':
tornado_hostname = None
try:
logger.debug('Starting HTTP server')
sockets = tornado.netutil.bind_sockets(self.port, tornado_hostname)
self.server = HttpServer(
config=config, core=core, sockets=sockets,
apps=self.apps, statics=self.statics)
except IOError as error:
raise exceptions.FrontendError(
'HTTP server startup failed: %s' %
encoding.locale_decode(error))
self.zeroconf_name = config['http']['zeroconf']
self.zeroconf_http = None
self.zeroconf_mopidy_http = None
def on_start(self):
logger.info(
'HTTP server running at [%s]:%s', self.hostname, self.port)
self.server.start()
if self.zeroconf_name:
self.zeroconf_http = zeroconf.Zeroconf(
name=self.zeroconf_name,
stype='_http._tcp',
port=self.port)
self.zeroconf_mopidy_http = zeroconf.Zeroconf(
name=self.zeroconf_name,
stype='_mopidy-http._tcp',
port=self.port)
self.zeroconf_http.publish()
self.zeroconf_mopidy_http.publish()
def on_stop(self):
if self.zeroconf_http:
self.zeroconf_http.unpublish()
if self.zeroconf_mopidy_http:
self.zeroconf_mopidy_http.unpublish()
self.server.stop()
def on_event(self, name, **data):
on_event(name, **data)
def on_event(name, **data):
event = data
event['event'] = name
message = json.dumps(event, cls=models.ModelJSONEncoder)
handlers.WebSocketHandler.broadcast(message)
class HttpServer(threading.Thread):
name = 'HttpServer'
def __init__(self, config, core, sockets, apps, statics):
super(HttpServer, self).__init__()
self.config = config
self.core = core
self.sockets = sockets
self.apps = apps
self.statics = statics
self.app = None
self.server = None
def run(self):
self.app = tornado.web.Application(self._get_request_handlers())
self.server = tornado.httpserver.HTTPServer(self.app)
self.server.add_sockets(self.sockets)
tornado.ioloop.IOLoop.instance().start()
logger.debug('Stopped HTTP server')
def stop(self):
logger.debug('Stopping HTTP server')
tornado.ioloop.IOLoop.instance().add_callback(
tornado.ioloop.IOLoop.instance().stop)
def _get_request_handlers(self):
request_handlers = []
request_handlers.extend(self._get_app_request_handlers())
request_handlers.extend(self._get_static_request_handlers())
request_handlers.extend(self._get_mopidy_request_handlers())
logger.debug(
'HTTP routes from extensions: %s',
formatting.indent('\n'.join(
'%r: %r' % (r[0], r[1]) for r in request_handlers)))
return request_handlers
def _get_app_request_handlers(self):
result = []
for app in self.apps:
try:
request_handlers = app['factory'](self.config, self.core)
except Exception:
logger.exception('Loading %s failed.', app['name'])
continue
result.append((
r'/%s' % app['name'],
handlers.AddSlashHandler
))
for handler in request_handlers:
handler = list(handler)
handler[0] = '/%s%s' % (app['name'], handler[0])
result.append(tuple(handler))
logger.debug('Loaded HTTP extension: %s', app['name'])
return result
def _get_static_request_handlers(self):
result = []
for static in self.statics:
result.append((
r'/%s' % static['name'],
handlers.AddSlashHandler
))
result.append((
r'/%s/(.*)' % static['name'],
handlers.StaticFileHandler,
{
'path': static['path'],
'default_filename': 'index.html'
}
))
logger.debug('Loaded static HTTP extension: %s', static['name'])
return result
def _get_mopidy_request_handlers(self):
# Either default Mopidy or user defined path to files
static_dir = self.config['http']['static_dir']
if static_dir and not os.path.exists(static_dir):
logger.warning(
'Configured http/static_dir %s does not exist. '
'Falling back to default HTTP handler.', static_dir)
static_dir = None
if static_dir:
return [(r'/(.*)', handlers.StaticFileHandler, {
'path': self.config['http']['static_dir'],
'default_filename': 'index.html',
})]
else:
return [(r'/', tornado.web.RedirectHandler, {
'url': '/mopidy/',
'permanent': False,
})]
Mopidy-2.0.0/mopidy/http/data/ 0000775 0001750 0001750 00000000000 12660436443 016361 5 ustar jodal jodal 0000000 0000000 Mopidy-2.0.0/mopidy/http/data/mopidy.min.js 0000664 0001750 0001750 00000074461 12505224626 021012 0 ustar jodal jodal 0000000 0000000 /*! Mopidy.js v0.5.0 - built 2015-01-31
* http://www.mopidy.com/
* Copyright (c) 2015 Stein Magnus Jodal and contributors
* Licensed under the Apache License, Version 2.0 */
!function(a){if("object"==typeof exports)module.exports=a();else if("function"==typeof define&&define.amd)define(a);else{var b;"undefined"!=typeof window?b=window:"undefined"!=typeof global?b=global:"undefined"!=typeof self&&(b=self),b.Mopidy=a()}}(function(){var a;return function b(a,c,d){function e(g,h){if(!c[g]){if(!a[g]){var i="function"==typeof require&&require;if(!h&&i)return i(g,!0);if(f)return f(g,!0);throw new Error("Cannot find module '"+g+"'")}var j=c[g]={exports:{}};a[g][0].call(j.exports,function(b){var c=a[g][1][b];return e(c?c:b)},j,j.exports,b,a,c,d)}return c[g].exports}for(var f="function"==typeof require&&require,g=0;g0)for(d=0;e>d;++d)c[d](a,b);else setTimeout(function(){throw b.message=a+" listener threw error: "+b.message,b},0)}function b(a){if("function"!=typeof a)throw new TypeError("Listener is not function");return a}function c(a){return a.supervisors||(a.supervisors=[]),a.supervisors}function d(a,b){return a.listeners||(a.listeners={}),b&&!a.listeners[b]&&(a.listeners[b]=[]),b?a.listeners[b]:a.listeners}function e(a){return a.errbacks||(a.errbacks=[]),a.errbacks}function f(f){function h(b,c,d){try{c.listener.apply(c.thisp||f,d)}catch(g){a(b,g,e(f))}}return f=f||{},f.on=function(a,e,f){return"function"==typeof a?c(this).push({listener:a,thisp:e}):void d(this,a).push({listener:b(e),thisp:f})},f.off=function(a,b){var f,g,h,i;if(!a){f=c(this),f.splice(0,f.length),g=d(this);for(h in g)g.hasOwnProperty(h)&&(f=d(this,h),f.splice(0,f.length));return f=e(this),void f.splice(0,f.length)}if("function"==typeof a?(f=c(this),b=a):f=d(this,a),!b)return void f.splice(0,f.length);for(h=0,i=f.length;i>h;++h)if(f[h].listener===b)return void f.splice(h,1)},f.once=function(a,b,c){var d=function(){f.off(a,d),b.apply(this,arguments)};f.on(a,d,c)},f.bind=function(a,b){var c,d,e;if(b)for(d=0,e=b.length;e>d;++d){if("function"!=typeof a[b[d]])throw new Error("No such method "+b[d]);this.on(b[d],a[b[d]],a)}else for(c in a)"function"==typeof a[c]&&this.on(c,a[c],a);return a},f.emit=function(a){var b,e,f=c(this),i=g.call(arguments);for(b=0,e=f.length;e>b;++b)h(a,f[b],i);for(f=d(this,a).slice(),i=g.call(arguments,1),b=0,e=f.length;e>b;++b)h(a,f[b],i)},f.errback=function(a){this.errbacks||(this.errbacks=[]),this.errbacks.push(b(a))},f}var g=Array.prototype.slice;return{createEventEmitter:f,aggregate:function(a){var b=f();return a.forEach(function(a){a.on(function(a,c){b.emit(a,c)})}),b}}})},{}],3:[function(a,b){function c(){}var d=b.exports={};d.nextTick=function(){var a="undefined"!=typeof window&&window.setImmediate,b="undefined"!=typeof window&&window.postMessage&&window.addEventListener;if(a)return function(a){return window.setImmediate(a)};if(b){var c=[];return window.addEventListener("message",function(a){var b=a.source;if((b===window||null===b)&&"process-tick"===a.data&&(a.stopPropagation(),c.length>0)){var d=c.shift();d()}},!0),function(a){c.push(a),window.postMessage("process-tick","*")}}return function(a){setTimeout(a,0)}}(),d.title="browser",d.browser=!0,d.env={},d.argv=[],d.on=c,d.addListener=c,d.once=c,d.off=c,d.removeListener=c,d.removeAllListeners=c,d.emit=c,d.binding=function(){throw new Error("process.binding is not supported")},d.cwd=function(){return"/"},d.chdir=function(){throw new Error("process.chdir is not supported")}},{}],4:[function(b,c){!function(a){"use strict";a(function(a){var b=a("./makePromise"),c=a("./Scheduler"),d=a("./env").asap;return b({scheduler:new c(d)})})}("function"==typeof a&&a.amd?a:function(a){c.exports=a(b)})},{"./Scheduler":5,"./env":17,"./makePromise":19}],5:[function(b,c){!function(a){"use strict";a(function(){function a(a){this._async=a,this._running=!1,this._queue=this,this._queueLen=0,this._afterQueue={},this._afterQueueLen=0;var b=this;this.drain=function(){b._drain()}}return a.prototype.enqueue=function(a){this._queue[this._queueLen++]=a,this.run()},a.prototype.afterQueue=function(a){this._afterQueue[this._afterQueueLen++]=a,this.run()},a.prototype.run=function(){this._running||(this._running=!0,this._async(this.drain))},a.prototype._drain=function(){for(var a=0;a>>0,j=i,k=[],l=0;i>l;++l)if(f=b[l],void 0!==f||l in b){if(e=a._handler(f),e.state()>0){h.become(e),a._visitRemaining(b,l,e);break}e.visit(h,c,d)}else--j;return 0===j&&h.reject(new RangeError("any(): array must not be empty")),g}function e(b,c){function d(a){this.resolved||(k.push(a),0===--n&&(l=null,this.resolve(k)))}function e(a){this.resolved||(l.push(a),0===--f&&(k=null,this.reject(l)))}var f,g,h,i=a._defer(),j=i._handler,k=[],l=[],m=b.length>>>0,n=0;for(h=0;m>h;++h)g=b[h],(void 0!==g||h in b)&&++n;for(c=Math.max(c,0),f=n-c+1,n=Math.min(c,n),c>n?j.reject(new RangeError("some(): array must contain at least "+c+" item(s), but had "+n)):0===n&&j.resolve(k),h=0;m>h;++h)g=b[h],(void 0!==g||h in b)&&a._handler(g).visit(j,d,e,j.notify);return i}function f(b,c){return a._traverse(c,b)}function g(b,c){var d=s.call(b);return a._traverse(c,d).then(function(a){return h(d,a)})}function h(b,c){for(var d=c.length,e=new Array(d),f=0,g=0;d>f;++f)c[f]&&(e[g++]=a._handler(b[f]).value);return e.length=g,e}function i(a){return p(a.map(j))}function j(c){var d=a._handler(c);return 0===d.state()?o(c).then(b.fulfilled,b.rejected):(d._unreport(),b.inspect(d))}function k(a,b){return arguments.length>2?q.call(a,m(b),arguments[2]):q.call(a,m(b))}function l(a,b){return arguments.length>2?r.call(a,m(b),arguments[2]):r.call(a,m(b))}function m(a){return function(b,c,d){return n(a,void 0,[b,c,d])}}var n=c(a),o=a.resolve,p=a.all,q=Array.prototype.reduce,r=Array.prototype.reduceRight,s=Array.prototype.slice;return a.any=d,a.some=e,a.settle=i,a.map=f,a.filter=g,a.reduce=k,a.reduceRight=l,a.prototype.spread=function(a){return this.then(p).then(function(b){return a.apply(this,b)})},a}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a(b)})},{"../apply":7,"../state":20}],9:[function(b,c){!function(a){"use strict";a(function(){function a(){throw new TypeError("catch predicate must be a function")}function b(a,b){return c(b)?a instanceof b:b(a)}function c(a){return a===Error||null!=a&&a.prototype instanceof Error}function d(a){return("object"==typeof a||"function"==typeof a)&&null!==a}function e(a){return a}return function(c){function f(a,c){return function(d){return b(d,c)?a.call(this,d):j(d)}}function g(a,b,c,e){var f=a.call(b);return d(f)?h(f,c,e):c(e)}function h(a,b,c){return i(a).then(function(){return b(c)})}var i=c.resolve,j=c.reject,k=c.prototype["catch"];return c.prototype.done=function(a,b){this._handler.visit(this._handler.receiver,a,b)},c.prototype["catch"]=c.prototype.otherwise=function(b){return arguments.length<2?k.call(this,b):"function"!=typeof b?this.ensure(a):k.call(this,f(arguments[1],b))},c.prototype["finally"]=c.prototype.ensure=function(a){return"function"!=typeof a?this:this.then(function(b){return g(a,this,e,b)},function(b){return g(a,this,j,b)})},c.prototype["else"]=c.prototype.orElse=function(a){return this.then(void 0,function(){return a})},c.prototype["yield"]=function(a){return this.then(function(){return a})},c.prototype.tap=function(a){return this.then(a)["yield"](this)},c}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a()})},{}],10:[function(b,c){!function(a){"use strict";a(function(){return function(a){return a.prototype.fold=function(b,c){var d=this._beget();return this._handler.fold(function(c,d,e){a._handler(c).fold(function(a,c,d){d.resolve(b.call(this,c,a))},d,this,e)},c,d._handler.receiver,d._handler),d},a}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a()})},{}],11:[function(b,c){!function(a){"use strict";a(function(a){var b=a("../state").inspect;return function(a){return a.prototype.inspect=function(){return b(a._handler(this))},a}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a(b)})},{"../state":20}],12:[function(b,c){!function(a){"use strict";a(function(){return function(a){function b(a,b,d,e){return c(function(b){return[b,a(b)]},b,d,e)}function c(a,b,e,f){function g(f,g){return d(e(f)).then(function(){return c(a,b,e,g)})}return d(f).then(function(c){return d(b(c)).then(function(b){return b?c:d(a(c)).spread(g)})})}var d=a.resolve;return a.iterate=b,a.unfold=c,a}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a()})},{}],13:[function(b,c){!function(a){"use strict";a(function(){return function(a){return a.prototype.progress=function(a){return this.then(void 0,void 0,a)},a}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a()})},{}],14:[function(b,c){!function(a){"use strict";a(function(a){function b(a,b,d,e){return c.setTimer(function(){a(d,e,b)},b)}var c=a("../env"),d=a("../TimeoutError");return function(a){function e(a,c,d){b(f,a,c,d)}function f(a,b){b.resolve(a)}function g(a,b,c){var e="undefined"==typeof a?new d("timed out after "+c+"ms"):a;b.reject(e)}return a.prototype.delay=function(a){var b=this._beget();return this._handler.fold(e,a,void 0,b._handler),b},a.prototype.timeout=function(a,d){var e=this._beget(),f=e._handler,h=b(g,a,d,e._handler);return this._handler.visit(f,function(a){c.clearTimer(h),this.resolve(a)},function(a){c.clearTimer(h),this.reject(a)},f.notify),e},a}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a(b)})},{"../TimeoutError":6,"../env":17}],15:[function(b,c){!function(a){"use strict";a(function(a){function b(a){throw a}function c(){}var d=a("../env").setTimer,e=a("../format");return function(a){function f(a){a.handled||(n.push(a),k("Potentially unhandled rejection ["+a.id+"] "+e.formatError(a.value)))}function g(a){var b=n.indexOf(a);b>=0&&(n.splice(b,1),l("Handled previous rejection ["+a.id+"] "+e.formatObject(a.value)))}function h(a,b){m.push(a,b),null===o&&(o=d(i,0))}function i(){for(o=null;m.length>0;)m.shift()(m.shift())}var j,k=c,l=c;"undefined"!=typeof console&&(j=console,k="undefined"!=typeof j.error?function(a){j.error(a)}:function(a){j.log(a)},l="undefined"!=typeof j.info?function(a){j.info(a)}:function(a){j.log(a)}),a.onPotentiallyUnhandledRejection=function(a){h(f,a)},a.onPotentiallyUnhandledRejectionHandled=function(a){h(g,a)},a.onFatalRejection=function(a){h(b,a.value)};var m=[],n=[],o=null;return a}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a(b)})},{"../env":17,"../format":18}],16:[function(b,c){!function(a){"use strict";a(function(){return function(a){return a.prototype["with"]=a.prototype.withThis=function(a){var b=this._beget(),c=b._handler;return c.receiver=a,this._handler.chain(c,a),b},a}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a()})},{}],17:[function(b,c){(function(d){!function(a){"use strict";a(function(a){function b(){return"undefined"!=typeof d&&null!==d&&"function"==typeof d.nextTick}function c(){return"function"==typeof MutationObserver&&MutationObserver||"function"==typeof WebKitMutationObserver&&WebKitMutationObserver}function e(a){function b(){var a=c;c=void 0,a()}var c,d=document.createTextNode(""),e=new a(b);e.observe(d,{characterData:!0});var f=0;return function(a){c=a,d.data=f^=1}}var f,g="undefined"!=typeof setTimeout&&setTimeout,h=function(a,b){return setTimeout(a,b)},i=function(a){return clearTimeout(a)},j=function(a){return g(a,0)};if(b())j=function(a){return d.nextTick(a)};else if(f=c())j=e(f);else if(!g){var k=a,l=k("vertx");h=function(a,b){return l.setTimer(b,a)},i=l.cancelTimer,j=l.runOnLoop||l.runOnContext}return{setTimer:h,clearTimer:i,asap:j}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a(b)})}).call(this,b("FWaASH"))},{FWaASH:3}],18:[function(b,c){!function(a){"use strict";a(function(){function a(a){var c="object"==typeof a&&null!==a&&a.stack?a.stack:b(a);return a instanceof Error?c:c+" (WARNING: non-Error used)"}function b(a){var b=String(a);return"[object Object]"===b&&"undefined"!=typeof JSON&&(b=c(a,b)),b}function c(a,b){try{return JSON.stringify(a)}catch(c){return b}}return{formatError:a,formatObject:b,tryStringify:c}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a()})},{}],19:[function(b,c){(function(b){!function(a){"use strict";a(function(){return function(a){function c(a,b){this._handler=a===u?b:d(a)}function d(a){function b(a){e.resolve(a)}function c(a){e.reject(a)}function d(a){e.notify(a)}var e=new w;try{a(b,c,d)}catch(f){c(f)}return e}function e(a){return J(a)?a:new c(u,new x(r(a)))}function f(a){return new c(u,new x(new A(a)))}function g(){return ab}function h(){return new c(u,new w)}function i(a,b){var c=new w(a.receiver,a.join().context);return new b(u,c)}function j(a){return l(T,null,a)}function k(a,b){return l(O,a,b)}function l(a,b,d){function e(c,e,g){g.resolved||m(d,f,c,a(b,e,c),g)}function f(a,b,c){k[a]=b,0===--j&&c.become(new z(k))}for(var g,h="function"==typeof b?e:f,i=new w,j=d.length>>>0,k=new Array(j),l=0;l0?b(c,f.value,e):(e.become(f),n(a,c+1,f))}else b(c,d,e)}function n(a,b,c){for(var d=b;dc&&a._unreport()}}function p(a){return"object"!=typeof a||null===a?f(new TypeError("non-iterable passed to race()")):0===a.length?g():1===a.length?e(a[0]):q(a)}function q(a){var b,d,e,f=new w;for(b=0;b0||"function"!=typeof b&&0>e)return new this.constructor(u,d);var f=this._beget(),g=f._handler;return d.chain(g,d.receiver,a,b,c),f},c.prototype["catch"]=function(a){return this.then(void 0,a)},c.prototype._beget=function(){return i(this._handler,this.constructor)},c.all=j,c.race=p,c._traverse=k,c._visitRemaining=n,u.prototype.when=u.prototype.become=u.prototype.notify=u.prototype.fail=u.prototype._unreport=u.prototype._report=U,u.prototype._state=0,u.prototype.state=function(){return this._state},u.prototype.join=function(){for(var a=this;void 0!==a.handler;)a=a.handler;return a},u.prototype.chain=function(a,b,c,d,e){this.when({resolver:a,receiver:b,fulfilled:c,rejected:d,progress:e})},u.prototype.visit=function(a,b,c,d){this.chain(Z,a,b,c,d)},u.prototype.fold=function(a,b,c,d){this.when(new I(a,b,c,d))},S(u,v),v.prototype.become=function(a){a.fail()};var Z=new v;S(u,w),w.prototype._state=0,w.prototype.resolve=function(a){this.become(r(a))},w.prototype.reject=function(a){this.resolved||this.become(new A(a))},w.prototype.join=function(){if(!this.resolved)return this;for(var a=this;void 0!==a.handler;)if(a=a.handler,a===this)return this.handler=D();return a},w.prototype.run=function(){var a=this.consumers,b=this.handler;this.handler=this.handler.join(),this.consumers=void 0;for(var c=0;c0?c(d.value):b(d.value)}return{pending:a,fulfilled:c,rejected:b,inspect:d}})}("function"==typeof a&&a.amd?a:function(a){c.exports=a()})},{}],21:[function(b,c){!function(a){"use strict";a(function(a){function b(a,b,c,d){var e=x.resolve(a);return arguments.length<2?e:e.then(b,c,d)}function c(a){return new x(a)}function d(a){return function(){for(var b=0,c=arguments.length,d=new Array(c);c>b;++b)d[b]=arguments[b];return y(a,this,d)}}function e(a){for(var b=0,c=arguments.length-1,d=new Array(c);c>b;++b)d[b]=arguments[b+1];return y(a,this,d)}function f(){return new g}function g(){function a(a){d._handler.resolve(a)}function b(a){d._handler.reject(a)}function c(a){d._handler.notify(a)}var d=x._defer();this.promise=d,this.resolve=a,this.reject=b,this.notify=c,this.resolver={resolve:a,reject:b,notify:c}}function h(a){return a&&"function"==typeof a.then}function i(){return x.all(arguments)}function j(a){return b(a,x.all)}function k(a){return b(a,x.settle)}function l(a,c){return b(a,function(a){return x.map(a,c)})}function m(a,c){return b(a,function(a){return x.filter(a,c)})}var n=a("./lib/decorators/timed"),o=a("./lib/decorators/array"),p=a("./lib/decorators/flow"),q=a("./lib/decorators/fold"),r=a("./lib/decorators/inspect"),s=a("./lib/decorators/iterate"),t=a("./lib/decorators/progress"),u=a("./lib/decorators/with"),v=a("./lib/decorators/unhandledRejection"),w=a("./lib/TimeoutError"),x=[o,p,q,s,t,r,u,n,v].reduce(function(a,b){return b(a)},a("./lib/Promise")),y=a("./lib/apply")(x);return b.promise=c,b.resolve=x.resolve,b.reject=x.reject,b.lift=d,b["try"]=e,b.attempt=e,b.iterate=x.iterate,b.unfold=x.unfold,b.join=i,b.all=j,b.settle=k,b.any=d(x.any),b.some=d(x.some),b.race=d(x.race),b.map=l,b.filter=m,b.reduce=d(x.reduce),b.reduceRight=d(x.reduceRight),b.isPromiseLike=h,b.Promise=x,b.defer=f,b.TimeoutError=w,b})}("function"==typeof a&&a.amd?a:function(a){c.exports=a(b)})},{"./lib/Promise":4,"./lib/TimeoutError":6,"./lib/apply":7,"./lib/decorators/array":8,"./lib/decorators/flow":9,"./lib/decorators/fold":10,"./lib/decorators/inspect":11,"./lib/decorators/iterate":12,"./lib/decorators/progress":13,"./lib/decorators/timed":14,"./lib/decorators/unhandledRejection":15,"./lib/decorators/with":16}],22:[function(a,b){function c(a){return this instanceof c?(this._console=this._getConsole(a||{}),this._settings=this._configure(a||{}),this._backoffDelay=this._settings.backoffDelayMin,this._pendingRequests={},this._webSocket=null,d.createEventEmitter(this),this._delegateEvents(),void(this._settings.autoConnect&&this.connect())):new c(a)}var d=a("bane"),e=a("../lib/websocket/"),f=a("when");c.ConnectionError=function(a){this.name="ConnectionError",this.message=a},c.ConnectionError.prototype=Object.create(Error.prototype),c.ConnectionError.prototype.constructor=c.ConnectionError,c.ServerError=function(a){this.name="ServerError",this.message=a},c.ServerError.prototype=Object.create(Error.prototype),c.ServerError.prototype.constructor=c.ServerError,c.WebSocket=e.Client,c.when=f,c.prototype._getConsole=function(a){if("undefined"!=typeof a.console)return a.console;var b="undefined"!=typeof console&&console||{};return b.log=b.log||function(){},b.warn=b.warn||function(){},b.error=b.error||function(){},b},c.prototype._configure=function(a){var b="undefined"!=typeof document&&"https:"===document.location.protocol?"wss://":"ws://",c="undefined"!=typeof document&&document.location.host||"localhost";return a.webSocketUrl=a.webSocketUrl||b+c+"/mopidy/ws",a.autoConnect!==!1&&(a.autoConnect=!0),a.backoffDelayMin=a.backoffDelayMin||1e3,a.backoffDelayMax=a.backoffDelayMax||64e3,"undefined"==typeof a.callingConvention&&this._console.warn("Mopidy.js is using the default calling convention. The default will change in the future. You should explicitly specify which calling convention you use."),a.callingConvention=a.callingConvention||"by-position-only",a},c.prototype._delegateEvents=function(){this.off("websocket:close"),this.off("websocket:error"),this.off("websocket:incomingMessage"),this.off("websocket:open"),this.off("state:offline"),this.on("websocket:close",this._cleanup),this.on("websocket:error",this._handleWebSocketError),this.on("websocket:incomingMessage",this._handleMessage),this.on("websocket:open",this._resetBackoffDelay),this.on("websocket:open",this._getApiSpec),this.on("state:offline",this._reconnect)},c.prototype.connect=function(){if(this._webSocket){if(this._webSocket.readyState===c.WebSocket.OPEN)return;this._webSocket.close()}this._webSocket=this._settings.webSocket||new c.WebSocket(this._settings.webSocketUrl),this._webSocket.onclose=function(a){this.emit("websocket:close",a)}.bind(this),this._webSocket.onerror=function(a){this.emit("websocket:error",a)}.bind(this),this._webSocket.onopen=function(){this.emit("websocket:open")}.bind(this),this._webSocket.onmessage=function(a){this.emit("websocket:incomingMessage",a)}.bind(this)},c.prototype._cleanup=function(a){Object.keys(this._pendingRequests).forEach(function(b){var d=this._pendingRequests[b];delete this._pendingRequests[b];var e=new c.ConnectionError("WebSocket closed");e.closeEvent=a,d.reject(e)}.bind(this)),this.emit("state:offline")},c.prototype._reconnect=function(){this.emit("reconnectionPending",{timeToAttempt:this._backoffDelay}),setTimeout(function(){this.emit("reconnecting"),this.connect()}.bind(this),this._backoffDelay),this._backoffDelay=2*this._backoffDelay,this._backoffDelay>this._settings.backoffDelayMax&&(this._backoffDelay=this._settings.backoffDelayMax)},c.prototype._resetBackoffDelay=function(){this._backoffDelay=this._settings.backoffDelayMin},c.prototype.close=function(){this.off("state:offline",this._reconnect),this._webSocket.close()},c.prototype._handleWebSocketError=function(a){this._console.warn("WebSocket error:",a.stack||a)},c.prototype._send=function(a){switch(this._webSocket.readyState){case c.WebSocket.CONNECTING:return f.reject(new c.ConnectionError("WebSocket is still connecting"));case c.WebSocket.CLOSING:return f.reject(new c.ConnectionError("WebSocket is closing"));case c.WebSocket.CLOSED:return f.reject(new c.ConnectionError("WebSocket is closed"));default:var b=f.defer();return a.jsonrpc="2.0",a.id=this._nextRequestId(),this._pendingRequests[a.id]=b.resolver,this._webSocket.send(JSON.stringify(a)),this.emit("websocket:outgoingMessage",a),b.promise}},c.prototype._nextRequestId=function(){var a=-1;return function(){return a+=1}}(),c.prototype._handleMessage=function(a){try{var b=JSON.parse(a.data);b.hasOwnProperty("id")?this._handleResponse(b):b.hasOwnProperty("event")?this._handleEvent(b):this._console.warn("Unknown message type received. Message was: "+a.data)}catch(c){if(!(c instanceof SyntaxError))throw c;this._console.warn("WebSocket message parsing failed. Message was: "+a.data)}},c.prototype._handleResponse=function(a){if(!this._pendingRequests.hasOwnProperty(a.id))return void this._console.warn("Unexpected response received. Message was:",a);var b,d=this._pendingRequests[a.id];delete this._pendingRequests[a.id],a.hasOwnProperty("result")?d.resolve(a.result):a.hasOwnProperty("error")?(b=new c.ServerError(a.error.message),b.code=a.error.code,b.data=a.error.data,d.reject(b),this._console.warn("Server returned error:",a.error)):(b=new Error("Response without 'result' or 'error' received"),b.data={response:a},d.reject(b),this._console.warn("Response without 'result' or 'error' received. Message was:",a))},c.prototype._handleEvent=function(a){var b=a.event,c=a;delete c.event,this.emit("event:"+this._snakeToCamel(b),c)},c.prototype._getApiSpec=function(){return this._send({method:"core.describe"}).then(this._createApi.bind(this))["catch"](this._handleWebSocketError)},c.prototype._createApi=function(a){var b="by-position-or-by-name"===this._settings.callingConvention,c=function(a){return function(){var c={method:a};return 0===arguments.length?this._send(c):b?arguments.length>1?f.reject(new Error("Expected zero arguments, a single array, or a single object.")):Array.isArray(arguments[0])||arguments[0]===Object(arguments[0])?(c.params=arguments[0],this._send(c)):f.reject(new TypeError("Expected an array or an object.")):(c.params=Array.prototype.slice.call(arguments),this._send(c))}.bind(this)}.bind(this),d=function(a){var b=a.split(".");return b.length>=1&&"core"===b[0]&&(b=b.slice(1)),b},e=function(a){var b=this;return a.forEach(function(a){a=this._snakeToCamel(a),b[a]=b[a]||{},b=b[a]}.bind(this)),b}.bind(this),g=function(b){var f=d(b),g=this._snakeToCamel(f.slice(-1)[0]),h=e(f.slice(0,-1));h[g]=c(b),h[g].description=a[b].description,h[g].params=a[b].params}.bind(this);Object.keys(a).forEach(g),this.emit("state:online")},c.prototype._snakeToCamel=function(a){return a.replace(/(_[a-z])/g,function(a){return a.toUpperCase().replace("_","")})},b.exports=c},{"../lib/websocket/":1,bane:2,when:21}]},{},[22])(22)}); Mopidy-2.0.0/mopidy/http/data/favicon.ico 0000664 0001750 0001750 00000013555 12441116635 020506 0 ustar jodal jodal 0000000 0000000 PNG
IHDR >a bKGD pHYs tIME+3 tEXtComment Created with GIMPW IDATxo(,ɇ[ovt-'E l:v|EIaiӌF#sdg{x Q
ݵόr# >{a@wHxx
7ähD>'tG'B]pPw#ⓁNyA |m pO:[ ^g8V#w> h &tCH| X G=n L 0 @ t ={6nsÈa۶]e|X,v\,! "`ٽӕ^7j4Mc8vz}}}*b脦~PNq-@$j.
뺄 }eie0i;pC\
}%01 I\r&ȨS^/:1FкO[ Ś =
p| P: 8q²a1]y۶ 0b~QeK^0\=Oݠk`+@$I 18y4MdXue-H\uQKaR$IM@@f09GI