Release v1.1.0
This commit is contained in:
commit
57670e3a2f
5
.mailmap
5
.mailmap
@ -23,4 +23,7 @@ Ignasi Fosch <natx@y10k.ws> <ifosch@serenity-2.local>
|
||||
Christopher Schirner <christopher@hackerspace-bamberg.de> <schinken@hackerspace-bamberg.de>
|
||||
Laura Barber <laura.c.barber@gmail.com> <artzii.laura@gmail.com>
|
||||
John Cass <john.cass77@gmail.com>
|
||||
Ronald Zielaznicki <zielaznickiz@g.cofc.edu>
|
||||
Ronald Zielaznicki <zielaznickizm@g.cofc.edu> <zielaznickiz@g.cofc.edu>
|
||||
Kyle Heyne <kyleheyne@gmail.com>
|
||||
Tom Roth <rawdlite@googlemail.com>
|
||||
Eric Jahn <ejahn@newstore.com>
|
||||
|
||||
14
.travis.yml
14
.travis.yml
@ -1,8 +1,18 @@
|
||||
sudo: false
|
||||
|
||||
language: python
|
||||
|
||||
python:
|
||||
- "2.7_with_system_site_packages"
|
||||
|
||||
addons:
|
||||
apt:
|
||||
sources:
|
||||
- mopidy-stable
|
||||
packages:
|
||||
- graphviz-dev
|
||||
- mopidy
|
||||
|
||||
env:
|
||||
- TOX_ENV=py27
|
||||
- TOX_ENV=py27-tornado23
|
||||
@ -11,10 +21,6 @@ env:
|
||||
- TOX_ENV=flake8
|
||||
|
||||
install:
|
||||
- "wget -O - http://apt.mopidy.com/mopidy.gpg | sudo apt-key add -"
|
||||
- "sudo wget -O /etc/apt/sources.list.d/mopidy.list http://apt.mopidy.com/mopidy.list"
|
||||
- "sudo apt-get update || true"
|
||||
- "sudo apt-get install mopidy graphviz-dev"
|
||||
- "pip install tox"
|
||||
|
||||
script:
|
||||
|
||||
16
AUTHORS
16
AUTHORS
@ -52,4 +52,18 @@
|
||||
- John Cass <john.cass77@gmail.com>
|
||||
- Laura Barber <laura.c.barber@gmail.com>
|
||||
- Jakab Kristóf <jaksi07c8@gmail.com>
|
||||
- Ronald Zielaznicki <zielaznickiz@g.cofc.edu>
|
||||
- Ronald Zielaznicki <zielaznickizm@g.cofc.edu>
|
||||
- Wojciech Wnętrzak <w.wnetrzak@gmail.com>
|
||||
- Camilo Nova <camilo.nova@gmail.com>
|
||||
- Dražen Lučanin <kermit666@gmail.com>
|
||||
- Naglis Jonaitis <njonaitis@gmail.com>
|
||||
- Kyle Heyne <kyleheyne@gmail.com>
|
||||
- Tom Roth <rawdlite@googlemail.com>
|
||||
- Mark Greenwood <fatgerman@gmail.com>
|
||||
- Stein Karlsen <karlsen.stein@gmail.com>
|
||||
- Dejan Prokić <dejanp@nordeus.eu>
|
||||
- Eric Jahn <ejahn@newstore.com>
|
||||
- Mikhail Golubev <qsolo825@gmail.com>
|
||||
- Danilo Bargen <mail@dbrgn.ch>
|
||||
- Bjørnar Snoksrud <bjornar@snoksrud.no>
|
||||
- Giorgos Logiotatidis <seadog@sealabs.net>
|
||||
|
||||
@ -10,9 +10,11 @@ flake8-import-order
|
||||
|
||||
# Mock dependencies in tests
|
||||
mock
|
||||
responses
|
||||
|
||||
# Test runners
|
||||
pytest
|
||||
pytest-capturelog
|
||||
pytest-cov
|
||||
pytest-xdist
|
||||
tox
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
.. _concepts:
|
||||
|
||||
*************************
|
||||
Architecture and concepts
|
||||
*************************
|
||||
************
|
||||
Architecture
|
||||
************
|
||||
|
||||
The overall architecture of Mopidy is organized around multiple frontends and
|
||||
backends. The frontends use the core API. The core actor makes multiple backends
|
||||
@ -1,8 +1,8 @@
|
||||
.. _audio-api:
|
||||
|
||||
*********
|
||||
Audio API
|
||||
*********
|
||||
*********************************
|
||||
:mod:`mopidy.audio` --- Audio API
|
||||
*********************************
|
||||
|
||||
.. module:: mopidy.audio
|
||||
:synopsis: Thin wrapper around the parts of GStreamer we use
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
.. _backend-api:
|
||||
|
||||
***********
|
||||
Backend API
|
||||
***********
|
||||
*************************************
|
||||
:mod:`mopidy.backend` --- Backend API
|
||||
*************************************
|
||||
|
||||
.. module:: mopidy.backend
|
||||
:synopsis: The API implemented by backends
|
||||
@ -1,8 +1,8 @@
|
||||
.. _commands-api:
|
||||
|
||||
************
|
||||
Commands API
|
||||
************
|
||||
***************************************
|
||||
:mod:`mopidy.commands` --- Commands API
|
||||
***************************************
|
||||
|
||||
.. automodule:: mopidy.commands
|
||||
:synopsis: Commands API for Mopidy CLI.
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
.. _config-api:
|
||||
|
||||
**********
|
||||
Config API
|
||||
**********
|
||||
***********************************
|
||||
:mod:`mopidy.config` --- Config API
|
||||
***********************************
|
||||
|
||||
.. automodule:: mopidy.config
|
||||
:synopsis: Config API for config loading and validation
|
||||
|
||||
@ -1,79 +1,253 @@
|
||||
.. _core-api:
|
||||
|
||||
********
|
||||
Core API
|
||||
********
|
||||
*******************************
|
||||
:mod:`mopidy.core` --- Core API
|
||||
*******************************
|
||||
|
||||
.. module:: mopidy.core
|
||||
:synopsis: Core API for use by frontends
|
||||
|
||||
The core API is the interface that is used by frontends like
|
||||
:mod:`mopidy.http` and :mod:`mopidy.mpd`. The core layer is inbetween the
|
||||
frontends and the backends.
|
||||
:mod:`mopidy.http` and :mod:`mopidy.mpd`. The core layer is in between the
|
||||
frontends and the backends. Don't forget that you will be accessing core
|
||||
as a Pykka actor. If you are only interested in being notified about changes
|
||||
in core see :class:`~mopidy.core.CoreListener`.
|
||||
|
||||
.. versionchanged:: 1.1
|
||||
All core API calls are now type checked.
|
||||
|
||||
.. versionchanged:: 1.1
|
||||
All backend return values are now type checked.
|
||||
|
||||
.. autoclass:: mopidy.core.Core
|
||||
:members:
|
||||
|
||||
.. attribute:: tracklist
|
||||
|
||||
Playback controller
|
||||
===================
|
||||
Manages everything related to the list of tracks we will play.
|
||||
See :class:`~mopidy.core.TracklistController`.
|
||||
|
||||
Manages playback, with actions like play, pause, stop, next, previous,
|
||||
seek, and volume control.
|
||||
.. attribute:: playback
|
||||
|
||||
.. autoclass:: mopidy.core.PlaybackState
|
||||
:members:
|
||||
Manages playback state and the current playing track.
|
||||
See :class:`~mopidy.core.PlaybackController`.
|
||||
|
||||
.. autoclass:: mopidy.core.PlaybackController
|
||||
:members:
|
||||
.. attribute:: library
|
||||
|
||||
Manages the music library, e.g. searching and browsing for music.
|
||||
See :class:`~mopidy.core.LibraryController`.
|
||||
|
||||
.. attribute:: playlists
|
||||
|
||||
Manages stored playlists. See :class:`~mopidy.core.PlaylistsController`.
|
||||
|
||||
.. attribute:: mixer
|
||||
|
||||
Manages volume and muting. See :class:`~mopidy.core.MixerController`.
|
||||
|
||||
.. attribute:: history
|
||||
|
||||
Keeps record of what tracks have been played.
|
||||
See :class:`~mopidy.core.HistoryController`.
|
||||
|
||||
.. automethod:: get_uri_schemes
|
||||
|
||||
.. automethod:: get_version
|
||||
|
||||
|
||||
Tracklist controller
|
||||
====================
|
||||
|
||||
Manages everything related to the tracks we are currently playing.
|
||||
|
||||
.. autoclass:: mopidy.core.TracklistController
|
||||
:members:
|
||||
|
||||
Manipulating
|
||||
------------
|
||||
|
||||
.. automethod:: mopidy.core.TracklistController.add
|
||||
.. automethod:: mopidy.core.TracklistController.remove
|
||||
.. automethod:: mopidy.core.TracklistController.clear
|
||||
.. automethod:: mopidy.core.TracklistController.move
|
||||
.. automethod:: mopidy.core.TracklistController.shuffle
|
||||
|
||||
Current state
|
||||
-------------
|
||||
|
||||
.. automethod:: mopidy.core.TracklistController.get_tl_tracks
|
||||
.. automethod:: mopidy.core.TracklistController.index
|
||||
.. automethod:: mopidy.core.TracklistController.get_version
|
||||
|
||||
.. automethod:: mopidy.core.TracklistController.get_length
|
||||
.. automethod:: mopidy.core.TracklistController.get_tracks
|
||||
|
||||
.. automethod:: mopidy.core.TracklistController.slice
|
||||
.. automethod:: mopidy.core.TracklistController.filter
|
||||
|
||||
Future state
|
||||
------------
|
||||
|
||||
.. automethod:: mopidy.core.TracklistController.get_eot_tlid
|
||||
.. automethod:: mopidy.core.TracklistController.get_next_tlid
|
||||
.. automethod:: mopidy.core.TracklistController.get_previous_tlid
|
||||
|
||||
.. automethod:: mopidy.core.TracklistController.eot_track
|
||||
.. automethod:: mopidy.core.TracklistController.next_track
|
||||
.. automethod:: mopidy.core.TracklistController.previous_track
|
||||
|
||||
Options
|
||||
-------
|
||||
|
||||
.. automethod:: mopidy.core.TracklistController.get_consume
|
||||
.. automethod:: mopidy.core.TracklistController.set_consume
|
||||
.. automethod:: mopidy.core.TracklistController.get_random
|
||||
.. automethod:: mopidy.core.TracklistController.set_random
|
||||
.. automethod:: mopidy.core.TracklistController.get_repeat
|
||||
.. automethod:: mopidy.core.TracklistController.set_repeat
|
||||
.. automethod:: mopidy.core.TracklistController.get_single
|
||||
.. automethod:: mopidy.core.TracklistController.set_single
|
||||
|
||||
|
||||
History controller
|
||||
==================
|
||||
Playback controller
|
||||
===================
|
||||
|
||||
Keeps record of what tracks have been played.
|
||||
.. autoclass:: mopidy.core.PlaybackController
|
||||
|
||||
.. autoclass:: mopidy.core.HistoryController
|
||||
:members:
|
||||
Playback control
|
||||
----------------
|
||||
|
||||
.. automethod:: mopidy.core.PlaybackController.play
|
||||
.. automethod:: mopidy.core.PlaybackController.next
|
||||
.. automethod:: mopidy.core.PlaybackController.previous
|
||||
.. automethod:: mopidy.core.PlaybackController.stop
|
||||
.. automethod:: mopidy.core.PlaybackController.pause
|
||||
.. automethod:: mopidy.core.PlaybackController.resume
|
||||
.. automethod:: mopidy.core.PlaybackController.seek
|
||||
|
||||
Playlists controller
|
||||
====================
|
||||
Current track
|
||||
-------------
|
||||
|
||||
Manages persistence of playlists.
|
||||
.. automethod:: mopidy.core.PlaybackController.get_current_tl_track
|
||||
.. automethod:: mopidy.core.PlaybackController.get_current_track
|
||||
.. automethod:: mopidy.core.PlaybackController.get_stream_title
|
||||
.. automethod:: mopidy.core.PlaybackController.get_time_position
|
||||
|
||||
.. autoclass:: mopidy.core.PlaylistsController
|
||||
:members:
|
||||
Playback states
|
||||
---------------
|
||||
|
||||
.. automethod:: mopidy.core.PlaybackController.get_state
|
||||
.. automethod:: mopidy.core.PlaybackController.set_state
|
||||
|
||||
.. class:: mopidy.core.PlaybackState
|
||||
|
||||
.. attribute:: STOPPED
|
||||
:annotation: = 'stopped'
|
||||
.. attribute:: PLAYING
|
||||
:annotation: = 'playing'
|
||||
.. attribute:: PAUSED
|
||||
:annotation: = 'paused'
|
||||
|
||||
Library controller
|
||||
==================
|
||||
|
||||
Manages the music library, e.g. searching for tracks to be added to a playlist.
|
||||
.. class:: mopidy.core.LibraryController
|
||||
|
||||
.. autoclass:: mopidy.core.LibraryController
|
||||
:members:
|
||||
.. automethod:: mopidy.core.LibraryController.browse
|
||||
.. automethod:: mopidy.core.LibraryController.search
|
||||
.. automethod:: mopidy.core.LibraryController.lookup
|
||||
.. automethod:: mopidy.core.LibraryController.refresh
|
||||
.. automethod:: mopidy.core.LibraryController.get_images
|
||||
.. automethod:: mopidy.core.LibraryController.get_distinct
|
||||
|
||||
Playlists controller
|
||||
====================
|
||||
|
||||
.. class:: mopidy.core.PlaylistsController
|
||||
|
||||
Fetching
|
||||
--------
|
||||
|
||||
.. automethod:: mopidy.core.PlaylistsController.as_list
|
||||
.. automethod:: mopidy.core.PlaylistsController.get_items
|
||||
.. automethod:: mopidy.core.PlaylistsController.lookup
|
||||
.. automethod:: mopidy.core.PlaylistsController.refresh
|
||||
|
||||
Manipulating
|
||||
------------
|
||||
|
||||
.. automethod:: mopidy.core.PlaylistsController.create
|
||||
.. automethod:: mopidy.core.PlaylistsController.save
|
||||
.. automethod:: mopidy.core.PlaylistsController.delete
|
||||
|
||||
Mixer controller
|
||||
================
|
||||
|
||||
Manages volume and muting.
|
||||
.. class:: mopidy.core.MixerController
|
||||
|
||||
.. autoclass:: mopidy.core.MixerController
|
||||
:members:
|
||||
.. automethod:: mopidy.core.MixerController.get_mute
|
||||
.. automethod:: mopidy.core.MixerController.set_mute
|
||||
.. automethod:: mopidy.core.MixerController.get_volume
|
||||
.. automethod:: mopidy.core.MixerController.set_volume
|
||||
|
||||
Core listener
|
||||
=============
|
||||
History controller
|
||||
==================
|
||||
|
||||
.. class:: mopidy.core.HistoryController
|
||||
|
||||
.. automethod:: mopidy.core.HistoryController.get_history
|
||||
.. automethod:: mopidy.core.HistoryController.get_length
|
||||
|
||||
Core events
|
||||
===========
|
||||
|
||||
.. autoclass:: mopidy.core.CoreListener
|
||||
:members:
|
||||
|
||||
Deprecated API features
|
||||
=======================
|
||||
|
||||
.. warning::
|
||||
Though these features still work, they are slated to go away in the next
|
||||
major Mopidy release.
|
||||
|
||||
Core
|
||||
----
|
||||
|
||||
.. autoattribute:: mopidy.core.Core.version
|
||||
.. autoattribute:: mopidy.core.Core.uri_schemes
|
||||
|
||||
TracklistController
|
||||
-------------------
|
||||
|
||||
.. autoattribute:: mopidy.core.TracklistController.tl_tracks
|
||||
.. autoattribute:: mopidy.core.TracklistController.tracks
|
||||
.. autoattribute:: mopidy.core.TracklistController.version
|
||||
.. autoattribute:: mopidy.core.TracklistController.length
|
||||
|
||||
.. autoattribute:: mopidy.core.TracklistController.consume
|
||||
.. autoattribute:: mopidy.core.TracklistController.random
|
||||
.. autoattribute:: mopidy.core.TracklistController.repeat
|
||||
.. autoattribute:: mopidy.core.TracklistController.single
|
||||
|
||||
PlaylistsController
|
||||
-------------------
|
||||
|
||||
.. automethod:: mopidy.core.PlaybackController.get_mute
|
||||
.. automethod:: mopidy.core.PlaybackController.get_volume
|
||||
|
||||
.. autoattribute:: mopidy.core.PlaybackController.current_tl_track
|
||||
.. autoattribute:: mopidy.core.PlaybackController.current_track
|
||||
.. autoattribute:: mopidy.core.PlaybackController.state
|
||||
.. autoattribute:: mopidy.core.PlaybackController.time_position
|
||||
.. autoattribute:: mopidy.core.PlaybackController.mute
|
||||
.. autoattribute:: mopidy.core.PlaybackController.volume
|
||||
|
||||
LibraryController
|
||||
-----------------
|
||||
|
||||
.. automethod:: mopidy.core.LibraryController.find_exact
|
||||
|
||||
PlaybackController
|
||||
------------------
|
||||
|
||||
.. automethod:: mopidy.core.PlaylistsController.filter
|
||||
.. automethod:: mopidy.core.PlaylistsController.get_playlists
|
||||
|
||||
.. autoattribute:: mopidy.core.PlaylistsController.playlists
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
.. _ext-api:
|
||||
|
||||
*************
|
||||
Extension API
|
||||
*************
|
||||
**********************************
|
||||
:mod:`mopidy.ext` -- Extension API
|
||||
**********************************
|
||||
|
||||
If you want to learn how to make Mopidy extensions, read :ref:`extensiondev`.
|
||||
|
||||
|
||||
@ -25,6 +25,8 @@ For details on how to make a Mopidy extension, see the :ref:`extensiondev`
|
||||
guide.
|
||||
|
||||
|
||||
.. _static-web-client:
|
||||
|
||||
Static web client example
|
||||
=========================
|
||||
|
||||
|
||||
@ -4,9 +4,6 @@
|
||||
HTTP JSON-RPC API
|
||||
*****************
|
||||
|
||||
.. module:: mopidy.http
|
||||
:synopsis: The HTTP frontend APIs
|
||||
|
||||
The :ref:`ext-http` extension makes Mopidy's :ref:`core-api` available using
|
||||
JSON-RPC over HTTP using HTTP POST and WebSockets. We also provide a JavaScript
|
||||
wrapper, called :ref:`Mopidy.js <mopidy-js>`, around the JSON-RPC over
|
||||
@ -65,14 +62,9 @@ JSON-RPC 2.0 messages can be recognized by checking for the key named
|
||||
please refer to the `JSON-RPC 2.0 spec
|
||||
<http://www.jsonrpc.org/specification>`_.
|
||||
|
||||
All methods (not attributes) in the :ref:`core-api` is made available through
|
||||
JSON-RPC calls over the WebSocket. For example,
|
||||
:meth:`mopidy.core.PlaybackController.play` is available as the JSON-RPC method
|
||||
``core.playback.play``.
|
||||
|
||||
The core API's attributes is made available through setters and getters. For
|
||||
example, the attribute :attr:`mopidy.core.PlaybackController.current_track` is
|
||||
available as the JSON-RPC method ``core.playback.get_current_track``.
|
||||
All methods in the :ref:`core-api` is made available through JSON-RPC calls
|
||||
over the WebSocket. For example, :meth:`mopidy.core.PlaybackController.play` is
|
||||
available as the JSON-RPC method ``core.playback.play``.
|
||||
|
||||
Example JSON-RPC request::
|
||||
|
||||
|
||||
9
docs/api/httpclient.rst
Normal file
9
docs/api/httpclient.rst
Normal file
@ -0,0 +1,9 @@
|
||||
.. _httpclient-helper:
|
||||
|
||||
************************************************
|
||||
:mod:`mopidy.httpclient` --- HTTP Client helpers
|
||||
************************************************
|
||||
|
||||
.. automodule:: mopidy.httpclient
|
||||
:synopsis: HTTP Client helpers for Mopidy its Extensions.
|
||||
:members:
|
||||
@ -4,26 +4,57 @@
|
||||
API reference
|
||||
*************
|
||||
|
||||
.. note:: What is public?
|
||||
.. note::
|
||||
|
||||
Only APIs documented here are public and open for use by Mopidy
|
||||
extensions.
|
||||
|
||||
|
||||
.. toctree::
|
||||
:glob:
|
||||
Concepts
|
||||
========
|
||||
|
||||
concepts
|
||||
.. toctree::
|
||||
|
||||
architecture
|
||||
models
|
||||
backends
|
||||
|
||||
|
||||
Basics
|
||||
======
|
||||
|
||||
.. toctree::
|
||||
|
||||
core
|
||||
audio
|
||||
mixer
|
||||
frontends
|
||||
commands
|
||||
frontend
|
||||
backend
|
||||
ext
|
||||
config
|
||||
zeroconf
|
||||
|
||||
|
||||
Web/JavaScript
|
||||
==============
|
||||
|
||||
.. toctree::
|
||||
|
||||
http-server
|
||||
http
|
||||
js
|
||||
|
||||
|
||||
Audio
|
||||
=====
|
||||
|
||||
.. toctree::
|
||||
|
||||
audio
|
||||
mixer
|
||||
|
||||
|
||||
Utilities
|
||||
=========
|
||||
|
||||
.. toctree::
|
||||
|
||||
commands
|
||||
config
|
||||
httpclient
|
||||
zeroconf
|
||||
|
||||
@ -21,9 +21,9 @@ available at:
|
||||
|
||||
You may need to adjust hostname and port for your local setup.
|
||||
|
||||
Thus, if you use Mopidy to host your web client, like described above, you can
|
||||
load the latest version of Mopidy.js by adding the following script tag to your
|
||||
HTML file:
|
||||
Thus, if you use Mopidy to host your web client, like described in
|
||||
:ref:`static-web-client`, you can load the latest version of Mopidy.js by
|
||||
adding the following script tag to your HTML file:
|
||||
|
||||
.. code-block:: html
|
||||
|
||||
@ -189,13 +189,10 @@ you've hooked up an errback (more on that a bit later) to the promise returned
|
||||
from the call, the errback will be called with a ``Mopidy.ConnectionError``
|
||||
instance.
|
||||
|
||||
All methods in Mopidy's :ref:`core-api` is available via Mopidy.js. The core
|
||||
API attributes is *not* available, but that shouldn't be a problem as we've
|
||||
added (undocumented) getters and setters for all of them, so you can access the
|
||||
attributes as well from JavaScript. For example, the
|
||||
:attr:`mopidy.core.PlaybackController.state` attribute is available in
|
||||
JSON-RPC as the method ``core.playback.get_state`` and in Mopidy.js as
|
||||
``mopidy.playback.getState()``.
|
||||
All methods in Mopidy's :ref:`core-api` is available via Mopidy.js. For
|
||||
example, the :meth:`mopidy.core.PlaybackController.get_state` method is
|
||||
available in JSON-RPC as the method ``core.playback.get_state`` and in
|
||||
Mopidy.js as ``mopidy.playback.getState()``.
|
||||
|
||||
Both the WebSocket API and the JavaScript API are based on introspection of the
|
||||
core Python API. Thus, they will always be up to date and immediately reflect
|
||||
@ -218,8 +215,7 @@ by looking at the method's ``description`` and ``params`` attributes:
|
||||
|
||||
JSON-RPC 2.0 limits method parameters to be sent *either* by-position or
|
||||
by-name. Combinations of both, like we're used to from Python, isn't supported
|
||||
by JSON-RPC 2.0. To further limit this, Mopidy.js currently only supports
|
||||
passing parameters by-position.
|
||||
by JSON-RPC 2.0.
|
||||
|
||||
Obviously, you'll want to get a return value from many of your method calls.
|
||||
Since everything is happening across the WebSocket and maybe even across the
|
||||
@ -272,8 +268,9 @@ passing it as the second argument to ``done()``:
|
||||
.done(printCurrentTrack, console.error.bind(console));
|
||||
|
||||
If you don't hook up an error handler function and never call ``done()`` on the
|
||||
promise object, when.js will log warnings to the console that you have
|
||||
unhandled errors. In general, unhandled errors will not go silently missing.
|
||||
promise object, warnings will be logged to the console complaining that you
|
||||
have unhandled errors. In general, unhandled errors will not go silently
|
||||
missing.
|
||||
|
||||
The promise objects returned by Mopidy.js adheres to the `CommonJS Promises/A
|
||||
<http://wiki.commonjs.org/wiki/Promises/A>`_ standard. We use the
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
.. _mixer-api:
|
||||
|
||||
***************
|
||||
Audio mixer API
|
||||
***************
|
||||
***************************************
|
||||
:mod:`mopidy.mixer` --- Audio mixer API
|
||||
***************************************
|
||||
|
||||
.. module:: mopidy.mixer
|
||||
:synopsis: The audio mixer API
|
||||
|
||||
@ -1,14 +1,14 @@
|
||||
***********
|
||||
Data models
|
||||
***********
|
||||
************************************
|
||||
:mod:`mopidy.models` --- Data models
|
||||
************************************
|
||||
|
||||
These immutable data models are used for all data transfer within the Mopidy
|
||||
backends and between the backends and the MPD frontend. All fields are optional
|
||||
and immutable. In other words, they can only be set through the class
|
||||
constructor during instance creation.
|
||||
constructor during instance creation. Additionally fields are type checked.
|
||||
|
||||
If you want to modify a model, use the
|
||||
:meth:`~mopidy.models.ImmutableObject.copy` method. It accepts keyword
|
||||
:meth:`~mopidy.models.ImmutableObject.replace` method. It accepts keyword
|
||||
arguments for the parts of the model you want to change, and copies the rest of
|
||||
the data from the model you call it on. Example::
|
||||
|
||||
@ -16,7 +16,7 @@ the data from the model you call it on. Example::
|
||||
>>> track1 = Track(name='Christmas Carol', length=171)
|
||||
>>> track1
|
||||
Track(artists=[], length=171, name='Christmas Carol')
|
||||
>>> track2 = track1.copy(length=37)
|
||||
>>> track2 = track1.replace(length=37)
|
||||
>>> track2
|
||||
Track(artists=[], length=37, name='Christmas Carol')
|
||||
>>> track1
|
||||
@ -75,7 +75,31 @@ Data model helpers
|
||||
==================
|
||||
|
||||
.. autoclass:: mopidy.models.ImmutableObject
|
||||
:members:
|
||||
|
||||
.. autoclass:: mopidy.models.ValidatedImmutableObject
|
||||
:members: replace
|
||||
|
||||
Data model (de)serialization
|
||||
----------------------------
|
||||
|
||||
.. autofunction:: mopidy.models.model_json_decoder
|
||||
|
||||
.. autoclass:: mopidy.models.ModelJSONEncoder
|
||||
|
||||
.. autofunction:: mopidy.models.model_json_decoder
|
||||
Data model field types
|
||||
----------------------
|
||||
|
||||
.. autoclass:: mopidy.models.fields.Field
|
||||
|
||||
.. autoclass:: mopidy.models.fields.String
|
||||
|
||||
.. autoclass:: mopidy.models.fields.Identifier
|
||||
|
||||
.. autoclass:: mopidy.models.fields.URI
|
||||
|
||||
.. autoclass:: mopidy.models.fields.Date
|
||||
|
||||
.. autoclass:: mopidy.models.fields.Integer
|
||||
|
||||
.. autoclass:: mopidy.models.fields.Collection
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
.. _zeroconf-api:
|
||||
|
||||
************
|
||||
Zeroconf API
|
||||
************
|
||||
***************************************
|
||||
:mod:`mopidy.zeroconf` --- Zeroconf API
|
||||
***************************************
|
||||
|
||||
.. module:: mopidy.zeroconf
|
||||
:synopsis: Helper for publishing of services on Zeroconf
|
||||
|
||||
@ -4,6 +4,219 @@ Changelog
|
||||
|
||||
This changelog is used to track all major changes to Mopidy.
|
||||
|
||||
v1.1.0 (2015-08-09)
|
||||
===================
|
||||
|
||||
Mopidy 1.1 is here!
|
||||
|
||||
Since the release of 1.0, we've closed or merged approximately 55 issues and
|
||||
pull requests through about 400 commits by a record high 20 extraordinary
|
||||
people, including 14 newcomers. That's less issues and commits than in the 1.0
|
||||
release, but even more contributors, and a doubling of the number of newcomers.
|
||||
Thanks to :ref:`everyone <authors>` who has :ref:`contributed <contributing>`!
|
||||
|
||||
As we promised with the release of Mopidy 1.0, any extension working with
|
||||
Mopidy 1.0 should continue working with all Mopidy 1.x releases. However, this
|
||||
release brings a lot stronger enforcement of our documented APIs. If an
|
||||
extension doesn't use the APIs properly, it may no longer work. The advantage
|
||||
of this change is that Mopidy is now more robust against errors in extensions,
|
||||
and also provides vastly better error messages when extensions misbehave. This
|
||||
should make it easier to create quality extensions.
|
||||
|
||||
The major features of Mopidy 1.1 are:
|
||||
|
||||
- Validation of the arguments to all core API methods, as well as all responses
|
||||
from backends and all data model attributes.
|
||||
|
||||
- New bundled backend, Mopidy-File. It is similar to Mopidy-Local, but allows
|
||||
you to browse and play music from local disk without running a scan to index
|
||||
the music first. The drawback is that it doesn't support searching.
|
||||
|
||||
- The Mopidy-MPD server should now be up to date with the 0.19 version of the
|
||||
MPD protocol.
|
||||
|
||||
Dependencies
|
||||
------------
|
||||
|
||||
- Mopidy now requires Requests.
|
||||
|
||||
- Heads up: Porting from GStreamer 0.10 to 1.x and support for running Mopidy
|
||||
with Python 3.4+ is not far off on our roadmap.
|
||||
|
||||
Core API
|
||||
--------
|
||||
|
||||
- **Deprecated:** Calling the following methods with ``kwargs`` is being
|
||||
deprecated. (PR: :issue:`1090`)
|
||||
|
||||
- :meth:`mopidy.core.LibraryController.search`
|
||||
- :meth:`mopidy.core.PlaylistsController.filter`
|
||||
- :meth:`mopidy.core.TracklistController.filter`
|
||||
- :meth:`mopidy.core.TracklistController.remove`
|
||||
|
||||
- Updated core controllers to handle backend exceptions in all calls that rely
|
||||
on multiple backends. (Issue: :issue:`667`)
|
||||
|
||||
- Update core methods to do strict input checking. (Fixes: :issue:`700`)
|
||||
|
||||
- Add ``tlid`` alternatives to methods that take ``tl_track`` and also add
|
||||
``get_{eot,next,previous}_tlid`` methods as light weight alternatives to the
|
||||
``tl_track`` versions of the calls. (Fixes: :issue:`1131` PR: :issue:`1136`,
|
||||
:issue:`1140`)
|
||||
|
||||
- Add :meth:`mopidy.core.PlaybackController.get_current_tlid`.
|
||||
(Part of: :issue:`1137`)
|
||||
|
||||
- Update core to handle backend crashes and bad data. (Fixes: :issue:`1161`)
|
||||
|
||||
- Add :confval:`core/max_tracklist_length` config and limitation. (Fixes:
|
||||
:issue:`997` PR: :issue:`1225`)
|
||||
|
||||
- Added ``playlist_deleted`` event. (Fixes: :issue:`996`)
|
||||
|
||||
Models
|
||||
------
|
||||
|
||||
- Added type checks and other sanity checks to model construction and
|
||||
serialization. (Fixes: :issue:`865`)
|
||||
|
||||
- Memory usage for models has been greatly improved. We now have a lower
|
||||
overhead per instance by using slots, interned identifiers and automatically
|
||||
reuse instances. For the test data set this was developed against, a library
|
||||
of ~14.000 tracks, went from needing ~75MB to ~17MB. (Fixes: :issue:`348`)
|
||||
|
||||
- Added :attr:`mopidy.models.Artist.sortname` field that is mapped to
|
||||
``musicbrainz-sortname`` tag. (Fixes: :issue:`940`)
|
||||
|
||||
Configuration
|
||||
-------------
|
||||
|
||||
- Add new configurations to set base directories to be used by Mopidy and
|
||||
Mopidy extensions: :confval:`core/cache_dir`, :confval:`core/config_dir`, and
|
||||
:confval:`core/data_dir`. (Fixes: :issue:`843`, PR: :issue:`1232`)
|
||||
|
||||
Extension support
|
||||
-----------------
|
||||
|
||||
- Add new methods to :class:`~mopidy.ext.Extension` class for getting cache,
|
||||
config and data directories specific to your extension:
|
||||
|
||||
- :meth:`mopidy.ext.Extension.get_cache_dir`
|
||||
- :meth:`mopidy.ext.Extension.get_config_dir`
|
||||
- :meth:`mopidy.ext.Extension.get_data_dir`
|
||||
|
||||
Extensions should use these methods so that the correct directories are used
|
||||
both when Mopidy is run by a regular user and when run as a system service.
|
||||
(Fixes: :issue:`843`, PR: :issue:`1232`)
|
||||
|
||||
- Add :func:`mopidy.httpclient.format_proxy` and
|
||||
:func:`mopidy.httpclient.format_user_agent`. (Part of: :issue:`1156`)
|
||||
|
||||
- It is now possible to import :mod:`mopidy.backends` without having GObject or
|
||||
GStreamer installed. In other words, a lot of backend extensions should now
|
||||
be able to run tests in a virtualenv with global site-packages disabled. This
|
||||
removes a lot of potential error sources. (Fixes: :issue:`1068`, PR:
|
||||
:issue:`1115`)
|
||||
|
||||
Local backend
|
||||
-------------
|
||||
|
||||
- Filter out :class:`None` from
|
||||
:meth:`~mopidy.backend.LibraryProvider.get_distinct` results. All returned
|
||||
results should be strings. (Fixes: :issue:`1202`)
|
||||
|
||||
Stream backend
|
||||
--------------
|
||||
|
||||
- Move stream playlist parsing from GStreamer to the stream backend. (Fixes:
|
||||
:issue:`671`)
|
||||
|
||||
File backend
|
||||
------------
|
||||
|
||||
The :ref:`Mopidy-File <ext-file>` backend is a new bundled backend. It is
|
||||
similar to Mopidy-Local since it works with local files, but it differs in a
|
||||
few key ways:
|
||||
|
||||
- Mopidy-File lets you browse your media files by their file hierarchy.
|
||||
|
||||
- It supports multiple media directories, all exposed under the "Files"
|
||||
directory when you browse your library with e.g. an MPD client.
|
||||
|
||||
- There is no index of the media files, like the JSON or SQLite files used by
|
||||
Mopidy-Local. Thus no need to scan the music collection before starting
|
||||
Mopidy. Everything is read from the file system when needed and changes to
|
||||
the file system is thus immediately visible in Mopidy clients.
|
||||
|
||||
- Because there is no index, there is no support for search.
|
||||
|
||||
Our long term plan is to keep this very simple file backend in Mopidy, as it
|
||||
has a well defined and limited scope, while splitting the more feature rich
|
||||
Mopidy-Local extension out to an independent project. (Fixes: :issue:`1004`,
|
||||
PR: :issue:`1207`)
|
||||
|
||||
M3U backend
|
||||
-----------
|
||||
|
||||
- Support loading UTF-8 encoded M3U files with the ``.m3u8`` file extension.
|
||||
(PR: :issue:`1193`)
|
||||
|
||||
MPD frontend
|
||||
------------
|
||||
|
||||
- The MPD command ``count`` now ignores tracks with no length, which would
|
||||
previously cause a :exc:`TypeError`. (PR: :issue:`1192`)
|
||||
|
||||
- Concatenate multiple artists, composers and performers using the "A;B" format
|
||||
instead of "A, B". This is a part of updating our protocol implementation to
|
||||
match MPD 0.19. (PR: :issue:`1213`)
|
||||
|
||||
- Add "not implemented" skeletons of new commands in the MPD protocol version
|
||||
0.19:
|
||||
|
||||
- Current playlist:
|
||||
|
||||
- ``rangeid``
|
||||
- ``addtagid``
|
||||
- ``cleartagid``
|
||||
|
||||
- Mounts and neighbors:
|
||||
|
||||
- ``mount``
|
||||
- ``unmount``
|
||||
- ``listmounts``
|
||||
- ``listneighbors``
|
||||
|
||||
- Music DB:
|
||||
|
||||
- ``listfiles``
|
||||
|
||||
- Track data now include the ``Last-Modified`` field if set on the track model.
|
||||
(Fixes: :issue:`1218`, PR: :issue:`1219`)
|
||||
|
||||
- Implement ``tagtypes`` MPD command. (PR: :issue:`1235`)
|
||||
|
||||
- Exclude empty tags fields from metadata output. (Fixes: :issue:`1045`, PR:
|
||||
:issue:`1235`)
|
||||
|
||||
- Implement protocol extensions to output Album URIs and Album Images when
|
||||
outputting track data to clients. (PR: :issue:`1230`)
|
||||
|
||||
- The MPD commands ``lsinfo`` and ``listplaylists`` are now implemented using
|
||||
the :meth:`~mopidy.core.PlaylistsController.as_list` method, which retrieves
|
||||
a lot less data and is thus much faster than the deprecated
|
||||
:meth:`~mopidy.core.PlaylistsController.get_playlists`. The drawback is that
|
||||
the ``Last-Modified`` timestamp is not available through this method, and the
|
||||
timestamps in the MPD command responses are now always set to the current
|
||||
time.
|
||||
|
||||
Internal changes
|
||||
----------------
|
||||
|
||||
- Tests have been cleaned up to stop using deprecated APIs where feasible.
|
||||
(Partial fix: :issue:`1083`, PR: :issue:`1090`)
|
||||
|
||||
|
||||
v1.0.8 (2015-07-22)
|
||||
===================
|
||||
|
||||
|
||||
10
docs/conf.py
10
docs/conf.py
@ -15,6 +15,7 @@ sys.path.insert(0, os.path.abspath(os.path.dirname(__file__) + '/../'))
|
||||
|
||||
|
||||
class Mock(object):
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
pass
|
||||
|
||||
@ -46,7 +47,6 @@ class Mock(object):
|
||||
return Mock()
|
||||
|
||||
MOCK_MODULES = [
|
||||
'cherrypy',
|
||||
'dbus',
|
||||
'dbus.mainloop',
|
||||
'dbus.mainloop.glib',
|
||||
@ -60,12 +60,6 @@ MOCK_MODULES = [
|
||||
'pykka.actor',
|
||||
'pykka.future',
|
||||
'pykka.registry',
|
||||
'pylast',
|
||||
'ws4py',
|
||||
'ws4py.messaging',
|
||||
'ws4py.server',
|
||||
'ws4py.server.cherrypyserver',
|
||||
'ws4py.websocket',
|
||||
]
|
||||
for mod_name in MOCK_MODULES:
|
||||
sys.modules[mod_name] = Mock()
|
||||
@ -101,7 +95,7 @@ master_doc = 'index'
|
||||
project = 'Mopidy'
|
||||
copyright = '2009-2015, Stein Magnus Jodal and contributors'
|
||||
|
||||
from mopidy.utils.versioning import get_version
|
||||
from mopidy.internal.versioning import get_version
|
||||
release = get_version()
|
||||
version = '.'.join(release.split('.')[:2])
|
||||
|
||||
|
||||
@ -57,6 +57,48 @@ Core configuration values
|
||||
|
||||
Mopidy's core has the following configuration values that you can change.
|
||||
|
||||
|
||||
Core configuration
|
||||
------------------
|
||||
|
||||
.. confval:: core/cache_dir
|
||||
|
||||
Path to base directory for storing cached data.
|
||||
|
||||
When running Mopidy as a regular user, this should usually be
|
||||
``$XDG_CACHE_DIR/mopidy``, i.e. :file:`~/.cache/mopidy`.
|
||||
|
||||
When running Mopidy as a system service, this should usually be
|
||||
:file:`/var/cache/mopidy`.
|
||||
|
||||
.. confval:: core/config_dir
|
||||
|
||||
Path to base directory for config files.
|
||||
|
||||
When running Mopidy as a regular user, this should usually be
|
||||
``$XDG_CONFIG_DIR/mopidy``, i.e. :file:`~/.config/mopidy`.
|
||||
|
||||
When running Mopidy as a system service, this should usually be
|
||||
:file:`/etc/mopidy`.
|
||||
|
||||
.. confval:: core/data_dir
|
||||
|
||||
Path to base directory for persistent data files.
|
||||
|
||||
When running Mopidy as a regular user, this should usually be
|
||||
``$XDG_DATA_DIR/mopidy``, i.e. :file:`~/.local/share/mopidy`.
|
||||
|
||||
When running Mopidy as a system service, this should usually be
|
||||
:file:`/var/lib/mopidy`.
|
||||
|
||||
.. confval:: core/max_tracklist_length
|
||||
|
||||
Max length of the tracklist. Defaults to 10000.
|
||||
|
||||
The original MPD server only supports 10000 tracks in the tracklist. Some
|
||||
MPD clients will crash if this limit is exceeded.
|
||||
|
||||
|
||||
Audio configuration
|
||||
-------------------
|
||||
|
||||
|
||||
@ -325,13 +325,6 @@ For each successful build, Travis submits code coverage data to `coveralls.io
|
||||
<https://coveralls.io/r/mopidy/mopidy>`_. If you're out of work, coveralls might
|
||||
help you find areas in the code which could need better test coverage.
|
||||
|
||||
In addition, we run a Jenkins CI server at https://ci.mopidy.com/ that runs all
|
||||
tests on multiple platforms (Ubuntu, OS X, x86, arm) for every commit we push
|
||||
to the ``develop`` branch in the main Mopidy repo on GitHub. Thus, new code
|
||||
isn't tested by Jenkins before it is merged into the ``develop`` branch, which
|
||||
is a bit late, but good enough to get broad testing before new code is
|
||||
released.
|
||||
|
||||
|
||||
.. _code-linting:
|
||||
|
||||
|
||||
@ -57,6 +57,28 @@ Provides a backend for browsing the Internet radio channels from the `Dirble
|
||||
<http://dirble.com/>`_ directory.
|
||||
|
||||
|
||||
Mopidy-dLeyna
|
||||
=============
|
||||
|
||||
https://github.com/tkem/mopidy-dleyna
|
||||
|
||||
Provides a backend for playing music from Digital Media Servers using
|
||||
the `dLeyna <http://01.org/dleyna>`_ D-Bus interface.
|
||||
|
||||
Mopidy-File
|
||||
===========
|
||||
|
||||
Bundled with Mopidy. See :ref:`ext-file`.
|
||||
|
||||
Mopidy-Grooveshark
|
||||
==================
|
||||
|
||||
https://github.com/camilonova/mopidy-grooveshark
|
||||
|
||||
Provides a backend for playing music from `Grooveshark
|
||||
<http://grooveshark.com/>`_.
|
||||
|
||||
|
||||
Mopidy-GMusic
|
||||
=============
|
||||
|
||||
|
||||
47
docs/ext/file.rst
Normal file
47
docs/ext/file.rst
Normal file
@ -0,0 +1,47 @@
|
||||
.. _ext-file:
|
||||
|
||||
************
|
||||
Mopidy-File
|
||||
************
|
||||
|
||||
Mopidy-File is an extension for playing music from your local music archive.
|
||||
It is bundled with Mopidy and enabled by default.
|
||||
It allows you to browse through your local file system.
|
||||
Only files that are considered playable will be shown.
|
||||
|
||||
This backend handles URIs starting with ``file:``.
|
||||
|
||||
|
||||
Configuration
|
||||
=============
|
||||
|
||||
See :ref:`config` for general help on configuring Mopidy.
|
||||
|
||||
.. literalinclude:: ../../mopidy/file/ext.conf
|
||||
:language: ini
|
||||
|
||||
.. confval:: file/enabled
|
||||
|
||||
If the file extension should be enabled or not.
|
||||
|
||||
.. confval:: file/media_dirs
|
||||
|
||||
A list of directories to be browsable.
|
||||
Optionally the path can be followed by ``|`` and a name that will be shown for that path.
|
||||
|
||||
.. confval:: file/show_dotfiles
|
||||
|
||||
Whether to show hidden files and directories that start with a dot.
|
||||
Default is false.
|
||||
|
||||
.. confval:: file/follow_symlinks
|
||||
|
||||
Whether to follow symbolic links found in :confval:`files/media_dir`.
|
||||
Directories and files that are outside the configured directories will not be shown.
|
||||
Default is false.
|
||||
|
||||
.. confval:: file/metadata_timeout
|
||||
|
||||
Number of milliseconds before giving up scanning a file and moving on to
|
||||
the next file. Reducing the value might speed up the directory listing,
|
||||
but can lead to some tracks not being shown.
|
||||
BIN
docs/ext/mopster.png
Normal file
BIN
docs/ext/mopster.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 80 KiB |
@ -159,6 +159,21 @@ A web extension for changing settings. Used by the Pi MusicBox distribution
|
||||
for Raspberry Pi, but also usable for other projects.
|
||||
|
||||
|
||||
Mopster
|
||||
=======
|
||||
|
||||
https://github.com/cowbell/mopster
|
||||
|
||||
Simple web client hosted online written in Ember.js and styled using basic
|
||||
Bootstrap by Wojciech Wnętrzak.
|
||||
|
||||
.. image:: /ext/mopster.png
|
||||
:width: 1275
|
||||
:height: 628
|
||||
|
||||
To use, just visit http://mopster.cowbell-labs.com/.
|
||||
|
||||
|
||||
Other web clients
|
||||
=================
|
||||
|
||||
|
||||
@ -6,7 +6,7 @@ Extension development
|
||||
|
||||
Mopidy started as simply an MPD server that could play music from Spotify.
|
||||
Early on, Mopidy got multiple "frontends" to expose Mopidy to more than just MPD
|
||||
clients: for example the scrobbler frontend that scrobbles your listening
|
||||
clients: for example the scrobbler frontend that scrobbles your listening
|
||||
history to your Last.fm account, the MPRIS frontend that integrates Mopidy into the
|
||||
Ubuntu Sound Menu, and the HTTP server and JavaScript player API making web
|
||||
based Mopidy clients possible. In Mopidy 0.9 we added support for multiple
|
||||
@ -75,10 +75,10 @@ the readme of `cookiecutter-mopidy-ext
|
||||
Example README.rst
|
||||
==================
|
||||
|
||||
The README file should quickly explain what the extension does, how to install
|
||||
it, and how to configure it. It should also contain a link to a tarball of the
|
||||
latest development version of the extension. It's important that this link ends
|
||||
with ``#egg=Mopidy-Something-dev`` for installation using
|
||||
The README file should quickly explain what the extension does, how to install
|
||||
it, and how to configure it. It should also contain a link to a tarball of the
|
||||
latest development version of the extension. It's important that this link ends
|
||||
with ``#egg=Mopidy-Something-dev`` for installation using
|
||||
``pip install Mopidy-Something==dev`` to work.
|
||||
|
||||
.. code-block:: rst
|
||||
@ -230,8 +230,8 @@ The root of your Python package should have an ``__version__`` attribute with a
|
||||
class named ``Extension`` which inherits from Mopidy's extension base class,
|
||||
:class:`mopidy.ext.Extension`. This is the class referred to in the
|
||||
``entry_points`` part of ``setup.py``. Any imports of other files in your
|
||||
extension, outside of Mopidy and it's core requirements, should be kept inside
|
||||
methods. This ensures that this file can be imported without raising
|
||||
extension, outside of Mopidy and it's core requirements, should be kept inside
|
||||
methods. This ensures that this file can be imported without raising
|
||||
:exc:`ImportError` exceptions for missing dependencies, etc.
|
||||
|
||||
The default configuration for the extension is defined by the
|
||||
@ -245,7 +245,7 @@ change them. The exception is if the config value has security implications; in
|
||||
that case you should default to the most secure configuration. Leave any
|
||||
configurations that don't have meaningful defaults blank, like ``username``
|
||||
and ``password``. In the example below, we've chosen to maintain the default
|
||||
config as a separate file named ``ext.conf``. This makes it easy to include the
|
||||
config as a separate file named ``ext.conf``. This makes it easy to include the
|
||||
default config in documentation without duplicating it.
|
||||
|
||||
This is ``mopidy_soundspot/__init__.py``::
|
||||
@ -413,11 +413,11 @@ examples, see the :ref:`http-server-api` docs or explore with
|
||||
Running an extension
|
||||
====================
|
||||
|
||||
Once your extension is ready to go, to see it in action you'll need to register
|
||||
it with Mopidy. Typically this is done by running ``python setup.py install``
|
||||
from your extension's Git repo root directory. While developing your extension
|
||||
and to avoid doing this every time you make a change, you can instead run
|
||||
``python setup.py develop`` to effectively link Mopidy directly with your
|
||||
Once your extension is ready to go, to see it in action you'll need to register
|
||||
it with Mopidy. Typically this is done by running ``python setup.py install``
|
||||
from your extension's Git repo root directory. While developing your extension
|
||||
and to avoid doing this every time you make a change, you can instead run
|
||||
``python setup.py develop`` to effectively link Mopidy directly with your
|
||||
development files.
|
||||
|
||||
|
||||
@ -434,9 +434,12 @@ Use of Mopidy APIs
|
||||
==================
|
||||
|
||||
When writing an extension, you should only use APIs documented at
|
||||
:ref:`api-ref`. Other parts of Mopidy, like :mod:`mopidy.utils`, may change at
|
||||
any time and are not something extensions should use.
|
||||
:ref:`api-ref`. Other parts of Mopidy, like :mod:`mopidy.internal`, may change
|
||||
at any time and are not something extensions should use.
|
||||
|
||||
Mopidy performs type checking to help catch extension bugs. This applies to
|
||||
both frontend calls into core and return values from backends. Additionally
|
||||
model fields always get validated to further guard against bad data.
|
||||
|
||||
Logging in extensions
|
||||
=====================
|
||||
@ -471,3 +474,76 @@ Is much better than::
|
||||
If you want to turn on debug logging for your own extension, but not for
|
||||
everything else due to the amount of noise, see the docs for the
|
||||
:confval:`loglevels/*` config section.
|
||||
|
||||
|
||||
Making HTTP requests from extensions
|
||||
====================================
|
||||
|
||||
Many Mopidy extensions need to make HTTP requests to use some web API. Here's a
|
||||
few recommendations to those extensions.
|
||||
|
||||
Proxies
|
||||
-------
|
||||
|
||||
If you make HTTP requests please make sure to respect the :ref:`proxy configs
|
||||
<proxy-config>`, so that all the requests you make go through the proxy
|
||||
configured by the Mopidy user. To make this easier for extension developers,
|
||||
the helper function :func:`mopidy.httpclient.format_proxy` was added in Mopidy
|
||||
1.1. This function returns the proxy settings `formatted the way Requests
|
||||
expects <http://www.python-requests.org/en/latest/user/advanced/#proxies>`__.
|
||||
|
||||
User-Agent strings
|
||||
------------------
|
||||
|
||||
When you make HTTP requests, it's helpful for debugging and usage analysis if
|
||||
the client identifies itself with a proper User-Agent string. In Mopidy 1.1, we
|
||||
added the helper function :func:`mopidy.httpclient.format_user_agent`. Here's
|
||||
an example of how to use it::
|
||||
|
||||
>>> from mopidy import httpclient
|
||||
>>> import mopidy_soundspot
|
||||
>>> httpclient.format_user_agent('%s/%s' % (
|
||||
... mopidy_soundspot.Extension.dist_name, mopidy_soundspot.__version__))
|
||||
u'Mopidy-SoundSpot/2.0.0 Mopidy/1.0.7 Python/2.7.10'
|
||||
|
||||
Example using Requests sessions
|
||||
-------------------------------
|
||||
|
||||
Most Mopidy extensions that make HTTP requests use the `Requests
|
||||
<http://www.python-requests.org/>`_ library to do so. When using Requests, the
|
||||
most convenient way to make sure the proxy and User-Agent header is set
|
||||
properly is to create a Requests session object and use that object to make all
|
||||
your HTTP requests::
|
||||
|
||||
from mopidy import httpclient
|
||||
|
||||
import requests
|
||||
|
||||
import mopidy_soundspot
|
||||
|
||||
|
||||
def get_requests_session(proxy_config, user_agent):
|
||||
proxy = httpclient.format_proxy(proxy_config)
|
||||
full_user_agent = httpclient.format_user_agent(user_agent)
|
||||
|
||||
session = requests.Session()
|
||||
session.proxies.update({'http': proxy, 'https': proxy})
|
||||
session.headers.update({'user-agent': full_user_agent})
|
||||
|
||||
return session
|
||||
|
||||
|
||||
# ``mopidy_config`` is the config object passed to your frontend/backend
|
||||
# constructor
|
||||
session = get_requests_session(
|
||||
proxy_config=mopidy_config['proxy'],
|
||||
user_agent='%s/%s' % (
|
||||
mopidy_soundspot.Extension.dist_name,
|
||||
mopidy_soundspot.__version__))
|
||||
|
||||
response = session.get('http://example.com')
|
||||
# Now do something with ``response`` and/or make further requests using the
|
||||
# ``session`` object.
|
||||
|
||||
For further details, see Requests' docs on `session objects
|
||||
<http://www.python-requests.org/en/latest/user/advanced/#session-objects>`__.
|
||||
|
||||
@ -6,7 +6,7 @@ Glossary
|
||||
|
||||
backend
|
||||
A part of Mopidy providing music library, playlist storage and/or
|
||||
playback capability to the :term:`core`. Mopidy have a backend for each
|
||||
playback capability to the :term:`core`. Mopidy has a backend for each
|
||||
music store or music service it supports. See :ref:`backend-api` for
|
||||
details.
|
||||
|
||||
|
||||
@ -96,6 +96,7 @@ Extensions
|
||||
:maxdepth: 2
|
||||
|
||||
ext/local
|
||||
ext/file
|
||||
ext/m3u
|
||||
ext/stream
|
||||
ext/http
|
||||
|
||||
@ -173,3 +173,20 @@ More info about this issue can be found in `this post
|
||||
|
||||
Please note that if you're running Xbian or another XBMC distribution these
|
||||
instructions might vary for your system.
|
||||
|
||||
|
||||
Appendix C: Installation on XBian
|
||||
=================================
|
||||
|
||||
Similar to the Raspbmc issue outlined in Appendix B, it's not possible to
|
||||
install Mopidy on XBian without first resolving a dependency problem between
|
||||
``gstreamer0.10-plugins-good`` and ``libtag1c2a``. More information can be
|
||||
found in `this post
|
||||
<https://github.com/xbianonpi/xbian/issues/378#issuecomment-37723392>`_.
|
||||
|
||||
Run the following commands to remedy this and then install Mopidy as normal::
|
||||
|
||||
cd /tmp
|
||||
wget http://apt.xbian.org/pool/stable/rpi-wheezy/l/libtag1c2a/libtag1c2a_1.7.2-1_armhf.deb
|
||||
sudo dpkg -i libtag1c2a_1.7.2-1_armhf.deb
|
||||
rm libtag1c2a_1.7.2-1_armhf.deb
|
||||
|
||||
@ -1,9 +1,23 @@
|
||||
************************************
|
||||
:mod:`mopidy.local` -- Local backend
|
||||
************************************
|
||||
*************************************
|
||||
:mod:`mopidy.local` --- Local backend
|
||||
*************************************
|
||||
|
||||
For details on how to use Mopidy's local backend, see :ref:`ext-local`.
|
||||
|
||||
.. automodule:: mopidy.local
|
||||
:synopsis: Local backend
|
||||
|
||||
|
||||
Local library API
|
||||
=================
|
||||
|
||||
.. autoclass:: mopidy.local.Library
|
||||
:members:
|
||||
|
||||
|
||||
Translation utils
|
||||
=================
|
||||
|
||||
.. automodule:: mopidy.local.translator
|
||||
:synopsis: Translators for local library extensions
|
||||
:members:
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
*******************************
|
||||
:mod:`mopidy.mpd` -- MPD server
|
||||
*******************************
|
||||
********************************
|
||||
:mod:`mopidy.mpd` --- MPD server
|
||||
********************************
|
||||
|
||||
For details on how to use Mopidy's MPD server, see :ref:`ext-mpd`.
|
||||
|
||||
@ -71,6 +71,14 @@ Current playlist
|
||||
:members:
|
||||
|
||||
|
||||
Mounts and neighbors
|
||||
--------------------
|
||||
|
||||
.. automodule:: mopidy.mpd.protocol.mount
|
||||
:synopsis: MPD protocol: mounts and neighbors
|
||||
:members:
|
||||
|
||||
|
||||
Music database
|
||||
--------------
|
||||
|
||||
|
||||
@ -47,8 +47,7 @@ Creating releases
|
||||
|
||||
#. Push to GitHub::
|
||||
|
||||
git push
|
||||
git push --tags
|
||||
git push --follow-tags
|
||||
|
||||
#. Upload the previously built and tested sdist and bdist_wheel packages to
|
||||
PyPI::
|
||||
|
||||
@ -13,6 +13,20 @@ When Mopidy says ``MPD server running at [127.0.0.1]:6600`` it's ready to
|
||||
accept connections by any MPD client. Check out our non-exhaustive
|
||||
:doc:`/clients/mpd` list to find recommended clients.
|
||||
|
||||
Updating the library
|
||||
====================
|
||||
|
||||
To update the library, e.g. after audio files have changed, run::
|
||||
|
||||
mopidy local scan
|
||||
|
||||
Afterwards, to refresh the library (which is for now only available
|
||||
through the API) it is necessary to run::
|
||||
|
||||
curl -d '{"jsonrpc": "2.0", "id": 1, "method": "core.library.refresh"}' http://localhost:6680/mopidy/rpc
|
||||
|
||||
This makes the changes in the library visible to the clients.
|
||||
|
||||
|
||||
Stopping Mopidy
|
||||
===============
|
||||
|
||||
@ -20,12 +20,6 @@ for free. We use their services for the following sites:
|
||||
|
||||
- Mailgun for sending emails from the Discourse forum.
|
||||
|
||||
- Hosting of the Jenkins CI server at https://ci.mopidy.com.
|
||||
|
||||
- Hosting of a Linux worker for https://ci.mopidy.com.
|
||||
|
||||
- Hosting of a Windows worker for https://ci.mopidy.com.
|
||||
|
||||
- CDN hosting at http://dl.mopidy.com, which is used to distribute Pi Musicbox
|
||||
images.
|
||||
|
||||
|
||||
@ -2,7 +2,6 @@ from __future__ import absolute_import, print_function, unicode_literals
|
||||
|
||||
import platform
|
||||
import sys
|
||||
import textwrap
|
||||
import warnings
|
||||
|
||||
|
||||
@ -11,23 +10,8 @@ if not (2, 7) <= sys.version_info < (3,):
|
||||
'ERROR: Mopidy requires Python 2.7, but found %s.' %
|
||||
platform.python_version())
|
||||
|
||||
try:
|
||||
import gobject # noqa
|
||||
except ImportError:
|
||||
print(textwrap.dedent("""
|
||||
ERROR: The gobject Python package was not found.
|
||||
|
||||
Mopidy requires GStreamer (and GObject) to work. These are C libraries
|
||||
with a number of dependencies themselves, and cannot be installed with
|
||||
the regular Python tools like pip.
|
||||
|
||||
Please see http://docs.mopidy.com/en/latest/installation/ for
|
||||
instructions on how to install the required dependencies.
|
||||
"""))
|
||||
raise
|
||||
|
||||
|
||||
warnings.filterwarnings('ignore', 'could not open display')
|
||||
|
||||
|
||||
__version__ = '1.0.8'
|
||||
__version__ = '1.1.0'
|
||||
|
||||
@ -4,8 +4,23 @@ import logging
|
||||
import os
|
||||
import signal
|
||||
import sys
|
||||
import textwrap
|
||||
|
||||
try:
|
||||
import gobject # noqa
|
||||
except ImportError:
|
||||
print(textwrap.dedent("""
|
||||
ERROR: The gobject Python package was not found.
|
||||
|
||||
Mopidy requires GStreamer (and GObject) to work. These are C libraries
|
||||
with a number of dependencies themselves, and cannot be installed with
|
||||
the regular Python tools like pip.
|
||||
|
||||
Please see http://docs.mopidy.com/en/latest/installation/ for
|
||||
instructions on how to install the required dependencies.
|
||||
"""))
|
||||
raise
|
||||
|
||||
import gobject
|
||||
gobject.threads_init()
|
||||
|
||||
try:
|
||||
@ -26,7 +41,7 @@ sys.argv[1:] = []
|
||||
|
||||
|
||||
from mopidy import commands, config as config_lib, ext
|
||||
from mopidy.utils import encoding, log, path, process, versioning
|
||||
from mopidy.internal import encoding, log, path, process, versioning
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -51,21 +66,23 @@ def main():
|
||||
root_cmd.add_child('config', config_cmd)
|
||||
root_cmd.add_child('deps', deps_cmd)
|
||||
|
||||
installed_extensions = ext.load_extensions()
|
||||
extensions_data = ext.load_extensions()
|
||||
|
||||
for extension in installed_extensions:
|
||||
ext_cmd = extension.get_command()
|
||||
if ext_cmd:
|
||||
ext_cmd.set(extension=extension)
|
||||
root_cmd.add_child(extension.ext_name, ext_cmd)
|
||||
for data in extensions_data:
|
||||
if data.command: # TODO: check isinstance?
|
||||
data.command.set(extension=data.extension)
|
||||
root_cmd.add_child(data.extension.ext_name, data.command)
|
||||
|
||||
args = root_cmd.parse(mopidy_args)
|
||||
|
||||
create_file_structures_and_config(args, installed_extensions)
|
||||
create_file_structures_and_config(args, extensions_data)
|
||||
check_old_locations()
|
||||
|
||||
config, config_errors = config_lib.load(
|
||||
args.config_files, installed_extensions, args.config_overrides)
|
||||
args.config_files,
|
||||
[d.config_schema for d in extensions_data],
|
||||
[d.config_defaults for d in extensions_data],
|
||||
args.config_overrides)
|
||||
|
||||
verbosity_level = args.base_verbosity_level
|
||||
if args.verbosity_level:
|
||||
@ -75,8 +92,11 @@ def main():
|
||||
|
||||
extensions = {
|
||||
'validate': [], 'config': [], 'disabled': [], 'enabled': []}
|
||||
for extension in installed_extensions:
|
||||
if not ext.validate_extension(extension):
|
||||
for data in extensions_data:
|
||||
extension = data.extension
|
||||
|
||||
# TODO: factor out all of this to a helper that can be tested
|
||||
if not ext.validate_extension_data(data):
|
||||
config[extension.ext_name] = {'enabled': False}
|
||||
config_errors[extension.ext_name] = {
|
||||
'enabled': 'extension disabled by self check.'}
|
||||
@ -94,12 +114,13 @@ def main():
|
||||
else:
|
||||
extensions['enabled'].append(extension)
|
||||
|
||||
log_extension_info(installed_extensions, extensions['enabled'])
|
||||
log_extension_info([d.extension for d in extensions_data],
|
||||
extensions['enabled'])
|
||||
|
||||
# Config and deps commands are simply special cased for now.
|
||||
if args.command == config_cmd:
|
||||
return args.command.run(
|
||||
config, config_errors, installed_extensions)
|
||||
schemas = [d.config_schema for d in extensions_data]
|
||||
return args.command.run(config, config_errors, schemas)
|
||||
elif args.command == deps_cmd:
|
||||
return args.command.run()
|
||||
|
||||
@ -119,10 +140,19 @@ def main():
|
||||
return 1
|
||||
|
||||
for extension in extensions['enabled']:
|
||||
extension.setup(registry)
|
||||
try:
|
||||
extension.setup(registry)
|
||||
except Exception:
|
||||
# TODO: would be nice a transactional registry. But sadly this
|
||||
# is a bit tricky since our current API is giving out a mutable
|
||||
# list. We might however be able to replace this with a
|
||||
# collections.Sequence to provide a RO view.
|
||||
logger.exception('Extension %s failed during setup, this might'
|
||||
' have left the registry in a bad state.',
|
||||
extension.ext_name)
|
||||
|
||||
# Anything that wants to exit after this point must use
|
||||
# mopidy.utils.process.exit_process as actors can have been started.
|
||||
# mopidy.internal.process.exit_process as actors can have been started.
|
||||
try:
|
||||
return args.command.run(args, proxied_config)
|
||||
except NotImplementedError:
|
||||
@ -136,7 +166,7 @@ def main():
|
||||
raise
|
||||
|
||||
|
||||
def create_file_structures_and_config(args, extensions):
|
||||
def create_file_structures_and_config(args, extensions_data):
|
||||
path.get_or_create_dir(b'$XDG_DATA_DIR/mopidy')
|
||||
path.get_or_create_dir(b'$XDG_CONFIG_DIR/mopidy')
|
||||
|
||||
@ -146,7 +176,7 @@ def create_file_structures_and_config(args, extensions):
|
||||
return
|
||||
|
||||
try:
|
||||
default = config_lib.format_initial(extensions)
|
||||
default = config_lib.format_initial(extensions_data)
|
||||
path.get_or_create_file(config_file, mkdir=False, content=default)
|
||||
logger.info('Initialized %s with default config', config_file)
|
||||
except IOError as error:
|
||||
|
||||
@ -13,10 +13,10 @@ import gst.pbutils # noqa
|
||||
import pykka
|
||||
|
||||
from mopidy import exceptions
|
||||
from mopidy.audio import playlists, utils
|
||||
from mopidy.audio import icy, utils
|
||||
from mopidy.audio.constants import PlaybackState
|
||||
from mopidy.audio.listener import AudioListener
|
||||
from mopidy.utils import process
|
||||
from mopidy.internal import deprecation, process
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@ -26,8 +26,7 @@ logger = logging.getLogger(__name__)
|
||||
# set_state on a pipeline.
|
||||
gst_logger = logging.getLogger('mopidy.audio.gst')
|
||||
|
||||
playlists.register_typefinders()
|
||||
playlists.register_elements()
|
||||
icy.register()
|
||||
|
||||
_GST_STATE_MAPPING = {
|
||||
gst.STATE_PLAYING: PlaybackState.PLAYING,
|
||||
@ -36,7 +35,9 @@ _GST_STATE_MAPPING = {
|
||||
|
||||
|
||||
class _Signals(object):
|
||||
|
||||
"""Helper for tracking gobject signal registrations"""
|
||||
|
||||
def __init__(self):
|
||||
self._ids = {}
|
||||
|
||||
@ -65,7 +66,9 @@ class _Signals(object):
|
||||
|
||||
# TODO: expose this as a property on audio?
|
||||
class _Appsrc(object):
|
||||
|
||||
"""Helper class for dealing with appsrc based playback."""
|
||||
|
||||
def __init__(self):
|
||||
self._signals = _Signals()
|
||||
self.reset()
|
||||
@ -132,6 +135,7 @@ class _Appsrc(object):
|
||||
|
||||
# TODO: expose this as a property on audio when #790 gets further along.
|
||||
class _Outputs(gst.Bin):
|
||||
|
||||
def __init__(self):
|
||||
gst.Bin.__init__(self, 'outputs')
|
||||
|
||||
@ -202,6 +206,7 @@ class SoftwareMixer(object):
|
||||
|
||||
|
||||
class _Handler(object):
|
||||
|
||||
def __init__(self, audio):
|
||||
self._audio = audio
|
||||
self._element = None
|
||||
@ -370,6 +375,7 @@ class _Handler(object):
|
||||
|
||||
# TODO: create a player class which replaces the actors internals
|
||||
class Audio(pykka.ThreadingActor):
|
||||
|
||||
"""
|
||||
Audio output through `GStreamer <http://gstreamer.freedesktop.org/>`_.
|
||||
"""
|
||||
@ -582,6 +588,7 @@ class Audio(pykka.ThreadingActor):
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`emit_data` with a :class:`None` buffer instead.
|
||||
"""
|
||||
deprecation.warn('audio.emit_end_of_stream')
|
||||
self._appsrc.push(None)
|
||||
|
||||
def set_about_to_finish_callback(self, callback):
|
||||
|
||||
@ -2,6 +2,7 @@ from __future__ import absolute_import, unicode_literals
|
||||
|
||||
|
||||
class PlaybackState(object):
|
||||
|
||||
"""
|
||||
Enum of playback states.
|
||||
"""
|
||||
|
||||
63
mopidy/audio/icy.py
Normal file
63
mopidy/audio/icy.py
Normal file
@ -0,0 +1,63 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import gobject
|
||||
|
||||
import pygst
|
||||
pygst.require('0.10')
|
||||
import gst # noqa
|
||||
|
||||
|
||||
class IcySrc(gst.Bin, gst.URIHandler):
|
||||
__gstdetails__ = ('IcySrc',
|
||||
'Src',
|
||||
'HTTP src wrapper for icy:// support.',
|
||||
'Mopidy')
|
||||
|
||||
srcpad_template = gst.PadTemplate(
|
||||
'src', gst.PAD_SRC, gst.PAD_ALWAYS,
|
||||
gst.caps_new_any())
|
||||
|
||||
__gsttemplates__ = (srcpad_template,)
|
||||
|
||||
def __init__(self):
|
||||
super(IcySrc, self).__init__()
|
||||
self._httpsrc = gst.element_make_from_uri(gst.URI_SRC, 'http://')
|
||||
try:
|
||||
self._httpsrc.set_property('iradio-mode', True)
|
||||
except TypeError:
|
||||
pass
|
||||
self.add(self._httpsrc)
|
||||
|
||||
self._srcpad = gst.GhostPad('src', self._httpsrc.get_pad('src'))
|
||||
self.add_pad(self._srcpad)
|
||||
|
||||
@classmethod
|
||||
def do_get_type_full(cls):
|
||||
return gst.URI_SRC
|
||||
|
||||
@classmethod
|
||||
def do_get_protocols_full(cls):
|
||||
return [b'icy', b'icyx']
|
||||
|
||||
def do_set_uri(self, uri):
|
||||
if uri.startswith('icy://'):
|
||||
return self._httpsrc.set_uri(b'http://' + uri[len('icy://'):])
|
||||
elif uri.startswith('icyx://'):
|
||||
return self._httpsrc.set_uri(b'https://' + uri[len('icyx://'):])
|
||||
else:
|
||||
return False
|
||||
|
||||
def do_get_uri(self):
|
||||
uri = self._httpsrc.get_uri()
|
||||
if uri.startswith('http://'):
|
||||
return b'icy://' + uri[len('http://'):]
|
||||
else:
|
||||
return b'icyx://' + uri[len('https://'):]
|
||||
|
||||
|
||||
def register():
|
||||
# Only register icy if gst install can't handle it on it's own.
|
||||
if not gst.element_make_from_uri(gst.URI_SRC, 'icy://'):
|
||||
gobject.type_register(IcySrc)
|
||||
gst.element_register(
|
||||
IcySrc, IcySrc.__name__.lower(), gst.RANK_MARGINAL)
|
||||
@ -4,6 +4,7 @@ from mopidy import listener
|
||||
|
||||
|
||||
class AudioListener(listener.Listener):
|
||||
|
||||
"""
|
||||
Marker interface for recipients of events sent by the audio actor.
|
||||
|
||||
|
||||
@ -1,419 +0,0 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import io
|
||||
|
||||
import gobject
|
||||
|
||||
import pygst
|
||||
pygst.require('0.10')
|
||||
import gst # noqa
|
||||
|
||||
from mopidy.compat import configparser
|
||||
|
||||
try:
|
||||
import xml.etree.cElementTree as elementtree
|
||||
except ImportError:
|
||||
import xml.etree.ElementTree as elementtree
|
||||
|
||||
|
||||
# TODO: make detect_FOO_header reusable in general mopidy code.
|
||||
# i.e. give it just a "peek" like function.
|
||||
def detect_m3u_header(typefind):
|
||||
return typefind.peek(0, 7).upper() == b'#EXTM3U'
|
||||
|
||||
|
||||
def detect_pls_header(typefind):
|
||||
return typefind.peek(0, 10).lower() == b'[playlist]'
|
||||
|
||||
|
||||
def detect_xspf_header(typefind):
|
||||
data = typefind.peek(0, 150)
|
||||
if b'xspf' not in data.lower():
|
||||
return False
|
||||
|
||||
try:
|
||||
data = io.BytesIO(data)
|
||||
for event, element in elementtree.iterparse(data, events=(b'start',)):
|
||||
return element.tag.lower() == '{http://xspf.org/ns/0/}playlist'
|
||||
except elementtree.ParseError:
|
||||
pass
|
||||
return False
|
||||
|
||||
|
||||
def detect_asx_header(typefind):
|
||||
data = typefind.peek(0, 50)
|
||||
if b'asx' not in data.lower():
|
||||
return False
|
||||
|
||||
try:
|
||||
data = io.BytesIO(data)
|
||||
for event, element in elementtree.iterparse(data, events=(b'start',)):
|
||||
return element.tag.lower() == 'asx'
|
||||
except elementtree.ParseError:
|
||||
pass
|
||||
return False
|
||||
|
||||
|
||||
def parse_m3u(data):
|
||||
# TODO: convert non URIs to file URIs.
|
||||
found_header = False
|
||||
for line in data.readlines():
|
||||
if found_header or line.startswith(b'#EXTM3U'):
|
||||
found_header = True
|
||||
else:
|
||||
continue
|
||||
if not line.startswith(b'#') and line.strip():
|
||||
yield line.strip()
|
||||
|
||||
|
||||
def parse_pls(data):
|
||||
# TODO: convert non URIs to file URIs.
|
||||
try:
|
||||
cp = configparser.RawConfigParser()
|
||||
cp.readfp(data)
|
||||
except configparser.Error:
|
||||
return
|
||||
|
||||
for section in cp.sections():
|
||||
if section.lower() != 'playlist':
|
||||
continue
|
||||
for i in range(cp.getint(section, 'numberofentries')):
|
||||
yield cp.get(section, 'file%d' % (i + 1))
|
||||
|
||||
|
||||
def parse_xspf(data):
|
||||
try:
|
||||
# Last element will be root.
|
||||
for event, element in elementtree.iterparse(data):
|
||||
element.tag = element.tag.lower() # normalize
|
||||
except elementtree.ParseError:
|
||||
return
|
||||
|
||||
ns = 'http://xspf.org/ns/0/'
|
||||
for track in element.iterfind('{%s}tracklist/{%s}track' % (ns, ns)):
|
||||
yield track.findtext('{%s}location' % ns)
|
||||
|
||||
|
||||
def parse_asx(data):
|
||||
try:
|
||||
# Last element will be root.
|
||||
for event, element in elementtree.iterparse(data):
|
||||
element.tag = element.tag.lower() # normalize
|
||||
except elementtree.ParseError:
|
||||
return
|
||||
|
||||
for ref in element.findall('entry/ref[@href]'):
|
||||
yield ref.get('href', '').strip()
|
||||
|
||||
for entry in element.findall('entry[@href]'):
|
||||
yield entry.get('href', '').strip()
|
||||
|
||||
|
||||
def parse_urilist(data):
|
||||
for line in data.readlines():
|
||||
if not line.startswith('#') and gst.uri_is_valid(line.strip()):
|
||||
yield line
|
||||
|
||||
|
||||
def playlist_typefinder(typefind, func, caps):
|
||||
if func(typefind):
|
||||
typefind.suggest(gst.TYPE_FIND_MAXIMUM, caps)
|
||||
|
||||
|
||||
def register_typefind(mimetype, func, extensions):
|
||||
caps = gst.caps_from_string(mimetype)
|
||||
gst.type_find_register(mimetype, gst.RANK_PRIMARY, playlist_typefinder,
|
||||
extensions, caps, func, caps)
|
||||
|
||||
|
||||
def register_typefinders():
|
||||
register_typefind('audio/x-mpegurl', detect_m3u_header, [b'm3u', b'm3u8'])
|
||||
register_typefind('audio/x-scpls', detect_pls_header, [b'pls'])
|
||||
register_typefind('application/xspf+xml', detect_xspf_header, [b'xspf'])
|
||||
# NOTE: seems we can't use video/x-ms-asf which is the correct mime for asx
|
||||
# as it is shared with asf for streaming videos :/
|
||||
register_typefind('audio/x-ms-asx', detect_asx_header, [b'asx'])
|
||||
|
||||
|
||||
class BasePlaylistElement(gst.Bin):
|
||||
"""Base class for creating GStreamer elements for playlist support.
|
||||
|
||||
This element performs the following steps:
|
||||
|
||||
1. Initializes src and sink pads for the element.
|
||||
2. Collects data from the sink until EOS is reached.
|
||||
3. Passes the collected data to :meth:`convert` to get a list of URIs.
|
||||
4. Passes the list of URIs to :meth:`handle`, default handling is to pass
|
||||
the URIs to the src element as a uri-list.
|
||||
5. If handle returned true, the EOS consumed and nothing more happens, if
|
||||
it is not consumed it flows on to the next element downstream, which is
|
||||
likely our uri-list consumer which needs the EOS to know we are done
|
||||
sending URIs.
|
||||
"""
|
||||
|
||||
sinkpad_template = None
|
||||
"""GStreamer pad template to use for sink, must be overriden."""
|
||||
|
||||
srcpad_template = None
|
||||
"""GStreamer pad template to use for src, must be overriden."""
|
||||
|
||||
ghost_srcpad = False
|
||||
"""Indicates if src pad should be ghosted or not."""
|
||||
|
||||
def __init__(self):
|
||||
"""Sets up src and sink pads plus behaviour."""
|
||||
super(BasePlaylistElement, self).__init__()
|
||||
self._data = io.BytesIO()
|
||||
self._done = False
|
||||
|
||||
self.sinkpad = gst.Pad(self.sinkpad_template)
|
||||
self.sinkpad.set_chain_function(self._chain)
|
||||
self.sinkpad.set_event_function(self._event)
|
||||
self.add_pad(self.sinkpad)
|
||||
|
||||
if self.ghost_srcpad:
|
||||
self.srcpad = gst.ghost_pad_new_notarget('src', gst.PAD_SRC)
|
||||
else:
|
||||
self.srcpad = gst.Pad(self.srcpad_template)
|
||||
self.add_pad(self.srcpad)
|
||||
|
||||
def convert(self, data):
|
||||
"""Convert the data we have colleted to URIs.
|
||||
|
||||
:param data: collected data buffer
|
||||
:type data: :class:`io.BytesIO`
|
||||
:returns: iterable or generator of URIs
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def handle(self, uris):
|
||||
"""Do something useful with the URIs.
|
||||
|
||||
:param uris: list of URIs
|
||||
:type uris: :type:`list`
|
||||
:returns: boolean indicating if EOS should be consumed
|
||||
"""
|
||||
# TODO: handle unicode uris which we can get out of elementtree
|
||||
self.srcpad.push(gst.Buffer('\n'.join(uris)))
|
||||
return False
|
||||
|
||||
def _chain(self, pad, buf):
|
||||
if not self._done:
|
||||
self._data.write(buf.data)
|
||||
return gst.FLOW_OK
|
||||
return gst.FLOW_EOS
|
||||
|
||||
def _event(self, pad, event):
|
||||
if event.type == gst.EVENT_NEWSEGMENT:
|
||||
return True
|
||||
|
||||
if event.type == gst.EVENT_EOS:
|
||||
self._done = True
|
||||
self._data.seek(0)
|
||||
if self.handle(list(self.convert(self._data))):
|
||||
return True
|
||||
|
||||
# Ensure we handle remaining events in a sane way.
|
||||
return pad.event_default(event)
|
||||
|
||||
|
||||
class M3uDecoder(BasePlaylistElement):
|
||||
__gstdetails__ = ('M3U Decoder',
|
||||
'Decoder',
|
||||
'Convert .m3u to text/uri-list',
|
||||
'Mopidy')
|
||||
|
||||
sinkpad_template = gst.PadTemplate(
|
||||
'sink', gst.PAD_SINK, gst.PAD_ALWAYS,
|
||||
gst.caps_from_string('audio/x-mpegurl'))
|
||||
|
||||
srcpad_template = gst.PadTemplate(
|
||||
'src', gst.PAD_SRC, gst.PAD_ALWAYS,
|
||||
gst.caps_from_string('text/uri-list'))
|
||||
|
||||
__gsttemplates__ = (sinkpad_template, srcpad_template)
|
||||
|
||||
def convert(self, data):
|
||||
return parse_m3u(data)
|
||||
|
||||
|
||||
class PlsDecoder(BasePlaylistElement):
|
||||
__gstdetails__ = ('PLS Decoder',
|
||||
'Decoder',
|
||||
'Convert .pls to text/uri-list',
|
||||
'Mopidy')
|
||||
|
||||
sinkpad_template = gst.PadTemplate(
|
||||
'sink', gst.PAD_SINK, gst.PAD_ALWAYS,
|
||||
gst.caps_from_string('audio/x-scpls'))
|
||||
|
||||
srcpad_template = gst.PadTemplate(
|
||||
'src', gst.PAD_SRC, gst.PAD_ALWAYS,
|
||||
gst.caps_from_string('text/uri-list'))
|
||||
|
||||
__gsttemplates__ = (sinkpad_template, srcpad_template)
|
||||
|
||||
def convert(self, data):
|
||||
return parse_pls(data)
|
||||
|
||||
|
||||
class XspfDecoder(BasePlaylistElement):
|
||||
__gstdetails__ = ('XSPF Decoder',
|
||||
'Decoder',
|
||||
'Convert .pls to text/uri-list',
|
||||
'Mopidy')
|
||||
|
||||
sinkpad_template = gst.PadTemplate(
|
||||
'sink', gst.PAD_SINK, gst.PAD_ALWAYS,
|
||||
gst.caps_from_string('application/xspf+xml'))
|
||||
|
||||
srcpad_template = gst.PadTemplate(
|
||||
'src', gst.PAD_SRC, gst.PAD_ALWAYS,
|
||||
gst.caps_from_string('text/uri-list'))
|
||||
|
||||
__gsttemplates__ = (sinkpad_template, srcpad_template)
|
||||
|
||||
def convert(self, data):
|
||||
return parse_xspf(data)
|
||||
|
||||
|
||||
class AsxDecoder(BasePlaylistElement):
|
||||
__gstdetails__ = ('ASX Decoder',
|
||||
'Decoder',
|
||||
'Convert .asx to text/uri-list',
|
||||
'Mopidy')
|
||||
|
||||
sinkpad_template = gst.PadTemplate(
|
||||
'sink', gst.PAD_SINK, gst.PAD_ALWAYS,
|
||||
gst.caps_from_string('audio/x-ms-asx'))
|
||||
|
||||
srcpad_template = gst.PadTemplate(
|
||||
'src', gst.PAD_SRC, gst.PAD_ALWAYS,
|
||||
gst.caps_from_string('text/uri-list'))
|
||||
|
||||
__gsttemplates__ = (sinkpad_template, srcpad_template)
|
||||
|
||||
def convert(self, data):
|
||||
return parse_asx(data)
|
||||
|
||||
|
||||
class UriListElement(BasePlaylistElement):
|
||||
__gstdetails__ = ('URIListDemuxer',
|
||||
'Demuxer',
|
||||
'Convert a text/uri-list to a stream',
|
||||
'Mopidy')
|
||||
|
||||
sinkpad_template = gst.PadTemplate(
|
||||
'sink', gst.PAD_SINK, gst.PAD_ALWAYS,
|
||||
gst.caps_from_string('text/uri-list'))
|
||||
|
||||
srcpad_template = gst.PadTemplate(
|
||||
'src', gst.PAD_SRC, gst.PAD_ALWAYS,
|
||||
gst.caps_new_any())
|
||||
|
||||
ghost_srcpad = True # We need to hook this up to our internal decodebin
|
||||
|
||||
__gsttemplates__ = (sinkpad_template, srcpad_template)
|
||||
|
||||
def __init__(self):
|
||||
super(UriListElement, self).__init__()
|
||||
self.uridecodebin = gst.element_factory_make('uridecodebin')
|
||||
self.uridecodebin.connect('pad-added', self.pad_added)
|
||||
# Limit to anycaps so we get a single stream out, letting other
|
||||
# elements downstream figure out actual muxing
|
||||
self.uridecodebin.set_property('caps', gst.caps_new_any())
|
||||
|
||||
def pad_added(self, src, pad):
|
||||
self.srcpad.set_target(pad)
|
||||
pad.add_event_probe(self.pad_event)
|
||||
|
||||
def pad_event(self, pad, event):
|
||||
if event.has_name('urilist-played'):
|
||||
error = gst.GError(gst.RESOURCE_ERROR, gst.RESOURCE_ERROR_FAILED,
|
||||
b'Nested playlists not supported.')
|
||||
message = b'Playlists pointing to other playlists is not supported'
|
||||
self.post_message(gst.message_new_error(self, error, message))
|
||||
return 1 # GST_PAD_PROBE_OK
|
||||
|
||||
def handle(self, uris):
|
||||
struct = gst.Structure('urilist-played')
|
||||
event = gst.event_new_custom(gst.EVENT_CUSTOM_UPSTREAM, struct)
|
||||
self.sinkpad.push_event(event)
|
||||
|
||||
# TODO: hookup about to finish and errors to rest of URIs so we
|
||||
# round robin, only giving up once all have been tried.
|
||||
# TODO: uris could be empty.
|
||||
self.add(self.uridecodebin)
|
||||
self.uridecodebin.set_state(gst.STATE_READY)
|
||||
self.uridecodebin.set_property('uri', uris[0])
|
||||
self.uridecodebin.sync_state_with_parent()
|
||||
return True # Make sure we consume the EOS that triggered us.
|
||||
|
||||
def convert(self, data):
|
||||
return parse_urilist(data)
|
||||
|
||||
|
||||
class IcySrc(gst.Bin, gst.URIHandler):
|
||||
__gstdetails__ = ('IcySrc',
|
||||
'Src',
|
||||
'HTTP src wrapper for icy:// support.',
|
||||
'Mopidy')
|
||||
|
||||
srcpad_template = gst.PadTemplate(
|
||||
'src', gst.PAD_SRC, gst.PAD_ALWAYS,
|
||||
gst.caps_new_any())
|
||||
|
||||
__gsttemplates__ = (srcpad_template,)
|
||||
|
||||
def __init__(self):
|
||||
super(IcySrc, self).__init__()
|
||||
self._httpsrc = gst.element_make_from_uri(gst.URI_SRC, 'http://')
|
||||
try:
|
||||
self._httpsrc.set_property('iradio-mode', True)
|
||||
except TypeError:
|
||||
pass
|
||||
self.add(self._httpsrc)
|
||||
|
||||
self._srcpad = gst.GhostPad('src', self._httpsrc.get_pad('src'))
|
||||
self.add_pad(self._srcpad)
|
||||
|
||||
@classmethod
|
||||
def do_get_type_full(cls):
|
||||
return gst.URI_SRC
|
||||
|
||||
@classmethod
|
||||
def do_get_protocols_full(cls):
|
||||
return [b'icy', b'icyx']
|
||||
|
||||
def do_set_uri(self, uri):
|
||||
if uri.startswith('icy://'):
|
||||
return self._httpsrc.set_uri(b'http://' + uri[len('icy://'):])
|
||||
elif uri.startswith('icyx://'):
|
||||
return self._httpsrc.set_uri(b'https://' + uri[len('icyx://'):])
|
||||
else:
|
||||
return False
|
||||
|
||||
def do_get_uri(self):
|
||||
uri = self._httpsrc.get_uri()
|
||||
if uri.startswith('http://'):
|
||||
return b'icy://' + uri[len('http://'):]
|
||||
else:
|
||||
return b'icyx://' + uri[len('https://'):]
|
||||
|
||||
|
||||
def register_element(element_class):
|
||||
gobject.type_register(element_class)
|
||||
gst.element_register(
|
||||
element_class, element_class.__name__.lower(), gst.RANK_MARGINAL)
|
||||
|
||||
|
||||
def register_elements():
|
||||
register_element(M3uDecoder)
|
||||
register_element(PlsDecoder)
|
||||
register_element(XspfDecoder)
|
||||
register_element(AsxDecoder)
|
||||
register_element(UriListElement)
|
||||
|
||||
# Only register icy if gst install can't handle it on it's own.
|
||||
if not gst.element_make_from_uri(gst.URI_SRC, 'icy://'):
|
||||
register_element(IcySrc)
|
||||
@ -10,7 +10,7 @@ import gst.pbutils # noqa
|
||||
|
||||
from mopidy import exceptions
|
||||
from mopidy.audio import utils
|
||||
from mopidy.utils import encoding
|
||||
from mopidy.internal import encoding
|
||||
|
||||
_missing_plugin_desc = gst.pbutils.missing_plugin_message_get_description
|
||||
|
||||
@ -22,6 +22,7 @@ _RAW_AUDIO = gst.Caps(b'audio/x-raw-int; audio/x-raw-float')
|
||||
|
||||
# TODO: replace with a scan(uri, timeout=1000, proxy_config=None)?
|
||||
class Scanner(object):
|
||||
|
||||
"""
|
||||
Helper to get tags and other relevant info from URIs.
|
||||
|
||||
@ -73,7 +74,8 @@ def _setup_pipeline(uri, proxy_config=None):
|
||||
decodebin = gst.element_factory_make('decodebin2')
|
||||
|
||||
pipeline = gst.element_factory_make('pipeline')
|
||||
pipeline.add_many(src, typefind, decodebin)
|
||||
for e in (src, typefind, decodebin):
|
||||
pipeline.add(e)
|
||||
gst.element_link_many(src, typefind, decodebin)
|
||||
|
||||
if proxy_config:
|
||||
@ -180,7 +182,7 @@ if __name__ == '__main__':
|
||||
|
||||
import gobject
|
||||
|
||||
from mopidy.utils import path
|
||||
from mopidy.internal import path
|
||||
|
||||
gobject.threads_init()
|
||||
|
||||
|
||||
@ -8,7 +8,7 @@ import pygst
|
||||
pygst.require('0.10')
|
||||
import gst # noqa
|
||||
|
||||
from mopidy import compat
|
||||
from mopidy import compat, httpclient
|
||||
from mopidy.models import Album, Artist, Track
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@ -65,15 +65,21 @@ def supported_uri_schemes(uri_schemes):
|
||||
return supported_schemes
|
||||
|
||||
|
||||
def _artists(tags, artist_name, artist_id=None):
|
||||
def _artists(tags, artist_name, artist_id=None, artist_sortname=None):
|
||||
# Name missing, don't set artist
|
||||
if not tags.get(artist_name):
|
||||
return None
|
||||
# One artist name and id, provide artist with id.
|
||||
if len(tags[artist_name]) == 1 and artist_id in tags:
|
||||
return [Artist(name=tags[artist_name][0],
|
||||
musicbrainz_id=tags[artist_id][0])]
|
||||
# Multiple artist, provide artists without id.
|
||||
# One artist name and either id or sortname, include all available fields
|
||||
if len(tags[artist_name]) == 1 and \
|
||||
(artist_id in tags or artist_sortname in tags):
|
||||
attrs = {'name': tags[artist_name][0]}
|
||||
if artist_id in tags:
|
||||
attrs['musicbrainz_id'] = tags[artist_id][0]
|
||||
if artist_sortname in tags:
|
||||
attrs['sortname'] = tags[artist_sortname][0]
|
||||
return [Artist(**attrs)]
|
||||
|
||||
# Multiple artist, provide artists with name only to avoid ambiguity.
|
||||
return [Artist(name=name) for name in tags[artist_name]]
|
||||
|
||||
|
||||
@ -91,8 +97,9 @@ def convert_tags_to_track(tags):
|
||||
|
||||
track_kwargs['composers'] = _artists(tags, gst.TAG_COMPOSER)
|
||||
track_kwargs['performers'] = _artists(tags, gst.TAG_PERFORMER)
|
||||
track_kwargs['artists'] = _artists(
|
||||
tags, gst.TAG_ARTIST, 'musicbrainz-artistid')
|
||||
track_kwargs['artists'] = _artists(tags, gst.TAG_ARTIST,
|
||||
'musicbrainz-artistid',
|
||||
'musicbrainz-sortname')
|
||||
album_kwargs['artists'] = _artists(
|
||||
tags, gst.TAG_ALBUM_ARTIST, 'musicbrainz-albumartistid')
|
||||
|
||||
@ -142,11 +149,7 @@ def setup_proxy(element, config):
|
||||
if not hasattr(element.props, 'proxy') or not config.get('hostname'):
|
||||
return
|
||||
|
||||
proxy = "%s://%s:%d" % (config.get('scheme', 'http'),
|
||||
config.get('hostname'),
|
||||
config.get('port', 80))
|
||||
|
||||
element.set_property('proxy', proxy)
|
||||
element.set_property('proxy', httpclient.format_proxy(config, auth=False))
|
||||
element.set_property('proxy-id', config.get('username'))
|
||||
element.set_property('proxy-pw', config.get('password'))
|
||||
|
||||
|
||||
@ -1,9 +1,15 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import logging
|
||||
|
||||
from mopidy import listener, models
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class Backend(object):
|
||||
|
||||
"""Backend API
|
||||
|
||||
If the backend has problems during initialization it should raise
|
||||
@ -57,8 +63,13 @@ class Backend(object):
|
||||
def has_playlists(self):
|
||||
return self.playlists is not None
|
||||
|
||||
def ping(self):
|
||||
"""Called to check if the actor is still alive."""
|
||||
return True
|
||||
|
||||
|
||||
class LibraryProvider(object):
|
||||
|
||||
"""
|
||||
:param backend: backend the controller is a part of
|
||||
:type backend: :class:`mopidy.backend.Backend`
|
||||
@ -151,6 +162,7 @@ class LibraryProvider(object):
|
||||
|
||||
|
||||
class PlaybackProvider(object):
|
||||
|
||||
"""
|
||||
:param audio: the audio actor
|
||||
:type audio: actor proxy to an instance of :class:`mopidy.audio.Audio`
|
||||
@ -231,6 +243,9 @@ class PlaybackProvider(object):
|
||||
:rtype: :class:`True` if successful, else :class:`False`
|
||||
"""
|
||||
uri = self.translate_uri(track.uri)
|
||||
if uri != track.uri:
|
||||
logger.debug(
|
||||
'Backend translated URI from %s to %s', track.uri, uri)
|
||||
if not uri:
|
||||
return False
|
||||
self.audio.set_uri(uri).get()
|
||||
@ -283,6 +298,7 @@ class PlaybackProvider(object):
|
||||
|
||||
|
||||
class PlaylistsProvider(object):
|
||||
|
||||
"""
|
||||
A playlist provider exposes a collection of playlists, methods to
|
||||
create/change/delete playlists in this collection, and lookup of any
|
||||
@ -394,6 +410,7 @@ class PlaylistsProvider(object):
|
||||
|
||||
|
||||
class BackendListener(listener.Listener):
|
||||
|
||||
"""
|
||||
Marker interface for recipients of events sent by the backend actors.
|
||||
|
||||
|
||||
@ -2,6 +2,7 @@ from __future__ import absolute_import, print_function, unicode_literals
|
||||
|
||||
import argparse
|
||||
import collections
|
||||
import contextlib
|
||||
import logging
|
||||
import os
|
||||
import sys
|
||||
@ -10,10 +11,12 @@ import glib
|
||||
|
||||
import gobject
|
||||
|
||||
import pykka
|
||||
|
||||
from mopidy import config as config_lib, exceptions
|
||||
from mopidy.audio import Audio
|
||||
from mopidy.core import Core
|
||||
from mopidy.utils import deps, process, timer, versioning
|
||||
from mopidy.internal import deps, process, timer, versioning
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -38,7 +41,9 @@ def config_override_type(value):
|
||||
|
||||
|
||||
class _ParserError(Exception):
|
||||
pass
|
||||
|
||||
def __init__(self, message):
|
||||
self.message = message
|
||||
|
||||
|
||||
class _HelpError(Exception):
|
||||
@ -46,11 +51,13 @@ class _HelpError(Exception):
|
||||
|
||||
|
||||
class _ArgumentParser(argparse.ArgumentParser):
|
||||
|
||||
def error(self, message):
|
||||
raise _ParserError(message)
|
||||
|
||||
|
||||
class _HelpAction(argparse.Action):
|
||||
|
||||
def __init__(self, option_strings, dest=None, help=None):
|
||||
super(_HelpAction, self).__init__(
|
||||
option_strings=option_strings,
|
||||
@ -64,6 +71,7 @@ class _HelpAction(argparse.Action):
|
||||
|
||||
|
||||
class Command(object):
|
||||
|
||||
"""Command parser and runner for building trees of commands.
|
||||
|
||||
This class provides a wraper around :class:`argparse.ArgumentParser`
|
||||
@ -225,7 +233,26 @@ class Command(object):
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def _actor_error_handling(name):
|
||||
try:
|
||||
yield
|
||||
except exceptions.BackendError as exc:
|
||||
logger.error(
|
||||
'Backend (%s) initialization error: %s', name, exc.message)
|
||||
except exceptions.FrontendError as exc:
|
||||
logger.error(
|
||||
'Frontend (%s) initialization error: %s', name, exc.message)
|
||||
except exceptions.MixerError as exc:
|
||||
logger.error(
|
||||
'Mixer (%s) initialization error: %s', name, exc.message)
|
||||
except Exception:
|
||||
logger.exception('Got un-handled exception from %s', name)
|
||||
|
||||
|
||||
# TODO: move out of this utility class
|
||||
class RootCommand(Command):
|
||||
|
||||
def __init__(self):
|
||||
super(RootCommand, self).__init__()
|
||||
self.set(base_verbosity_level=0)
|
||||
@ -270,9 +297,11 @@ class RootCommand(Command):
|
||||
mixer = None
|
||||
if mixer_class is not None:
|
||||
mixer = self.start_mixer(config, mixer_class)
|
||||
if mixer:
|
||||
self.configure_mixer(config, mixer)
|
||||
audio = self.start_audio(config, mixer)
|
||||
backends = self.start_backends(config, backend_classes, audio)
|
||||
core = self.start_core(mixer, backends, audio)
|
||||
core = self.start_core(config, mixer, backends, audio)
|
||||
self.start_frontends(config, frontend_classes, core)
|
||||
loop.run()
|
||||
except (exceptions.BackendError,
|
||||
@ -316,16 +345,15 @@ class RootCommand(Command):
|
||||
return selected_mixers[0]
|
||||
|
||||
def start_mixer(self, config, mixer_class):
|
||||
try:
|
||||
logger.info('Starting Mopidy mixer: %s', mixer_class.__name__)
|
||||
logger.info('Starting Mopidy mixer: %s', mixer_class.__name__)
|
||||
with _actor_error_handling(mixer_class.__name__):
|
||||
mixer = mixer_class.start(config=config).proxy()
|
||||
self.configure_mixer(config, mixer)
|
||||
return mixer
|
||||
except exceptions.MixerError as exc:
|
||||
logger.error(
|
||||
'Mixer (%s) initialization error: %s',
|
||||
mixer_class.__name__, exc.message)
|
||||
raise
|
||||
try:
|
||||
mixer.ping().get()
|
||||
return mixer
|
||||
except pykka.ActorDeadError as exc:
|
||||
logger.error('Actor died: %s', exc)
|
||||
return None
|
||||
|
||||
def configure_mixer(self, config, mixer):
|
||||
volume = config['audio']['mixer_volume']
|
||||
@ -346,22 +374,26 @@ class RootCommand(Command):
|
||||
|
||||
backends = []
|
||||
for backend_class in backend_classes:
|
||||
try:
|
||||
with _actor_error_handling(backend_class.__name__):
|
||||
with timer.time_logger(backend_class.__name__):
|
||||
backend = backend_class.start(
|
||||
config=config, audio=audio).proxy()
|
||||
backends.append(backend)
|
||||
except exceptions.BackendError as exc:
|
||||
logger.error(
|
||||
'Backend (%s) initialization error: %s',
|
||||
backend_class.__name__, exc.message)
|
||||
raise
|
||||
backends.append(backend)
|
||||
|
||||
# Block until all on_starts have finished, letting them run in parallel
|
||||
for backend in backends[:]:
|
||||
try:
|
||||
backend.ping().get()
|
||||
except pykka.ActorDeadError as exc:
|
||||
backends.remove(backend)
|
||||
logger.error('Actor died: %s', exc)
|
||||
|
||||
return backends
|
||||
|
||||
def start_core(self, mixer, backends, audio):
|
||||
def start_core(self, config, mixer, backends, audio):
|
||||
logger.info('Starting Mopidy core')
|
||||
return Core.start(mixer=mixer, backends=backends, audio=audio).proxy()
|
||||
return Core.start(
|
||||
config=config, mixer=mixer, backends=backends, audio=audio).proxy()
|
||||
|
||||
def start_frontends(self, config, frontend_classes, core):
|
||||
logger.info(
|
||||
@ -369,14 +401,9 @@ class RootCommand(Command):
|
||||
', '.join(f.__name__ for f in frontend_classes) or 'none')
|
||||
|
||||
for frontend_class in frontend_classes:
|
||||
try:
|
||||
with _actor_error_handling(frontend_class.__name__):
|
||||
with timer.time_logger(frontend_class.__name__):
|
||||
frontend_class.start(config=config, core=core)
|
||||
except exceptions.FrontendError as exc:
|
||||
logger.error(
|
||||
'Frontend (%s) initialization error: %s',
|
||||
frontend_class.__name__, exc.message)
|
||||
raise
|
||||
|
||||
def stop_frontends(self, frontend_classes):
|
||||
logger.info('Stopping Mopidy frontends')
|
||||
@ -408,8 +435,8 @@ class ConfigCommand(Command):
|
||||
super(ConfigCommand, self).__init__()
|
||||
self.set(base_verbosity_level=-1)
|
||||
|
||||
def run(self, config, errors, extensions):
|
||||
print(config_lib.format(config, extensions, errors))
|
||||
def run(self, config, errors, schemas):
|
||||
print(config_lib.format(config, schemas, errors))
|
||||
return 0
|
||||
|
||||
|
||||
|
||||
@ -11,10 +11,17 @@ from mopidy.compat import configparser
|
||||
from mopidy.config import keyring
|
||||
from mopidy.config.schemas import * # noqa
|
||||
from mopidy.config.types import * # noqa
|
||||
from mopidy.utils import path, versioning
|
||||
from mopidy.internal import path, versioning
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
_core_schema = ConfigSchema('core')
|
||||
_core_schema['cache_dir'] = Path()
|
||||
_core_schema['config_dir'] = Path()
|
||||
_core_schema['data_dir'] = Path()
|
||||
# MPD supports at most 10k tracks, some clients segfault when this is exceeded.
|
||||
_core_schema['max_tracklist_length'] = Integer(minimum=1, maximum=10000)
|
||||
|
||||
_logging_schema = ConfigSchema('logging')
|
||||
_logging_schema['color'] = Boolean()
|
||||
_logging_schema['console_format'] = String()
|
||||
@ -43,8 +50,9 @@ _proxy_schema['password'] = Secret(optional=True)
|
||||
# NOTE: if multiple outputs ever comes something like LogLevelConfigSchema
|
||||
# _outputs_schema = config.AudioOutputConfigSchema()
|
||||
|
||||
_schemas = [_logging_schema, _loglevels_schema, _logcolors_schema,
|
||||
_audio_schema, _proxy_schema]
|
||||
_schemas = [
|
||||
_core_schema, _logging_schema, _loglevels_schema, _logcolors_schema,
|
||||
_audio_schema, _proxy_schema]
|
||||
|
||||
_INITIAL_HELP = """
|
||||
# For further information about options in this file see:
|
||||
@ -65,41 +73,40 @@ def read(config_file):
|
||||
return filehandle.read()
|
||||
|
||||
|
||||
def load(files, extensions, overrides):
|
||||
# Helper to get configs, as the rest of our config system should not need
|
||||
# to know about extensions.
|
||||
def load(files, ext_schemas, ext_defaults, overrides):
|
||||
config_dir = os.path.dirname(__file__)
|
||||
defaults = [read(os.path.join(config_dir, 'default.conf'))]
|
||||
defaults.extend(e.get_default_config() for e in extensions)
|
||||
defaults.extend(ext_defaults)
|
||||
raw_config = _load(files, defaults, keyring.fetch() + (overrides or []))
|
||||
|
||||
schemas = _schemas[:]
|
||||
schemas.extend(e.get_config_schema() for e in extensions)
|
||||
schemas.extend(ext_schemas)
|
||||
return _validate(raw_config, schemas)
|
||||
|
||||
|
||||
def format(config, extensions, comments=None, display=True):
|
||||
# Helper to format configs, as the rest of our config system should not
|
||||
# need to know about extensions.
|
||||
def format(config, ext_schemas, comments=None, display=True):
|
||||
schemas = _schemas[:]
|
||||
schemas.extend(e.get_config_schema() for e in extensions)
|
||||
schemas.extend(ext_schemas)
|
||||
return _format(config, comments or {}, schemas, display, False)
|
||||
|
||||
|
||||
def format_initial(extensions):
|
||||
def format_initial(extensions_data):
|
||||
config_dir = os.path.dirname(__file__)
|
||||
defaults = [read(os.path.join(config_dir, 'default.conf'))]
|
||||
defaults.extend(e.get_default_config() for e in extensions)
|
||||
defaults.extend(d.extension.get_default_config() for d in extensions_data)
|
||||
raw_config = _load([], defaults, [])
|
||||
|
||||
schemas = _schemas[:]
|
||||
schemas.extend(e.get_config_schema() for e in extensions)
|
||||
schemas.extend(d.extension.get_config_schema() for d in extensions_data)
|
||||
|
||||
config, errors = _validate(raw_config, schemas)
|
||||
|
||||
versions = ['Mopidy %s' % versioning.get_version()]
|
||||
for extension in sorted(extensions, key=lambda ext: ext.dist_name):
|
||||
versions.append('%s %s' % (extension.dist_name, extension.version))
|
||||
extensions_data = sorted(
|
||||
extensions_data, key=lambda d: d.extension.dist_name)
|
||||
for data in extensions_data:
|
||||
versions.append('%s %s' % (
|
||||
data.extension.dist_name, data.extension.version))
|
||||
|
||||
header = _INITIAL_HELP.strip() % {'versions': '\n# '.join(versions)}
|
||||
formatted_config = _format(
|
||||
@ -264,6 +271,7 @@ def _postprocess(config_string):
|
||||
|
||||
|
||||
class Proxy(collections.Mapping):
|
||||
|
||||
def __init__(self, data):
|
||||
self._data = data
|
||||
|
||||
|
||||
@ -1,3 +1,9 @@
|
||||
[core]
|
||||
cache_dir = $XDG_CACHE_DIR/mopidy
|
||||
config_dir = $XDG_CONFIG_DIR/mopidy
|
||||
data_dir = $XDG_DATA_DIR/mopidy
|
||||
max_tracklist_length = 10000
|
||||
|
||||
[logging]
|
||||
color = true
|
||||
console_format = %(levelname)-8s %(message)s
|
||||
|
||||
@ -38,6 +38,7 @@ def _levenshtein(a, b):
|
||||
|
||||
|
||||
class ConfigSchema(collections.OrderedDict):
|
||||
|
||||
"""Logical group of config values that correspond to a config section.
|
||||
|
||||
Schemas are set up by assigning config keys with config values to
|
||||
@ -47,6 +48,7 @@ class ConfigSchema(collections.OrderedDict):
|
||||
:meth:`serialize` for converting the values to a form suitable for
|
||||
persistence.
|
||||
"""
|
||||
|
||||
def __init__(self, name):
|
||||
super(ConfigSchema, self).__init__()
|
||||
self.name = name
|
||||
@ -95,6 +97,7 @@ class ConfigSchema(collections.OrderedDict):
|
||||
|
||||
|
||||
class MapConfigSchema(object):
|
||||
|
||||
"""Schema for handling multiple unknown keys with the same type.
|
||||
|
||||
Does not sub-class :class:`ConfigSchema`, but implements the same
|
||||
|
||||
@ -6,7 +6,7 @@ import socket
|
||||
|
||||
from mopidy import compat
|
||||
from mopidy.config import validators
|
||||
from mopidy.utils import log, path
|
||||
from mopidy.internal import log, path
|
||||
|
||||
|
||||
def decode(value):
|
||||
@ -25,6 +25,7 @@ def encode(value):
|
||||
|
||||
|
||||
class ExpandedPath(bytes):
|
||||
|
||||
def __new__(cls, original, expanded):
|
||||
return super(ExpandedPath, cls).__new__(cls, expanded)
|
||||
|
||||
@ -37,6 +38,7 @@ class DeprecatedValue(object):
|
||||
|
||||
|
||||
class ConfigValue(object):
|
||||
|
||||
"""Represents a config key's value and how to handle it.
|
||||
|
||||
Normally you will only be interacting with sub-classes for config values
|
||||
@ -65,6 +67,7 @@ class ConfigValue(object):
|
||||
|
||||
|
||||
class Deprecated(ConfigValue):
|
||||
|
||||
"""Deprecated value
|
||||
|
||||
Used for ignoring old config values that are no longer in use, but should
|
||||
@ -79,10 +82,12 @@ class Deprecated(ConfigValue):
|
||||
|
||||
|
||||
class String(ConfigValue):
|
||||
|
||||
"""String value.
|
||||
|
||||
Is decoded as utf-8 and \\n \\t escapes should work and be preserved.
|
||||
"""
|
||||
|
||||
def __init__(self, optional=False, choices=None):
|
||||
self._required = not optional
|
||||
self._choices = choices
|
||||
@ -102,6 +107,7 @@ class String(ConfigValue):
|
||||
|
||||
|
||||
class Secret(String):
|
||||
|
||||
"""Secret string value.
|
||||
|
||||
Is decoded as utf-8 and \\n \\t escapes should work and be preserved.
|
||||
@ -109,6 +115,7 @@ class Secret(String):
|
||||
Should be used for passwords, auth tokens etc. Will mask value when being
|
||||
displayed.
|
||||
"""
|
||||
|
||||
def __init__(self, optional=False, choices=None):
|
||||
self._required = not optional
|
||||
self._choices = None # Choices doesn't make sense for secrets
|
||||
@ -120,6 +127,7 @@ class Secret(String):
|
||||
|
||||
|
||||
class Integer(ConfigValue):
|
||||
|
||||
"""Integer value."""
|
||||
|
||||
def __init__(
|
||||
@ -141,6 +149,7 @@ class Integer(ConfigValue):
|
||||
|
||||
|
||||
class Boolean(ConfigValue):
|
||||
|
||||
"""Boolean value.
|
||||
|
||||
Accepts ``1``, ``yes``, ``true``, and ``on`` with any casing as
|
||||
@ -173,11 +182,13 @@ class Boolean(ConfigValue):
|
||||
|
||||
|
||||
class List(ConfigValue):
|
||||
|
||||
"""List value.
|
||||
|
||||
Supports elements split by commas or newlines. Newlines take presedence and
|
||||
empty list items will be filtered out.
|
||||
"""
|
||||
|
||||
def __init__(self, optional=False):
|
||||
self._required = not optional
|
||||
|
||||
@ -198,6 +209,7 @@ class List(ConfigValue):
|
||||
|
||||
|
||||
class LogColor(ConfigValue):
|
||||
|
||||
def deserialize(self, value):
|
||||
validators.validate_choice(value.lower(), log.COLORS)
|
||||
return value.lower()
|
||||
@ -209,6 +221,7 @@ class LogColor(ConfigValue):
|
||||
|
||||
|
||||
class LogLevel(ConfigValue):
|
||||
|
||||
"""Log level value.
|
||||
|
||||
Expects one of ``critical``, ``error``, ``warning``, ``info``, ``debug``,
|
||||
@ -235,6 +248,7 @@ class LogLevel(ConfigValue):
|
||||
|
||||
|
||||
class Hostname(ConfigValue):
|
||||
|
||||
"""Network hostname value."""
|
||||
|
||||
def __init__(self, optional=False):
|
||||
@ -252,18 +266,21 @@ class Hostname(ConfigValue):
|
||||
|
||||
|
||||
class Port(Integer):
|
||||
|
||||
"""Network port value.
|
||||
|
||||
Expects integer in the range 0-65535, zero tells the kernel to simply
|
||||
allocate a port for us.
|
||||
"""
|
||||
# TODO: consider probing if port is free or not?
|
||||
|
||||
def __init__(self, choices=None, optional=False):
|
||||
super(Port, self).__init__(
|
||||
minimum=0, maximum=2 ** 16 - 1, choices=choices, optional=optional)
|
||||
|
||||
|
||||
class Path(ConfigValue):
|
||||
|
||||
"""File system path
|
||||
|
||||
The following expansions of the path will be done:
|
||||
@ -278,6 +295,7 @@ class Path(ConfigValue):
|
||||
|
||||
- ``$XDG_MUSIC_DIR`` according to the XDG spec
|
||||
"""
|
||||
|
||||
def __init__(self, optional=False):
|
||||
self._required = not optional
|
||||
|
||||
|
||||
@ -2,6 +2,7 @@ from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import collections
|
||||
import itertools
|
||||
import logging
|
||||
|
||||
import pykka
|
||||
|
||||
@ -14,8 +15,11 @@ from mopidy.core.mixer import MixerController
|
||||
from mopidy.core.playback import PlaybackController
|
||||
from mopidy.core.playlists import PlaylistsController
|
||||
from mopidy.core.tracklist import TracklistController
|
||||
from mopidy.utils import versioning
|
||||
from mopidy.utils.deprecation import deprecated_property
|
||||
from mopidy.internal import versioning
|
||||
from mopidy.internal.deprecation import deprecated_property
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class Core(
|
||||
@ -23,32 +27,28 @@ class Core(
|
||||
mixer.MixerListener):
|
||||
|
||||
library = None
|
||||
"""The library controller. An instance of
|
||||
:class:`mopidy.core.LibraryController`."""
|
||||
"""An instance of :class:`~mopidy.core.LibraryController`"""
|
||||
|
||||
history = None
|
||||
"""The playback history controller. An instance of
|
||||
:class:`mopidy.core.HistoryController`."""
|
||||
"""An instance of :class:`~mopidy.core.HistoryController`"""
|
||||
|
||||
mixer = None
|
||||
"""The mixer controller. An instance of
|
||||
:class:`mopidy.core.MixerController`."""
|
||||
"""An instance of :class:`~mopidy.core.MixerController`"""
|
||||
|
||||
playback = None
|
||||
"""The playback controller. An instance of
|
||||
:class:`mopidy.core.PlaybackController`."""
|
||||
"""An instance of :class:`~mopidy.core.PlaybackController`"""
|
||||
|
||||
playlists = None
|
||||
"""The playlists controller. An instance of
|
||||
:class:`mopidy.core.PlaylistsController`."""
|
||||
"""An instance of :class:`~mopidy.core.PlaylistsController`"""
|
||||
|
||||
tracklist = None
|
||||
"""The tracklist controller. An instance of
|
||||
:class:`mopidy.core.TracklistController`."""
|
||||
"""An instance of :class:`~mopidy.core.TracklistController`"""
|
||||
|
||||
def __init__(self, mixer=None, backends=None, audio=None):
|
||||
def __init__(self, config=None, mixer=None, backends=None, audio=None):
|
||||
super(Core, self).__init__()
|
||||
|
||||
self._config = config
|
||||
|
||||
self.backends = Backends(backends)
|
||||
|
||||
self.library = LibraryController(backends=self.backends, core=self)
|
||||
@ -134,6 +134,7 @@ class Core(
|
||||
|
||||
|
||||
class Backends(list):
|
||||
|
||||
def __init__(self, backends):
|
||||
super(Backends, self).__init__(backends)
|
||||
|
||||
@ -148,10 +149,15 @@ class Backends(list):
|
||||
return b.actor_ref.actor_class.__name__
|
||||
|
||||
for b in backends:
|
||||
has_library = b.has_library().get()
|
||||
has_library_browse = b.has_library_browse().get()
|
||||
has_playback = b.has_playback().get()
|
||||
has_playlists = b.has_playlists().get()
|
||||
try:
|
||||
has_library = b.has_library().get()
|
||||
has_library_browse = b.has_library_browse().get()
|
||||
has_playback = b.has_playback().get()
|
||||
has_playlists = b.has_playlists().get()
|
||||
except Exception:
|
||||
self.remove(b)
|
||||
logger.exception('Fetching backend info for %s failed',
|
||||
b.actor_ref.actor_class.__name__)
|
||||
|
||||
for scheme in b.uri_schemes.get():
|
||||
assert scheme not in backends_by_scheme, (
|
||||
|
||||
@ -1,15 +1,32 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import collections
|
||||
import contextlib
|
||||
import logging
|
||||
import operator
|
||||
import urlparse
|
||||
|
||||
import pykka
|
||||
from mopidy import compat, exceptions, models
|
||||
from mopidy.internal import deprecation, validation
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def _backend_error_handling(backend, reraise=None):
|
||||
try:
|
||||
yield
|
||||
except exceptions.ValidationError as e:
|
||||
logger.error('%s backend returned bad data: %s',
|
||||
backend.actor_ref.actor_class.__name__, e)
|
||||
except Exception as e:
|
||||
if reraise and isinstance(e, reraise):
|
||||
raise
|
||||
logger.exception('%s backend caused an exception.',
|
||||
backend.actor_ref.actor_class.__name__)
|
||||
|
||||
|
||||
class LibraryController(object):
|
||||
pykka_traversable = True
|
||||
|
||||
@ -38,8 +55,8 @@ class LibraryController(object):
|
||||
Browse directories and tracks at the given ``uri``.
|
||||
|
||||
``uri`` is a string which represents some directory belonging to a
|
||||
backend. To get the intial root directories for backends pass None as
|
||||
the URI.
|
||||
backend. To get the intial root directories for backends pass
|
||||
:class:`None` as the URI.
|
||||
|
||||
Returns a list of :class:`mopidy.models.Ref` objects for the
|
||||
directories and tracks at the given ``uri``.
|
||||
@ -67,15 +84,36 @@ class LibraryController(object):
|
||||
.. versionadded:: 0.18
|
||||
"""
|
||||
if uri is None:
|
||||
backends = self.backends.with_library_browse.values()
|
||||
unique_dirs = {b.library.root_directory.get() for b in backends}
|
||||
return sorted(unique_dirs, key=operator.attrgetter('name'))
|
||||
return self._roots()
|
||||
elif not uri.strip():
|
||||
return []
|
||||
validation.check_uri(uri)
|
||||
return self._browse(uri)
|
||||
|
||||
def _roots(self):
|
||||
directories = set()
|
||||
backends = self.backends.with_library_browse.values()
|
||||
futures = {b: b.library.root_directory for b in backends}
|
||||
for backend, future in futures.items():
|
||||
with _backend_error_handling(backend):
|
||||
root = future.get()
|
||||
validation.check_instance(root, models.Ref)
|
||||
directories.add(root)
|
||||
return sorted(directories, key=operator.attrgetter('name'))
|
||||
|
||||
def _browse(self, uri):
|
||||
scheme = urlparse.urlparse(uri).scheme
|
||||
backend = self.backends.with_library_browse.get(scheme)
|
||||
|
||||
if not backend:
|
||||
return []
|
||||
return backend.library.browse(uri).get()
|
||||
|
||||
with _backend_error_handling(backend):
|
||||
result = backend.library.browse(uri).get()
|
||||
validation.check_instances(result, models.Ref)
|
||||
return result
|
||||
|
||||
return []
|
||||
|
||||
def get_distinct(self, field, query=None):
|
||||
"""
|
||||
@ -86,18 +124,25 @@ class LibraryController(object):
|
||||
recommended to use this method.
|
||||
|
||||
:param string field: One of ``track``, ``artist``, ``albumartist``,
|
||||
``album``, ``composer``, ``performer``, ``date``or ``genre``.
|
||||
``album``, ``composer``, ``performer``, ``date`` or ``genre``.
|
||||
:param dict query: Query to use for limiting results, see
|
||||
:meth:`search` for details about the query format.
|
||||
:rtype: set of values corresponding to the requested field type.
|
||||
|
||||
.. versionadded:: 1.0
|
||||
"""
|
||||
futures = [b.library.get_distinct(field, query)
|
||||
for b in self.backends.with_library.values()]
|
||||
validation.check_choice(field, validation.DISTINCT_FIELDS)
|
||||
query is None or validation.check_query(query) # TODO: normalize?
|
||||
|
||||
result = set()
|
||||
for r in pykka.get_all(futures):
|
||||
result.update(r)
|
||||
futures = {b: b.library.get_distinct(field, query)
|
||||
for b in self.backends.with_library.values()}
|
||||
for backend, future in futures.items():
|
||||
with _backend_error_handling(backend):
|
||||
values = future.get()
|
||||
if values is not None:
|
||||
validation.check_instances(values, compat.text_type)
|
||||
result.update(values)
|
||||
return result
|
||||
|
||||
def get_images(self, uris):
|
||||
@ -110,20 +155,31 @@ class LibraryController(object):
|
||||
Unknown URIs or URIs the corresponding backend couldn't find anything
|
||||
for will simply return an empty list for that URI.
|
||||
|
||||
:param list uris: list of URIs to find images for
|
||||
:param uris: list of URIs to find images for
|
||||
:type uris: list of string
|
||||
:rtype: {uri: tuple of :class:`mopidy.models.Image`}
|
||||
|
||||
.. versionadded:: 1.0
|
||||
"""
|
||||
futures = [
|
||||
backend.library.get_images(backend_uris)
|
||||
validation.check_uris(uris)
|
||||
|
||||
futures = {
|
||||
backend: backend.library.get_images(backend_uris)
|
||||
for (backend, backend_uris)
|
||||
in self._get_backends_to_uris(uris).items() if backend_uris]
|
||||
in self._get_backends_to_uris(uris).items() if backend_uris}
|
||||
|
||||
results = {uri: tuple() for uri in uris}
|
||||
for r in pykka.get_all(futures):
|
||||
for uri, images in r.items():
|
||||
results[uri] += tuple(images)
|
||||
for backend, future in futures.items():
|
||||
with _backend_error_handling(backend):
|
||||
if future.get() is None:
|
||||
continue
|
||||
validation.check_instance(future.get(), collections.Mapping)
|
||||
for uri, images in future.get().items():
|
||||
if uri not in uris:
|
||||
raise exceptions.ValidationError(
|
||||
'Got unknown image URI: %s' % uri)
|
||||
validation.check_instances(images, models.Image)
|
||||
results[uri] += tuple(images)
|
||||
return results
|
||||
|
||||
def find_exact(self, query=None, uris=None, **kwargs):
|
||||
@ -132,11 +188,12 @@ class LibraryController(object):
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`search` with ``exact`` set.
|
||||
"""
|
||||
deprecation.warn('core.library.find_exact')
|
||||
return self.search(query=query, uris=uris, exact=True, **kwargs)
|
||||
|
||||
def lookup(self, uri=None, uris=None):
|
||||
"""
|
||||
Lookup the given URI.
|
||||
Lookup the given URIs.
|
||||
|
||||
If the URI expands to multiple tracks, the returned list will contain
|
||||
them all.
|
||||
@ -146,7 +203,7 @@ class LibraryController(object):
|
||||
:param uris: track URIs
|
||||
:type uris: list of string or :class:`None`
|
||||
:rtype: list of :class:`mopidy.models.Track` if uri was set or
|
||||
a {uri: list of :class:`mopidy.models.Track`} if uris was set.
|
||||
{uri: list of :class:`mopidy.models.Track`} if uris was set.
|
||||
|
||||
.. versionadded:: 1.0
|
||||
The ``uris`` argument.
|
||||
@ -154,33 +211,36 @@ class LibraryController(object):
|
||||
.. deprecated:: 1.0
|
||||
The ``uri`` argument. Use ``uris`` instead.
|
||||
"""
|
||||
none_set = uri is None and uris is None
|
||||
both_set = uri is not None and uris is not None
|
||||
if sum(o is not None for o in [uri, uris]) != 1:
|
||||
raise ValueError('Exactly one of "uri" or "uris" must be set')
|
||||
|
||||
if none_set or both_set:
|
||||
raise ValueError("One of 'uri' or 'uris' must be set")
|
||||
uris is None or validation.check_uris(uris)
|
||||
uri is None or validation.check_uri(uri)
|
||||
|
||||
if uri:
|
||||
deprecation.warn('core.library.lookup:uri_arg')
|
||||
|
||||
if uri is not None:
|
||||
uris = [uri]
|
||||
|
||||
futures = {}
|
||||
result = {}
|
||||
backends = self._get_backends_to_uris(uris)
|
||||
results = {u: [] for u in uris}
|
||||
|
||||
# TODO: lookup(uris) to backend APIs
|
||||
for backend, backend_uris in backends.items():
|
||||
for u in backend_uris or []:
|
||||
futures[u] = backend.library.lookup(u)
|
||||
for backend, backend_uris in self._get_backends_to_uris(uris).items():
|
||||
for u in backend_uris:
|
||||
futures[(backend, u)] = backend.library.lookup(u)
|
||||
|
||||
for u in uris:
|
||||
if u in futures:
|
||||
result[u] = futures[u].get()
|
||||
else:
|
||||
result[u] = []
|
||||
for (backend, u), future in futures.items():
|
||||
with _backend_error_handling(backend):
|
||||
result = future.get()
|
||||
if result is not None:
|
||||
validation.check_instances(result, models.Track)
|
||||
results[u] = result
|
||||
|
||||
if uri:
|
||||
return result[uri]
|
||||
return result
|
||||
return results[uri]
|
||||
return results
|
||||
|
||||
def refresh(self, uri=None):
|
||||
"""
|
||||
@ -189,25 +249,27 @@ class LibraryController(object):
|
||||
:param uri: directory or track URI
|
||||
:type uri: string
|
||||
"""
|
||||
if uri is not None:
|
||||
backend = self._get_backend(uri)
|
||||
if backend:
|
||||
backend.library.refresh(uri).get()
|
||||
else:
|
||||
futures = [b.library.refresh(uri)
|
||||
for b in self.backends.with_library.values()]
|
||||
pykka.get_all(futures)
|
||||
uri is None or validation.check_uri(uri)
|
||||
|
||||
futures = {}
|
||||
backends = {}
|
||||
uri_scheme = urlparse.urlparse(uri).scheme if uri else None
|
||||
|
||||
for backend_scheme, backend in self.backends.with_playlists.items():
|
||||
backends.setdefault(backend, set()).add(backend_scheme)
|
||||
|
||||
for backend, backend_schemes in backends.items():
|
||||
if uri_scheme is None or uri_scheme in backend_schemes:
|
||||
futures[backend] = backend.library.refresh(uri)
|
||||
|
||||
for backend, future in futures.items():
|
||||
with _backend_error_handling(backend):
|
||||
future.get()
|
||||
|
||||
def search(self, query=None, uris=None, exact=False, **kwargs):
|
||||
"""
|
||||
Search the library for tracks where ``field`` contains ``values``.
|
||||
|
||||
.. deprecated:: 1.0
|
||||
Previously, if the query was empty, and the backend could support
|
||||
it, all available tracks were returned. This has not changed, but
|
||||
it is strongly discouraged. No new code should rely on this
|
||||
behavior.
|
||||
|
||||
If ``uris`` is given, the search is limited to results from within the
|
||||
URI roots. For example passing ``uris=['file:']`` will limit the search
|
||||
to the local backend.
|
||||
@ -216,51 +278,83 @@ class LibraryController(object):
|
||||
|
||||
# Returns results matching 'a' in any backend
|
||||
search({'any': ['a']})
|
||||
search(any=['a'])
|
||||
|
||||
# Returns results matching artist 'xyz' in any backend
|
||||
search({'artist': ['xyz']})
|
||||
search(artist=['xyz'])
|
||||
|
||||
# Returns results matching 'a' and 'b' and artist 'xyz' in any
|
||||
# backend
|
||||
search({'any': ['a', 'b'], 'artist': ['xyz']})
|
||||
search(any=['a', 'b'], artist=['xyz'])
|
||||
|
||||
# Returns results matching 'a' if within the given URI roots
|
||||
# "file:///media/music" and "spotify:"
|
||||
search({'any': ['a']}, uris=['file:///media/music', 'spotify:'])
|
||||
search(any=['a'], uris=['file:///media/music', 'spotify:'])
|
||||
|
||||
# Returns results matching artist 'xyz' and 'abc' in any backend
|
||||
search({'artist': ['xyz', 'abc']})
|
||||
|
||||
:param query: one or more queries to search for
|
||||
:type query: dict
|
||||
:param uris: zero or more URI roots to limit the search to
|
||||
:type uris: list of strings or :class:`None`
|
||||
:type uris: list of string or :class:`None`
|
||||
:param exact: if the search should use exact matching
|
||||
:type exact: :class:`bool`
|
||||
:rtype: list of :class:`mopidy.models.SearchResult`
|
||||
|
||||
.. versionadded:: 1.0
|
||||
The ``exact`` keyword argument, which replaces :meth:`find_exact`.
|
||||
|
||||
.. deprecated:: 1.0
|
||||
Previously, if the query was empty, and the backend could support
|
||||
it, all available tracks were returned. This has not changed, but
|
||||
it is strongly discouraged. No new code should rely on this
|
||||
behavior.
|
||||
|
||||
.. deprecated:: 1.1
|
||||
Providing the search query via ``kwargs`` is no longer supported.
|
||||
"""
|
||||
query = _normalize_query(query or kwargs)
|
||||
|
||||
uris is None or validation.check_uris(uris)
|
||||
query is None or validation.check_query(query)
|
||||
validation.check_boolean(exact)
|
||||
|
||||
if kwargs:
|
||||
deprecation.warn('core.library.search:kwargs_query')
|
||||
|
||||
if not query:
|
||||
deprecation.warn('core.library.search:empty_query')
|
||||
|
||||
futures = {}
|
||||
for backend, backend_uris in self._get_backends_to_uris(uris).items():
|
||||
futures[backend] = backend.library.search(
|
||||
query=query, uris=backend_uris, exact=exact)
|
||||
|
||||
# Some of our tests check for LookupError to catch bad queries. This is
|
||||
# silly and should be replaced with query validation before passing it
|
||||
# to the backends.
|
||||
reraise = (TypeError, LookupError)
|
||||
|
||||
results = []
|
||||
for backend, future in futures.items():
|
||||
try:
|
||||
results.append(future.get())
|
||||
with _backend_error_handling(backend, reraise=reraise):
|
||||
result = future.get()
|
||||
if result is not None:
|
||||
validation.check_instance(result, models.SearchResult)
|
||||
results.append(result)
|
||||
except TypeError:
|
||||
backend_name = backend.actor_ref.actor_class.__name__
|
||||
logger.warning(
|
||||
'%s does not implement library.search() with "exact" '
|
||||
'support. Please upgrade it.', backend_name)
|
||||
return [r for r in results if r]
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def _normalize_query(query):
|
||||
broken_client = False
|
||||
# TODO: this breaks if query is not a dictionary like object...
|
||||
for (field, values) in query.items():
|
||||
if isinstance(values, basestring):
|
||||
broken_client = True
|
||||
|
||||
@ -4,6 +4,7 @@ from mopidy import listener
|
||||
|
||||
|
||||
class CoreListener(listener.Listener):
|
||||
|
||||
"""
|
||||
Marker interface for recipients of events sent by the core actor.
|
||||
|
||||
@ -122,6 +123,17 @@ class CoreListener(listener.Listener):
|
||||
"""
|
||||
pass
|
||||
|
||||
def playlist_deleted(self, uri):
|
||||
"""
|
||||
Called whenever a playlist is deleted.
|
||||
|
||||
*MAY* be implemented by actor.
|
||||
|
||||
:param uri: the URI of the deleted playlist
|
||||
:type uri: string
|
||||
"""
|
||||
pass
|
||||
|
||||
def options_changed(self):
|
||||
"""
|
||||
Called whenever an option is changed.
|
||||
|
||||
@ -1,11 +1,27 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import contextlib
|
||||
import logging
|
||||
|
||||
from mopidy import exceptions
|
||||
from mopidy.internal import validation
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def _mixer_error_handling(mixer):
|
||||
try:
|
||||
yield
|
||||
except exceptions.ValidationError as e:
|
||||
logger.error('%s mixer returned bad data: %s',
|
||||
mixer.actor_ref.actor_class.__name__, e)
|
||||
except Exception:
|
||||
logger.exception('%s mixer caused an exception.',
|
||||
mixer.actor_ref.actor_class.__name__)
|
||||
|
||||
|
||||
class MixerController(object):
|
||||
pykka_traversable = True
|
||||
|
||||
@ -19,8 +35,15 @@ class MixerController(object):
|
||||
|
||||
The volume scale is linear.
|
||||
"""
|
||||
if self._mixer is not None:
|
||||
return self._mixer.get_volume().get()
|
||||
if self._mixer is None:
|
||||
return None
|
||||
|
||||
with _mixer_error_handling(self._mixer):
|
||||
volume = self._mixer.get_volume().get()
|
||||
volume is None or validation.check_integer(volume, min=0, max=100)
|
||||
return volume
|
||||
|
||||
return None
|
||||
|
||||
def set_volume(self, volume):
|
||||
"""Set the volume.
|
||||
@ -31,10 +54,17 @@ class MixerController(object):
|
||||
|
||||
Returns :class:`True` if call is successful, otherwise :class:`False`.
|
||||
"""
|
||||
validation.check_integer(volume, min=0, max=100)
|
||||
|
||||
if self._mixer is None:
|
||||
return False
|
||||
else:
|
||||
return self._mixer.set_volume(volume).get()
|
||||
return False # TODO: 2.0 return None
|
||||
|
||||
with _mixer_error_handling(self._mixer):
|
||||
result = self._mixer.set_volume(volume).get()
|
||||
validation.check_instance(result, bool)
|
||||
return result
|
||||
|
||||
return False
|
||||
|
||||
def get_mute(self):
|
||||
"""Get mute state.
|
||||
@ -42,8 +72,15 @@ class MixerController(object):
|
||||
:class:`True` if muted, :class:`False` unmuted, :class:`None` if
|
||||
unknown.
|
||||
"""
|
||||
if self._mixer is not None:
|
||||
return self._mixer.get_mute().get()
|
||||
if self._mixer is None:
|
||||
return None
|
||||
|
||||
with _mixer_error_handling(self._mixer):
|
||||
mute = self._mixer.get_mute().get()
|
||||
mute is None or validation.check_instance(mute, bool)
|
||||
return mute
|
||||
|
||||
return None
|
||||
|
||||
def set_mute(self, mute):
|
||||
"""Set mute state.
|
||||
@ -52,7 +89,13 @@ class MixerController(object):
|
||||
|
||||
Returns :class:`True` if call is successful, otherwise :class:`False`.
|
||||
"""
|
||||
validation.check_boolean(mute)
|
||||
if self._mixer is None:
|
||||
return False
|
||||
else:
|
||||
return self._mixer.set_mute(bool(mute)).get()
|
||||
return False # TODO: 2.0 return None
|
||||
|
||||
with _mixer_error_handling(self._mixer):
|
||||
result = self._mixer.set_mute(bool(mute)).get()
|
||||
validation.check_instance(result, bool)
|
||||
return result
|
||||
|
||||
return False
|
||||
|
||||
@ -2,12 +2,11 @@ from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import logging
|
||||
import urlparse
|
||||
import warnings
|
||||
|
||||
from mopidy import models
|
||||
from mopidy.audio import PlaybackState
|
||||
from mopidy.core import listener
|
||||
from mopidy.utils.deprecation import deprecated_property
|
||||
|
||||
from mopidy.internal import deprecation, validation
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -48,7 +47,7 @@ class PlaybackController(object):
|
||||
"""
|
||||
self._current_tl_track = value
|
||||
|
||||
current_tl_track = deprecated_property(get_current_tl_track)
|
||||
current_tl_track = deprecation.deprecated_property(get_current_tl_track)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_current_tl_track` instead.
|
||||
@ -62,16 +61,26 @@ class PlaybackController(object):
|
||||
|
||||
Returns a :class:`mopidy.models.Track` or :class:`None`.
|
||||
"""
|
||||
tl_track = self.get_current_tl_track()
|
||||
if tl_track is not None:
|
||||
return tl_track.track
|
||||
return getattr(self.get_current_tl_track(), 'track', None)
|
||||
|
||||
current_track = deprecated_property(get_current_track)
|
||||
current_track = deprecation.deprecated_property(get_current_track)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_current_track` instead.
|
||||
"""
|
||||
|
||||
def get_current_tlid(self):
|
||||
"""
|
||||
Get the currently playing or selected TLID.
|
||||
|
||||
Extracted from :meth:`get_current_tl_track` for convenience.
|
||||
|
||||
Returns a :class:`int` or :class:`None`.
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
return getattr(self.get_current_tl_track(), 'tlid', None)
|
||||
|
||||
def get_stream_title(self):
|
||||
"""Get the current stream title or :class:`None`."""
|
||||
return self._stream_title
|
||||
@ -98,12 +107,14 @@ class PlaybackController(object):
|
||||
"PAUSED" -> "PLAYING" [ label="resume" ]
|
||||
"PAUSED" -> "STOPPED" [ label="stop" ]
|
||||
"""
|
||||
validation.check_choice(new_state, validation.PLAYBACK_STATES)
|
||||
|
||||
(old_state, self._state) = (self.get_state(), new_state)
|
||||
logger.debug('Changing state: %s -> %s', old_state, new_state)
|
||||
|
||||
self._trigger_playback_state_changed(old_state, new_state)
|
||||
|
||||
state = deprecated_property(get_state, set_state)
|
||||
state = deprecation.deprecated_property(get_state, set_state)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_state` and :meth:`set_state` instead.
|
||||
@ -117,7 +128,7 @@ class PlaybackController(object):
|
||||
else:
|
||||
return 0
|
||||
|
||||
time_position = deprecated_property(get_time_position)
|
||||
time_position = deprecation.deprecated_property(get_time_position)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_time_position` instead.
|
||||
@ -129,8 +140,7 @@ class PlaybackController(object):
|
||||
Use :meth:`core.mixer.get_volume()
|
||||
<mopidy.core.MixerController.get_volume>` instead.
|
||||
"""
|
||||
warnings.warn(
|
||||
'playback.get_volume() is deprecated', DeprecationWarning)
|
||||
deprecation.warn('core.playback.get_volume')
|
||||
return self.core.mixer.get_volume()
|
||||
|
||||
def set_volume(self, volume):
|
||||
@ -139,11 +149,10 @@ class PlaybackController(object):
|
||||
Use :meth:`core.mixer.set_volume()
|
||||
<mopidy.core.MixerController.set_volume>` instead.
|
||||
"""
|
||||
warnings.warn(
|
||||
'playback.set_volume() is deprecated', DeprecationWarning)
|
||||
deprecation.warn('core.playback.set_volume')
|
||||
return self.core.mixer.set_volume(volume)
|
||||
|
||||
volume = deprecated_property(get_volume, set_volume)
|
||||
volume = deprecation.deprecated_property(get_volume, set_volume)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`core.mixer.get_volume()
|
||||
@ -158,7 +167,7 @@ class PlaybackController(object):
|
||||
Use :meth:`core.mixer.get_mute()
|
||||
<mopidy.core.MixerController.get_mute>` instead.
|
||||
"""
|
||||
warnings.warn('playback.get_mute() is deprecated', DeprecationWarning)
|
||||
deprecation.warn('core.playback.get_mute')
|
||||
return self.core.mixer.get_mute()
|
||||
|
||||
def set_mute(self, mute):
|
||||
@ -167,10 +176,10 @@ class PlaybackController(object):
|
||||
Use :meth:`core.mixer.set_mute()
|
||||
<mopidy.core.MixerController.set_mute>` instead.
|
||||
"""
|
||||
warnings.warn('playback.set_mute() is deprecated', DeprecationWarning)
|
||||
deprecation.warn('core.playback.set_mute')
|
||||
return self.core.mixer.set_mute(mute)
|
||||
|
||||
mute = deprecated_property(get_mute, set_mute)
|
||||
mute = deprecation.deprecated_property(get_mute, set_mute)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`core.mixer.get_mute()
|
||||
@ -272,17 +281,37 @@ class PlaybackController(object):
|
||||
self.set_state(PlaybackState.PAUSED)
|
||||
self._trigger_track_playback_paused()
|
||||
|
||||
def play(self, tl_track=None):
|
||||
def play(self, tl_track=None, tlid=None):
|
||||
"""
|
||||
Play the given track, or if the given track is :class:`None`, play the
|
||||
currently active track.
|
||||
Play the given track, or if the given tl_track and tlid is
|
||||
:class:`None`, play the currently active track.
|
||||
|
||||
Note that the track **must** already be in the tracklist.
|
||||
|
||||
:param tl_track: track to play
|
||||
:type tl_track: :class:`mopidy.models.TlTrack` or :class:`None`
|
||||
:param tlid: TLID of the track to play
|
||||
:type tlid: :class:`int` or :class:`None`
|
||||
"""
|
||||
self._play(tl_track, on_error_step=1)
|
||||
if sum(o is not None for o in [tl_track, tlid]) > 1:
|
||||
raise ValueError('At most one of "tl_track" and "tlid" may be set')
|
||||
|
||||
tl_track is None or validation.check_instance(tl_track, models.TlTrack)
|
||||
tlid is None or validation.check_integer(tlid, min=0)
|
||||
|
||||
if tl_track:
|
||||
deprecation.warn('core.playback.play:tl_track_kwarg', pending=True)
|
||||
|
||||
self._play(tl_track=tl_track, tlid=tlid, on_error_step=1)
|
||||
|
||||
def _play(self, tl_track=None, tlid=None, on_error_step=1):
|
||||
if tl_track is None and tlid is not None:
|
||||
for tl_track in self.core.tracklist.get_tl_tracks():
|
||||
if tl_track.tlid == tlid:
|
||||
break
|
||||
else:
|
||||
tl_track = None
|
||||
|
||||
def _play(self, tl_track=None, on_error_step=1):
|
||||
if tl_track is None:
|
||||
if self.get_state() == PlaybackState.PAUSED:
|
||||
return self.resume()
|
||||
@ -319,8 +348,11 @@ class PlaybackController(object):
|
||||
backend.playback.change_track(tl_track.track).get() and
|
||||
backend.playback.play().get())
|
||||
except TypeError:
|
||||
logger.error('%s needs to be updated to work with this '
|
||||
'version of Mopidy.', backend)
|
||||
logger.error(
|
||||
'%s needs to be updated to work with this '
|
||||
'version of Mopidy.',
|
||||
backend.actor_ref.actor_class.__name__)
|
||||
logger.debug('Backend exception', exc_info=True)
|
||||
|
||||
if success:
|
||||
self.core.tracklist._mark_playing(tl_track)
|
||||
@ -370,6 +402,13 @@ class PlaybackController(object):
|
||||
:type time_position: int
|
||||
:rtype: :class:`True` if successful, else :class:`False`
|
||||
"""
|
||||
validation.check_integer(time_position)
|
||||
|
||||
if time_position < 0:
|
||||
logger.debug(
|
||||
'Client seeked to negative position. Seeking to zero.')
|
||||
time_position = 0
|
||||
|
||||
if not self.core.tracklist.tracks:
|
||||
return False
|
||||
|
||||
|
||||
@ -1,18 +1,31 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import contextlib
|
||||
import logging
|
||||
import urlparse
|
||||
|
||||
import pykka
|
||||
|
||||
from mopidy import exceptions
|
||||
from mopidy.core import listener
|
||||
from mopidy.models import Playlist
|
||||
from mopidy.utils.deprecation import deprecated_property
|
||||
|
||||
from mopidy.internal import deprecation, validation
|
||||
from mopidy.models import Playlist, Ref
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def _backend_error_handling(backend, reraise=None):
|
||||
try:
|
||||
yield
|
||||
except exceptions.ValidationError as e:
|
||||
logger.error('%s backend returned bad data: %s',
|
||||
backend.actor_ref.actor_class.__name__, e)
|
||||
except Exception as e:
|
||||
if reraise and isinstance(e, reraise):
|
||||
raise
|
||||
logger.exception('%s backend caused an exception.',
|
||||
backend.actor_ref.actor_class.__name__)
|
||||
|
||||
|
||||
class PlaylistsController(object):
|
||||
pykka_traversable = True
|
||||
|
||||
@ -33,14 +46,19 @@ class PlaylistsController(object):
|
||||
.. versionadded:: 1.0
|
||||
"""
|
||||
futures = {
|
||||
b.actor_ref.actor_class.__name__: b.playlists.as_list()
|
||||
for b in set(self.backends.with_playlists.values())}
|
||||
backend: backend.playlists.as_list()
|
||||
for backend in set(self.backends.with_playlists.values())}
|
||||
|
||||
results = []
|
||||
for backend_name, future in futures.items():
|
||||
for b, future in futures.items():
|
||||
try:
|
||||
results.extend(future.get())
|
||||
with _backend_error_handling(b, reraise=NotImplementedError):
|
||||
playlists = future.get()
|
||||
if playlists is not None:
|
||||
validation.check_instances(playlists, Ref)
|
||||
results.extend(playlists)
|
||||
except NotImplementedError:
|
||||
backend_name = b.actor_ref.actor_class.__name__
|
||||
logger.warning(
|
||||
'%s does not implement playlists.as_list(). '
|
||||
'Please upgrade it.', backend_name)
|
||||
@ -61,10 +79,20 @@ class PlaylistsController(object):
|
||||
|
||||
.. versionadded:: 1.0
|
||||
"""
|
||||
validation.check_uri(uri)
|
||||
|
||||
uri_scheme = urlparse.urlparse(uri).scheme
|
||||
backend = self.backends.with_playlists.get(uri_scheme, None)
|
||||
if backend:
|
||||
return backend.playlists.get_items(uri).get()
|
||||
|
||||
if not backend:
|
||||
return None
|
||||
|
||||
with _backend_error_handling(backend):
|
||||
items = backend.playlists.get_items(uri).get()
|
||||
items is None or validation.check_instances(items, Ref)
|
||||
return items
|
||||
|
||||
return None
|
||||
|
||||
def get_playlists(self, include_tracks=True):
|
||||
"""
|
||||
@ -80,6 +108,8 @@ class PlaylistsController(object):
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`as_list` and :meth:`get_items` instead.
|
||||
"""
|
||||
deprecation.warn('core.playlists.get_playlists')
|
||||
|
||||
playlist_refs = self.as_list()
|
||||
|
||||
if include_tracks:
|
||||
@ -87,13 +117,13 @@ class PlaylistsController(object):
|
||||
# Use the playlist name from as_list() because it knows about any
|
||||
# playlist folder hierarchy, which lookup() does not.
|
||||
return [
|
||||
playlists[r.uri].copy(name=r.name)
|
||||
playlists[r.uri].replace(name=r.name)
|
||||
for r in playlist_refs if playlists[r.uri] is not None]
|
||||
else:
|
||||
return [
|
||||
Playlist(uri=r.uri, name=r.name) for r in playlist_refs]
|
||||
|
||||
playlists = deprecated_property(get_playlists)
|
||||
playlists = deprecation.deprecated_property(get_playlists)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`as_list` and :meth:`get_items` instead.
|
||||
@ -115,22 +145,23 @@ class PlaylistsController(object):
|
||||
:type name: string
|
||||
:param uri_scheme: use the backend matching the URI scheme
|
||||
:type uri_scheme: string
|
||||
:rtype: :class:`mopidy.models.Playlist`
|
||||
:rtype: :class:`mopidy.models.Playlist` or :class:`None`
|
||||
"""
|
||||
if uri_scheme in self.backends.with_playlists:
|
||||
backends = [self.backends.with_playlists[uri_scheme]]
|
||||
else:
|
||||
backends = self.backends.with_playlists.values()
|
||||
|
||||
for backend in backends:
|
||||
try:
|
||||
playlist = backend.playlists.create(name).get()
|
||||
except Exception:
|
||||
playlist = None
|
||||
# Workaround for playlist providers that return None from create()
|
||||
if not playlist:
|
||||
continue
|
||||
listener.CoreListener.send('playlist_changed', playlist=playlist)
|
||||
return playlist
|
||||
with _backend_error_handling(backend):
|
||||
result = backend.playlists.create(name).get()
|
||||
if result is None:
|
||||
continue
|
||||
validation.check_instance(result, Playlist)
|
||||
listener.CoreListener.send('playlist_changed', playlist=result)
|
||||
return result
|
||||
|
||||
return None
|
||||
|
||||
def delete(self, uri):
|
||||
"""
|
||||
@ -142,10 +173,19 @@ class PlaylistsController(object):
|
||||
:param uri: URI of the playlist to delete
|
||||
:type uri: string
|
||||
"""
|
||||
validation.check_uri(uri)
|
||||
|
||||
uri_scheme = urlparse.urlparse(uri).scheme
|
||||
backend = self.backends.with_playlists.get(uri_scheme, None)
|
||||
if backend:
|
||||
if not backend:
|
||||
return None # TODO: error reporting to user
|
||||
|
||||
with _backend_error_handling(backend):
|
||||
backend.playlists.delete(uri).get()
|
||||
# TODO: error detection and reporting to user
|
||||
listener.CoreListener.send('playlist_deleted', uri=uri)
|
||||
|
||||
# TODO: return value?
|
||||
|
||||
def filter(self, criteria=None, **kwargs):
|
||||
"""
|
||||
@ -155,15 +195,12 @@ class PlaylistsController(object):
|
||||
|
||||
# Returns track with name 'a'
|
||||
filter({'name': 'a'})
|
||||
filter(name='a')
|
||||
|
||||
# Returns track with URI 'xyz'
|
||||
filter({'uri': 'xyz'})
|
||||
filter(uri='xyz')
|
||||
|
||||
# Returns track with name 'a' and URI 'xyz'
|
||||
filter({'name': 'a', 'uri': 'xyz'})
|
||||
filter(name='a', uri='xyz')
|
||||
|
||||
:param criteria: one or more criteria to match by
|
||||
:type criteria: dict
|
||||
@ -172,8 +209,13 @@ class PlaylistsController(object):
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`as_list` and filter yourself.
|
||||
"""
|
||||
deprecation.warn('core.playlists.filter')
|
||||
|
||||
criteria = criteria or kwargs
|
||||
matches = self.playlists
|
||||
validation.check_query(
|
||||
criteria, validation.PLAYLIST_FIELDS, list_values=False)
|
||||
|
||||
matches = self.playlists # TODO: stop using self playlists
|
||||
for (key, value) in criteria.iteritems():
|
||||
matches = filter(lambda p: getattr(p, key) == value, matches)
|
||||
return matches
|
||||
@ -189,11 +231,18 @@ class PlaylistsController(object):
|
||||
"""
|
||||
uri_scheme = urlparse.urlparse(uri).scheme
|
||||
backend = self.backends.with_playlists.get(uri_scheme, None)
|
||||
if backend:
|
||||
return backend.playlists.lookup(uri).get()
|
||||
else:
|
||||
if not backend:
|
||||
return None
|
||||
|
||||
with _backend_error_handling(backend):
|
||||
playlist = backend.playlists.lookup(uri).get()
|
||||
playlist is None or validation.check_instance(playlist, Playlist)
|
||||
return playlist
|
||||
|
||||
return None
|
||||
|
||||
# TODO: there is an inconsistency between library.refresh(uri) and this
|
||||
# call, not sure how to sort this out.
|
||||
def refresh(self, uri_scheme=None):
|
||||
"""
|
||||
Refresh the playlists in :attr:`playlists`.
|
||||
@ -206,16 +255,26 @@ class PlaylistsController(object):
|
||||
:param uri_scheme: limit to the backend matching the URI scheme
|
||||
:type uri_scheme: string
|
||||
"""
|
||||
if uri_scheme is None:
|
||||
futures = [b.playlists.refresh()
|
||||
for b in self.backends.with_playlists.values()]
|
||||
pykka.get_all(futures)
|
||||
# TODO: check: uri_scheme is None or uri_scheme?
|
||||
|
||||
futures = {}
|
||||
backends = {}
|
||||
playlists_loaded = False
|
||||
|
||||
for backend_scheme, backend in self.backends.with_playlists.items():
|
||||
backends.setdefault(backend, set()).add(backend_scheme)
|
||||
|
||||
for backend, backend_schemes in backends.items():
|
||||
if uri_scheme is None or uri_scheme in backend_schemes:
|
||||
futures[backend] = backend.playlists.refresh()
|
||||
|
||||
for backend, future in futures.items():
|
||||
with _backend_error_handling(backend):
|
||||
future.get()
|
||||
playlists_loaded = True
|
||||
|
||||
if playlists_loaded:
|
||||
listener.CoreListener.send('playlists_loaded')
|
||||
else:
|
||||
backend = self.backends.with_playlists.get(uri_scheme, None)
|
||||
if backend:
|
||||
backend.playlists.refresh().get()
|
||||
listener.CoreListener.send('playlists_loaded')
|
||||
|
||||
def save(self, playlist):
|
||||
"""
|
||||
@ -239,11 +298,23 @@ class PlaylistsController(object):
|
||||
:type playlist: :class:`mopidy.models.Playlist`
|
||||
:rtype: :class:`mopidy.models.Playlist` or :class:`None`
|
||||
"""
|
||||
validation.check_instance(playlist, Playlist)
|
||||
|
||||
if playlist.uri is None:
|
||||
return
|
||||
return # TODO: log this problem?
|
||||
|
||||
uri_scheme = urlparse.urlparse(playlist.uri).scheme
|
||||
backend = self.backends.with_playlists.get(uri_scheme, None)
|
||||
if backend:
|
||||
if not backend:
|
||||
return None
|
||||
|
||||
# TODO: we let AssertionError error through due to legacy tests :/
|
||||
with _backend_error_handling(backend, reraise=AssertionError):
|
||||
playlist = backend.playlists.save(playlist).get()
|
||||
listener.CoreListener.send('playlist_changed', playlist=playlist)
|
||||
playlist is None or validation.check_instance(playlist, Playlist)
|
||||
if playlist:
|
||||
listener.CoreListener.send(
|
||||
'playlist_changed', playlist=playlist)
|
||||
return playlist
|
||||
|
||||
return None
|
||||
|
||||
@ -1,14 +1,12 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import collections
|
||||
import logging
|
||||
import random
|
||||
|
||||
from mopidy import compat
|
||||
from mopidy import exceptions
|
||||
from mopidy.core import listener
|
||||
from mopidy.models import TlTrack
|
||||
from mopidy.utils.deprecation import deprecated_property
|
||||
|
||||
from mopidy.internal import deprecation, validation
|
||||
from mopidy.models import TlTrack, Track
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -30,7 +28,7 @@ class TracklistController(object):
|
||||
"""Get tracklist as list of :class:`mopidy.models.TlTrack`."""
|
||||
return self._tl_tracks[:]
|
||||
|
||||
tl_tracks = deprecated_property(get_tl_tracks)
|
||||
tl_tracks = deprecation.deprecated_property(get_tl_tracks)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_tl_tracks` instead.
|
||||
@ -40,7 +38,7 @@ class TracklistController(object):
|
||||
"""Get tracklist as list of :class:`mopidy.models.Track`."""
|
||||
return [tl_track.track for tl_track in self._tl_tracks]
|
||||
|
||||
tracks = deprecated_property(get_tracks)
|
||||
tracks = deprecation.deprecated_property(get_tracks)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_tracks` instead.
|
||||
@ -50,7 +48,7 @@ class TracklistController(object):
|
||||
"""Get length of the tracklist."""
|
||||
return len(self._tl_tracks)
|
||||
|
||||
length = deprecated_property(get_length)
|
||||
length = deprecation.deprecated_property(get_length)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_length` instead.
|
||||
@ -70,7 +68,7 @@ class TracklistController(object):
|
||||
self.core.playback._on_tracklist_change()
|
||||
self._trigger_tracklist_changed()
|
||||
|
||||
version = deprecated_property(get_version)
|
||||
version = deprecation.deprecated_property(get_version)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_version` instead.
|
||||
@ -94,11 +92,12 @@ class TracklistController(object):
|
||||
:class:`False`
|
||||
Tracks are not removed from the tracklist.
|
||||
"""
|
||||
validation.check_boolean(value)
|
||||
if self.get_consume() != value:
|
||||
self._trigger_options_changed()
|
||||
return setattr(self, '_consume', value)
|
||||
|
||||
consume = deprecated_property(get_consume, set_consume)
|
||||
consume = deprecation.deprecated_property(get_consume, set_consume)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_consume` and :meth:`set_consume` instead.
|
||||
@ -122,7 +121,7 @@ class TracklistController(object):
|
||||
:class:`False`
|
||||
Tracks are played in the order of the tracklist.
|
||||
"""
|
||||
|
||||
validation.check_boolean(value)
|
||||
if self.get_random() != value:
|
||||
self._trigger_options_changed()
|
||||
if value:
|
||||
@ -130,7 +129,7 @@ class TracklistController(object):
|
||||
random.shuffle(self._shuffled)
|
||||
return setattr(self, '_random', value)
|
||||
|
||||
random = deprecated_property(get_random, set_random)
|
||||
random = deprecation.deprecated_property(get_random, set_random)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_random` and :meth:`set_random` instead.
|
||||
@ -158,12 +157,12 @@ class TracklistController(object):
|
||||
:class:`False`
|
||||
The tracklist is played once.
|
||||
"""
|
||||
|
||||
validation.check_boolean(value)
|
||||
if self.get_repeat() != value:
|
||||
self._trigger_options_changed()
|
||||
return setattr(self, '_repeat', value)
|
||||
|
||||
repeat = deprecated_property(get_repeat, set_repeat)
|
||||
repeat = deprecation.deprecated_property(get_repeat, set_repeat)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_repeat` and :meth:`set_repeat` instead.
|
||||
@ -189,11 +188,12 @@ class TracklistController(object):
|
||||
:class:`False`
|
||||
Playback continues after current song.
|
||||
"""
|
||||
validation.check_boolean(value)
|
||||
if self.get_single() != value:
|
||||
self._trigger_options_changed()
|
||||
return setattr(self, '_single', value)
|
||||
|
||||
single = deprecated_property(get_single, set_single)
|
||||
single = deprecation.deprecated_property(get_single, set_single)
|
||||
"""
|
||||
.. deprecated:: 1.0
|
||||
Use :meth:`get_single` and :meth:`set_single` instead.
|
||||
@ -201,18 +201,52 @@ class TracklistController(object):
|
||||
|
||||
# Methods
|
||||
|
||||
def index(self, tl_track):
|
||||
def index(self, tl_track=None, tlid=None):
|
||||
"""
|
||||
The position of the given track in the tracklist.
|
||||
|
||||
If neither *tl_track* or *tlid* is given we return the index of
|
||||
the currently playing track.
|
||||
|
||||
:param tl_track: the track to find the index of
|
||||
:type tl_track: :class:`mopidy.models.TlTrack`
|
||||
:type tl_track: :class:`mopidy.models.TlTrack` or :class:`None`
|
||||
:param tlid: TLID of the track to find the index of
|
||||
:type tlid: :class:`int` or :class:`None`
|
||||
:rtype: :class:`int` or :class:`None`
|
||||
|
||||
.. versionadded:: 1.1
|
||||
The *tlid* parameter
|
||||
"""
|
||||
try:
|
||||
return self._tl_tracks.index(tl_track)
|
||||
except ValueError:
|
||||
return None
|
||||
tl_track is None or validation.check_instance(tl_track, TlTrack)
|
||||
tlid is None or validation.check_integer(tlid, min=0)
|
||||
|
||||
if tl_track is None and tlid is None:
|
||||
tl_track = self.core.playback.get_current_tl_track()
|
||||
|
||||
if tl_track is not None:
|
||||
try:
|
||||
return self._tl_tracks.index(tl_track)
|
||||
except ValueError:
|
||||
pass
|
||||
elif tlid is not None:
|
||||
for i, tl_track in enumerate(self._tl_tracks):
|
||||
if tl_track.tlid == tlid:
|
||||
return i
|
||||
return None
|
||||
|
||||
def get_eot_tlid(self):
|
||||
"""
|
||||
The TLID of the track that will be played after the given track.
|
||||
|
||||
Not necessarily the same TLID as returned by :meth:`get_next_tlid`.
|
||||
|
||||
:rtype: :class:`int` or :class:`None`
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
|
||||
current_tl_track = self.core.playback.get_current_tl_track()
|
||||
return getattr(self.eot_track(current_tl_track), 'tlid', None)
|
||||
|
||||
def eot_track(self, tl_track):
|
||||
"""
|
||||
@ -224,6 +258,8 @@ class TracklistController(object):
|
||||
:type tl_track: :class:`mopidy.models.TlTrack` or :class:`None`
|
||||
:rtype: :class:`mopidy.models.TlTrack` or :class:`None`
|
||||
"""
|
||||
deprecation.warn('core.tracklist.eot_track', pending=True)
|
||||
tl_track is None or validation.check_instance(tl_track, TlTrack)
|
||||
if self.get_single() and self.get_repeat():
|
||||
return tl_track
|
||||
elif self.get_single():
|
||||
@ -234,6 +270,23 @@ class TracklistController(object):
|
||||
# shared.
|
||||
return self.next_track(tl_track)
|
||||
|
||||
def get_next_tlid(self):
|
||||
"""
|
||||
The tlid of the track that will be played if calling
|
||||
:meth:`mopidy.core.PlaybackController.next()`.
|
||||
|
||||
For normal playback this is the next track in the tracklist. If repeat
|
||||
is enabled the next track can loop around the tracklist. When random is
|
||||
enabled this should be a random track, all tracks should be played once
|
||||
before the tracklist repeats.
|
||||
|
||||
:rtype: :class:`int` or :class:`None`
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
current_tl_track = self.core.playback.get_current_tl_track()
|
||||
return getattr(self.next_track(current_tl_track), 'tlid', None)
|
||||
|
||||
def next_track(self, tl_track):
|
||||
"""
|
||||
The track that will be played if calling
|
||||
@ -248,34 +301,51 @@ class TracklistController(object):
|
||||
:type tl_track: :class:`mopidy.models.TlTrack` or :class:`None`
|
||||
:rtype: :class:`mopidy.models.TlTrack` or :class:`None`
|
||||
"""
|
||||
deprecation.warn('core.tracklist.next_track', pending=True)
|
||||
tl_track is None or validation.check_instance(tl_track, TlTrack)
|
||||
|
||||
if not self.get_tl_tracks():
|
||||
if not self._tl_tracks:
|
||||
return None
|
||||
|
||||
if self.get_random() and not self._shuffled:
|
||||
if self.get_repeat() or not tl_track:
|
||||
logger.debug('Shuffling tracks')
|
||||
self._shuffled = self.get_tl_tracks()
|
||||
self._shuffled = self._tl_tracks[:]
|
||||
random.shuffle(self._shuffled)
|
||||
|
||||
if self.get_random():
|
||||
try:
|
||||
if self._shuffled:
|
||||
return self._shuffled[0]
|
||||
except IndexError:
|
||||
return None
|
||||
return None
|
||||
|
||||
if tl_track is None:
|
||||
return self.get_tl_tracks()[0]
|
||||
next_index = 0
|
||||
else:
|
||||
next_index = self.index(tl_track) + 1
|
||||
|
||||
next_index = self.index(tl_track) + 1
|
||||
if self.get_repeat():
|
||||
next_index %= len(self.get_tl_tracks())
|
||||
|
||||
try:
|
||||
return self.get_tl_tracks()[next_index]
|
||||
except IndexError:
|
||||
next_index %= len(self._tl_tracks)
|
||||
elif next_index >= len(self._tl_tracks):
|
||||
return None
|
||||
|
||||
return self._tl_tracks[next_index]
|
||||
|
||||
def get_previous_tlid(self):
|
||||
"""
|
||||
Returns the TLID of the track that will be played if calling
|
||||
:meth:`mopidy.core.PlaybackController.previous()`.
|
||||
|
||||
For normal playback this is the previous track in the tracklist. If
|
||||
random and/or consume is enabled it should return the current track
|
||||
instead.
|
||||
|
||||
:rtype: :class:`int` or :class:`None`
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
current_tl_track = self.core.playback.get_current_tl_track()
|
||||
return getattr(self.previous_track(current_tl_track), 'tlid', None)
|
||||
|
||||
def previous_track(self, tl_track):
|
||||
"""
|
||||
Returns the track that will be played if calling
|
||||
@ -289,6 +359,9 @@ class TracklistController(object):
|
||||
:type tl_track: :class:`mopidy.models.TlTrack` or :class:`None`
|
||||
:rtype: :class:`mopidy.models.TlTrack` or :class:`None`
|
||||
"""
|
||||
deprecation.warn('core.tracklist.previous_track', pending=True)
|
||||
tl_track is None or validation.check_instance(tl_track, TlTrack)
|
||||
|
||||
if self.get_repeat() or self.get_consume() or self.get_random():
|
||||
return tl_track
|
||||
|
||||
@ -297,30 +370,35 @@ class TracklistController(object):
|
||||
if position in (None, 0):
|
||||
return None
|
||||
|
||||
return self.get_tl_tracks()[position - 1]
|
||||
# Since we know we are not at zero we have to be somewhere in the range
|
||||
# 1 - len(tracks) Thus 'position - 1' will always be within the list.
|
||||
return self._tl_tracks[position - 1]
|
||||
|
||||
def add(self, tracks=None, at_position=None, uri=None, uris=None):
|
||||
"""
|
||||
Add the track or list of tracks to the tracklist.
|
||||
Add tracks to the tracklist.
|
||||
|
||||
If ``uri`` is given instead of ``tracks``, the URI is looked up in the
|
||||
library and the resulting tracks are added to the tracklist.
|
||||
|
||||
If ``uris`` is given instead of ``tracks``, the URIs are looked up in
|
||||
the library and the resulting tracks are added to the tracklist.
|
||||
If ``uris`` is given instead of ``uri`` or ``tracks``, the URIs are
|
||||
looked up in the library and the resulting tracks are added to the
|
||||
tracklist.
|
||||
|
||||
If ``at_position`` is given, the tracks placed at the given position in
|
||||
the tracklist. If ``at_position`` is not given, the tracks are appended
|
||||
to the end of the tracklist.
|
||||
If ``at_position`` is given, the tracks are inserted at the given
|
||||
position in the tracklist. If ``at_position`` is not given, the tracks
|
||||
are appended to the end of the tracklist.
|
||||
|
||||
Triggers the :meth:`mopidy.core.CoreListener.tracklist_changed` event.
|
||||
|
||||
:param tracks: tracks to add
|
||||
:type tracks: list of :class:`mopidy.models.Track`
|
||||
:param at_position: position in tracklist to add track
|
||||
:type tracks: list of :class:`mopidy.models.Track` or :class:`None`
|
||||
:param at_position: position in tracklist to add tracks
|
||||
:type at_position: int or :class:`None`
|
||||
:param uri: URI for tracks to add
|
||||
:type uri: string
|
||||
:type uri: string or :class:`None`
|
||||
:param uris: list of URIs for tracks to add
|
||||
:type uris: list of string or :class:`None`
|
||||
:rtype: list of :class:`mopidy.models.TlTrack`
|
||||
|
||||
.. versionadded:: 1.0
|
||||
@ -329,21 +407,38 @@ class TracklistController(object):
|
||||
.. deprecated:: 1.0
|
||||
The ``tracks`` and ``uri`` arguments. Use ``uris``.
|
||||
"""
|
||||
assert tracks is not None or uri is not None or uris is not None, \
|
||||
'tracks, uri or uris must be provided'
|
||||
if sum(o is not None for o in [tracks, uri, uris]) != 1:
|
||||
raise ValueError(
|
||||
'Exactly one of "tracks", "uri" or "uris" must be set')
|
||||
|
||||
tracks is None or validation.check_instances(tracks, Track)
|
||||
uri is None or validation.check_uri(uri)
|
||||
uris is None or validation.check_uris(uris)
|
||||
validation.check_integer(at_position or 0)
|
||||
|
||||
if tracks:
|
||||
deprecation.warn('core.tracklist.add:tracks_arg')
|
||||
|
||||
if uri:
|
||||
deprecation.warn('core.tracklist.add:uri_arg')
|
||||
|
||||
if tracks is None:
|
||||
if uri is not None:
|
||||
tracks = self.core.library.lookup(uri=uri)
|
||||
elif uris is not None:
|
||||
tracks = []
|
||||
track_map = self.core.library.lookup(uris=uris)
|
||||
for uri in uris:
|
||||
tracks.extend(track_map[uri])
|
||||
uris = [uri]
|
||||
|
||||
tracks = []
|
||||
track_map = self.core.library.lookup(uris=uris)
|
||||
for uri in uris:
|
||||
tracks.extend(track_map[uri])
|
||||
|
||||
tl_tracks = []
|
||||
max_length = self.core._config['core']['max_tracklist_length']
|
||||
|
||||
for track in tracks:
|
||||
if self.get_length() >= max_length:
|
||||
raise exceptions.TracklistFull(
|
||||
'Tracklist may contain at most %d tracks.' % max_length)
|
||||
|
||||
tl_track = TlTrack(self._next_tlid, track)
|
||||
self._next_tlid += 1
|
||||
if at_position is not None:
|
||||
@ -381,41 +476,35 @@ class TracklistController(object):
|
||||
|
||||
# Returns tracks with TLIDs 1, 2, 3, or 4 (tracklist ID)
|
||||
filter({'tlid': [1, 2, 3, 4]})
|
||||
filter(tlid=[1, 2, 3, 4])
|
||||
|
||||
# Returns track with IDs 1, 5, or 7
|
||||
filter({'id': [1, 5, 7]})
|
||||
filter(id=[1, 5, 7])
|
||||
|
||||
# Returns track with URIs 'xyz' or 'abc'
|
||||
filter({'uri': ['xyz', 'abc']})
|
||||
filter(uri=['xyz', 'abc'])
|
||||
|
||||
# Returns tracks with ID 1 and URI 'xyz'
|
||||
filter({'id': [1], 'uri': ['xyz']})
|
||||
filter(id=[1], uri=['xyz'])
|
||||
|
||||
# Returns track with a matching ID (1, 3 or 6) and a matching URI
|
||||
# ('xyz' or 'abc')
|
||||
filter({'id': [1, 3, 6], 'uri': ['xyz', 'abc']})
|
||||
filter(id=[1, 3, 6], uri=['xyz', 'abc'])
|
||||
# Returns track with a matching TLIDs (1, 3 or 6) and a
|
||||
# matching URI ('xyz' or 'abc')
|
||||
filter({'tlid': [1, 3, 6], 'uri': ['xyz', 'abc']})
|
||||
|
||||
:param criteria: on or more criteria to match by
|
||||
:type criteria: dict, of (string, list) pairs
|
||||
:rtype: list of :class:`mopidy.models.TlTrack`
|
||||
|
||||
.. deprecated:: 1.1
|
||||
Providing the criteria via ``kwargs``.
|
||||
"""
|
||||
if kwargs:
|
||||
deprecation.warn('core.tracklist.filter:kwargs_criteria')
|
||||
|
||||
criteria = criteria or kwargs
|
||||
tlids = criteria.pop('tlid', [])
|
||||
validation.check_query(criteria, validation.TRACKLIST_FIELDS)
|
||||
validation.check_instances(tlids, int)
|
||||
|
||||
matches = self._tl_tracks
|
||||
for (key, values) in criteria.items():
|
||||
if (not isinstance(values, collections.Iterable) or
|
||||
isinstance(values, compat.string_types)):
|
||||
# Fail hard if anyone is using the <0.17 calling style
|
||||
raise ValueError('Filter values must be iterable: %r' % values)
|
||||
if key == 'tlid':
|
||||
matches = [ct for ct in matches if ct.tlid in values]
|
||||
else:
|
||||
matches = [
|
||||
ct for ct in matches if getattr(ct.track, key) in values]
|
||||
matches = [
|
||||
ct for ct in matches if getattr(ct.track, key) in values]
|
||||
if tlids:
|
||||
matches = [ct for ct in matches if ct.tlid in tlids]
|
||||
return matches
|
||||
|
||||
def move(self, start, end, to_position):
|
||||
@ -436,6 +525,7 @@ class TracklistController(object):
|
||||
|
||||
tl_tracks = self._tl_tracks
|
||||
|
||||
# TODO: use validation helpers?
|
||||
assert start < end, 'start must be smaller than end'
|
||||
assert start >= 0, 'start must be at least zero'
|
||||
assert end <= len(tl_tracks), \
|
||||
@ -462,8 +552,14 @@ class TracklistController(object):
|
||||
:param criteria: on or more criteria to match by
|
||||
:type criteria: dict
|
||||
:rtype: list of :class:`mopidy.models.TlTrack` that was removed
|
||||
|
||||
.. deprecated:: 1.1
|
||||
Providing the criteria via ``kwargs``.
|
||||
"""
|
||||
tl_tracks = self.filter(criteria, **kwargs)
|
||||
if kwargs:
|
||||
deprecation.warn('core.tracklist.remove:kwargs_criteria')
|
||||
|
||||
tl_tracks = self.filter(criteria or kwargs)
|
||||
for tl_track in tl_tracks:
|
||||
position = self._tl_tracks.index(tl_track)
|
||||
del self._tl_tracks[position]
|
||||
@ -484,6 +580,7 @@ class TracklistController(object):
|
||||
"""
|
||||
tl_tracks = self._tl_tracks
|
||||
|
||||
# TOOD: use validation helpers?
|
||||
if start is not None and end is not None:
|
||||
assert start < end, 'start must be smaller than end'
|
||||
|
||||
@ -512,6 +609,7 @@ class TracklistController(object):
|
||||
:type end: int
|
||||
:rtype: :class:`mopidy.models.TlTrack`
|
||||
"""
|
||||
# TODO: validate slice?
|
||||
return self._tl_tracks[start:end]
|
||||
|
||||
def _mark_playing(self, tl_track):
|
||||
@ -528,13 +626,13 @@ class TracklistController(object):
|
||||
def _mark_played(self, tl_track):
|
||||
"""Internal method for :class:`mopidy.core.PlaybackController`."""
|
||||
if self.consume and tl_track is not None:
|
||||
self.remove(tlid=[tl_track.tlid])
|
||||
self.remove({'tlid': [tl_track.tlid]})
|
||||
return True
|
||||
return False
|
||||
|
||||
def _trigger_tracklist_changed(self):
|
||||
if self.get_random():
|
||||
self._shuffled = self.get_tl_tracks()
|
||||
self._shuffled = self._tl_tracks[:]
|
||||
random.shuffle(self._shuffled)
|
||||
else:
|
||||
self._shuffled = []
|
||||
|
||||
@ -2,6 +2,7 @@ from __future__ import absolute_import, unicode_literals
|
||||
|
||||
|
||||
class MopidyException(Exception):
|
||||
|
||||
def __init__(self, message, *args, **kwargs):
|
||||
super(MopidyException, self).__init__(message, *args, **kwargs)
|
||||
self._message = message
|
||||
@ -20,11 +21,19 @@ class BackendError(MopidyException):
|
||||
pass
|
||||
|
||||
|
||||
class CoreError(MopidyException):
|
||||
|
||||
def __init__(self, message, errno=None):
|
||||
super(CoreError, self).__init__(message, errno)
|
||||
self.errno = errno
|
||||
|
||||
|
||||
class ExtensionError(MopidyException):
|
||||
pass
|
||||
|
||||
|
||||
class FindError(MopidyException):
|
||||
|
||||
def __init__(self, message, errno=None):
|
||||
super(FindError, self).__init__(message, errno)
|
||||
self.errno = errno
|
||||
@ -42,5 +51,16 @@ class ScannerError(MopidyException):
|
||||
pass
|
||||
|
||||
|
||||
class TracklistFull(CoreError):
|
||||
|
||||
def __init__(self, message, errno=None):
|
||||
super(TracklistFull, self).__init__(message, errno)
|
||||
self.errno = errno
|
||||
|
||||
|
||||
class AudioException(MopidyException):
|
||||
pass
|
||||
|
||||
|
||||
class ValidationError(ValueError):
|
||||
pass
|
||||
|
||||
128
mopidy/ext.py
128
mopidy/ext.py
@ -2,16 +2,25 @@ from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import collections
|
||||
import logging
|
||||
import os
|
||||
|
||||
import pkg_resources
|
||||
|
||||
from mopidy import config as config_lib, exceptions
|
||||
from mopidy.internal import path
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
_extension_data_fields = ['extension', 'entry_point', 'config_schema',
|
||||
'config_defaults', 'command']
|
||||
|
||||
ExtensionData = collections.namedtuple('ExtensionData', _extension_data_fields)
|
||||
|
||||
|
||||
class Extension(object):
|
||||
|
||||
"""Base class for Mopidy extensions"""
|
||||
|
||||
dist_name = None
|
||||
@ -51,6 +60,42 @@ class Extension(object):
|
||||
schema['enabled'] = config_lib.Boolean()
|
||||
return schema
|
||||
|
||||
def get_cache_dir(self, config):
|
||||
"""Get or create cache directory for the extension.
|
||||
|
||||
:param config: the Mopidy config object
|
||||
:return: string
|
||||
"""
|
||||
assert self.ext_name is not None
|
||||
cache_dir_path = bytes(os.path.join(config['core']['cache_dir'],
|
||||
self.ext_name))
|
||||
path.get_or_create_dir(cache_dir_path)
|
||||
return cache_dir_path
|
||||
|
||||
def get_config_dir(self, config):
|
||||
"""Get or create configuration directory for the extension.
|
||||
|
||||
:param config: the Mopidy config object
|
||||
:return: string
|
||||
"""
|
||||
assert self.ext_name is not None
|
||||
config_dir_path = bytes(os.path.join(config['core']['config_dir'],
|
||||
self.ext_name))
|
||||
path.get_or_create_dir(config_dir_path)
|
||||
return config_dir_path
|
||||
|
||||
def get_data_dir(self, config):
|
||||
"""Get or create data directory for the extension.
|
||||
|
||||
:param config: the Mopidy config object
|
||||
:returns: string
|
||||
"""
|
||||
assert self.ext_name is not None
|
||||
data_dir_path = bytes(os.path.join(config['core']['data_dir'],
|
||||
self.ext_name))
|
||||
path.get_or_create_dir(data_dir_path)
|
||||
return data_dir_path
|
||||
|
||||
def get_command(self):
|
||||
"""Command to expose to command line users running ``mopidy``.
|
||||
|
||||
@ -88,14 +133,7 @@ class Extension(object):
|
||||
the ``frontend`` and ``backend`` registry keys.
|
||||
|
||||
This method can also be used for other setup tasks not involving the
|
||||
extension registry. For example, to register custom GStreamer
|
||||
elements::
|
||||
|
||||
def setup(self, registry):
|
||||
from .mixer import SoundspotMixer
|
||||
gobject.type_register(SoundspotMixer)
|
||||
gst.element_register(
|
||||
SoundspotMixer, 'soundspotmixer', gst.RANK_MARGINAL)
|
||||
extension registry.
|
||||
|
||||
:param registry: the extension registry
|
||||
:type registry: :class:`Registry`
|
||||
@ -104,6 +142,7 @@ class Extension(object):
|
||||
|
||||
|
||||
class Registry(collections.Mapping):
|
||||
|
||||
"""Registry of components provided by Mopidy extensions.
|
||||
|
||||
Passed to the :meth:`~Extension.setup` method of all extensions. The
|
||||
@ -153,55 +192,100 @@ def load_extensions():
|
||||
for entry_point in pkg_resources.iter_entry_points('mopidy.ext'):
|
||||
logger.debug('Loading entry point: %s', entry_point)
|
||||
extension_class = entry_point.load(require=False)
|
||||
extension = extension_class()
|
||||
extension.entry_point = entry_point
|
||||
installed_extensions.append(extension)
|
||||
|
||||
try:
|
||||
if not issubclass(extension_class, Extension):
|
||||
raise TypeError # issubclass raises TypeError on non-class
|
||||
except TypeError:
|
||||
logger.error('Entry point %s did not contain a valid extension'
|
||||
'class: %r', entry_point.name, extension_class)
|
||||
continue
|
||||
|
||||
try:
|
||||
extension = extension_class()
|
||||
config_schema = extension.get_config_schema()
|
||||
default_config = extension.get_default_config()
|
||||
command = extension.get_command()
|
||||
except Exception:
|
||||
logger.exception('Setup of extension from entry point %s failed, '
|
||||
'ignoring extension.', entry_point.name)
|
||||
continue
|
||||
|
||||
installed_extensions.append(ExtensionData(
|
||||
extension, entry_point, config_schema, default_config, command))
|
||||
|
||||
logger.debug(
|
||||
'Loaded extension: %s %s', extension.dist_name, extension.version)
|
||||
|
||||
names = (e.ext_name for e in installed_extensions)
|
||||
names = (ed.extension.ext_name for ed in installed_extensions)
|
||||
logger.debug('Discovered extensions: %s', ', '.join(names))
|
||||
return installed_extensions
|
||||
|
||||
|
||||
def validate_extension(extension):
|
||||
def validate_extension_data(data):
|
||||
"""Verify extension's dependencies and environment.
|
||||
|
||||
:param extensions: an extension to check
|
||||
:returns: if extension should be run
|
||||
"""
|
||||
|
||||
logger.debug('Validating extension: %s', extension.ext_name)
|
||||
logger.debug('Validating extension: %s', data.extension.ext_name)
|
||||
|
||||
if extension.ext_name != extension.entry_point.name:
|
||||
if data.extension.ext_name != data.entry_point.name:
|
||||
logger.warning(
|
||||
'Disabled extension %(ep)s: entry point name (%(ep)s) '
|
||||
'does not match extension name (%(ext)s)',
|
||||
{'ep': extension.entry_point.name, 'ext': extension.ext_name})
|
||||
{'ep': data.entry_point.name, 'ext': data.extension.ext_name})
|
||||
return False
|
||||
|
||||
try:
|
||||
extension.entry_point.require()
|
||||
data.entry_point.require()
|
||||
except pkg_resources.DistributionNotFound as ex:
|
||||
logger.info(
|
||||
'Disabled extension %s: Dependency %s not found',
|
||||
extension.ext_name, ex)
|
||||
data.extension.ext_name, ex)
|
||||
return False
|
||||
except pkg_resources.VersionConflict as ex:
|
||||
if len(ex.args) == 2:
|
||||
found, required = ex.args
|
||||
logger.info(
|
||||
'Disabled extension %s: %s required, but found %s at %s',
|
||||
extension.ext_name, required, found, found.location)
|
||||
data.extension.ext_name, required, found, found.location)
|
||||
else:
|
||||
logger.info('Disabled extension %s: %s', extension.ext_name, ex)
|
||||
logger.info(
|
||||
'Disabled extension %s: %s', data.extension.ext_name, ex)
|
||||
return False
|
||||
|
||||
try:
|
||||
extension.validate_environment()
|
||||
data.extension.validate_environment()
|
||||
except exceptions.ExtensionError as ex:
|
||||
logger.info(
|
||||
'Disabled extension %s: %s', extension.ext_name, ex.message)
|
||||
'Disabled extension %s: %s', data.extension.ext_name, ex.message)
|
||||
return False
|
||||
except Exception:
|
||||
logger.exception('Validating extension %s failed with an exception.',
|
||||
data.extension.ext_name)
|
||||
return False
|
||||
|
||||
if not data.config_schema:
|
||||
logger.error('Extension %s does not have a config schema, disabling.',
|
||||
data.extension.ext_name)
|
||||
return False
|
||||
elif not isinstance(data.config_schema.get('enabled'), config_lib.Boolean):
|
||||
logger.error('Extension %s does not have the required "enabled" config'
|
||||
' option, disabling.', data.extension.ext_name)
|
||||
return False
|
||||
|
||||
for key, value in data.config_schema.items():
|
||||
if not isinstance(value, config_lib.ConfigValue):
|
||||
logger.error('Extension %s config schema contains an invalid value'
|
||||
' for the option "%s", disabling.',
|
||||
data.extension.ext_name, key)
|
||||
return False
|
||||
|
||||
if not data.config_defaults:
|
||||
logger.error('Extension %s does not have a default config, disabling.',
|
||||
data.extension.ext_name)
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
32
mopidy/file/__init__.py
Normal file
32
mopidy/file/__init__.py
Normal file
@ -0,0 +1,32 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import logging
|
||||
import os
|
||||
|
||||
import mopidy
|
||||
from mopidy import config, ext
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class Extension(ext.Extension):
|
||||
|
||||
dist_name = 'Mopidy-File'
|
||||
ext_name = 'file'
|
||||
version = mopidy.__version__
|
||||
|
||||
def get_default_config(self):
|
||||
conf_file = os.path.join(os.path.dirname(__file__), 'ext.conf')
|
||||
return config.read(conf_file)
|
||||
|
||||
def get_config_schema(self):
|
||||
schema = super(Extension, self).get_config_schema()
|
||||
schema['media_dirs'] = config.List(optional=True)
|
||||
schema['show_dotfiles'] = config.Boolean(optional=True)
|
||||
schema['follow_symlinks'] = config.Boolean(optional=True)
|
||||
schema['metadata_timeout'] = config.Integer(optional=True)
|
||||
return schema
|
||||
|
||||
def setup(self, registry):
|
||||
from .backend import FileBackend
|
||||
registry.add('backend', FileBackend)
|
||||
21
mopidy/file/backend.py
Normal file
21
mopidy/file/backend.py
Normal file
@ -0,0 +1,21 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import logging
|
||||
|
||||
import pykka
|
||||
|
||||
from mopidy import backend
|
||||
from mopidy.file import library
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class FileBackend(pykka.ThreadingActor, backend.Backend):
|
||||
uri_schemes = ['file']
|
||||
|
||||
def __init__(self, config, audio):
|
||||
super(FileBackend, self).__init__()
|
||||
self.library = library.FileLibraryProvider(backend=self, config=config)
|
||||
self.playback = backend.PlaybackProvider(audio=audio, backend=self)
|
||||
self.playlists = None
|
||||
8
mopidy/file/ext.conf
Normal file
8
mopidy/file/ext.conf
Normal file
@ -0,0 +1,8 @@
|
||||
[file]
|
||||
enabled = true
|
||||
media_dirs =
|
||||
$XDG_MUSIC_DIR|Music
|
||||
~/|Home
|
||||
show_dotfiles = false
|
||||
follow_symlinks = false
|
||||
metadata_timeout = 1000
|
||||
149
mopidy/file/library.py
Normal file
149
mopidy/file/library.py
Normal file
@ -0,0 +1,149 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import logging
|
||||
import operator
|
||||
import os
|
||||
import sys
|
||||
import urllib2
|
||||
|
||||
from mopidy import backend, exceptions, models
|
||||
from mopidy.audio import scan, utils
|
||||
from mopidy.internal import path
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
FS_ENCODING = sys.getfilesystemencoding()
|
||||
|
||||
|
||||
class FileLibraryProvider(backend.LibraryProvider):
|
||||
"""Library for browsing local files."""
|
||||
|
||||
# TODO: get_images that can pull from metadata and/or .folder.png etc?
|
||||
# TODO: handle playlists?
|
||||
|
||||
@property
|
||||
def root_directory(self):
|
||||
if not self._media_dirs:
|
||||
return None
|
||||
elif len(self._media_dirs) == 1:
|
||||
uri = path.path_to_uri(self._media_dirs[0]['path'])
|
||||
else:
|
||||
uri = 'file:root'
|
||||
return models.Ref.directory(name='Files', uri=uri)
|
||||
|
||||
def __init__(self, backend, config):
|
||||
super(FileLibraryProvider, self).__init__(backend)
|
||||
self._media_dirs = list(self._get_media_dirs(config))
|
||||
self._follow_symlinks = config['file']['follow_symlinks']
|
||||
self._show_dotfiles = config['file']['show_dotfiles']
|
||||
self._scanner = scan.Scanner(
|
||||
timeout=config['file']['metadata_timeout'])
|
||||
|
||||
def browse(self, uri):
|
||||
logger.debug('Browsing files at: %s', uri)
|
||||
result = []
|
||||
local_path = path.uri_to_path(uri)
|
||||
|
||||
if local_path == 'root':
|
||||
return list(self._get_media_dirs_refs())
|
||||
|
||||
if not self._is_in_basedir(os.path.realpath(local_path)):
|
||||
logger.warning(
|
||||
'Rejected attempt to browse path (%s) outside dirs defined '
|
||||
'in file/media_dirs config.', uri)
|
||||
return []
|
||||
|
||||
for dir_entry in os.listdir(local_path):
|
||||
child_path = os.path.join(local_path, dir_entry)
|
||||
uri = path.path_to_uri(child_path)
|
||||
|
||||
if not self._show_dotfiles and dir_entry.startswith(b'.'):
|
||||
continue
|
||||
|
||||
if os.path.islink(child_path) and not self._follow_symlinks:
|
||||
logger.debug('Ignoring symlink: %s', uri)
|
||||
continue
|
||||
|
||||
if not self._is_in_basedir(os.path.realpath(child_path)):
|
||||
logger.debug('Ignoring symlink to outside base dir: %s', uri)
|
||||
continue
|
||||
|
||||
name = dir_entry.decode(FS_ENCODING, 'replace')
|
||||
if os.path.isdir(child_path):
|
||||
result.append(models.Ref.directory(name=name, uri=uri))
|
||||
elif os.path.isfile(child_path) and self._is_audio_file(uri):
|
||||
result.append(models.Ref.track(name=name, uri=uri))
|
||||
|
||||
result.sort(key=operator.attrgetter('name'))
|
||||
return result
|
||||
|
||||
def lookup(self, uri):
|
||||
logger.debug('Looking up file URI: %s', uri)
|
||||
local_path = path.uri_to_path(uri)
|
||||
|
||||
if not self._is_in_basedir(local_path):
|
||||
logger.warning('Ignoring URI outside base dir: %s', local_path)
|
||||
return []
|
||||
|
||||
try:
|
||||
result = self._scanner.scan(uri)
|
||||
track = utils.convert_tags_to_track(result.tags).copy(
|
||||
uri=uri, length=result.duration)
|
||||
except exceptions.ScannerError as e:
|
||||
logger.warning('Failed looking up %s: %s', uri, e)
|
||||
track = models.Track(uri=uri)
|
||||
|
||||
if not track.name:
|
||||
filename = os.path.basename(local_path)
|
||||
name = urllib2.unquote(filename).decode(FS_ENCODING, 'replace')
|
||||
track = track.copy(name=name)
|
||||
|
||||
return [track]
|
||||
|
||||
def _get_media_dirs(self, config):
|
||||
for entry in config['file']['media_dirs']:
|
||||
media_dir = {}
|
||||
media_dir_split = entry.split('|', 1)
|
||||
local_path = path.expand_path(
|
||||
media_dir_split[0].encode(FS_ENCODING))
|
||||
|
||||
if not local_path:
|
||||
logger.warning('Failed expanding path (%s) from'
|
||||
'file/media_dirs config value.',
|
||||
media_dir_split[0])
|
||||
continue
|
||||
elif not os.path.isdir(local_path):
|
||||
logger.warning('%s is not a directory', local_path)
|
||||
continue
|
||||
|
||||
media_dir['path'] = local_path
|
||||
if len(media_dir_split) == 2:
|
||||
media_dir['name'] = media_dir_split[1]
|
||||
else:
|
||||
# TODO Mpd client should accept / in dir name
|
||||
media_dir['name'] = media_dir_split[0].replace(os.sep, '+')
|
||||
|
||||
yield media_dir
|
||||
|
||||
def _get_media_dirs_refs(self):
|
||||
for media_dir in self._media_dirs:
|
||||
yield models.Ref.directory(
|
||||
name=media_dir['name'],
|
||||
uri=path.path_to_uri(media_dir['path']))
|
||||
|
||||
def _is_audio_file(self, uri):
|
||||
try:
|
||||
result = self._scanner.scan(uri)
|
||||
if result.playable:
|
||||
logger.debug('Playable file: %s', result.uri)
|
||||
else:
|
||||
logger.debug('Unplayable file: %s (not audio)', result.uri)
|
||||
return result.playable
|
||||
except exceptions.ScannerError as e:
|
||||
logger.debug('Unplayable file: %s (%s)', uri, e)
|
||||
return False
|
||||
|
||||
def _is_in_basedir(self, local_path):
|
||||
return any(
|
||||
path.is_path_inside_base_dir(local_path, media_dir['path'])
|
||||
for media_dir in self._media_dirs)
|
||||
@ -16,7 +16,7 @@ import tornado.websocket
|
||||
from mopidy import exceptions, models, zeroconf
|
||||
from mopidy.core import CoreListener
|
||||
from mopidy.http import handlers
|
||||
from mopidy.utils import encoding, formatting, network
|
||||
from mopidy.internal import encoding, formatting, network
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -12,7 +12,7 @@ import tornado.websocket
|
||||
|
||||
import mopidy
|
||||
from mopidy import core, models
|
||||
from mopidy.utils import encoding, jsonrpc
|
||||
from mopidy.internal import encoding, jsonrpc
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@ -157,6 +157,7 @@ def set_mopidy_headers(request_handler):
|
||||
|
||||
|
||||
class JsonRpcHandler(tornado.web.RequestHandler):
|
||||
|
||||
def initialize(self, core):
|
||||
self.jsonrpc = make_jsonrpc_wrapper(core)
|
||||
|
||||
@ -191,6 +192,7 @@ class JsonRpcHandler(tornado.web.RequestHandler):
|
||||
|
||||
|
||||
class ClientListHandler(tornado.web.RequestHandler):
|
||||
|
||||
def initialize(self, apps, statics):
|
||||
self.apps = apps
|
||||
self.statics = statics
|
||||
@ -212,6 +214,7 @@ class ClientListHandler(tornado.web.RequestHandler):
|
||||
|
||||
|
||||
class StaticFileHandler(tornado.web.StaticFileHandler):
|
||||
|
||||
def set_extra_headers(self, path):
|
||||
set_mopidy_headers(self)
|
||||
|
||||
|
||||
52
mopidy/httpclient.py
Normal file
52
mopidy/httpclient.py
Normal file
@ -0,0 +1,52 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import platform
|
||||
|
||||
import mopidy
|
||||
|
||||
"Helpers for configuring HTTP clients used in Mopidy extensions."
|
||||
|
||||
|
||||
def format_proxy(proxy_config, auth=True):
|
||||
"""Convert a Mopidy proxy config to the commonly used proxy string format.
|
||||
|
||||
Outputs ``scheme://host:port``, ``scheme://user:pass@host:port`` or
|
||||
:class:`None` depending on the proxy config provided.
|
||||
|
||||
You can also opt out of getting the basic auth by setting ``auth`` to
|
||||
:class:`False`.
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
if not proxy_config.get('hostname'):
|
||||
return None
|
||||
|
||||
port = proxy_config.get('port', 80)
|
||||
if port < 0:
|
||||
port = 80
|
||||
|
||||
if proxy_config.get('username') and proxy_config.get('password') and auth:
|
||||
template = '{scheme}://{username}:{password}@{hostname}:{port}'
|
||||
else:
|
||||
template = '{scheme}://{hostname}:{port}'
|
||||
|
||||
return template.format(scheme=proxy_config.get('scheme') or 'http',
|
||||
username=proxy_config.get('username'),
|
||||
password=proxy_config.get('password'),
|
||||
hostname=proxy_config['hostname'], port=port)
|
||||
|
||||
|
||||
def format_user_agent(name=None):
|
||||
"""Construct a User-Agent suitable for use in client code.
|
||||
|
||||
This will identify use by the provided ``name`` (which should be on the
|
||||
format ``dist_name/version``), Mopidy version and Python version.
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
parts = ['Mopidy/%s' % (mopidy.__version__),
|
||||
'%s/%s' % (platform.python_implementation(),
|
||||
platform.python_version())]
|
||||
if name:
|
||||
parts.insert(0, name)
|
||||
return ' '.join(parts)
|
||||
100
mopidy/internal/deprecation.py
Normal file
100
mopidy/internal/deprecation.py
Normal file
@ -0,0 +1,100 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import contextlib
|
||||
import re
|
||||
import warnings
|
||||
|
||||
# Messages used in deprecation warnings are collected here so we can target
|
||||
# them easily when ignoring warnings.
|
||||
_MESSAGES = {
|
||||
# Deprecated features mpd:
|
||||
'mpd.protocol.playback.pause:state_arg':
|
||||
'The use of pause command w/o the PAUSE argument is deprecated.',
|
||||
'mpd.protocol.current_playlist.playlist':
|
||||
'Do not use this, instead use playlistinfo',
|
||||
|
||||
# Deprecated features in audio:
|
||||
'audio.emit_end_of_stream': 'audio.emit_end_of_stream() is deprecated',
|
||||
|
||||
# Deprecated features in core libary:
|
||||
'core.library.find_exact': 'library.find_exact() is deprecated',
|
||||
'core.library.lookup:uri_arg':
|
||||
'library.lookup() "uri" argument is deprecated',
|
||||
'core.library.search:kwargs_query':
|
||||
'library.search() with "kwargs" as query is deprecated',
|
||||
'core.library.search:empty_query':
|
||||
'library.search() with empty "query" argument deprecated',
|
||||
|
||||
# Deprecated features in core playback:
|
||||
'core.playback.get_mute': 'playback.get_mute() is deprecated',
|
||||
'core.playback.set_mute': 'playback.set_mute() is deprecated',
|
||||
'core.playback.get_volume': 'playback.get_volume() is deprecated',
|
||||
'core.playback.set_volume': 'playback.set_volume() is deprecated',
|
||||
'core.playback.play:tl_track_kwargs':
|
||||
'playback.play() with "tl_track" argument is pending deprecation use '
|
||||
'"tlid" instead',
|
||||
|
||||
# Deprecated features in core playlists:
|
||||
'core.playlists.filter': 'playlists.filter() is deprecated',
|
||||
'core.playlists.get_playlists': 'playlists.get_playlists() is deprecated',
|
||||
|
||||
# Deprecated features in core tracklist:
|
||||
'core.tracklist.add:tracks_arg':
|
||||
'tracklist.add() "tracks" argument is deprecated',
|
||||
'core.tracklist.add:uri_arg':
|
||||
'tracklist.add() "uri" argument is deprecated',
|
||||
'core.tracklist.filter:kwargs_criteria':
|
||||
'tracklist.filter() with "kwargs" as criteria is deprecated',
|
||||
'core.tracklist.remove:kwargs_criteria':
|
||||
'tracklist.remove() with "kwargs" as criteria is deprecated',
|
||||
|
||||
'core.tracklist.eot_track':
|
||||
'tracklist.eot_track() is pending deprecation, use '
|
||||
'tracklist.get_eot_tlid()',
|
||||
'core.tracklist.next_track':
|
||||
'tracklist.next_track() is pending deprecation, use '
|
||||
'tracklist.get_next_tlid()',
|
||||
'core.tracklist.previous_track':
|
||||
'tracklist.previous_track() is pending deprecation, use '
|
||||
'tracklist.get_previous_tlid()',
|
||||
|
||||
'models.immutable.copy':
|
||||
'ImmutableObject.copy() is deprecated, use ImmutableObject.replace()',
|
||||
}
|
||||
|
||||
|
||||
def warn(msg_id, pending=False):
|
||||
if pending:
|
||||
category = PendingDeprecationWarning
|
||||
else:
|
||||
category = DeprecationWarning
|
||||
warnings.warn(_MESSAGES.get(msg_id, msg_id), category)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def ignore(ids=None):
|
||||
with warnings.catch_warnings():
|
||||
if isinstance(ids, basestring):
|
||||
ids = [ids]
|
||||
|
||||
if ids:
|
||||
for msg_id in ids:
|
||||
msg = re.escape(_MESSAGES.get(msg_id, msg_id))
|
||||
warnings.filterwarnings('ignore', msg, DeprecationWarning)
|
||||
else:
|
||||
warnings.filterwarnings('ignore', category=DeprecationWarning)
|
||||
yield
|
||||
|
||||
|
||||
def deprecated_property(
|
||||
getter=None, setter=None, message='Property is deprecated'):
|
||||
|
||||
# During development, this is a convenient place to add logging, emit
|
||||
# warnings, or ``assert False`` to ensure you are not using any of the
|
||||
# deprecated properties.
|
||||
#
|
||||
# Using inspect to find the call sites to emit proper warnings makes
|
||||
# parallel execution of our test suite slower than serial execution. Thus,
|
||||
# we don't want to add any extra overhead here by default.
|
||||
|
||||
return property(getter, setter)
|
||||
@ -11,7 +11,7 @@ import pygst
|
||||
pygst.require('0.10')
|
||||
import gst # noqa
|
||||
|
||||
from mopidy.utils import formatting
|
||||
from mopidy.internal import formatting
|
||||
|
||||
|
||||
def format_dependency_list(adapters=None):
|
||||
16
mopidy/internal/http.py
Normal file
16
mopidy/internal/http.py
Normal file
@ -0,0 +1,16 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import requests
|
||||
|
||||
from mopidy import httpclient
|
||||
|
||||
|
||||
def get_requests_session(proxy_config, user_agent):
|
||||
proxy = httpclient.format_proxy(proxy_config)
|
||||
full_user_agent = httpclient.format_user_agent(user_agent)
|
||||
|
||||
session = requests.Session()
|
||||
session.proxies.update({'http': proxy, 'https': proxy})
|
||||
session.headers.update({'user-agent': full_user_agent})
|
||||
|
||||
return session
|
||||
@ -10,6 +10,7 @@ from mopidy import compat
|
||||
|
||||
|
||||
class JsonRpcWrapper(object):
|
||||
|
||||
"""
|
||||
Wrap objects and make them accessible through JSON-RPC 2.0 messaging.
|
||||
|
||||
@ -278,6 +279,7 @@ def get_combined_json_decoder(decoders):
|
||||
|
||||
def get_combined_json_encoder(encoders):
|
||||
class JsonRpcEncoder(json.JSONEncoder):
|
||||
|
||||
def default(self, obj):
|
||||
for encoder in encoders:
|
||||
try:
|
||||
@ -289,6 +291,7 @@ def get_combined_json_encoder(encoders):
|
||||
|
||||
|
||||
class JsonRpcInspector(object):
|
||||
|
||||
"""
|
||||
Inspects a group of classes and functions to create a description of what
|
||||
methods they can expose over JSON-RPC 2.0.
|
||||
@ -21,6 +21,7 @@ logging.addLevelName(TRACE_LOG_LEVEL, 'TRACE')
|
||||
|
||||
|
||||
class DelayedHandler(logging.Handler):
|
||||
|
||||
def __init__(self):
|
||||
logging.Handler.__init__(self)
|
||||
self._released = False
|
||||
@ -101,6 +102,7 @@ def setup_debug_logging_to_file(config):
|
||||
|
||||
|
||||
class VerbosityFilter(logging.Filter):
|
||||
|
||||
def __init__(self, verbosity_level, loglevels):
|
||||
self.verbosity_level = verbosity_level
|
||||
self.loglevels = loglevels
|
||||
@ -123,6 +125,7 @@ COLORS = [b'black', b'red', b'green', b'yellow', b'blue', b'magenta', b'cyan',
|
||||
|
||||
|
||||
class ColorizingStreamHandler(logging.StreamHandler):
|
||||
|
||||
"""
|
||||
Stream handler which colorizes the log using ANSI escape sequences.
|
||||
|
||||
@ -11,13 +11,14 @@ import gobject
|
||||
|
||||
import pykka
|
||||
|
||||
from mopidy.utils import encoding
|
||||
from mopidy.internal import encoding
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ShouldRetrySocketCall(Exception):
|
||||
|
||||
"""Indicate that attempted socket call should be retried"""
|
||||
|
||||
|
||||
@ -65,6 +66,7 @@ def format_hostname(hostname):
|
||||
|
||||
|
||||
class Server(object):
|
||||
|
||||
"""Setup listener and register it with gobject's event loop."""
|
||||
|
||||
def __init__(self, host, port, protocol, protocol_kwargs=None,
|
||||
@ -305,6 +307,7 @@ class Connection(object):
|
||||
|
||||
|
||||
class LineProtocol(pykka.ThreadingActor):
|
||||
|
||||
"""
|
||||
Base class for handling line based protocols.
|
||||
|
||||
@ -8,25 +8,15 @@ import threading
|
||||
import urllib
|
||||
import urlparse
|
||||
|
||||
import glib
|
||||
|
||||
from mopidy import compat, exceptions
|
||||
from mopidy.compat import queue
|
||||
from mopidy.utils import encoding
|
||||
from mopidy.internal import encoding, xdg
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
XDG_DIRS = {
|
||||
'XDG_CACHE_DIR': glib.get_user_cache_dir(),
|
||||
'XDG_CONFIG_DIR': glib.get_user_config_dir(),
|
||||
'XDG_DATA_DIR': glib.get_user_data_dir(),
|
||||
'XDG_MUSIC_DIR': glib.get_user_special_dir(glib.USER_DIRECTORY_MUSIC),
|
||||
}
|
||||
|
||||
# XDG_MUSIC_DIR can be none, so filter out any bad data.
|
||||
XDG_DIRS = dict((k, v) for k, v in XDG_DIRS.items() if v is not None)
|
||||
XDG_DIRS = xdg.get_dirs()
|
||||
|
||||
|
||||
def get_or_create_dir(dir_path):
|
||||
@ -202,31 +192,33 @@ def _find(root, thread_count=10, relative=False, follow=False):
|
||||
|
||||
def find_mtimes(root, follow=False):
|
||||
results, errors = _find(root, relative=False, follow=follow)
|
||||
mtimes = dict((f, int(st.st_mtime * 1000)) for f, st in results.items())
|
||||
# return the mtimes as integer milliseconds
|
||||
mtimes = {f: int(st.st_mtime * 1000) for f, st in results.items()}
|
||||
return mtimes, errors
|
||||
|
||||
|
||||
def check_file_path_is_inside_base_dir(file_path, base_path):
|
||||
assert not file_path.endswith(os.sep), (
|
||||
'File path %s cannot end with a path separator' % file_path)
|
||||
|
||||
def is_path_inside_base_dir(path, base_path):
|
||||
if path.endswith(os.sep):
|
||||
raise ValueError('Path %s cannot end with a path separator'
|
||||
% path)
|
||||
# Expand symlinks
|
||||
real_base_path = os.path.realpath(base_path)
|
||||
real_file_path = os.path.realpath(file_path)
|
||||
real_path = os.path.realpath(path)
|
||||
|
||||
# Use dir of file for prefix comparision, so we don't accept
|
||||
# /tmp/foo.m3u as being inside /tmp/foo, simply because they have a
|
||||
# common prefix, /tmp/foo, which matches the base path, /tmp/foo.
|
||||
real_dir_path = os.path.dirname(real_file_path)
|
||||
if os.path.isfile(path):
|
||||
# Use dir of file for prefix comparision, so we don't accept
|
||||
# /tmp/foo.m3u as being inside /tmp/foo, simply because they have a
|
||||
# common prefix, /tmp/foo, which matches the base path, /tmp/foo.
|
||||
real_path = os.path.dirname(real_path)
|
||||
|
||||
# Check if dir of file is the base path or a subdir
|
||||
common_prefix = os.path.commonprefix([real_base_path, real_dir_path])
|
||||
assert common_prefix == real_base_path, (
|
||||
'File path %s must be in %s' % (real_file_path, real_base_path))
|
||||
common_prefix = os.path.commonprefix([real_base_path, real_path])
|
||||
return common_prefix == real_base_path
|
||||
|
||||
|
||||
# FIXME replace with mock usage in tests.
|
||||
class Mtime(object):
|
||||
|
||||
def __init__(self):
|
||||
self.fake = None
|
||||
|
||||
132
mopidy/internal/playlists.py
Normal file
132
mopidy/internal/playlists.py
Normal file
@ -0,0 +1,132 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import io
|
||||
|
||||
import pygst
|
||||
pygst.require('0.10')
|
||||
import gst # noqa
|
||||
|
||||
from mopidy.compat import configparser
|
||||
from mopidy.internal import validation
|
||||
|
||||
try:
|
||||
import xml.etree.cElementTree as elementtree
|
||||
except ImportError:
|
||||
import xml.etree.ElementTree as elementtree
|
||||
|
||||
|
||||
def parse(data):
|
||||
handlers = {
|
||||
detect_extm3u_header: parse_extm3u,
|
||||
detect_pls_header: parse_pls,
|
||||
detect_asx_header: parse_asx,
|
||||
detect_xspf_header: parse_xspf,
|
||||
}
|
||||
for detector, parser in handlers.items():
|
||||
if detector(data):
|
||||
return list(parser(data))
|
||||
return parse_urilist(data) # Fallback
|
||||
|
||||
|
||||
def detect_extm3u_header(data):
|
||||
return data[0:7].upper() == b'#EXTM3U'
|
||||
|
||||
|
||||
def detect_pls_header(data):
|
||||
return data[0:10].lower() == b'[playlist]'
|
||||
|
||||
|
||||
def detect_xspf_header(data):
|
||||
data = data[0:150]
|
||||
if b'xspf' not in data.lower():
|
||||
return False
|
||||
|
||||
try:
|
||||
data = io.BytesIO(data)
|
||||
for event, element in elementtree.iterparse(data, events=(b'start',)):
|
||||
return element.tag.lower() == '{http://xspf.org/ns/0/}playlist'
|
||||
except elementtree.ParseError:
|
||||
pass
|
||||
return False
|
||||
|
||||
|
||||
def detect_asx_header(data):
|
||||
data = data[0:50]
|
||||
if b'asx' not in data.lower():
|
||||
return False
|
||||
|
||||
try:
|
||||
data = io.BytesIO(data)
|
||||
for event, element in elementtree.iterparse(data, events=(b'start',)):
|
||||
return element.tag.lower() == 'asx'
|
||||
except elementtree.ParseError:
|
||||
pass
|
||||
return False
|
||||
|
||||
|
||||
def parse_extm3u(data):
|
||||
# TODO: convert non URIs to file URIs.
|
||||
found_header = False
|
||||
for line in data.splitlines():
|
||||
if found_header or line.startswith(b'#EXTM3U'):
|
||||
found_header = True
|
||||
else:
|
||||
continue
|
||||
if not line.startswith(b'#') and line.strip():
|
||||
yield line.strip()
|
||||
|
||||
|
||||
def parse_pls(data):
|
||||
# TODO: convert non URIs to file URIs.
|
||||
try:
|
||||
cp = configparser.RawConfigParser()
|
||||
cp.readfp(io.BytesIO(data))
|
||||
except configparser.Error:
|
||||
return
|
||||
|
||||
for section in cp.sections():
|
||||
if section.lower() != 'playlist':
|
||||
continue
|
||||
for i in range(cp.getint(section, 'numberofentries')):
|
||||
yield cp.get(section, 'file%d' % (i + 1))
|
||||
|
||||
|
||||
def parse_xspf(data):
|
||||
try:
|
||||
# Last element will be root.
|
||||
for event, element in elementtree.iterparse(io.BytesIO(data)):
|
||||
element.tag = element.tag.lower() # normalize
|
||||
except elementtree.ParseError:
|
||||
return
|
||||
|
||||
ns = 'http://xspf.org/ns/0/'
|
||||
for track in element.iterfind('{%s}tracklist/{%s}track' % (ns, ns)):
|
||||
yield track.findtext('{%s}location' % ns)
|
||||
|
||||
|
||||
def parse_asx(data):
|
||||
try:
|
||||
# Last element will be root.
|
||||
for event, element in elementtree.iterparse(io.BytesIO(data)):
|
||||
element.tag = element.tag.lower() # normalize
|
||||
except elementtree.ParseError:
|
||||
return
|
||||
|
||||
for ref in element.findall('entry/ref[@href]'):
|
||||
yield ref.get('href', '').strip()
|
||||
|
||||
for entry in element.findall('entry[@href]'):
|
||||
yield entry.get('href', '').strip()
|
||||
|
||||
|
||||
def parse_urilist(data):
|
||||
result = []
|
||||
for line in data.splitlines():
|
||||
if not line.strip() or line.startswith('#'):
|
||||
continue
|
||||
try:
|
||||
validation.check_uri(line)
|
||||
except ValueError:
|
||||
return []
|
||||
result.append(line)
|
||||
return result
|
||||
@ -53,6 +53,7 @@ def stop_remaining_actors():
|
||||
|
||||
|
||||
class BaseThread(threading.Thread):
|
||||
|
||||
def __init__(self):
|
||||
super(BaseThread, self).__init__()
|
||||
# No thread should block process from exiting
|
||||
105
mopidy/internal/validation.py
Normal file
105
mopidy/internal/validation.py
Normal file
@ -0,0 +1,105 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import collections
|
||||
import urlparse
|
||||
|
||||
from mopidy import compat, exceptions
|
||||
|
||||
PLAYBACK_STATES = {'paused', 'stopped', 'playing'}
|
||||
|
||||
SEARCH_FIELDS = {
|
||||
'uri', 'track_name', 'album', 'artist', 'albumartist', 'composer',
|
||||
'performer', 'track_no', 'genre', 'date', 'comment', 'any'}
|
||||
|
||||
PLAYLIST_FIELDS = {'uri', 'name'} # TODO: add length and last_modified?
|
||||
|
||||
TRACKLIST_FIELDS = { # TODO: add bitrate, length, disc_no, track_no, modified?
|
||||
'uri', 'name', 'genre', 'date', 'comment', 'musicbrainz_id'}
|
||||
|
||||
DISTINCT_FIELDS = {
|
||||
'track', 'artist', 'albumartist', 'album', 'composer', 'performer', 'date',
|
||||
'genre'}
|
||||
|
||||
|
||||
# TODO: _check_iterable(check, msg, **kwargs) + [check(a) for a in arg]?
|
||||
def _check_iterable(arg, msg, **kwargs):
|
||||
"""Ensure we have an iterable which is not a string or an iterator"""
|
||||
if isinstance(arg, compat.string_types):
|
||||
raise exceptions.ValidationError(msg.format(arg=arg, **kwargs))
|
||||
elif not isinstance(arg, collections.Iterable):
|
||||
raise exceptions.ValidationError(msg.format(arg=arg, **kwargs))
|
||||
elif iter(arg) is iter(arg):
|
||||
raise exceptions.ValidationError(msg.format(arg=arg, **kwargs))
|
||||
|
||||
|
||||
def check_choice(arg, choices, msg='Expected one of {choices}, not {arg!r}'):
|
||||
if arg not in choices:
|
||||
raise exceptions.ValidationError(msg.format(
|
||||
arg=arg, choices=tuple(choices)))
|
||||
|
||||
|
||||
def check_boolean(arg, msg='Expected a boolean, not {arg!r}'):
|
||||
check_instance(arg, bool, msg=msg)
|
||||
|
||||
|
||||
def check_instance(arg, cls, msg='Expected a {name} instance, not {arg!r}'):
|
||||
if not isinstance(arg, cls):
|
||||
raise exceptions.ValidationError(
|
||||
msg.format(arg=arg, name=cls.__name__))
|
||||
|
||||
|
||||
def check_instances(arg, cls, msg='Expected a list of {name}, not {arg!r}'):
|
||||
_check_iterable(arg, msg, name=cls.__name__)
|
||||
if not all(isinstance(instance, cls) for instance in arg):
|
||||
raise exceptions.ValidationError(
|
||||
msg.format(arg=arg, name=cls.__name__))
|
||||
|
||||
|
||||
def check_integer(arg, min=None, max=None):
|
||||
if not isinstance(arg, (int, long)):
|
||||
raise exceptions.ValidationError('Expected an integer, not %r' % arg)
|
||||
elif min is not None and arg < min:
|
||||
raise exceptions.ValidationError(
|
||||
'Expected number larger or equal to %d, not %r' % (min, arg))
|
||||
elif max is not None and arg > max:
|
||||
raise exceptions.ValidationError(
|
||||
'Expected number smaller or equal to %d, not %r' % (max, arg))
|
||||
|
||||
|
||||
def check_query(arg, fields=SEARCH_FIELDS, list_values=True):
|
||||
# TODO: normalize name -> track_name
|
||||
# TODO: normalize value -> [value]
|
||||
# TODO: normalize blank -> [] or just remove field?
|
||||
# TODO: remove list_values?
|
||||
|
||||
if not isinstance(arg, collections.Mapping):
|
||||
raise exceptions.ValidationError(
|
||||
'Expected a query dictionary, not {arg!r}'.format(arg=arg))
|
||||
|
||||
for key, value in arg.items():
|
||||
check_choice(key, fields, msg='Expected query field to be one of '
|
||||
'{choices}, not {arg!r}')
|
||||
if list_values:
|
||||
msg = 'Expected "{key}" to be list of strings, not {arg!r}'
|
||||
_check_iterable(value, msg, key=key)
|
||||
[_check_query_value(key, v, msg) for v in value]
|
||||
else:
|
||||
_check_query_value(
|
||||
key, value, 'Expected "{key}" to be a string, not {arg!r}')
|
||||
|
||||
|
||||
def _check_query_value(key, arg, msg):
|
||||
if not isinstance(arg, compat.string_types) or not arg.strip():
|
||||
raise exceptions.ValidationError(msg.format(arg=arg, key=key))
|
||||
|
||||
|
||||
def check_uri(arg, msg='Expected a valid URI, not {arg!r}'):
|
||||
if not isinstance(arg, compat.string_types):
|
||||
raise exceptions.ValidationError(msg.format(arg=arg))
|
||||
elif urlparse.urlparse(arg).scheme == '':
|
||||
raise exceptions.ValidationError(msg.format(arg=arg))
|
||||
|
||||
|
||||
def check_uris(arg, msg='Expected a list of URIs, not {arg!r}'):
|
||||
_check_iterable(arg, msg)
|
||||
[check_uri(a, msg) for a in arg]
|
||||
66
mopidy/internal/xdg.py
Normal file
66
mopidy/internal/xdg.py
Normal file
@ -0,0 +1,66 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import ConfigParser as configparser
|
||||
import io
|
||||
import os
|
||||
|
||||
|
||||
def get_dirs():
|
||||
"""Returns a dict of all the known XDG Base Directories for the current user.
|
||||
|
||||
The keys ``XDG_CACHE_DIR``, ``XDG_CONFIG_DIR``, and ``XDG_DATA_DIR`` is
|
||||
always available.
|
||||
|
||||
Additional keys, like ``XDG_MUSIC_DIR``, may be available if the
|
||||
``$XDG_CONFIG_DIR/user-dirs.dirs`` file exists and is parseable.
|
||||
|
||||
See http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html
|
||||
for the XDG Base Directory specification.
|
||||
"""
|
||||
|
||||
dirs = {
|
||||
'XDG_CACHE_DIR': (
|
||||
os.environ.get('XDG_CACHE_HOME') or
|
||||
os.path.expanduser(b'~/.cache')),
|
||||
'XDG_CONFIG_DIR': (
|
||||
os.environ.get('XDG_CONFIG_HOME') or
|
||||
os.path.expanduser(b'~/.config')),
|
||||
'XDG_DATA_DIR': (
|
||||
os.environ.get('XDG_DATA_HOME') or
|
||||
os.path.expanduser(b'~/.local/share')),
|
||||
}
|
||||
|
||||
dirs.update(_get_user_dirs(dirs['XDG_CONFIG_DIR']))
|
||||
|
||||
return dirs
|
||||
|
||||
|
||||
def _get_user_dirs(xdg_config_dir):
|
||||
"""Returns a dict of XDG dirs read from
|
||||
``$XDG_CONFIG_HOME/user-dirs.dirs``.
|
||||
|
||||
This is used at import time for most users of :mod:`mopidy`. By rolling our
|
||||
own implementation instead of using :meth:`glib.get_user_special_dir` we
|
||||
make it possible for many extensions to run their test suites, which are
|
||||
importing parts of :mod:`mopidy`, in a virtualenv with global site-packages
|
||||
disabled, and thus no :mod:`glib` available.
|
||||
"""
|
||||
|
||||
dirs_file = os.path.join(xdg_config_dir, 'user-dirs.dirs')
|
||||
|
||||
if not os.path.exists(dirs_file):
|
||||
return {}
|
||||
|
||||
with open(dirs_file, 'rb') as fh:
|
||||
data = fh.read()
|
||||
|
||||
data = b'[XDG_USER_DIRS]\n' + data
|
||||
data = data.replace(b'$HOME', os.path.expanduser(b'~'))
|
||||
data = data.replace(b'"', b'')
|
||||
|
||||
config = configparser.RawConfigParser()
|
||||
config.readfp(io.BytesIO(data))
|
||||
|
||||
return {
|
||||
k.decode('utf-8').upper(): os.path.abspath(v)
|
||||
for k, v in config.items('XDG_USER_DIRS') if v is not None}
|
||||
@ -2,14 +2,18 @@ from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import logging
|
||||
|
||||
import gobject
|
||||
|
||||
import pykka
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def send_async(cls, event, **kwargs):
|
||||
# This file is imported by mopidy.backends, which again is imported by all
|
||||
# backend extensions. By importing modules that are not easily installable
|
||||
# close to their use, we make some extensions able to run their tests in a
|
||||
# virtualenv with global site-packages disabled.
|
||||
import gobject
|
||||
|
||||
gobject.idle_add(lambda: send(cls, event, **kwargs))
|
||||
|
||||
|
||||
@ -35,6 +39,7 @@ def send(cls, event, **kwargs):
|
||||
|
||||
|
||||
class Listener(object):
|
||||
|
||||
def on_event(self, event, **kwargs):
|
||||
"""
|
||||
Called on all events.
|
||||
|
||||
@ -48,6 +48,7 @@ class Extension(ext.Extension):
|
||||
|
||||
|
||||
class Library(object):
|
||||
|
||||
"""
|
||||
Local library interface.
|
||||
|
||||
|
||||
@ -7,8 +7,8 @@ import time
|
||||
|
||||
from mopidy import commands, compat, exceptions
|
||||
from mopidy.audio import scan, utils
|
||||
from mopidy.internal import path
|
||||
from mopidy.local import translator
|
||||
from mopidy.utils import path
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@ -29,6 +29,7 @@ def _get_library(args, config):
|
||||
|
||||
|
||||
class LocalCommand(commands.Command):
|
||||
|
||||
def __init__(self):
|
||||
super(LocalCommand, self).__init__()
|
||||
self.add_child('scan', ScanCommand())
|
||||
@ -142,7 +143,7 @@ class ScanCommand(commands.Command):
|
||||
uri, MIN_DURATION_MS)
|
||||
else:
|
||||
mtime = file_mtimes.get(os.path.join(media_dir, relpath))
|
||||
track = utils.convert_tags_to_track(tags).copy(
|
||||
track = utils.convert_tags_to_track(tags).replace(
|
||||
uri=uri, length=duration, last_modified=mtime)
|
||||
if library.add_supports_tags_and_duration:
|
||||
library.add(track, tags=tags, duration=duration)
|
||||
@ -164,6 +165,7 @@ class ScanCommand(commands.Command):
|
||||
|
||||
|
||||
class _Progress(object):
|
||||
|
||||
def __init__(self, batch_size, total):
|
||||
self.count = 0
|
||||
self.batch_size = batch_size
|
||||
|
||||
@ -11,8 +11,8 @@ import tempfile
|
||||
|
||||
import mopidy
|
||||
from mopidy import compat, local, models
|
||||
from mopidy.internal import encoding, timer
|
||||
from mopidy.local import search, storage, translator
|
||||
from mopidy.utils import encoding, timer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -174,7 +174,7 @@ class JsonLibrary(local.Library):
|
||||
search_result = search.search(self._tracks.values(), query, limit=None)
|
||||
for track in search_result.tracks:
|
||||
distinct_result.update(distinct(track))
|
||||
return distinct_result
|
||||
return distinct_result - {None}
|
||||
|
||||
def search(self, query=None, limit=100, offset=0, uris=None, exact=False):
|
||||
tracks = self._tracks.values()
|
||||
|
||||
@ -8,6 +8,7 @@ logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class LocalLibraryProvider(backend.LibraryProvider):
|
||||
|
||||
"""Proxy library that delegates work to our active local library."""
|
||||
|
||||
root_directory = models.Ref.directory(
|
||||
|
||||
@ -5,6 +5,7 @@ from mopidy.local import translator
|
||||
|
||||
|
||||
class LocalPlaybackProvider(backend.PlaybackProvider):
|
||||
|
||||
def translate_uri(self, uri):
|
||||
return translator.local_track_uri_to_file_uri(
|
||||
return translator.local_uri_to_file_uri(
|
||||
uri, self.backend.config['local']['media_dir'])
|
||||
|
||||
@ -43,8 +43,8 @@ def find_exact(tracks, query=None, limit=100, offset=0, uris=None):
|
||||
return filter(lambda a: q == a.name, t.artists)
|
||||
|
||||
def albumartist_filter(t):
|
||||
return any([q == a.name for a in getattr(t.album,
|
||||
'artists', [])])
|
||||
return any([
|
||||
q == a.name for a in getattr(t.album, 'artists', [])])
|
||||
|
||||
def composer_filter(t):
|
||||
return any([q == a.name for a in getattr(t, 'composers', [])])
|
||||
@ -150,8 +150,8 @@ def search(tracks, query=None, limit=100, offset=0, uris=None):
|
||||
q in t.album.name.lower())
|
||||
|
||||
def artist_filter(t):
|
||||
return bool(filter(lambda a:
|
||||
bool(a.name and q in a.name.lower()), t.artists))
|
||||
return bool(filter(
|
||||
lambda a: bool(a.name and q in a.name.lower()), t.artists))
|
||||
|
||||
def albumartist_filter(t):
|
||||
return any([a.name and q in a.name.lower()
|
||||
|
||||
@ -3,7 +3,7 @@ from __future__ import absolute_import, unicode_literals
|
||||
import logging
|
||||
import os
|
||||
|
||||
from mopidy.utils import encoding, path
|
||||
from mopidy.internal import encoding, path
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@ -5,25 +5,41 @@ import os
|
||||
import urllib
|
||||
|
||||
from mopidy import compat
|
||||
from mopidy.utils.path import path_to_uri, uri_to_path
|
||||
from mopidy.internal import path
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def local_track_uri_to_file_uri(uri, media_dir):
|
||||
return path_to_uri(local_track_uri_to_path(uri, media_dir))
|
||||
def local_uri_to_file_uri(uri, media_dir):
|
||||
"""Convert local track or directory URI to file URI."""
|
||||
return path_to_file_uri(local_uri_to_path(uri, media_dir))
|
||||
|
||||
|
||||
def local_track_uri_to_path(uri, media_dir):
|
||||
if not uri.startswith('local:track:'):
|
||||
def local_uri_to_path(uri, media_dir):
|
||||
"""Convert local track or directory URI to absolute path."""
|
||||
if (
|
||||
not uri.startswith('local:directory:') and
|
||||
not uri.startswith('local:track:')):
|
||||
raise ValueError('Invalid URI.')
|
||||
file_path = uri_to_path(uri).split(b':', 1)[1]
|
||||
file_path = path.uri_to_path(uri).split(b':', 1)[1]
|
||||
return os.path.join(media_dir, file_path)
|
||||
|
||||
|
||||
def local_track_uri_to_path(uri, media_dir):
|
||||
# Deprecated version to keep old versions of Mopidy-Local-Sqlite working.
|
||||
return local_uri_to_path(uri, media_dir)
|
||||
|
||||
|
||||
def path_to_file_uri(abspath):
|
||||
"""Convert absolute path to file URI."""
|
||||
# Re-export internal method for use by Mopidy-Local-* extensions.
|
||||
return path.path_to_uri(abspath)
|
||||
|
||||
|
||||
def path_to_local_track_uri(relpath):
|
||||
"""Convert path relative to media_dir to local track URI."""
|
||||
"""Convert path relative to :confval:`local/media_dir` to local track
|
||||
URI."""
|
||||
if isinstance(relpath, compat.text_type):
|
||||
relpath = relpath.encode('utf-8')
|
||||
return b'local:track:%s' % urllib.quote(relpath)
|
||||
|
||||
@ -5,9 +5,9 @@ import logging
|
||||
import pykka
|
||||
|
||||
from mopidy import backend
|
||||
from mopidy.internal import encoding, path
|
||||
from mopidy.m3u.library import M3ULibraryProvider
|
||||
from mopidy.m3u.playlists import M3UPlaylistsProvider
|
||||
from mopidy.utils import encoding, path
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@ -8,6 +8,7 @@ logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class M3ULibraryProvider(backend.LibraryProvider):
|
||||
|
||||
"""Library for looking up M3U playlists."""
|
||||
|
||||
def __init__(self, backend):
|
||||
|
||||
@ -51,10 +51,12 @@ class M3UPlaylistsProvider(backend.PlaylistsProvider):
|
||||
if os.path.exists(path):
|
||||
os.remove(path)
|
||||
else:
|
||||
logger.warn('Trying to delete missing playlist file %s', path)
|
||||
logger.warning(
|
||||
'Trying to delete missing playlist file %s', path)
|
||||
del self._playlists[uri]
|
||||
logger.info('Deleted playlist %s', uri)
|
||||
else:
|
||||
logger.warn('Trying to delete unknown playlist %s', uri)
|
||||
logger.warning('Trying to delete unknown playlist %s', uri)
|
||||
|
||||
def lookup(self, uri):
|
||||
return self._playlists.get(uri)
|
||||
@ -63,7 +65,7 @@ class M3UPlaylistsProvider(backend.PlaylistsProvider):
|
||||
playlists = {}
|
||||
|
||||
encoding = sys.getfilesystemencoding()
|
||||
for path in glob.glob(os.path.join(self._playlists_dir, b'*.m3u')):
|
||||
for path in glob.glob(os.path.join(self._playlists_dir, b'*.m3u*')):
|
||||
relpath = os.path.basename(path)
|
||||
uri = translator.path_to_playlist_uri(relpath)
|
||||
name = os.path.splitext(relpath)[0].decode(encoding, 'replace')
|
||||
@ -76,6 +78,8 @@ class M3UPlaylistsProvider(backend.PlaylistsProvider):
|
||||
'Loaded %d M3U playlists from %s',
|
||||
len(playlists), self._playlists_dir)
|
||||
|
||||
# TODO Trigger playlists_loaded event?
|
||||
|
||||
def save(self, playlist):
|
||||
assert playlist.uri, 'Cannot save playlist without URI'
|
||||
assert playlist.uri in self._playlists, \
|
||||
@ -110,4 +114,4 @@ class M3UPlaylistsProvider(backend.PlaylistsProvider):
|
||||
raise ValueError('M3U playlist needs name or URI')
|
||||
translator.save_m3u(path, playlist.tracks, 'latin1')
|
||||
# assert playlist name matches file name/uri
|
||||
return playlist.copy(uri=uri, name=name)
|
||||
return playlist.replace(uri=uri, name=name)
|
||||
|
||||
@ -8,9 +8,8 @@ import urllib
|
||||
import urlparse
|
||||
|
||||
from mopidy import compat
|
||||
from mopidy.internal import encoding, path
|
||||
from mopidy.models import Track
|
||||
from mopidy.utils.encoding import locale_decode
|
||||
from mopidy.utils.path import path_to_uri, uri_to_path
|
||||
|
||||
|
||||
M3U_EXTINF_RE = re.compile(r'#EXTINF:(-1|\d+),(.*)')
|
||||
@ -21,7 +20,7 @@ logger = logging.getLogger(__name__)
|
||||
def playlist_uri_to_path(uri, playlists_dir):
|
||||
if not uri.startswith('m3u:'):
|
||||
raise ValueError('Invalid URI %s' % uri)
|
||||
file_path = uri_to_path(uri)
|
||||
file_path = path.uri_to_path(uri)
|
||||
return os.path.join(playlists_dir, file_path)
|
||||
|
||||
|
||||
@ -74,38 +73,42 @@ def parse_m3u(file_path, media_dir=None):
|
||||
- Lines starting with # are ignored, except for extended M3U directives.
|
||||
- Track.name and Track.length are set from extended M3U directives.
|
||||
- m3u files are latin-1.
|
||||
- m3u8 files are utf-8
|
||||
"""
|
||||
# TODO: uris as bytes
|
||||
file_encoding = 'utf-8' if file_path.endswith(b'.m3u8') else 'latin1'
|
||||
|
||||
tracks = []
|
||||
try:
|
||||
with open(file_path) as m3u:
|
||||
with codecs.open(file_path, 'rb', file_encoding, 'replace') as m3u:
|
||||
contents = m3u.readlines()
|
||||
except IOError as error:
|
||||
logger.warning('Couldn\'t open m3u: %s', locale_decode(error))
|
||||
logger.warning('Couldn\'t open m3u: %s', encoding.locale_decode(error))
|
||||
return tracks
|
||||
|
||||
if not contents:
|
||||
return tracks
|
||||
|
||||
extended = contents[0].decode('latin1').startswith('#EXTM3U')
|
||||
# Strip newlines left by codecs
|
||||
contents = [line.strip() for line in contents]
|
||||
|
||||
extended = contents[0].startswith('#EXTM3U')
|
||||
|
||||
track = Track()
|
||||
for line in contents:
|
||||
line = line.strip().decode('latin1')
|
||||
|
||||
if line.startswith('#'):
|
||||
if extended and line.startswith('#EXTINF'):
|
||||
track = m3u_extinf_to_track(line)
|
||||
continue
|
||||
|
||||
if urlparse.urlsplit(line).scheme:
|
||||
tracks.append(track.copy(uri=line))
|
||||
tracks.append(track.replace(uri=line))
|
||||
elif os.path.normpath(line) == os.path.abspath(line):
|
||||
path = path_to_uri(line)
|
||||
tracks.append(track.copy(uri=path))
|
||||
uri = path.path_to_uri(line)
|
||||
tracks.append(track.replace(uri=uri))
|
||||
elif media_dir is not None:
|
||||
path = path_to_uri(os.path.join(media_dir, line))
|
||||
tracks.append(track.copy(uri=path))
|
||||
uri = path.path_to_uri(os.path.join(media_dir, line))
|
||||
tracks.append(track.replace(uri=uri))
|
||||
|
||||
track = Track()
|
||||
return tracks
|
||||
|
||||
@ -9,6 +9,7 @@ logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class Mixer(object):
|
||||
|
||||
"""
|
||||
Audio mixer API
|
||||
|
||||
@ -109,8 +110,13 @@ class Mixer(object):
|
||||
logger.debug('Mixer event: mute_changed(mute=%s)', mute)
|
||||
MixerListener.send('mute_changed', mute=mute)
|
||||
|
||||
def ping(self):
|
||||
"""Called to check if the actor is still alive."""
|
||||
return True
|
||||
|
||||
|
||||
class MixerListener(listener.Listener):
|
||||
|
||||
"""
|
||||
Marker interface for recipients of events sent by the mixer actor.
|
||||
|
||||
|
||||
@ -1,148 +1,17 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import json
|
||||
from mopidy.models import fields
|
||||
from mopidy.models.immutable import ImmutableObject, ValidatedImmutableObject
|
||||
from mopidy.models.serialize import ModelJSONEncoder, model_json_decoder
|
||||
|
||||
__all__ = [
|
||||
'ImmutableObject', 'Ref', 'Image', 'Artist', 'Album', 'track', 'TlTrack',
|
||||
'Playlist', 'SearchResult', 'model_json_decoder', 'ModelJSONEncoder',
|
||||
'ValidatedImmutableObject']
|
||||
|
||||
|
||||
class ImmutableObject(object):
|
||||
"""
|
||||
Superclass for immutable objects whose fields can only be modified via the
|
||||
constructor.
|
||||
class Ref(ValidatedImmutableObject):
|
||||
|
||||
:param kwargs: kwargs to set as fields on the object
|
||||
:type kwargs: any
|
||||
"""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
for key, value in kwargs.items():
|
||||
if not hasattr(self, key) or callable(getattr(self, key)):
|
||||
raise TypeError(
|
||||
'__init__() got an unexpected keyword argument "%s"' %
|
||||
key)
|
||||
if value == getattr(self, key):
|
||||
continue # Don't explicitly set default values
|
||||
self.__dict__[key] = value
|
||||
|
||||
def __setattr__(self, name, value):
|
||||
if name.startswith('_'):
|
||||
return super(ImmutableObject, self).__setattr__(name, value)
|
||||
raise AttributeError('Object is immutable.')
|
||||
|
||||
def __repr__(self):
|
||||
kwarg_pairs = []
|
||||
for (key, value) in sorted(self.__dict__.items()):
|
||||
if isinstance(value, (frozenset, tuple)):
|
||||
if not value:
|
||||
continue
|
||||
value = list(value)
|
||||
kwarg_pairs.append('%s=%s' % (key, repr(value)))
|
||||
return '%(classname)s(%(kwargs)s)' % {
|
||||
'classname': self.__class__.__name__,
|
||||
'kwargs': ', '.join(kwarg_pairs),
|
||||
}
|
||||
|
||||
def __hash__(self):
|
||||
hash_sum = 0
|
||||
for key, value in self.__dict__.items():
|
||||
hash_sum += hash(key) + hash(value)
|
||||
return hash_sum
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, self.__class__):
|
||||
return False
|
||||
|
||||
return self.__dict__ == other.__dict__
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self.__eq__(other)
|
||||
|
||||
def copy(self, **values):
|
||||
"""
|
||||
Copy the model with ``field`` updated to new value.
|
||||
|
||||
Examples::
|
||||
|
||||
# Returns a track with a new name
|
||||
Track(name='foo').copy(name='bar')
|
||||
# Return an album with a new number of tracks
|
||||
Album(num_tracks=2).copy(num_tracks=5)
|
||||
|
||||
:param values: the model fields to modify
|
||||
:type values: dict
|
||||
:rtype: new instance of the model being copied
|
||||
"""
|
||||
data = {}
|
||||
for key in self.__dict__.keys():
|
||||
public_key = key.lstrip('_')
|
||||
value = values.pop(public_key, self.__dict__[key])
|
||||
data[public_key] = value
|
||||
for key in values.keys():
|
||||
if hasattr(self, key):
|
||||
value = values.pop(key)
|
||||
data[key] = value
|
||||
if values:
|
||||
raise TypeError(
|
||||
'copy() got an unexpected keyword argument "%s"' % key)
|
||||
return self.__class__(**data)
|
||||
|
||||
def serialize(self):
|
||||
data = {}
|
||||
data['__model__'] = self.__class__.__name__
|
||||
for key in self.__dict__.keys():
|
||||
public_key = key.lstrip('_')
|
||||
value = self.__dict__[key]
|
||||
if isinstance(value, (set, frozenset, list, tuple)):
|
||||
value = [
|
||||
v.serialize() if isinstance(v, ImmutableObject) else v
|
||||
for v in value]
|
||||
elif isinstance(value, ImmutableObject):
|
||||
value = value.serialize()
|
||||
if not (isinstance(value, list) and len(value) == 0):
|
||||
data[public_key] = value
|
||||
return data
|
||||
|
||||
|
||||
class ModelJSONEncoder(json.JSONEncoder):
|
||||
"""
|
||||
Automatically serialize Mopidy models to JSON.
|
||||
|
||||
Usage::
|
||||
|
||||
>>> import json
|
||||
>>> json.dumps({'a_track': Track(name='name')}, cls=ModelJSONEncoder)
|
||||
'{"a_track": {"__model__": "Track", "name": "name"}}'
|
||||
|
||||
"""
|
||||
def default(self, obj):
|
||||
if isinstance(obj, ImmutableObject):
|
||||
return obj.serialize()
|
||||
return json.JSONEncoder.default(self, obj)
|
||||
|
||||
|
||||
def model_json_decoder(dct):
|
||||
"""
|
||||
Automatically deserialize Mopidy models from JSON.
|
||||
|
||||
Usage::
|
||||
|
||||
>>> import json
|
||||
>>> json.loads(
|
||||
... '{"a_track": {"__model__": "Track", "name": "name"}}',
|
||||
... object_hook=model_json_decoder)
|
||||
{u'a_track': Track(artists=[], name=u'name')}
|
||||
|
||||
"""
|
||||
if '__model__' in dct:
|
||||
model_name = dct.pop('__model__')
|
||||
cls = globals().get(model_name, None)
|
||||
if issubclass(cls, ImmutableObject):
|
||||
kwargs = {}
|
||||
for key, value in dct.items():
|
||||
kwargs[key] = value
|
||||
return cls(**kwargs)
|
||||
return dct
|
||||
|
||||
|
||||
class Ref(ImmutableObject):
|
||||
"""
|
||||
Model to represent URI references with a human friendly name and type
|
||||
attached. This is intended for use a lightweight object "free" of metadata
|
||||
@ -157,14 +26,15 @@ class Ref(ImmutableObject):
|
||||
"""
|
||||
|
||||
#: The object URI. Read-only.
|
||||
uri = None
|
||||
uri = fields.URI()
|
||||
|
||||
#: The object name. Read-only.
|
||||
name = None
|
||||
name = fields.String()
|
||||
|
||||
#: The object type, e.g. "artist", "album", "track", "playlist",
|
||||
#: "directory". Read-only.
|
||||
type = None
|
||||
type = fields.Identifier() # TODO: consider locking this down.
|
||||
# type = fields.Field(choices=(ALBUM, ARTIST, DIRECTORY, PLAYLIST, TRACK))
|
||||
|
||||
#: Constant used for comparison with the :attr:`type` field.
|
||||
ALBUM = 'album'
|
||||
@ -212,7 +82,8 @@ class Ref(ImmutableObject):
|
||||
return cls(**kwargs)
|
||||
|
||||
|
||||
class Image(ImmutableObject):
|
||||
class Image(ValidatedImmutableObject):
|
||||
|
||||
"""
|
||||
:param string uri: URI of the image
|
||||
:param int width: Optional width of image or :class:`None`
|
||||
@ -220,36 +91,43 @@ class Image(ImmutableObject):
|
||||
"""
|
||||
|
||||
#: The image URI. Read-only.
|
||||
uri = None
|
||||
uri = fields.URI()
|
||||
|
||||
#: Optional width of the image or :class:`None`. Read-only.
|
||||
width = None
|
||||
width = fields.Integer(min=0)
|
||||
|
||||
#: Optional height of the image or :class:`None`. Read-only.
|
||||
height = None
|
||||
height = fields.Integer(min=0)
|
||||
|
||||
|
||||
class Artist(ImmutableObject):
|
||||
class Artist(ValidatedImmutableObject):
|
||||
|
||||
"""
|
||||
:param uri: artist URI
|
||||
:type uri: string
|
||||
:param name: artist name
|
||||
:type name: string
|
||||
:param sortname: artist name for sorting
|
||||
:type sortname: string
|
||||
:param musicbrainz_id: MusicBrainz ID
|
||||
:type musicbrainz_id: string
|
||||
"""
|
||||
|
||||
#: The artist URI. Read-only.
|
||||
uri = None
|
||||
uri = fields.URI()
|
||||
|
||||
#: The artist name. Read-only.
|
||||
name = None
|
||||
name = fields.String()
|
||||
|
||||
#: Artist name for better sorting, e.g. with articles stripped
|
||||
sortname = fields.String()
|
||||
|
||||
#: The MusicBrainz ID of the artist. Read-only.
|
||||
musicbrainz_id = None
|
||||
musicbrainz_id = fields.Identifier()
|
||||
|
||||
|
||||
class Album(ImmutableObject):
|
||||
class Album(ValidatedImmutableObject):
|
||||
|
||||
"""
|
||||
:param uri: album URI
|
||||
:type uri: string
|
||||
@ -270,39 +148,35 @@ class Album(ImmutableObject):
|
||||
"""
|
||||
|
||||
#: The album URI. Read-only.
|
||||
uri = None
|
||||
uri = fields.URI()
|
||||
|
||||
#: The album name. Read-only.
|
||||
name = None
|
||||
name = fields.String()
|
||||
|
||||
#: A set of album artists. Read-only.
|
||||
artists = frozenset()
|
||||
artists = fields.Collection(type=Artist, container=frozenset)
|
||||
|
||||
#: The number of tracks in the album. Read-only.
|
||||
num_tracks = None
|
||||
num_tracks = fields.Integer(min=0)
|
||||
|
||||
#: The number of discs in the album. Read-only.
|
||||
num_discs = None
|
||||
num_discs = fields.Integer(min=0)
|
||||
|
||||
#: The album release date. Read-only.
|
||||
date = None
|
||||
date = fields.Date()
|
||||
|
||||
#: The MusicBrainz ID of the album. Read-only.
|
||||
musicbrainz_id = None
|
||||
musicbrainz_id = fields.Identifier()
|
||||
|
||||
#: The album image URIs. Read-only.
|
||||
images = frozenset()
|
||||
images = fields.Collection(type=basestring, container=frozenset)
|
||||
# XXX If we want to keep the order of images we shouldn't use frozenset()
|
||||
# as it doesn't preserve order. I'm deferring this issue until we got
|
||||
# actual usage of this field with more than one image.
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.__dict__['artists'] = frozenset(kwargs.pop('artists', None) or [])
|
||||
self.__dict__['images'] = frozenset(kwargs.pop('images', None) or [])
|
||||
super(Album, self).__init__(*args, **kwargs)
|
||||
|
||||
class Track(ValidatedImmutableObject):
|
||||
|
||||
class Track(ImmutableObject):
|
||||
"""
|
||||
:param uri: track URI
|
||||
:type uri: string
|
||||
@ -337,64 +211,56 @@ class Track(ImmutableObject):
|
||||
"""
|
||||
|
||||
#: The track URI. Read-only.
|
||||
uri = None
|
||||
uri = fields.URI()
|
||||
|
||||
#: The track name. Read-only.
|
||||
name = None
|
||||
name = fields.String()
|
||||
|
||||
#: A set of track artists. Read-only.
|
||||
artists = frozenset()
|
||||
artists = fields.Collection(type=Artist, container=frozenset)
|
||||
|
||||
#: The track :class:`Album`. Read-only.
|
||||
album = None
|
||||
album = fields.Field(type=Album)
|
||||
|
||||
#: A set of track composers. Read-only.
|
||||
composers = frozenset()
|
||||
composers = fields.Collection(type=Artist, container=frozenset)
|
||||
|
||||
#: A set of track performers`. Read-only.
|
||||
performers = frozenset()
|
||||
performers = fields.Collection(type=Artist, container=frozenset)
|
||||
|
||||
#: The track genre. Read-only.
|
||||
genre = None
|
||||
genre = fields.String()
|
||||
|
||||
#: The track number in the album. Read-only.
|
||||
track_no = None
|
||||
track_no = fields.Integer(min=0)
|
||||
|
||||
#: The disc number in the album. Read-only.
|
||||
disc_no = None
|
||||
disc_no = fields.Integer(min=0)
|
||||
|
||||
#: The track release date. Read-only.
|
||||
date = None
|
||||
date = fields.Date()
|
||||
|
||||
#: The track length in milliseconds. Read-only.
|
||||
length = None
|
||||
length = fields.Integer(min=0)
|
||||
|
||||
#: The track's bitrate in kbit/s. Read-only.
|
||||
bitrate = None
|
||||
bitrate = fields.Integer(min=0)
|
||||
|
||||
#: The track comment. Read-only.
|
||||
comment = None
|
||||
comment = fields.String()
|
||||
|
||||
#: The MusicBrainz ID of the track. Read-only.
|
||||
musicbrainz_id = None
|
||||
musicbrainz_id = fields.Identifier()
|
||||
|
||||
#: Integer representing when the track was last modified. Exact meaning
|
||||
#: depends on source of track. For local files this is the modification
|
||||
#: time in milliseconds since Unix epoch. For other backends it could be an
|
||||
#: equivalent timestamp or simply a version counter.
|
||||
last_modified = None
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
def get(key):
|
||||
return frozenset(kwargs.pop(key, None) or [])
|
||||
|
||||
self.__dict__['artists'] = get('artists')
|
||||
self.__dict__['composers'] = get('composers')
|
||||
self.__dict__['performers'] = get('performers')
|
||||
super(Track, self).__init__(*args, **kwargs)
|
||||
last_modified = fields.Integer(min=0)
|
||||
|
||||
|
||||
class TlTrack(ImmutableObject):
|
||||
class TlTrack(ValidatedImmutableObject):
|
||||
|
||||
"""
|
||||
A tracklist track. Wraps a regular track and it's tracklist ID.
|
||||
|
||||
@ -416,10 +282,10 @@ class TlTrack(ImmutableObject):
|
||||
"""
|
||||
|
||||
#: The tracklist ID. Read-only.
|
||||
tlid = None
|
||||
tlid = fields.Integer(min=0)
|
||||
|
||||
#: The track. Read-only.
|
||||
track = None
|
||||
track = fields.Field(type=Track)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
if len(args) == 2 and len(kwargs) == 0:
|
||||
@ -432,7 +298,8 @@ class TlTrack(ImmutableObject):
|
||||
return iter([self.tlid, self.track])
|
||||
|
||||
|
||||
class Playlist(ImmutableObject):
|
||||
class Playlist(ValidatedImmutableObject):
|
||||
|
||||
"""
|
||||
:param uri: playlist URI
|
||||
:type uri: string
|
||||
@ -446,23 +313,19 @@ class Playlist(ImmutableObject):
|
||||
"""
|
||||
|
||||
#: The playlist URI. Read-only.
|
||||
uri = None
|
||||
uri = fields.URI()
|
||||
|
||||
#: The playlist name. Read-only.
|
||||
name = None
|
||||
name = fields.String()
|
||||
|
||||
#: The playlist's tracks. Read-only.
|
||||
tracks = tuple()
|
||||
tracks = fields.Collection(type=Track, container=tuple)
|
||||
|
||||
#: The playlist modification time in milliseconds since Unix epoch.
|
||||
#: Read-only.
|
||||
#:
|
||||
#: Integer, or :class:`None` if unknown.
|
||||
last_modified = None
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.__dict__['tracks'] = tuple(kwargs.pop('tracks', None) or [])
|
||||
super(Playlist, self).__init__(*args, **kwargs)
|
||||
last_modified = fields.Integer(min=0)
|
||||
|
||||
# TODO: def insert(self, pos, track): ... ?
|
||||
|
||||
@ -472,7 +335,8 @@ class Playlist(ImmutableObject):
|
||||
return len(self.tracks)
|
||||
|
||||
|
||||
class SearchResult(ImmutableObject):
|
||||
class SearchResult(ValidatedImmutableObject):
|
||||
|
||||
"""
|
||||
:param uri: search result URI
|
||||
:type uri: string
|
||||
@ -485,19 +349,13 @@ class SearchResult(ImmutableObject):
|
||||
"""
|
||||
|
||||
# The search result URI. Read-only.
|
||||
uri = None
|
||||
uri = fields.URI()
|
||||
|
||||
# The tracks matching the search query. Read-only.
|
||||
tracks = tuple()
|
||||
tracks = fields.Collection(type=Track, container=tuple)
|
||||
|
||||
# The artists matching the search query. Read-only.
|
||||
artists = tuple()
|
||||
artists = fields.Collection(type=Artist, container=tuple)
|
||||
|
||||
# The albums matching the search query. Read-only.
|
||||
albums = tuple()
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.__dict__['tracks'] = tuple(kwargs.pop('tracks', None) or [])
|
||||
self.__dict__['artists'] = tuple(kwargs.pop('artists', None) or [])
|
||||
self.__dict__['albums'] = tuple(kwargs.pop('albums', None) or [])
|
||||
super(SearchResult, self).__init__(*args, **kwargs)
|
||||
albums = fields.Collection(type=Album, container=tuple)
|
||||
154
mopidy/models/fields.py
Normal file
154
mopidy/models/fields.py
Normal file
@ -0,0 +1,154 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
|
||||
class Field(object):
|
||||
|
||||
"""
|
||||
Base field for use in
|
||||
:class:`~mopidy.models.immutable.ValidatedImmutableObject`. These fields
|
||||
are responsible for type checking and other data sanitation in our models.
|
||||
|
||||
For simplicity fields use the Python descriptor protocol to store the
|
||||
values in the instance dictionary. Also note that fields are mutable if
|
||||
the object they are attached to allow it.
|
||||
|
||||
Default values will be validated with the exception of :class:`None`.
|
||||
|
||||
:param default: default value for field
|
||||
:param type: if set the field value must be of this type
|
||||
:param choices: if set the field value must be one of these
|
||||
"""
|
||||
|
||||
def __init__(self, default=None, type=None, choices=None):
|
||||
self._name = None # Set by ValidatedImmutableObjectMeta
|
||||
self._choices = choices
|
||||
self._default = default
|
||||
self._type = type
|
||||
|
||||
if self._default is not None:
|
||||
self.validate(self._default)
|
||||
|
||||
def validate(self, value):
|
||||
"""Validate and possibly modify the field value before assignment"""
|
||||
if self._type and not isinstance(value, self._type):
|
||||
raise TypeError('Expected %s to be a %s, not %r' %
|
||||
(self._name, self._type, value))
|
||||
if self._choices and value not in self._choices:
|
||||
raise TypeError('Expected %s to be a one of %s, not %r' %
|
||||
(self._name, self._choices, value))
|
||||
return value
|
||||
|
||||
def __get__(self, instance, owner):
|
||||
if not instance:
|
||||
return self
|
||||
return getattr(instance, '_' + self._name, self._default)
|
||||
|
||||
def __set__(self, instance, value):
|
||||
if value is not None:
|
||||
value = self.validate(value)
|
||||
|
||||
if value is None or value == self._default:
|
||||
self.__delete__(instance)
|
||||
else:
|
||||
setattr(instance, '_' + self._name, value)
|
||||
|
||||
def __delete__(self, instance):
|
||||
if hasattr(instance, '_' + self._name):
|
||||
delattr(instance, '_' + self._name)
|
||||
|
||||
|
||||
class String(Field):
|
||||
|
||||
"""
|
||||
Specialized :class:`Field` which is wired up for bytes and unicode.
|
||||
|
||||
:param default: default value for field
|
||||
"""
|
||||
|
||||
def __init__(self, default=None):
|
||||
# TODO: normalize to unicode?
|
||||
# TODO: only allow unicode?
|
||||
# TODO: disallow empty strings?
|
||||
super(String, self).__init__(type=basestring, default=default)
|
||||
|
||||
|
||||
class Date(String):
|
||||
"""
|
||||
:class:`Field` for storing ISO 8601 dates as a string.
|
||||
|
||||
Supported formats are ``YYYY-MM-DD``, ``YYYY-MM`` and ``YYYY``, currently
|
||||
not validated.
|
||||
|
||||
:param default: default value for field
|
||||
"""
|
||||
pass # TODO: make this check for YYYY-MM-DD, YYYY-MM, YYYY using strptime.
|
||||
|
||||
|
||||
class Identifier(String):
|
||||
"""
|
||||
:class:`Field` for storing ASCII values such as GUIDs or other identifiers.
|
||||
|
||||
Values will be interned.
|
||||
|
||||
:param default: default value for field
|
||||
"""
|
||||
def validate(self, value):
|
||||
return intern(str(super(Identifier, self).validate(value)))
|
||||
|
||||
|
||||
class URI(Identifier):
|
||||
"""
|
||||
:class:`Field` for storing URIs
|
||||
|
||||
Values will be interned, currently not validated.
|
||||
|
||||
:param default: default value for field
|
||||
"""
|
||||
pass # TODO: validate URIs?
|
||||
|
||||
|
||||
class Integer(Field):
|
||||
"""
|
||||
:class:`Field` for storing integer numbers.
|
||||
|
||||
:param default: default value for field
|
||||
:param min: field value must be larger or equal to this value when set
|
||||
:param max: field value must be smaller or equal to this value when set
|
||||
"""
|
||||
|
||||
def __init__(self, default=None, min=None, max=None):
|
||||
self._min = min
|
||||
self._max = max
|
||||
super(Integer, self).__init__(type=(int, long), default=default)
|
||||
|
||||
def validate(self, value):
|
||||
value = super(Integer, self).validate(value)
|
||||
if self._min is not None and value < self._min:
|
||||
raise ValueError('Expected %s to be at least %d, not %d' %
|
||||
(self._name, self._min, value))
|
||||
if self._max is not None and value > self._max:
|
||||
raise ValueError('Expected %s to be at most %d, not %d' %
|
||||
(self._name, self._max, value))
|
||||
return value
|
||||
|
||||
|
||||
class Collection(Field):
|
||||
"""
|
||||
:class:`Field` for storing collections of a given type.
|
||||
|
||||
:param type: all items stored in the collection must be of this type
|
||||
:param container: the type to store the items in
|
||||
"""
|
||||
|
||||
def __init__(self, type, container=tuple):
|
||||
super(Collection, self).__init__(type=type, default=container())
|
||||
|
||||
def validate(self, value):
|
||||
if isinstance(value, basestring):
|
||||
raise TypeError('Expected %s to be a collection of %s, not %r'
|
||||
% (self._name, self._type.__name__, value))
|
||||
for v in value:
|
||||
if not isinstance(v, self._type):
|
||||
raise TypeError('Expected %s to be a collection of %s, not %r'
|
||||
% (self._name, self._type.__name__, value))
|
||||
return self._default.__class__(value) or None
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user