Release v0.15.0
This commit is contained in:
commit
3b48feda35
3
.mailmap
3
.mailmap
@ -5,3 +5,6 @@ Kristian Klette <klette@samfundet.no>
|
||||
Johannes Knutsen <johannes@knutseninfo.no> <johannes@iterate.no>
|
||||
Johannes Knutsen <johannes@knutseninfo.no> <johannes@barbarmaclin.(none)>
|
||||
John Bäckstrand <sopues@gmail.com> <sandos@XBMCLive.(none)>
|
||||
Alli Witheford <alzeih@gmail.com>
|
||||
Alexandre Petitjean <alpetitjean@gmail.com>
|
||||
Alexandre Petitjean <alpetitjean@gmail.com> <alexandre.petitjean@lne.fr>
|
||||
|
||||
@ -1,15 +1,18 @@
|
||||
language: python
|
||||
|
||||
install:
|
||||
- "wget -q -O - http://apt.mopidy.com/mopidy.gpg | sudo apt-key add -"
|
||||
- "sudo wget -q -O /etc/apt/sources.list.d/mopidy.list http://apt.mopidy.com/mopidy.list"
|
||||
- "wget -O - http://apt.mopidy.com/mopidy.gpg | sudo apt-key add -"
|
||||
- "sudo wget -O /etc/apt/sources.list.d/mopidy.list http://apt.mopidy.com/mopidy.list"
|
||||
- "sudo apt-get update || true"
|
||||
- "sudo apt-get install $(apt-cache depends mopidy | awk '$2 !~ /mopidy/ {print $2}')"
|
||||
- "pip install flake8"
|
||||
|
||||
before_script:
|
||||
- "rm $VIRTUAL_ENV/lib/python$TRAVIS_PYTHON_VERSION/no-global-site-packages.txt"
|
||||
|
||||
script: nosetests
|
||||
script:
|
||||
- "flake8 $(find . -iname '*.py')"
|
||||
- "nosetests"
|
||||
|
||||
notifications:
|
||||
irc:
|
||||
|
||||
5
AUTHORS
5
AUTHORS
@ -19,3 +19,8 @@
|
||||
- Nick Steel <kingosticks@gmail.com>
|
||||
- Zan Dobersek <zandobersek@gmail.com>
|
||||
- Thomas Refis <refis.thomas@gmail.com>
|
||||
- Janez Troha <janez.troha@gmail.com>
|
||||
- Tobias Sauerwein <cgtobi@gmail.com>
|
||||
- Alli Witheford <alzeih@gmail.com>
|
||||
- Alexandre Petitjean <alpetitjean@gmail.com>
|
||||
- Pavol Babincak <scroolik@gmail.com>
|
||||
|
||||
@ -25,4 +25,5 @@ To get started with Mopidy, check out `the docs <http://docs.mopidy.com/>`_.
|
||||
- Mailing list: `mopidy@googlegroups.com <https://groups.google.com/forum/?fromgroups=#!forum/mopidy>`_
|
||||
- Twitter: `@mopidy <https://twitter.com/mopidy/>`_
|
||||
|
||||
.. image:: https://secure.travis-ci.org/mopidy/mopidy.png?branch=develop
|
||||
.. image:: https://travis-ci.org/mopidy/mopidy.png?branch=develop
|
||||
:target: https://travis-ci.org/mopidy/mopidy
|
||||
|
||||
@ -4,6 +4,83 @@ Changelog
|
||||
|
||||
This changelog is used to track all major changes to Mopidy.
|
||||
|
||||
v0.15.0 (2013-09-19)
|
||||
====================
|
||||
|
||||
A release with a number of small and medium fixes, with no specific focus.
|
||||
|
||||
**Dependencies**
|
||||
|
||||
- Mopidy no longer supports Python 2.6. Currently, the only Python version
|
||||
supported by Mopidy is Python 2.7. We're continuously working towards running
|
||||
Mopidy on Python 3. (Fixes: :issue:`344`)
|
||||
|
||||
**Command line options**
|
||||
|
||||
- Converted from the optparse to the argparse library for handling command line
|
||||
options.
|
||||
|
||||
- :option:`mopidy --show-config` will now take into consideration any
|
||||
:option:`mopidy --option` arguments appearing later on the command line. This
|
||||
helps you see the effective configuration for runs with the same
|
||||
:option:`mopidy --options` arguments.
|
||||
|
||||
**Audio**
|
||||
|
||||
- Added support for audio visualization. :confval:`audio/visualizer` can now be
|
||||
set to GStreamer visualizers.
|
||||
|
||||
- Properly encode localized mixer names before logging.
|
||||
|
||||
**Local backend**
|
||||
|
||||
- An album's number of discs and a track's disc number are now extracted when
|
||||
scanning your music collection.
|
||||
|
||||
- The scanner now gives up scanning a file after a second, and continues with
|
||||
the next file. This fixes some hangs on non-media files, like logs. (Fixes:
|
||||
:issue:`476`, :issue:`483`)
|
||||
|
||||
- Added support for pluggable library updaters. This allows extension writers
|
||||
to start providing their own custom libraries instead of being stuck with
|
||||
just our tag cache as the only option.
|
||||
|
||||
- Converted local backend to use new ``local:playlist:path`` and
|
||||
``local:track:path`` URI scheme. Also moves support of ``file://`` to
|
||||
streaming backend.
|
||||
|
||||
**Spotify backend**
|
||||
|
||||
- Prepend playlist folder names to the playlist name, so that the playlist
|
||||
hierarchy from your Spotify account is available in Mopidy. (Fixes:
|
||||
:issue:`62`)
|
||||
|
||||
- Fix proxy config values that was broken with the config system change in
|
||||
0.14. (Fixes: :issue:`472`)
|
||||
|
||||
**MPD frontend**
|
||||
|
||||
- Replace newline, carriage return and forward slash in playlist names. (Fixes:
|
||||
:issue:`474`, :issue:`480`)
|
||||
|
||||
- Accept ``listall`` and ``listallinfo`` commands without the URI parameter.
|
||||
The methods are still not implemented, but now the commands are accepted as
|
||||
valid.
|
||||
|
||||
**HTTP frontend**
|
||||
|
||||
- Fix too broad truth test that caused :class:`mopidy.models.TlTrack`
|
||||
objects with ``tlid`` set to ``0`` to be sent to the HTTP client without the
|
||||
``tlid`` field. (Fixes: :issue:`501`)
|
||||
|
||||
- Upgrade Mopidy.js dependencies. This version has been released to NPM as
|
||||
Mopidy.js v0.1.1.
|
||||
|
||||
**Extension support**
|
||||
|
||||
- :class:`mopidy.config.Secret` is now deserialized to unicode instead of
|
||||
bytes. This may require modifications to extensions.
|
||||
|
||||
|
||||
v0.14.2 (2013-07-01)
|
||||
====================
|
||||
|
||||
@ -33,7 +33,10 @@ class Mock(object):
|
||||
if name in ('__file__', '__path__'):
|
||||
return '/dev/null'
|
||||
elif (name[0] == name[0].upper()
|
||||
and not name.startswith('MIXER_TRACK_')):
|
||||
# gst.interfaces.MIXER_TRACK_*
|
||||
and not name.startswith('MIXER_TRACK_')
|
||||
# dbus.String()
|
||||
and not name == 'String'):
|
||||
return type(name, (), {})
|
||||
else:
|
||||
return Mock()
|
||||
@ -98,7 +101,7 @@ master_doc = 'index'
|
||||
|
||||
# General information about the project.
|
||||
project = 'Mopidy'
|
||||
copyright = '2010-2013, Stein Magnus Jodal and contributors'
|
||||
copyright = '2009-2013, Stein Magnus Jodal and contributors'
|
||||
|
||||
# The version info for the project you're documenting, acts as replacement for
|
||||
# |version| and |release|, also used in various other places throughout the
|
||||
|
||||
@ -90,6 +90,16 @@ Core configuration values
|
||||
``gst-inspect-0.10`` to see what output properties can be set on the sink.
|
||||
For example: ``gst-inspect-0.10 shout2send``
|
||||
|
||||
.. confval:: audio/visualizer
|
||||
|
||||
Visualizer to use.
|
||||
|
||||
Can be left blank if no visualizer is desired. Otherwise this expects a
|
||||
GStreamer visualizer. Typical values are ``monoscope``, ``goom``,
|
||||
``goom2k1`` or one of the `libvisual`_ visualizers.
|
||||
|
||||
.. _libvisual: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-plugin-libvisual.html
|
||||
|
||||
.. confval:: logging/console_format
|
||||
|
||||
The log format used for informational logging.
|
||||
@ -134,9 +144,25 @@ Core configuration values
|
||||
|
||||
Password for the proxy server, if needed.
|
||||
|
||||
.. _the Python logging docs:
|
||||
.. _the Python logging docs: http://docs.python.org/2/library/logging.config.html
|
||||
|
||||
http://docs.python.org/2/library/logging.config.html
|
||||
|
||||
Extension configuration
|
||||
=======================
|
||||
|
||||
Mopidy's extensions have their own config values that you may want to tweak.
|
||||
For the available config values, please refer to the docs for each extension.
|
||||
Most, if not all, can be found at :ref:`ext`.
|
||||
|
||||
Mopidy extensions are enabled by default when they are installed. If you want
|
||||
to disable an extension without uninstalling it, all extensions support the
|
||||
``enabled`` config value even if it isn't explicitly documented by all
|
||||
extensions. If the ``enabled`` config value is set to ``false`` the extension
|
||||
will not be started. For example, to disable the Spotify extension, add the
|
||||
following to your ``mopidy.conf``::
|
||||
|
||||
[spotify]
|
||||
enabled = false
|
||||
|
||||
|
||||
Extension configuration
|
||||
|
||||
@ -22,8 +22,8 @@ tested by Jenkins before it is merged into the ``develop`` branch, which is a
|
||||
bit late, but good enough to get broad testing before new code is released.
|
||||
|
||||
In addition to running tests, the Jenkins CI server also gathers coverage
|
||||
statistics and uses pylint to check for errors and possible improvements in our
|
||||
code. So, if you're out of work, the code coverage and pylint data at the CI
|
||||
statistics and uses flake8 to check for errors and possible improvements in our
|
||||
code. So, if you're out of work, the code coverage and flake8 data at the CI
|
||||
server should give you a place to start.
|
||||
|
||||
|
||||
|
||||
@ -46,6 +46,22 @@ Issues:
|
||||
https://github.com/dz0ny/mopidy-beets/issues
|
||||
|
||||
|
||||
Mopidy-GMusic
|
||||
-------------
|
||||
|
||||
Provides a backend for playing music from `Google Play Music
|
||||
<https://play.google.com/music/>`_.
|
||||
|
||||
Author:
|
||||
Ronald Hecht
|
||||
PyPI:
|
||||
`Mopidy-GMusic <https://pypi.python.org/pypi/Mopidy-GMusic>`_
|
||||
GitHub:
|
||||
`hechtus/mopidy-gmusic <https://github.com/hechtus/mopidy-gmusic>`_
|
||||
Issues:
|
||||
https://github.com/hechtus/mopidy-gmusic/issues
|
||||
|
||||
|
||||
Mopidy-NAD
|
||||
----------
|
||||
|
||||
@ -61,6 +77,22 @@ Issues:
|
||||
https://github.com/mopidy/mopidy/issues
|
||||
|
||||
|
||||
Mopidy-SomaFM
|
||||
-------------
|
||||
|
||||
Provides a backend for playing music from the `SomaFM <http://somafm.com/>`_
|
||||
service.
|
||||
|
||||
Author:
|
||||
Alexandre Petitjean
|
||||
PyPI:
|
||||
`Mopidy-SomaFM <https://pypi.python.org/pypi/Mopidy-SomaFM>`_
|
||||
GitHub:
|
||||
`AlexandrePTJ/mopidy-somafm <https://github.com/AlexandrePTJ/mopidy-somafm/>`_
|
||||
Issues:
|
||||
https://github.com/AlexandrePTJ/mopidy-somafm/issues
|
||||
|
||||
|
||||
Mopidy-SoundCloud
|
||||
-----------------
|
||||
|
||||
@ -75,3 +107,19 @@ GitHub:
|
||||
`dz0ny/mopidy-soundcloud <https://github.com/dz0ny/mopidy-soundcloud>`_
|
||||
Issues:
|
||||
https://github.com/dz0ny/mopidy-soundcloud/issues
|
||||
|
||||
|
||||
Mopidy-Subsonic
|
||||
---------------
|
||||
|
||||
Provides a backend for playing music from a `Subsonic Music Streamer
|
||||
<http://www.subsonic.org/>`_ library.
|
||||
|
||||
Author:
|
||||
Bradon Kanyid
|
||||
PyPI:
|
||||
`Mopidy-Subsonic <https://pypi.python.org/pypi/Mopidy-Subsonic>`_
|
||||
GitHub:
|
||||
`rattboi/mopidy-subsonic <https://github.com/rattboi/mopidy-subsonic>`_
|
||||
Issues:
|
||||
https://github.com/rattboi/mopidy-subsonic/issues
|
||||
|
||||
@ -47,6 +47,11 @@ Configuration values
|
||||
|
||||
Path to tag cache for local media.
|
||||
|
||||
.. confval:: local/scan_timeout
|
||||
|
||||
Number of milliseconds before giving up scanning a file and moving on to
|
||||
the next file.
|
||||
|
||||
|
||||
Usage
|
||||
=====
|
||||
|
||||
@ -48,7 +48,7 @@ About
|
||||
:maxdepth: 1
|
||||
|
||||
authors
|
||||
licenses
|
||||
license
|
||||
changelog
|
||||
versioning
|
||||
|
||||
|
||||
@ -131,12 +131,6 @@ Pip.
|
||||
|
||||
PYTHONPATH=$(brew --prefix)/lib/python2.7/site-packages mopidy
|
||||
|
||||
Note that you need to replace ``python2.7`` with ``python2.6`` in the above
|
||||
``PYTHONPATH`` examples if you are using Python 2.6. To find your Python
|
||||
version, run::
|
||||
|
||||
python --version
|
||||
|
||||
#. Next up, you need to install some Python packages. To do so, we use Pip. If
|
||||
you don't have the ``pip`` command, you can install it now::
|
||||
|
||||
@ -157,7 +151,7 @@ Otherwise: Install from source using Pip
|
||||
If you are on on Linux, but can't install from the APT archive or from AUR, you
|
||||
can install Mopidy from PyPI using Pip.
|
||||
|
||||
#. First of all, you need Python >= 2.6, < 3. Check if you have Python and what
|
||||
#. First of all, you need Python 2.7. Check if you have Python and what
|
||||
version by running::
|
||||
|
||||
python --version
|
||||
|
||||
@ -16,12 +16,20 @@ distribution.
|
||||
|
||||
.. _raspi-wheezy:
|
||||
|
||||
How to for Debian 7 (Wheezy)
|
||||
============================
|
||||
How to for Raspbian "wheezy" and Debian "wheezy"
|
||||
================================================
|
||||
|
||||
#. Download the latest wheezy disk image from
|
||||
http://downloads.raspberrypi.org/images/debian/7/. I used the one dated
|
||||
2012-08-08.
|
||||
This guide applies for both:
|
||||
|
||||
- Raspian "wheezy" for armhf (hard-float), and
|
||||
- Debian "wheezy" for armel (soft-float)
|
||||
|
||||
If you don't know which one to select, go for the armhf variant, as it'll give
|
||||
you a lot better performance.
|
||||
|
||||
#. Download the latest "wheezy" disk image from
|
||||
http://www.raspberrypi.org/downloads/. This was last tested with the images
|
||||
from 2013-05-25 for armhf and 2013-05-29 for armel.
|
||||
|
||||
#. Flash the OS image to your SD card. See
|
||||
http://elinux.org/RPi_Easy_SD_Card_Setup for help.
|
||||
|
||||
10
docs/license.rst
Normal file
10
docs/license.rst
Normal file
@ -0,0 +1,10 @@
|
||||
*******
|
||||
License
|
||||
*******
|
||||
|
||||
Mopidy is copyright 2009-2013 Stein Magnus Jodal and contributors. For a list
|
||||
of contributors, see :doc:`authors`. For details on who have contributed what,
|
||||
please refer to our git repository.
|
||||
|
||||
Mopidy is licensed under the `Apache License, Version 2.0
|
||||
<http://www.apache.org/licenses/LICENSE-2.0>`_.
|
||||
@ -1,34 +0,0 @@
|
||||
********
|
||||
Licenses
|
||||
********
|
||||
|
||||
For a list of contributors, see :doc:`authors`. For details on who have
|
||||
contributed what, please refer to our git repository.
|
||||
|
||||
Source code license
|
||||
===================
|
||||
|
||||
Copyright 2009-2013 Stein Magnus Jodal and contributors
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
|
||||
|
||||
Documentation license
|
||||
=====================
|
||||
|
||||
Copyright 2010-2013 Stein Magnus Jodal and contributors
|
||||
|
||||
This work is licensed under the Creative Commons Attribution-ShareAlike 3.0
|
||||
Unported License. To view a copy of this license, visit
|
||||
http://creativecommons.org/licenses/by-sa/3.0/ or send a letter to Creative
|
||||
Commons, 171 Second Street, Suite 300, San Francisco, California, 94105, USA.
|
||||
47
fabfile.py
vendored
47
fabfile.py
vendored
@ -1,21 +1,62 @@
|
||||
from fabric.api import local, settings
|
||||
from fabric.api import execute, local, settings, task
|
||||
|
||||
|
||||
@task
|
||||
def docs():
|
||||
local('make -C docs/ html')
|
||||
|
||||
|
||||
@task
|
||||
def autodocs():
|
||||
auto(docs)
|
||||
|
||||
|
||||
@task
|
||||
def test(path=None):
|
||||
path = path or 'tests/'
|
||||
local('nosetests ' + path)
|
||||
|
||||
|
||||
@task
|
||||
def autotest(path=None):
|
||||
auto(test, path=path)
|
||||
|
||||
|
||||
@task
|
||||
def coverage(path=None):
|
||||
path = path or 'tests/'
|
||||
local(
|
||||
'nosetests --with-coverage --cover-package=mopidy '
|
||||
'--cover-branches --cover-html ' + path)
|
||||
|
||||
|
||||
@task
|
||||
def autocoverage(path=None):
|
||||
auto(coverage, path=path)
|
||||
|
||||
|
||||
@task
|
||||
def lint(path=None):
|
||||
path = path or '.'
|
||||
local('flake8 $(find %s -iname "*.py")' % path)
|
||||
|
||||
|
||||
@task
|
||||
def autolint(path=None):
|
||||
auto(lint, path=path)
|
||||
|
||||
|
||||
def auto(task, *args, **kwargs):
|
||||
while True:
|
||||
local('clear')
|
||||
with settings(warn_only=True):
|
||||
test(path)
|
||||
execute(task, *args, **kwargs)
|
||||
local(
|
||||
'inotifywait -q -e create -e modify -e delete '
|
||||
'--exclude ".*\.(pyc|sw.)" -r mopidy/ tests/')
|
||||
'--exclude ".*\.(pyc|sw.)" -r docs/ mopidy/ tests/')
|
||||
|
||||
|
||||
@task
|
||||
def update_authors():
|
||||
# Keep authors in the order of appearance and use awk to filter out dupes
|
||||
local(
|
||||
|
||||
@ -15,6 +15,9 @@ module.exports = function (grunt) {
|
||||
minified: "../mopidy/frontends/http/data/mopidy.min.js"
|
||||
}
|
||||
},
|
||||
buster: {
|
||||
all: {}
|
||||
},
|
||||
concat: {
|
||||
options: {
|
||||
banner: "<%= meta.banner %>",
|
||||
|
||||
14
js/README.md
14
js/README.md
@ -51,15 +51,15 @@ Building from source
|
||||
1. Install [Node.js](http://nodejs.org/) and npm. There is a PPA if you're
|
||||
running Ubuntu:
|
||||
|
||||
sudo apt-get install python-software-properties
|
||||
sudo add-apt-repository ppa:chris-lea/node.js
|
||||
sudo apt-get update
|
||||
sudo apt-get install nodejs npm
|
||||
sudo apt-get install python-software-properties
|
||||
sudo add-apt-repository ppa:chris-lea/node.js
|
||||
sudo apt-get update
|
||||
sudo apt-get install nodejs
|
||||
|
||||
2. Enter the `js/` in Mopidy's Git repo dir and install all dependencies:
|
||||
|
||||
cd js/
|
||||
npm install
|
||||
cd js/
|
||||
npm install
|
||||
|
||||
That's it.
|
||||
|
||||
@ -69,7 +69,7 @@ You can now run the tests:
|
||||
|
||||
To run tests automatically when you save a file:
|
||||
|
||||
npm run-script watch
|
||||
npm start
|
||||
|
||||
To run tests, concatenate, minify the source, and update the JavaScript files
|
||||
in `mopidy/frontends/http/data/`:
|
||||
|
||||
@ -3,10 +3,10 @@
|
||||
*
|
||||
* https://github.com/busterjs/bane
|
||||
*
|
||||
* @version 0.4.0
|
||||
* @version 1.0.0
|
||||
*/
|
||||
|
||||
((typeof define === "function" && define.amd && function (m) { define(m); }) ||
|
||||
((typeof define === "function" && define.amd && function (m) { define("bane", m); }) ||
|
||||
(typeof module === "object" && function (m) { module.exports = m(); }) ||
|
||||
function (m) { this.bane = m(); }
|
||||
)(function () {
|
||||
@ -152,7 +152,7 @@
|
||||
notifyListener(event, toNotify[i], args);
|
||||
}
|
||||
|
||||
toNotify = listeners(this, event).slice()
|
||||
toNotify = listeners(this, event).slice();
|
||||
args = slice.call(arguments, 1);
|
||||
for (i = 0, l = toNotify.length; i < l; ++i) {
|
||||
notifyListener(event, toNotify[i], args);
|
||||
@ -9,27 +9,30 @@
|
||||
*
|
||||
* @author Brian Cavalier
|
||||
* @author John Hann
|
||||
* @version 2.0.0
|
||||
* @version 2.4.0
|
||||
*/
|
||||
(function(define) { 'use strict';
|
||||
define(function () {
|
||||
(function(define, global) { 'use strict';
|
||||
define(function (require) {
|
||||
|
||||
// Public API
|
||||
|
||||
when.defer = defer; // Create a deferred
|
||||
when.promise = promise; // Create a pending promise
|
||||
when.resolve = resolve; // Create a resolved promise
|
||||
when.reject = reject; // Create a rejected promise
|
||||
when.defer = defer; // Create a {promise, resolver} pair
|
||||
|
||||
when.join = join; // Join 2 or more promises
|
||||
|
||||
when.all = all; // Resolve a list of promises
|
||||
when.map = map; // Array.map() for promises
|
||||
when.reduce = reduce; // Array.reduce() for promises
|
||||
when.settle = settle; // Settle a list of promises
|
||||
|
||||
when.any = any; // One-winner race
|
||||
when.some = some; // Multi-winner race
|
||||
|
||||
when.isPromise = isPromise; // Determine if a thing is a promise
|
||||
when.isPromise = isPromiseLike; // DEPRECATED: use isPromiseLike
|
||||
when.isPromiseLike = isPromiseLike; // Is something promise-like, aka thenable
|
||||
|
||||
/**
|
||||
* Register an observer for a promise or immediate value.
|
||||
@ -57,13 +60,35 @@ define(function () {
|
||||
* a trusted when.js promise. Any other duck-typed promise is considered
|
||||
* untrusted.
|
||||
* @constructor
|
||||
* @param {function} sendMessage function to deliver messages to the promise's handler
|
||||
* @param {function?} inspect function that reports the promise's state
|
||||
* @name Promise
|
||||
*/
|
||||
function Promise(then) {
|
||||
this.then = then;
|
||||
function Promise(sendMessage, inspect) {
|
||||
this._message = sendMessage;
|
||||
this.inspect = inspect;
|
||||
}
|
||||
|
||||
Promise.prototype = {
|
||||
/**
|
||||
* Register handlers for this promise.
|
||||
* @param [onFulfilled] {Function} fulfillment handler
|
||||
* @param [onRejected] {Function} rejection handler
|
||||
* @param [onProgress] {Function} progress handler
|
||||
* @return {Promise} new Promise
|
||||
*/
|
||||
then: function(onFulfilled, onRejected, onProgress) {
|
||||
/*jshint unused:false*/
|
||||
var args, sendMessage;
|
||||
|
||||
args = arguments;
|
||||
sendMessage = this._message;
|
||||
|
||||
return _promise(function(resolve, reject, notify) {
|
||||
sendMessage('when', args, resolve, notify);
|
||||
}, this._status && this._status.observed());
|
||||
},
|
||||
|
||||
/**
|
||||
* Register a rejection handler. Shortcut for .then(undefined, onRejected)
|
||||
* @param {function?} onRejected
|
||||
@ -84,9 +109,7 @@ define(function () {
|
||||
* @returns {Promise}
|
||||
*/
|
||||
ensure: function(onFulfilledOrRejected) {
|
||||
var self = this;
|
||||
|
||||
return this.then(injectHandler, injectHandler).yield(self);
|
||||
return this.then(injectHandler, injectHandler)['yield'](this);
|
||||
|
||||
function injectHandler() {
|
||||
return resolve(onFulfilledOrRejected());
|
||||
@ -107,6 +130,16 @@ define(function () {
|
||||
});
|
||||
},
|
||||
|
||||
/**
|
||||
* Runs a side effect when this promise fulfills, without changing the
|
||||
* fulfillment value.
|
||||
* @param {function} onFulfilledSideEffect
|
||||
* @returns {Promise}
|
||||
*/
|
||||
tap: function(onFulfilledSideEffect) {
|
||||
return this.then(onFulfilledSideEffect)['yield'](this);
|
||||
},
|
||||
|
||||
/**
|
||||
* Assumes that this promise will fulfill with an array, and arranges
|
||||
* for the onFulfilled to be called with the array as its argument list
|
||||
@ -162,13 +195,16 @@ define(function () {
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new Deferred with fully isolated resolver and promise parts,
|
||||
* either or both of which may be given out safely to consumers.
|
||||
* Creates a {promise, resolver} pair, either or both of which
|
||||
* may be given out safely to consumers.
|
||||
* The resolver has resolve, reject, and progress. The promise
|
||||
* only has then.
|
||||
* has then plus extended promise API.
|
||||
*
|
||||
* @return {{
|
||||
* promise: Promise,
|
||||
* resolve: function:Promise,
|
||||
* reject: function:Promise,
|
||||
* notify: function:Promise
|
||||
* resolver: {
|
||||
* resolve: function:Promise,
|
||||
* reject: function:Promise,
|
||||
@ -216,12 +252,26 @@ define(function () {
|
||||
|
||||
/**
|
||||
* Creates a new promise whose fate is determined by resolver.
|
||||
* @private (for now)
|
||||
* @param {function} resolver function(resolve, reject, notify)
|
||||
* @returns {Promise} promise whose fate is determine by resolver
|
||||
*/
|
||||
function promise(resolver) {
|
||||
var value, handlers = [];
|
||||
return _promise(resolver, monitorApi.PromiseStatus && monitorApi.PromiseStatus());
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new promise, linked to parent, whose fate is determined
|
||||
* by resolver.
|
||||
* @param {function} resolver function(resolve, reject, notify)
|
||||
* @param {Promise?} status promise from which the new promise is begotten
|
||||
* @returns {Promise} promise whose fate is determine by resolver
|
||||
* @private
|
||||
*/
|
||||
function _promise(resolver, status) {
|
||||
var self, value, consumers = [];
|
||||
|
||||
self = new Promise(_message, inspect);
|
||||
self._status = status;
|
||||
|
||||
// Call the provider resolver to seal the promise's fate
|
||||
try {
|
||||
@ -231,29 +281,34 @@ define(function () {
|
||||
}
|
||||
|
||||
// Return the promise
|
||||
return new Promise(then);
|
||||
return self;
|
||||
|
||||
/**
|
||||
* Register handlers for this promise.
|
||||
* @param [onFulfilled] {Function} fulfillment handler
|
||||
* @param [onRejected] {Function} rejection handler
|
||||
* @param [onProgress] {Function} progress handler
|
||||
* @return {Promise} new Promise
|
||||
* Private message delivery. Queues and delivers messages to
|
||||
* the promise's ultimate fulfillment value or rejection reason.
|
||||
* @private
|
||||
* @param {String} type
|
||||
* @param {Array} args
|
||||
* @param {Function} resolve
|
||||
* @param {Function} notify
|
||||
*/
|
||||
function then(onFulfilled, onRejected, onProgress) {
|
||||
return promise(function(resolve, reject, notify) {
|
||||
handlers
|
||||
// Call handlers later, after resolution
|
||||
? handlers.push(function(value) {
|
||||
value.then(onFulfilled, onRejected, onProgress)
|
||||
.then(resolve, reject, notify);
|
||||
})
|
||||
// Call handlers soon, but not in the current stack
|
||||
: enqueue(function() {
|
||||
value.then(onFulfilled, onRejected, onProgress)
|
||||
.then(resolve, reject, notify);
|
||||
});
|
||||
});
|
||||
function _message(type, args, resolve, notify) {
|
||||
consumers ? consumers.push(deliver) : enqueue(function() { deliver(value); });
|
||||
|
||||
function deliver(p) {
|
||||
p._message(type, args, resolve, notify);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a snapshot of the promise's state at the instant inspect()
|
||||
* is called. The returned object is not live and will not update as
|
||||
* the promise's state changes.
|
||||
* @returns {{ state:String, value?:*, reason?:* }} status snapshot
|
||||
* of the promise.
|
||||
*/
|
||||
function inspect() {
|
||||
return value ? value.inspect() : toPendingState();
|
||||
}
|
||||
|
||||
/**
|
||||
@ -262,14 +317,17 @@ define(function () {
|
||||
* @param {*|Promise} val resolution value
|
||||
*/
|
||||
function promiseResolve(val) {
|
||||
if(!handlers) {
|
||||
if(!consumers) {
|
||||
return;
|
||||
}
|
||||
|
||||
value = coerce(val);
|
||||
scheduleHandlers(handlers, value);
|
||||
scheduleConsumers(consumers, value);
|
||||
consumers = undef;
|
||||
|
||||
handlers = undef;
|
||||
if(status) {
|
||||
updateStatus(value, status);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
@ -285,27 +343,90 @@ define(function () {
|
||||
* @param {*} update progress event payload to pass to all listeners
|
||||
*/
|
||||
function promiseNotify(update) {
|
||||
if(handlers) {
|
||||
scheduleHandlers(handlers, progressing(update));
|
||||
if(consumers) {
|
||||
scheduleConsumers(consumers, progressed(update));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a fulfilled, local promise as a proxy for a value
|
||||
* NOTE: must never be exposed
|
||||
* @param {*} value fulfillment value
|
||||
* @returns {Promise}
|
||||
*/
|
||||
function fulfilled(value) {
|
||||
return near(
|
||||
new NearFulfilledProxy(value),
|
||||
function() { return toFulfilledState(value); }
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a rejected, local promise with the supplied reason
|
||||
* NOTE: must never be exposed
|
||||
* @param {*} reason rejection reason
|
||||
* @returns {Promise}
|
||||
*/
|
||||
function rejected(reason) {
|
||||
return near(
|
||||
new NearRejectedProxy(reason),
|
||||
function() { return toRejectedState(reason); }
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a near promise using the provided proxy
|
||||
* NOTE: must never be exposed
|
||||
* @param {object} proxy proxy for the promise's ultimate value or reason
|
||||
* @param {function} inspect function that returns a snapshot of the
|
||||
* returned near promise's state
|
||||
* @returns {Promise}
|
||||
*/
|
||||
function near(proxy, inspect) {
|
||||
return new Promise(function (type, args, resolve) {
|
||||
try {
|
||||
resolve(proxy[type].apply(proxy, args));
|
||||
} catch(e) {
|
||||
resolve(rejected(e));
|
||||
}
|
||||
}, inspect);
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a progress promise with the supplied update.
|
||||
* @private
|
||||
* @param {*} update
|
||||
* @return {Promise} progress promise
|
||||
*/
|
||||
function progressed(update) {
|
||||
return new Promise(function (type, args, _, notify) {
|
||||
var onProgress = args[2];
|
||||
try {
|
||||
notify(typeof onProgress === 'function' ? onProgress(update) : update);
|
||||
} catch(e) {
|
||||
notify(e);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Coerces x to a trusted Promise
|
||||
*
|
||||
* @private
|
||||
* @param {*} x thing to coerce
|
||||
* @returns {Promise} Guaranteed to return a trusted Promise. If x
|
||||
* @returns {*} Guaranteed to return a trusted Promise. If x
|
||||
* is trusted, returns x, otherwise, returns a new, trusted, already-resolved
|
||||
* Promise whose resolution value is:
|
||||
* * the resolution value of x if it's a foreign promise, or
|
||||
* * x if it's a value
|
||||
*/
|
||||
function coerce(x) {
|
||||
if(x instanceof Promise) {
|
||||
if (x instanceof Promise) {
|
||||
return x;
|
||||
} else if (x !== Object(x)) {
|
||||
}
|
||||
|
||||
if (!(x === Object(x) && 'then' in x)) {
|
||||
return fulfilled(x);
|
||||
}
|
||||
|
||||
@ -332,61 +453,34 @@ define(function () {
|
||||
}
|
||||
|
||||
/**
|
||||
* Create an already-fulfilled promise for the supplied value
|
||||
* @private
|
||||
* Proxy for a near, fulfilled value
|
||||
* @param {*} value
|
||||
* @return {Promise} fulfilled promise
|
||||
* @constructor
|
||||
*/
|
||||
function fulfilled(value) {
|
||||
var self = new Promise(function (onFulfilled) {
|
||||
try {
|
||||
return typeof onFulfilled == 'function'
|
||||
? coerce(onFulfilled(value)) : self;
|
||||
} catch (e) {
|
||||
return rejected(e);
|
||||
}
|
||||
});
|
||||
|
||||
return self;
|
||||
function NearFulfilledProxy(value) {
|
||||
this.value = value;
|
||||
}
|
||||
|
||||
NearFulfilledProxy.prototype.when = function(onResult) {
|
||||
return typeof onResult === 'function' ? onResult(this.value) : this.value;
|
||||
};
|
||||
|
||||
/**
|
||||
* Create an already-rejected promise with the supplied rejection reason.
|
||||
* @private
|
||||
* Proxy for a near rejection
|
||||
* @param {*} reason
|
||||
* @return {Promise} rejected promise
|
||||
* @constructor
|
||||
*/
|
||||
function rejected(reason) {
|
||||
var self = new Promise(function (_, onRejected) {
|
||||
try {
|
||||
return typeof onRejected == 'function'
|
||||
? coerce(onRejected(reason)) : self;
|
||||
} catch (e) {
|
||||
return rejected(e);
|
||||
}
|
||||
});
|
||||
|
||||
return self;
|
||||
function NearRejectedProxy(reason) {
|
||||
this.reason = reason;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a progress promise with the supplied update.
|
||||
* @private
|
||||
* @param {*} update
|
||||
* @return {Promise} progress promise
|
||||
*/
|
||||
function progressing(update) {
|
||||
var self = new Promise(function (_, __, onProgress) {
|
||||
try {
|
||||
return typeof onProgress == 'function'
|
||||
? progressing(onProgress(update)) : self;
|
||||
} catch (e) {
|
||||
return progressing(e);
|
||||
}
|
||||
});
|
||||
|
||||
return self;
|
||||
}
|
||||
NearRejectedProxy.prototype.when = function(_, onError) {
|
||||
if(typeof onError === 'function') {
|
||||
return onError(this.reason);
|
||||
} else {
|
||||
throw this.reason;
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Schedule a task that will process a list of handlers
|
||||
@ -395,7 +489,7 @@ define(function () {
|
||||
* @param {Array} handlers queue of handlers to execute
|
||||
* @param {*} value passed as the only arg to each handler
|
||||
*/
|
||||
function scheduleHandlers(handlers, value) {
|
||||
function scheduleConsumers(handlers, value) {
|
||||
enqueue(function() {
|
||||
var handler, i = 0;
|
||||
while (handler = handlers[i++]) {
|
||||
@ -404,14 +498,23 @@ define(function () {
|
||||
});
|
||||
}
|
||||
|
||||
function updateStatus(value, status) {
|
||||
value.then(statusFulfilled, statusRejected);
|
||||
|
||||
function statusFulfilled() { status.fulfilled(); }
|
||||
function statusRejected(r) { status.rejected(r); }
|
||||
}
|
||||
|
||||
/**
|
||||
* Determines if promiseOrValue is a promise or not
|
||||
*
|
||||
* @param {*} promiseOrValue anything
|
||||
* @returns {boolean} true if promiseOrValue is a {@link Promise}
|
||||
* Determines if x is promise-like, i.e. a thenable object
|
||||
* NOTE: Will return true for *any thenable object*, and isn't truly
|
||||
* safe, since it may attempt to access the `then` property of x (i.e.
|
||||
* clever/malicious getters may do weird things)
|
||||
* @param {*} x anything
|
||||
* @returns {boolean} true if x is promise-like
|
||||
*/
|
||||
function isPromise(promiseOrValue) {
|
||||
return promiseOrValue && typeof promiseOrValue.then === 'function';
|
||||
function isPromiseLike(x) {
|
||||
return x && typeof x.then === 'function';
|
||||
}
|
||||
|
||||
/**
|
||||
@ -423,17 +526,15 @@ define(function () {
|
||||
* @param {Array} promisesOrValues array of anything, may contain a mix
|
||||
* of promises and values
|
||||
* @param howMany {number} number of promisesOrValues to resolve
|
||||
* @param {function?} [onFulfilled] resolution handler
|
||||
* @param {function?} [onRejected] rejection handler
|
||||
* @param {function?} [onProgress] progress handler
|
||||
* @param {function?} [onFulfilled] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onRejected] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onProgress] DEPRECATED, use returnedPromise.then()
|
||||
* @returns {Promise} promise that will resolve to an array of howMany values that
|
||||
* resolved first, or will reject with an array of
|
||||
* (promisesOrValues.length - howMany) + 1 rejection reasons.
|
||||
*/
|
||||
function some(promisesOrValues, howMany, onFulfilled, onRejected, onProgress) {
|
||||
|
||||
checkCallbacks(2, arguments);
|
||||
|
||||
return when(promisesOrValues, function(promisesOrValues) {
|
||||
|
||||
return promise(resolveSome).then(onFulfilled, onRejected, onProgress);
|
||||
@ -457,7 +558,7 @@ define(function () {
|
||||
rejectOne = function(reason) {
|
||||
reasons.push(reason);
|
||||
if(!--toReject) {
|
||||
fulfillOne = rejectOne = noop;
|
||||
fulfillOne = rejectOne = identity;
|
||||
reject(reasons);
|
||||
}
|
||||
};
|
||||
@ -466,7 +567,7 @@ define(function () {
|
||||
// This orders the values based on promise resolution order
|
||||
values.push(val);
|
||||
if (!--toResolve) {
|
||||
fulfillOne = rejectOne = noop;
|
||||
fulfillOne = rejectOne = identity;
|
||||
resolve(values);
|
||||
}
|
||||
};
|
||||
@ -496,9 +597,9 @@ define(function () {
|
||||
*
|
||||
* @param {Array|Promise} promisesOrValues array of anything, may contain a mix
|
||||
* of {@link Promise}s and values
|
||||
* @param {function?} [onFulfilled] resolution handler
|
||||
* @param {function?} [onRejected] rejection handler
|
||||
* @param {function?} [onProgress] progress handler
|
||||
* @param {function?} [onFulfilled] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onRejected] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onProgress] DEPRECATED, use returnedPromise.then()
|
||||
* @returns {Promise} promise that will resolve to the value that resolved first, or
|
||||
* will reject with an array of all rejected inputs.
|
||||
*/
|
||||
@ -519,14 +620,13 @@ define(function () {
|
||||
*
|
||||
* @param {Array|Promise} promisesOrValues array of anything, may contain a mix
|
||||
* of {@link Promise}s and values
|
||||
* @param {function?} [onFulfilled] resolution handler
|
||||
* @param {function?} [onRejected] rejection handler
|
||||
* @param {function?} [onProgress] progress handler
|
||||
* @param {function?} [onFulfilled] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onRejected] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onProgress] DEPRECATED, use returnedPromise.then()
|
||||
* @returns {Promise}
|
||||
*/
|
||||
function all(promisesOrValues, onFulfilled, onRejected, onProgress) {
|
||||
checkCallbacks(1, arguments);
|
||||
return map(promisesOrValues, identity).then(onFulfilled, onRejected, onProgress);
|
||||
return _map(promisesOrValues, identity).then(onFulfilled, onRejected, onProgress);
|
||||
}
|
||||
|
||||
/**
|
||||
@ -535,28 +635,49 @@ define(function () {
|
||||
* have fulfilled, or will reject when *any one* of the input promises rejects.
|
||||
*/
|
||||
function join(/* ...promises */) {
|
||||
return map(arguments, identity);
|
||||
return _map(arguments, identity);
|
||||
}
|
||||
|
||||
/**
|
||||
* Traditional map function, similar to `Array.prototype.map()`, but allows
|
||||
* input to contain {@link Promise}s and/or values, and mapFunc may return
|
||||
* either a value or a {@link Promise}
|
||||
*
|
||||
* @param {Array|Promise} array array of anything, may contain a mix
|
||||
* of {@link Promise}s and values
|
||||
* @param {function} mapFunc mapping function mapFunc(value) which may return
|
||||
* either a {@link Promise} or value
|
||||
* @returns {Promise} a {@link Promise} that will resolve to an array containing
|
||||
* the mapped output values.
|
||||
* Settles all input promises such that they are guaranteed not to
|
||||
* be pending once the returned promise fulfills. The returned promise
|
||||
* will always fulfill, except in the case where `array` is a promise
|
||||
* that rejects.
|
||||
* @param {Array|Promise} array or promise for array of promises to settle
|
||||
* @returns {Promise} promise that always fulfills with an array of
|
||||
* outcome snapshots for each input promise.
|
||||
*/
|
||||
function settle(array) {
|
||||
return _map(array, toFulfilledState, toRejectedState);
|
||||
}
|
||||
|
||||
/**
|
||||
* Promise-aware array map function, similar to `Array.prototype.map()`,
|
||||
* but input array may contain promises or values.
|
||||
* @param {Array|Promise} array array of anything, may contain promises and values
|
||||
* @param {function} mapFunc map function which may return a promise or value
|
||||
* @returns {Promise} promise that will fulfill with an array of mapped values
|
||||
* or reject if any input promise rejects.
|
||||
*/
|
||||
function map(array, mapFunc) {
|
||||
return _map(array, mapFunc);
|
||||
}
|
||||
|
||||
/**
|
||||
* Internal map that allows a fallback to handle rejections
|
||||
* @param {Array|Promise} array array of anything, may contain promises and values
|
||||
* @param {function} mapFunc map function which may return a promise or value
|
||||
* @param {function?} fallback function to handle rejected promises
|
||||
* @returns {Promise} promise that will fulfill with an array of mapped values
|
||||
* or reject if any input promise rejects.
|
||||
*/
|
||||
function _map(array, mapFunc, fallback) {
|
||||
return when(array, function(array) {
|
||||
|
||||
return promise(resolveMap);
|
||||
return _promise(resolveMap);
|
||||
|
||||
function resolveMap(resolve, reject, notify) {
|
||||
var results, len, toResolve, resolveOne, i;
|
||||
var results, len, toResolve, i;
|
||||
|
||||
// Since we know the resulting length, we can preallocate the results
|
||||
// array to avoid array expansions.
|
||||
@ -565,27 +686,28 @@ define(function () {
|
||||
|
||||
if(!toResolve) {
|
||||
resolve(results);
|
||||
} else {
|
||||
return;
|
||||
}
|
||||
|
||||
resolveOne = function(item, i) {
|
||||
when(item, mapFunc).then(function(mapped) {
|
||||
results[i] = mapped;
|
||||
|
||||
if(!--toResolve) {
|
||||
resolve(results);
|
||||
}
|
||||
}, reject, notify);
|
||||
};
|
||||
|
||||
// Since mapFunc may be async, get all invocations of it into flight
|
||||
for(i = 0; i < len; i++) {
|
||||
if(i in array) {
|
||||
resolveOne(array[i], i);
|
||||
} else {
|
||||
--toResolve;
|
||||
}
|
||||
// Since mapFunc may be async, get all invocations of it into flight
|
||||
for(i = 0; i < len; i++) {
|
||||
if(i in array) {
|
||||
resolveOne(array[i], i);
|
||||
} else {
|
||||
--toResolve;
|
||||
}
|
||||
}
|
||||
|
||||
function resolveOne(item, i) {
|
||||
when(item, mapFunc, fallback).then(function(mapped) {
|
||||
results[i] = mapped;
|
||||
notify(mapped);
|
||||
|
||||
if(!--toResolve) {
|
||||
resolve(results);
|
||||
}
|
||||
}, reject);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
@ -625,12 +747,46 @@ define(function () {
|
||||
});
|
||||
}
|
||||
|
||||
// Snapshot states
|
||||
|
||||
/**
|
||||
* Creates a fulfilled state snapshot
|
||||
* @private
|
||||
* @param {*} x any value
|
||||
* @returns {{state:'fulfilled',value:*}}
|
||||
*/
|
||||
function toFulfilledState(x) {
|
||||
return { state: 'fulfilled', value: x };
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a rejected state snapshot
|
||||
* @private
|
||||
* @param {*} x any reason
|
||||
* @returns {{state:'rejected',reason:*}}
|
||||
*/
|
||||
function toRejectedState(x) {
|
||||
return { state: 'rejected', reason: x };
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a pending state snapshot
|
||||
* @private
|
||||
* @returns {{state:'pending'}}
|
||||
*/
|
||||
function toPendingState() {
|
||||
return { state: 'pending' };
|
||||
}
|
||||
|
||||
//
|
||||
// Utilities, etc.
|
||||
// Internals, utilities, etc.
|
||||
//
|
||||
|
||||
var reduceArray, slice, fcall, nextTick, handlerQueue,
|
||||
timeout, funcProto, call, arrayProto, undef;
|
||||
setTimeout, funcProto, call, arrayProto, monitorApi,
|
||||
cjsRequire, undef;
|
||||
|
||||
cjsRequire = require;
|
||||
|
||||
//
|
||||
// Shared handler queue processing
|
||||
@ -648,20 +804,13 @@ define(function () {
|
||||
*/
|
||||
function enqueue(task) {
|
||||
if(handlerQueue.push(task) === 1) {
|
||||
scheduleDrainQueue();
|
||||
nextTick(drainQueue);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Schedule the queue to be drained in the next tick.
|
||||
*/
|
||||
function scheduleDrainQueue() {
|
||||
nextTick(drainQueue);
|
||||
}
|
||||
|
||||
/**
|
||||
* Drain the handler queue entirely or partially, being careful to allow
|
||||
* the queue to be extended while it is being processed, and to continue
|
||||
* Drain the handler queue entirely, being careful to allow the
|
||||
* queue to be extended while it is being processed, and to continue
|
||||
* processing until it is truly empty.
|
||||
*/
|
||||
function drainQueue() {
|
||||
@ -674,20 +823,36 @@ define(function () {
|
||||
handlerQueue = [];
|
||||
}
|
||||
|
||||
//
|
||||
// Capture function and array utils
|
||||
//
|
||||
/*global setImmediate:true*/
|
||||
// capture setTimeout to avoid being caught by fake timers
|
||||
// used in time based tests
|
||||
setTimeout = global.setTimeout;
|
||||
|
||||
// capture setTimeout to avoid being caught by fake timers used in time based tests
|
||||
timeout = setTimeout;
|
||||
nextTick = typeof setImmediate === 'function'
|
||||
? typeof window === 'undefined'
|
||||
? setImmediate
|
||||
: setImmediate.bind(window)
|
||||
: typeof process === 'object'
|
||||
? process.nextTick
|
||||
: function(task) { timeout(task, 0); };
|
||||
// Allow attaching the monitor to when() if env has no console
|
||||
monitorApi = typeof console != 'undefined' ? console : when;
|
||||
|
||||
// Prefer setImmediate or MessageChannel, cascade to node,
|
||||
// vertx and finally setTimeout
|
||||
/*global setImmediate,MessageChannel,process*/
|
||||
if (typeof setImmediate === 'function') {
|
||||
nextTick = setImmediate.bind(global);
|
||||
} else if(typeof MessageChannel !== 'undefined') {
|
||||
var channel = new MessageChannel();
|
||||
channel.port1.onmessage = drainQueue;
|
||||
nextTick = function() { channel.port2.postMessage(0); };
|
||||
} else if (typeof process === 'object' && process.nextTick) {
|
||||
nextTick = process.nextTick;
|
||||
} else {
|
||||
try {
|
||||
// vert.x 1.x || 2.x
|
||||
nextTick = cjsRequire('vertx').runOnLoop || cjsRequire('vertx').runOnContext;
|
||||
} catch(ignore) {
|
||||
nextTick = function(t) { setTimeout(t, 0); };
|
||||
}
|
||||
}
|
||||
|
||||
//
|
||||
// Capture/polyfill function and array utils
|
||||
//
|
||||
|
||||
// Safe function calls
|
||||
funcProto = Function.prototype;
|
||||
@ -748,40 +913,10 @@ define(function () {
|
||||
return reduced;
|
||||
};
|
||||
|
||||
//
|
||||
// Utility functions
|
||||
//
|
||||
|
||||
/**
|
||||
* Helper that checks arrayOfCallbacks to ensure that each element is either
|
||||
* a function, or null or undefined.
|
||||
* @private
|
||||
* @param {number} start index at which to start checking items in arrayOfCallbacks
|
||||
* @param {Array} arrayOfCallbacks array to check
|
||||
* @throws {Error} if any element of arrayOfCallbacks is something other than
|
||||
* a functions, null, or undefined.
|
||||
*/
|
||||
function checkCallbacks(start, arrayOfCallbacks) {
|
||||
// TODO: Promises/A+ update type checking and docs
|
||||
var arg, i = arrayOfCallbacks.length;
|
||||
|
||||
while(i > start) {
|
||||
arg = arrayOfCallbacks[--i];
|
||||
|
||||
if (arg != null && typeof arg != 'function') {
|
||||
throw new Error('arg '+i+' must be a function');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function noop() {}
|
||||
|
||||
function identity(x) {
|
||||
return x;
|
||||
}
|
||||
|
||||
return when;
|
||||
});
|
||||
})(
|
||||
typeof define === 'function' && define.amd ? define : function (factory) { module.exports = factory(); }
|
||||
);
|
||||
})(typeof define === 'function' && define.amd ? define : function (factory) { module.exports = factory(require); }, this);
|
||||
@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "mopidy",
|
||||
"version": "0.1.0",
|
||||
"version": "0.1.1",
|
||||
"description": "Client lib for controlling a Mopidy music server over a WebSocket",
|
||||
"homepage": "http://www.mopidy.com/",
|
||||
"author": {
|
||||
@ -14,23 +14,23 @@
|
||||
},
|
||||
"main": "src/mopidy.js",
|
||||
"dependencies": {
|
||||
"bane": "~0.4.0",
|
||||
"faye-websocket": "~0.4.4",
|
||||
"when": "~2.0.0"
|
||||
"bane": "~1.0.0",
|
||||
"faye-websocket": "~0.7.0",
|
||||
"when": "~2.4.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"buster": "~0.6.12",
|
||||
"grunt": "~0.4.0",
|
||||
"grunt-buster": "~0.1.2",
|
||||
"grunt-contrib-concat": "~0.1.3",
|
||||
"grunt-contrib-jshint": "~0.2.0",
|
||||
"grunt-contrib-uglify": "~0.1.2",
|
||||
"grunt-contrib-watch": "~0.3.1",
|
||||
"phantomjs": "~1.8.2"
|
||||
"buster": "~0.6.13",
|
||||
"grunt": "~0.4.1",
|
||||
"grunt-buster": "~0.2.1",
|
||||
"grunt-contrib-concat": "~0.3.0",
|
||||
"grunt-contrib-jshint": "~0.6.4",
|
||||
"grunt-contrib-uglify": "~0.2.4",
|
||||
"grunt-contrib-watch": "~0.5.3",
|
||||
"phantomjs": "~1.9.2-0"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "grunt test",
|
||||
"build": "grunt build",
|
||||
"watch": "grunt watch"
|
||||
"start": "grunt watch"
|
||||
}
|
||||
}
|
||||
|
||||
@ -1,17 +1,15 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
# pylint: disable = E0611,F0401
|
||||
from distutils.version import StrictVersion as SV
|
||||
# pylint: enable = E0611,F0401
|
||||
import sys
|
||||
import warnings
|
||||
|
||||
import pykka
|
||||
|
||||
|
||||
if not (2, 6) <= sys.version_info < (3,):
|
||||
if not (2, 7) <= sys.version_info < (3,):
|
||||
sys.exit(
|
||||
'Mopidy requires Python >= 2.6, < 3, but found %s' %
|
||||
'Mopidy requires Python >= 2.7, < 3, but found %s' %
|
||||
'.'.join(map(str, sys.version_info[:3])))
|
||||
|
||||
if (isinstance(pykka.__version__, basestring)
|
||||
|
||||
@ -1,7 +1,6 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import logging
|
||||
import optparse
|
||||
import os
|
||||
import signal
|
||||
import sys
|
||||
@ -18,17 +17,11 @@ mopidy_args = sys.argv[1:]
|
||||
sys.argv[1:] = []
|
||||
|
||||
|
||||
# Add ../ to the path so we can run Mopidy from a Git checkout without
|
||||
# installing it on the system.
|
||||
sys.path.insert(
|
||||
0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../')))
|
||||
|
||||
|
||||
from mopidy import ext
|
||||
from mopidy import commands, ext
|
||||
from mopidy.audio import Audio
|
||||
from mopidy import config as config_lib
|
||||
from mopidy.core import Core
|
||||
from mopidy.utils import deps, log, path, process, versioning
|
||||
from mopidy.utils import log, path, process
|
||||
|
||||
logger = logging.getLogger('mopidy.main')
|
||||
|
||||
@ -37,33 +30,36 @@ def main():
|
||||
signal.signal(signal.SIGTERM, process.exit_handler)
|
||||
signal.signal(signal.SIGUSR1, pykka.debug.log_thread_tracebacks)
|
||||
|
||||
loop = gobject.MainLoop()
|
||||
options = parse_options()
|
||||
config_files = options.config.split(b':')
|
||||
config_overrides = options.overrides
|
||||
|
||||
enabled_extensions = [] # Make sure it is defined before the finally block
|
||||
logging_initialized = False
|
||||
args = commands.parser.parse_args(args=mopidy_args)
|
||||
if args.show_config:
|
||||
commands.show_config(args)
|
||||
if args.show_deps:
|
||||
commands.show_deps()
|
||||
|
||||
# TODO: figure out a way to make the boilerplate in this file reusable in
|
||||
# scanner and other places we need it.
|
||||
|
||||
try:
|
||||
# Initial config without extensions to bootstrap logging.
|
||||
logging_config, _ = config_lib.load(config_files, [], config_overrides)
|
||||
logging_initialized = False
|
||||
logging_config, _ = config_lib.load(
|
||||
args.config_files, [], args.config_overrides)
|
||||
|
||||
# TODO: setup_logging needs defaults in-case config values are None
|
||||
log.setup_logging(
|
||||
logging_config, options.verbosity_level, options.save_debug_log)
|
||||
logging_config, args.verbosity_level, args.save_debug_log)
|
||||
logging_initialized = True
|
||||
|
||||
create_file_structures()
|
||||
check_old_locations()
|
||||
|
||||
installed_extensions = ext.load_extensions()
|
||||
|
||||
# TODO: wrap config in RO proxy.
|
||||
config, config_errors = config_lib.load(
|
||||
config_files, installed_extensions, config_overrides)
|
||||
args.config_files, installed_extensions, args.config_overrides)
|
||||
|
||||
# Filter out disabled extensions and remove any config errors for them.
|
||||
enabled_extensions = []
|
||||
for extension in installed_extensions:
|
||||
enabled = config[extension.ext_name]['enabled']
|
||||
if ext.validate_extension(extension) and enabled:
|
||||
@ -78,31 +74,38 @@ def main():
|
||||
proxied_config = config_lib.Proxy(config)
|
||||
|
||||
log.setup_log_levels(proxied_config)
|
||||
create_file_structures()
|
||||
check_old_locations()
|
||||
ext.register_gstreamer_elements(enabled_extensions)
|
||||
|
||||
# Anything that wants to exit after this point must use
|
||||
# mopidy.utils.process.exit_process as actors have been started.
|
||||
audio = setup_audio(proxied_config)
|
||||
backends = setup_backends(proxied_config, enabled_extensions, audio)
|
||||
core = setup_core(audio, backends)
|
||||
setup_frontends(proxied_config, enabled_extensions, core)
|
||||
loop.run()
|
||||
start(proxied_config, enabled_extensions)
|
||||
except KeyboardInterrupt:
|
||||
if logging_initialized:
|
||||
logger.info('Interrupted. Exiting...')
|
||||
pass
|
||||
except Exception as ex:
|
||||
if logging_initialized:
|
||||
logger.exception(ex)
|
||||
raise
|
||||
finally:
|
||||
loop.quit()
|
||||
stop_frontends(enabled_extensions)
|
||||
stop_core()
|
||||
stop_backends(enabled_extensions)
|
||||
stop_audio()
|
||||
process.stop_remaining_actors()
|
||||
|
||||
|
||||
def create_file_structures():
|
||||
path.get_or_create_dir(b'$XDG_DATA_DIR/mopidy')
|
||||
path.get_or_create_file(b'$XDG_CONFIG_DIR/mopidy/mopidy.conf')
|
||||
|
||||
|
||||
def check_old_locations():
|
||||
dot_mopidy_dir = path.expand_path(b'~/.mopidy')
|
||||
if os.path.isdir(dot_mopidy_dir):
|
||||
logger.warning(
|
||||
'Old Mopidy dot dir found at %s. Please migrate your config to '
|
||||
'the ini-file based config format. See release notes for further '
|
||||
'instructions.', dot_mopidy_dir)
|
||||
|
||||
old_settings_file = path.expand_path(b'$XDG_CONFIG_DIR/mopidy/settings.py')
|
||||
if os.path.isfile(old_settings_file):
|
||||
logger.warning(
|
||||
'Old Mopidy settings file found at %s. Please migrate your '
|
||||
'config to the ini-file based config format. See release notes '
|
||||
'for further instructions.', old_settings_file)
|
||||
|
||||
|
||||
def log_extension_info(all_extensions, enabled_extensions):
|
||||
@ -124,102 +127,27 @@ def check_config_errors(errors):
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def check_config_override(option, opt, override):
|
||||
def start(config, extensions):
|
||||
loop = gobject.MainLoop()
|
||||
try:
|
||||
return config_lib.parse_override(override)
|
||||
except ValueError:
|
||||
raise optparse.OptionValueError(
|
||||
'option %s: must have the format section/key=value' % opt)
|
||||
audio = start_audio(config)
|
||||
backends = start_backends(config, extensions, audio)
|
||||
core = start_core(audio, backends)
|
||||
start_frontends(config, extensions, core)
|
||||
loop.run()
|
||||
except KeyboardInterrupt:
|
||||
logger.info('Interrupted. Exiting...')
|
||||
return
|
||||
finally:
|
||||
loop.quit()
|
||||
stop_frontends(extensions)
|
||||
stop_core()
|
||||
stop_backends(extensions)
|
||||
stop_audio()
|
||||
process.stop_remaining_actors()
|
||||
|
||||
|
||||
def parse_options():
|
||||
parser = optparse.OptionParser(
|
||||
version='Mopidy %s' % versioning.get_version())
|
||||
|
||||
# Ugly extension of optparse type checking magic :/
|
||||
optparse.Option.TYPES += ('config_override',)
|
||||
optparse.Option.TYPE_CHECKER['config_override'] = check_config_override
|
||||
|
||||
# NOTE First argument to add_option must be bytestrings on Python < 2.6.2
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details
|
||||
parser.add_option(
|
||||
b'-q', '--quiet',
|
||||
action='store_const', const=0, dest='verbosity_level',
|
||||
help='less output (warning level)')
|
||||
parser.add_option(
|
||||
b'-v', '--verbose',
|
||||
action='count', default=1, dest='verbosity_level',
|
||||
help='more output (debug level)')
|
||||
parser.add_option(
|
||||
b'--save-debug-log',
|
||||
action='store_true', dest='save_debug_log',
|
||||
help='save debug log to "./mopidy.log"')
|
||||
parser.add_option(
|
||||
b'--show-config',
|
||||
action='callback', callback=show_config_callback,
|
||||
help='show current config')
|
||||
parser.add_option(
|
||||
b'--show-deps',
|
||||
action='callback', callback=deps.show_deps_optparse_callback,
|
||||
help='show dependencies and their versions')
|
||||
parser.add_option(
|
||||
b'--config',
|
||||
action='store', dest='config',
|
||||
default=b'$XDG_CONFIG_DIR/mopidy/mopidy.conf',
|
||||
help='config files to use, colon seperated, later files override')
|
||||
parser.add_option(
|
||||
b'-o', b'--option',
|
||||
action='append', dest='overrides', type='config_override',
|
||||
help='`section/key=value` values to override config options')
|
||||
return parser.parse_args(args=mopidy_args)[0]
|
||||
|
||||
|
||||
def show_config_callback(option, opt, value, parser):
|
||||
# TODO: don't use callback for this as --config or -o set after
|
||||
# --show-config will be ignored.
|
||||
files = getattr(parser.values, 'config', b'').split(b':')
|
||||
overrides = getattr(parser.values, 'overrides', [])
|
||||
|
||||
extensions = ext.load_extensions()
|
||||
config, errors = config_lib.load(files, extensions, overrides)
|
||||
|
||||
# Clear out any config for disabled extensions.
|
||||
for extension in extensions:
|
||||
if not ext.validate_extension(extension):
|
||||
config[extension.ext_name] = {b'enabled': False}
|
||||
errors[extension.ext_name] = {
|
||||
b'enabled': b'extension disabled its self.'}
|
||||
elif not config[extension.ext_name]['enabled']:
|
||||
config[extension.ext_name] = {b'enabled': False}
|
||||
errors[extension.ext_name] = {
|
||||
b'enabled': b'extension disabled by config.'}
|
||||
|
||||
print config_lib.format(config, extensions, errors)
|
||||
sys.exit(0)
|
||||
|
||||
|
||||
def check_old_locations():
|
||||
dot_mopidy_dir = path.expand_path(b'~/.mopidy')
|
||||
if os.path.isdir(dot_mopidy_dir):
|
||||
logger.warning(
|
||||
'Old Mopidy dot dir found at %s. Please migrate your config to '
|
||||
'the ini-file based config format. See release notes for further '
|
||||
'instructions.', dot_mopidy_dir)
|
||||
|
||||
old_settings_file = path.expand_path(b'$XDG_CONFIG_DIR/mopidy/settings.py')
|
||||
if os.path.isfile(old_settings_file):
|
||||
logger.warning(
|
||||
'Old Mopidy settings file found at %s. Please migrate your '
|
||||
'config to the ini-file based config format. See release notes '
|
||||
'for further instructions.', old_settings_file)
|
||||
|
||||
|
||||
def create_file_structures():
|
||||
path.get_or_create_dir(b'$XDG_DATA_DIR/mopidy')
|
||||
path.get_or_create_file(b'$XDG_CONFIG_DIR/mopidy/mopidy.conf')
|
||||
|
||||
|
||||
def setup_audio(config):
|
||||
def start_audio(config):
|
||||
logger.info('Starting Mopidy audio')
|
||||
return Audio.start(config=config).proxy()
|
||||
|
||||
@ -229,7 +157,7 @@ def stop_audio():
|
||||
process.stop_actors_by_class(Audio)
|
||||
|
||||
|
||||
def setup_backends(config, extensions, audio):
|
||||
def start_backends(config, extensions, audio):
|
||||
backend_classes = []
|
||||
for extension in extensions:
|
||||
backend_classes.extend(extension.get_backend_classes())
|
||||
@ -253,7 +181,7 @@ def stop_backends(extensions):
|
||||
process.stop_actors_by_class(backend_class)
|
||||
|
||||
|
||||
def setup_core(audio, backends):
|
||||
def start_core(audio, backends):
|
||||
logger.info('Starting Mopidy core')
|
||||
return Core.start(audio=audio, backends=backends).proxy()
|
||||
|
||||
@ -263,7 +191,7 @@ def stop_core():
|
||||
process.stop_actors_by_class(Core)
|
||||
|
||||
|
||||
def setup_frontends(config, extensions, core):
|
||||
def start_frontends(config, extensions, core):
|
||||
frontend_classes = []
|
||||
for extension in extensions:
|
||||
frontend_classes.extend(extension.get_frontend_classes())
|
||||
|
||||
@ -22,6 +22,22 @@ mixers.register_mixers()
|
||||
|
||||
MB = 1 << 20
|
||||
|
||||
# GST_PLAY_FLAG_VIDEO (1<<0)
|
||||
# GST_PLAY_FLAG_AUDIO (1<<1)
|
||||
# GST_PLAY_FLAG_TEXT (1<<2)
|
||||
# GST_PLAY_FLAG_VIS (1<<3)
|
||||
# GST_PLAY_FLAG_SOFT_VOLUME (1<<4)
|
||||
# GST_PLAY_FLAG_NATIVE_AUDIO (1<<5)
|
||||
# GST_PLAY_FLAG_NATIVE_VIDEO (1<<6)
|
||||
# GST_PLAY_FLAG_DOWNLOAD (1<<7)
|
||||
# GST_PLAY_FLAG_BUFFERING (1<<8)
|
||||
# GST_PLAY_FLAG_DEINTERLACE (1<<9)
|
||||
# GST_PLAY_FLAG_SOFT_COLORBALANCE (1<<10)
|
||||
|
||||
# Default flags to use for playbin: AUDIO, SOFT_VOLUME, DOWNLOAD
|
||||
PLAYBIN_FLAGS = (1 << 1) | (1 << 4) | (1 << 7)
|
||||
PLAYBIN_VIS_FLAGS = PLAYBIN_FLAGS | (1 << 3)
|
||||
|
||||
|
||||
class Audio(pykka.ThreadingActor):
|
||||
"""
|
||||
@ -55,6 +71,7 @@ class Audio(pykka.ThreadingActor):
|
||||
try:
|
||||
self._setup_playbin()
|
||||
self._setup_output()
|
||||
self._setup_visualizer()
|
||||
self._setup_mixer()
|
||||
self._setup_message_processor()
|
||||
except gobject.GError as ex:
|
||||
@ -78,9 +95,7 @@ class Audio(pykka.ThreadingActor):
|
||||
|
||||
def _setup_playbin(self):
|
||||
playbin = gst.element_factory_make('playbin2')
|
||||
|
||||
fakesink = gst.element_factory_make('fakesink')
|
||||
playbin.set_property('video-sink', fakesink)
|
||||
playbin.set_property('flags', PLAYBIN_FLAGS)
|
||||
|
||||
self._connect(playbin, 'about-to-finish', self._on_about_to_finish)
|
||||
self._connect(playbin, 'notify::source', self._on_new_source)
|
||||
@ -149,6 +164,20 @@ class Audio(pykka.ThreadingActor):
|
||||
'Failed to create audio output "%s": %s', output_desc, ex)
|
||||
process.exit_process()
|
||||
|
||||
def _setup_visualizer(self):
|
||||
visualizer_element = self._config['audio']['visualizer']
|
||||
if not visualizer_element:
|
||||
return
|
||||
try:
|
||||
visualizer = gst.element_factory_make(visualizer_element)
|
||||
self._playbin.set_property('vis-plugin', visualizer)
|
||||
self._playbin.set_property('flags', PLAYBIN_VIS_FLAGS)
|
||||
logger.info('Audio visualizer set to "%s"', visualizer_element)
|
||||
except gobject.GError as ex:
|
||||
logger.error(
|
||||
'Failed to create audio visualizer "%s": %s',
|
||||
visualizer_element, ex)
|
||||
|
||||
def _setup_mixer(self):
|
||||
mixer_desc = self._config['audio']['mixer']
|
||||
track_desc = self._config['audio']['mixer_track']
|
||||
@ -193,7 +222,8 @@ class Audio(pykka.ThreadingActor):
|
||||
self._mixer_track.min_volume, self._mixer_track.max_volume)
|
||||
logger.info(
|
||||
'Audio mixer set to "%s" using track "%s"',
|
||||
mixer.get_factory().get_name(), track.label)
|
||||
str(mixer.get_factory().get_name()).decode('utf-8'),
|
||||
str(track.label).decode('utf-8'))
|
||||
|
||||
def _select_mixer_track(self, mixer, track_label):
|
||||
# Ignore tracks without volumes, then look for track with
|
||||
|
||||
@ -29,9 +29,7 @@ class AutoAudioMixer(gst.Bin):
|
||||
gst.Bin.__init__(self)
|
||||
mixer = self._find_mixer()
|
||||
if mixer:
|
||||
# pylint: disable=E1101
|
||||
self.add(mixer)
|
||||
# pylint: enable=E1101
|
||||
logger.debug('AutoAudioMixer chose: %s', mixer.get_name())
|
||||
else:
|
||||
logger.debug('AutoAudioMixer did not find any usable mixers')
|
||||
|
||||
@ -53,6 +53,7 @@ class BaseLibraryProvider(object):
|
||||
def __init__(self, backend):
|
||||
self.backend = backend
|
||||
|
||||
# TODO: replace with search(query, exact=True, ...)
|
||||
def find_exact(self, query=None, uris=None):
|
||||
"""
|
||||
See :meth:`mopidy.core.LibraryController.find_exact`.
|
||||
@ -86,6 +87,40 @@ class BaseLibraryProvider(object):
|
||||
pass
|
||||
|
||||
|
||||
class BaseLibraryUpdateProvider(object):
|
||||
uri_schemes = []
|
||||
|
||||
def load(self):
|
||||
"""Loads the library and returns all tracks in it.
|
||||
|
||||
*MUST be implemented by subclass.*
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def add(self, track):
|
||||
"""Adds given track to library.
|
||||
|
||||
Overwrites any existing track with same URI.
|
||||
|
||||
*MUST be implemented by subclass.*
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def remove(self, uri):
|
||||
"""Removes given track from library.
|
||||
|
||||
*MUST be implemented by subclass.*
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def commit(self):
|
||||
"""Persist changes to library.
|
||||
|
||||
*MAY be implemented by subclass.*
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
class BasePlaybackProvider(object):
|
||||
"""
|
||||
:param audio: the audio actor
|
||||
@ -121,9 +156,22 @@ class BasePlaybackProvider(object):
|
||||
:rtype: :class:`True` if successful, else :class:`False`
|
||||
"""
|
||||
self.audio.prepare_change()
|
||||
self.audio.set_uri(track.uri).get()
|
||||
self.change_track(track)
|
||||
return self.audio.start_playback().get()
|
||||
|
||||
def change_track(self, track):
|
||||
"""
|
||||
Swith to provided track.
|
||||
|
||||
*MAY be reimplemented by subclass.*
|
||||
|
||||
:param track: the track to play
|
||||
:type track: :class:`mopidy.models.Track`
|
||||
:rtype: :class:`True` if successful, else :class:`False`
|
||||
"""
|
||||
self.audio.set_uri(track.uri).get()
|
||||
return True
|
||||
|
||||
def resume(self):
|
||||
"""
|
||||
Resume playback at the same time position playback was paused.
|
||||
|
||||
@ -21,6 +21,7 @@ class Extension(ext.Extension):
|
||||
schema['media_dir'] = config.Path()
|
||||
schema['playlists_dir'] = config.Path()
|
||||
schema['tag_cache_file'] = config.Path()
|
||||
schema['scan_timeout'] = config.Integer(minimum=0)
|
||||
return schema
|
||||
|
||||
def validate_environment(self):
|
||||
@ -29,3 +30,7 @@ class Extension(ext.Extension):
|
||||
def get_backend_classes(self):
|
||||
from .actor import LocalBackend
|
||||
return [LocalBackend]
|
||||
|
||||
def get_library_updaters(self):
|
||||
from .library import LocalLibraryUpdateProvider
|
||||
return [LocalLibraryUpdateProvider]
|
||||
|
||||
@ -10,6 +10,7 @@ from mopidy.utils import encoding, path
|
||||
|
||||
from .library import LocalLibraryProvider
|
||||
from .playlists import LocalPlaylistsProvider
|
||||
from .playback import LocalPlaybackProvider
|
||||
|
||||
logger = logging.getLogger('mopidy.backends.local')
|
||||
|
||||
@ -23,10 +24,10 @@ class LocalBackend(pykka.ThreadingActor, base.Backend):
|
||||
self.check_dirs_and_files()
|
||||
|
||||
self.library = LocalLibraryProvider(backend=self)
|
||||
self.playback = base.BasePlaybackProvider(audio=audio, backend=self)
|
||||
self.playback = LocalPlaybackProvider(audio=audio, backend=self)
|
||||
self.playlists = LocalPlaylistsProvider(backend=self)
|
||||
|
||||
self.uri_schemes = ['file']
|
||||
self.uri_schemes = ['local']
|
||||
|
||||
def check_dirs_and_files(self):
|
||||
if not os.path.isdir(self.config['local']['media_dir']):
|
||||
|
||||
@ -3,3 +3,4 @@ enabled = true
|
||||
media_dir = $XDG_MUSIC_DIR
|
||||
playlists_dir = $XDG_DATA_DIR/mopidy/local/playlists
|
||||
tag_cache_file = $XDG_DATA_DIR/mopidy/local/tag_cache
|
||||
scan_timeout = 1000
|
||||
|
||||
@ -1,7 +1,11 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import logging
|
||||
import os
|
||||
import tempfile
|
||||
|
||||
from mopidy.backends import base
|
||||
from mopidy.frontends.mpd import translator as mpd_translator
|
||||
from mopidy.models import Album, SearchResult
|
||||
|
||||
from .translator import parse_mpd_tag_cache
|
||||
@ -77,7 +81,8 @@ class LocalLibraryProvider(base.BaseLibraryProvider):
|
||||
result_tracks = filter(any_filter, result_tracks)
|
||||
else:
|
||||
raise LookupError('Invalid lookup field: %s' % field)
|
||||
return SearchResult(uri='file:search', tracks=result_tracks)
|
||||
# TODO: add local:search:<query>
|
||||
return SearchResult(uri='local:search', tracks=result_tracks)
|
||||
|
||||
def search(self, query=None, uris=None):
|
||||
# TODO Only return results within URI roots given by ``uris``
|
||||
@ -118,7 +123,8 @@ class LocalLibraryProvider(base.BaseLibraryProvider):
|
||||
result_tracks = filter(any_filter, result_tracks)
|
||||
else:
|
||||
raise LookupError('Invalid lookup field: %s' % field)
|
||||
return SearchResult(uri='file:search', tracks=result_tracks)
|
||||
# TODO: add local:search:<query>
|
||||
return SearchResult(uri='local:search', tracks=result_tracks)
|
||||
|
||||
def _validate_query(self, query):
|
||||
for (_, values) in query.iteritems():
|
||||
@ -127,3 +133,46 @@ class LocalLibraryProvider(base.BaseLibraryProvider):
|
||||
for value in values:
|
||||
if not value:
|
||||
raise LookupError('Missing query')
|
||||
|
||||
|
||||
# TODO: rename and move to tagcache extension.
|
||||
class LocalLibraryUpdateProvider(base.BaseLibraryProvider):
|
||||
uri_schemes = ['local']
|
||||
|
||||
def __init__(self, config):
|
||||
self._tracks = {}
|
||||
self._media_dir = config['local']['media_dir']
|
||||
self._tag_cache_file = config['local']['tag_cache_file']
|
||||
|
||||
def load(self):
|
||||
tracks = parse_mpd_tag_cache(self._tag_cache_file, self._media_dir)
|
||||
for track in tracks:
|
||||
self._tracks[track.uri] = track
|
||||
return tracks
|
||||
|
||||
def add(self, track):
|
||||
self._tracks[track.uri] = track
|
||||
|
||||
def remove(self, uri):
|
||||
if uri in self._tracks:
|
||||
del self._tracks[uri]
|
||||
|
||||
def commit(self):
|
||||
directory, basename = os.path.split(self._tag_cache_file)
|
||||
|
||||
# TODO: cleanup directory/basename.* files.
|
||||
tmp = tempfile.NamedTemporaryFile(
|
||||
prefix=basename + '.', dir=directory, delete=False)
|
||||
|
||||
try:
|
||||
for row in mpd_translator.tracks_to_tag_cache_format(
|
||||
self._tracks.values(), self._media_dir):
|
||||
if len(row) == 1:
|
||||
tmp.write(('%s\n' % row).encode('utf-8'))
|
||||
else:
|
||||
tmp.write(('%s: %s\n' % row).encode('utf-8'))
|
||||
|
||||
os.rename(tmp.name, self._tag_cache_file)
|
||||
finally:
|
||||
if os.path.exists(tmp.name):
|
||||
os.remove(tmp.name)
|
||||
|
||||
19
mopidy/backends/local/playback.py
Normal file
19
mopidy/backends/local/playback.py
Normal file
@ -0,0 +1,19 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import logging
|
||||
import os
|
||||
|
||||
from mopidy.backends import base
|
||||
from mopidy.utils import path
|
||||
|
||||
logger = logging.getLogger('mopidy.backends.local')
|
||||
|
||||
|
||||
class LocalPlaybackProvider(base.BasePlaybackProvider):
|
||||
def change_track(self, track):
|
||||
media_dir = self.backend.config['local']['media_dir']
|
||||
# TODO: check that type is correct.
|
||||
file_path = path.uri_to_path(track.uri).split(':', 1)[1]
|
||||
file_path = os.path.join(media_dir, file_path)
|
||||
track = track.copy(uri=path.path_to_uri(file_path))
|
||||
return super(LocalPlaybackProvider, self).change_track(track)
|
||||
@ -24,7 +24,7 @@ class LocalPlaylistsProvider(base.BasePlaylistsProvider):
|
||||
|
||||
def create(self, name):
|
||||
name = formatting.slugify(name)
|
||||
uri = path.path_to_uri(self._get_m3u_path(name))
|
||||
uri = 'local:playlist:%s.m3u' % name
|
||||
playlist = Playlist(uri=uri, name=name)
|
||||
return self.save(playlist)
|
||||
|
||||
@ -37,6 +37,7 @@ class LocalPlaylistsProvider(base.BasePlaylistsProvider):
|
||||
self._delete_m3u(playlist.uri)
|
||||
|
||||
def lookup(self, uri):
|
||||
# TODO: store as {uri: playlist}?
|
||||
for playlist in self._playlists:
|
||||
if playlist.uri == uri:
|
||||
return playlist
|
||||
@ -45,8 +46,8 @@ class LocalPlaylistsProvider(base.BasePlaylistsProvider):
|
||||
playlists = []
|
||||
|
||||
for m3u in glob.glob(os.path.join(self._playlists_dir, '*.m3u')):
|
||||
uri = path.path_to_uri(m3u)
|
||||
name = os.path.splitext(os.path.basename(m3u))[0]
|
||||
uri = 'local:playlist:%s' % name
|
||||
|
||||
tracks = []
|
||||
for track_uri in parse_m3u(m3u, self._media_dir):
|
||||
@ -61,6 +62,7 @@ class LocalPlaylistsProvider(base.BasePlaylistsProvider):
|
||||
playlists.append(playlist)
|
||||
|
||||
self.playlists = playlists
|
||||
# TODO: send what scheme we loaded them for?
|
||||
listener.BackendListener.send('playlists_loaded')
|
||||
|
||||
logger.info(
|
||||
@ -86,38 +88,30 @@ class LocalPlaylistsProvider(base.BasePlaylistsProvider):
|
||||
|
||||
return playlist
|
||||
|
||||
def _get_m3u_path(self, name):
|
||||
name = formatting.slugify(name)
|
||||
file_path = os.path.join(self._playlists_dir, name + '.m3u')
|
||||
def _m3u_uri_to_path(self, uri):
|
||||
# TODO: create uri handling helpers for local uri types.
|
||||
file_path = path.uri_to_path(uri).split(':', 1)[1]
|
||||
file_path = os.path.join(self._playlists_dir, file_path)
|
||||
path.check_file_path_is_inside_base_dir(file_path, self._playlists_dir)
|
||||
return file_path
|
||||
|
||||
def _save_m3u(self, playlist):
|
||||
file_path = path.uri_to_path(playlist.uri)
|
||||
path.check_file_path_is_inside_base_dir(file_path, self._playlists_dir)
|
||||
file_path = self._m3u_uri_to_path(playlist.uri)
|
||||
with open(file_path, 'w') as file_handle:
|
||||
for track in playlist.tracks:
|
||||
if track.uri.startswith('file://'):
|
||||
uri = path.uri_to_path(track.uri)
|
||||
else:
|
||||
uri = track.uri
|
||||
file_handle.write(uri + '\n')
|
||||
file_handle.write(track.uri + '\n')
|
||||
|
||||
def _delete_m3u(self, uri):
|
||||
file_path = path.uri_to_path(uri)
|
||||
path.check_file_path_is_inside_base_dir(file_path, self._playlists_dir)
|
||||
file_path = self._m3u_uri_to_path(uri)
|
||||
if os.path.exists(file_path):
|
||||
os.remove(file_path)
|
||||
|
||||
def _rename_m3u(self, playlist):
|
||||
src_file_path = path.uri_to_path(playlist.uri)
|
||||
path.check_file_path_is_inside_base_dir(
|
||||
src_file_path, self._playlists_dir)
|
||||
dst_name = formatting.slugify(playlist.name)
|
||||
dst_uri = 'local:playlist:%s.m3u' % dst_name
|
||||
|
||||
dst_file_path = self._get_m3u_path(playlist.name)
|
||||
path.check_file_path_is_inside_base_dir(
|
||||
dst_file_path, self._playlists_dir)
|
||||
src_file_path = self._m3u_uri_to_path(playlist.uri)
|
||||
dst_file_path = self._m3u_uri_to_path(dst_uri)
|
||||
|
||||
shutil.move(src_file_path, dst_file_path)
|
||||
|
||||
return playlist.copy(uri=path.path_to_uri(dst_file_path))
|
||||
return playlist.copy(uri=dst_uri)
|
||||
|
||||
@ -1,7 +1,8 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import logging
|
||||
import urllib
|
||||
import os
|
||||
import urlparse
|
||||
|
||||
from mopidy.models import Track, Artist, Album
|
||||
from mopidy.utils.encoding import locale_decode
|
||||
@ -30,7 +31,6 @@ def parse_m3u(file_path, media_dir):
|
||||
- m3u files are latin-1.
|
||||
- This function does not bother with Extended M3U directives.
|
||||
"""
|
||||
|
||||
# TODO: uris as bytes
|
||||
uris = []
|
||||
try:
|
||||
@ -46,16 +46,19 @@ def parse_m3u(file_path, media_dir):
|
||||
if line.startswith('#'):
|
||||
continue
|
||||
|
||||
# FIXME what about other URI types?
|
||||
if line.startswith('file://'):
|
||||
if urlparse.urlsplit(line).scheme:
|
||||
uris.append(line)
|
||||
elif os.path.normpath(line) == os.path.abspath(line):
|
||||
path = path_to_uri(line)
|
||||
uris.append(path)
|
||||
else:
|
||||
path = path_to_uri(media_dir, line)
|
||||
path = path_to_uri(os.path.join(media_dir, line))
|
||||
uris.append(path)
|
||||
|
||||
return uris
|
||||
|
||||
|
||||
# TODO: remove music_dir from API
|
||||
def parse_mpd_tag_cache(tag_cache, music_dir=''):
|
||||
"""
|
||||
Converts a MPD tag_cache into a lists of tracks, artists and albums.
|
||||
@ -86,23 +89,20 @@ def parse_mpd_tag_cache(tag_cache, music_dir=''):
|
||||
key, value = line.split(b': ', 1)
|
||||
|
||||
if key == b'key':
|
||||
_convert_mpd_data(current, tracks, music_dir)
|
||||
_convert_mpd_data(current, tracks)
|
||||
current.clear()
|
||||
|
||||
current[key.lower()] = value.decode('utf-8')
|
||||
|
||||
_convert_mpd_data(current, tracks, music_dir)
|
||||
_convert_mpd_data(current, tracks)
|
||||
|
||||
return tracks
|
||||
|
||||
|
||||
def _convert_mpd_data(data, tracks, music_dir):
|
||||
def _convert_mpd_data(data, tracks):
|
||||
if not data:
|
||||
return
|
||||
|
||||
# NOTE kwargs dict keys must be bytestrings to work on Python < 2.6.5
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details.
|
||||
|
||||
track_kwargs = {}
|
||||
album_kwargs = {}
|
||||
artist_kwargs = {}
|
||||
@ -110,66 +110,62 @@ def _convert_mpd_data(data, tracks, music_dir):
|
||||
|
||||
if 'track' in data:
|
||||
if '/' in data['track']:
|
||||
album_kwargs[b'num_tracks'] = int(data['track'].split('/')[1])
|
||||
track_kwargs[b'track_no'] = int(data['track'].split('/')[0])
|
||||
album_kwargs['num_tracks'] = int(data['track'].split('/')[1])
|
||||
track_kwargs['track_no'] = int(data['track'].split('/')[0])
|
||||
else:
|
||||
track_kwargs[b'track_no'] = int(data['track'])
|
||||
track_kwargs['track_no'] = int(data['track'])
|
||||
|
||||
if 'mtime' in data:
|
||||
track_kwargs['last_modified'] = int(data['mtime'])
|
||||
|
||||
if 'artist' in data:
|
||||
artist_kwargs[b'name'] = data['artist']
|
||||
albumartist_kwargs[b'name'] = data['artist']
|
||||
artist_kwargs['name'] = data['artist']
|
||||
albumartist_kwargs['name'] = data['artist']
|
||||
|
||||
if 'albumartist' in data:
|
||||
albumartist_kwargs[b'name'] = data['albumartist']
|
||||
albumartist_kwargs['name'] = data['albumartist']
|
||||
|
||||
if 'album' in data:
|
||||
album_kwargs[b'name'] = data['album']
|
||||
album_kwargs['name'] = data['album']
|
||||
|
||||
if 'title' in data:
|
||||
track_kwargs[b'name'] = data['title']
|
||||
track_kwargs['name'] = data['title']
|
||||
|
||||
if 'date' in data:
|
||||
track_kwargs[b'date'] = data['date']
|
||||
track_kwargs['date'] = data['date']
|
||||
|
||||
if 'musicbrainz_trackid' in data:
|
||||
track_kwargs[b'musicbrainz_id'] = data['musicbrainz_trackid']
|
||||
track_kwargs['musicbrainz_id'] = data['musicbrainz_trackid']
|
||||
|
||||
if 'musicbrainz_albumid' in data:
|
||||
album_kwargs[b'musicbrainz_id'] = data['musicbrainz_albumid']
|
||||
album_kwargs['musicbrainz_id'] = data['musicbrainz_albumid']
|
||||
|
||||
if 'musicbrainz_artistid' in data:
|
||||
artist_kwargs[b'musicbrainz_id'] = data['musicbrainz_artistid']
|
||||
artist_kwargs['musicbrainz_id'] = data['musicbrainz_artistid']
|
||||
|
||||
if 'musicbrainz_albumartistid' in data:
|
||||
albumartist_kwargs[b'musicbrainz_id'] = (
|
||||
albumartist_kwargs['musicbrainz_id'] = (
|
||||
data['musicbrainz_albumartistid'])
|
||||
|
||||
if artist_kwargs:
|
||||
artist = Artist(**artist_kwargs)
|
||||
track_kwargs[b'artists'] = [artist]
|
||||
track_kwargs['artists'] = [artist]
|
||||
|
||||
if albumartist_kwargs:
|
||||
albumartist = Artist(**albumartist_kwargs)
|
||||
album_kwargs[b'artists'] = [albumartist]
|
||||
album_kwargs['artists'] = [albumartist]
|
||||
|
||||
if album_kwargs:
|
||||
album = Album(**album_kwargs)
|
||||
track_kwargs[b'album'] = album
|
||||
track_kwargs['album'] = album
|
||||
|
||||
if data['file'][0] == '/':
|
||||
path = data['file'][1:]
|
||||
else:
|
||||
path = data['file']
|
||||
path = urllib.unquote(path.encode('utf-8'))
|
||||
|
||||
if isinstance(music_dir, unicode):
|
||||
music_dir = music_dir.encode('utf-8')
|
||||
|
||||
# Make sure we only pass bytestrings to path_to_uri to avoid implicit
|
||||
# decoding of bytestrings to unicode strings
|
||||
track_kwargs[b'uri'] = path_to_uri(music_dir, path)
|
||||
|
||||
track_kwargs[b'length'] = int(data.get('time', 0)) * 1000
|
||||
track_kwargs['uri'] = 'local:track:%s' % path
|
||||
track_kwargs['length'] = int(data.get('time', 0)) * 1000
|
||||
|
||||
track = Track(**track_kwargs)
|
||||
tracks.add(track)
|
||||
|
||||
@ -169,8 +169,6 @@ class SpotifyLibraryProvider(base.BaseLibraryProvider):
|
||||
translator.to_mopidy_track(t) for t in results.tracks()])
|
||||
future.set(search_result)
|
||||
|
||||
# Wait always returns None on python 2.6 :/
|
||||
self.backend.spotify.connected.wait(self._timeout)
|
||||
if not self.backend.spotify.connected.is_set():
|
||||
logger.debug('Not connected: Spotify search cancelled')
|
||||
return SearchResult(uri='spotify:search')
|
||||
|
||||
@ -18,9 +18,6 @@ logger = logging.getLogger('mopidy.backends.spotify')
|
||||
|
||||
BITRATES = {96: 2, 160: 0, 320: 1}
|
||||
|
||||
# pylint: disable = R0901
|
||||
# SpotifySessionManager: Too many ancestors (9/7)
|
||||
|
||||
|
||||
class SpotifySessionManager(process.BaseThread, PyspotifySessionManager):
|
||||
cache_location = None
|
||||
@ -33,9 +30,17 @@ class SpotifySessionManager(process.BaseThread, PyspotifySessionManager):
|
||||
self.cache_location = config['spotify']['cache_dir']
|
||||
self.settings_location = config['spotify']['cache_dir']
|
||||
|
||||
full_proxy = ''
|
||||
if config['proxy']['hostname']:
|
||||
full_proxy = config['proxy']['hostname']
|
||||
if config['proxy']['port']:
|
||||
full_proxy += ':' + str(config['proxy']['port'])
|
||||
if config['proxy']['scheme']:
|
||||
full_proxy = config['proxy']['scheme'] + "://" + full_proxy
|
||||
|
||||
PyspotifySessionManager.__init__(
|
||||
self, config['spotify']['username'], config['spotify']['password'],
|
||||
proxy=config['proxy']['hostname'],
|
||||
proxy=full_proxy,
|
||||
proxy_username=config['proxy']['username'],
|
||||
proxy_password=config['proxy']['password'])
|
||||
|
||||
@ -108,9 +113,6 @@ class SpotifySessionManager(process.BaseThread, PyspotifySessionManager):
|
||||
def music_delivery(self, session, frames, frame_size, num_frames,
|
||||
sample_type, sample_rate, channels):
|
||||
"""Callback used by pyspotify"""
|
||||
# pylint: disable = R0913
|
||||
# Too many arguments (8/5)
|
||||
|
||||
if not self.push_audio_data:
|
||||
return 0
|
||||
|
||||
@ -173,9 +175,14 @@ class SpotifySessionManager(process.BaseThread, PyspotifySessionManager):
|
||||
logger.debug('Still getting data; skipped refresh of playlists')
|
||||
return
|
||||
playlists = []
|
||||
folders = []
|
||||
for spotify_playlist in self.session.playlist_container():
|
||||
if spotify_playlist.type() == 'folder_start':
|
||||
folders.append(spotify_playlist)
|
||||
if spotify_playlist.type() == 'folder_end':
|
||||
folders.pop()
|
||||
playlists.append(translator.to_mopidy_playlist(
|
||||
spotify_playlist,
|
||||
spotify_playlist, folders=folders,
|
||||
bitrate=self.bitrate, username=self.username))
|
||||
playlists.append(translator.to_mopidy_playlist(
|
||||
self.session.starred(),
|
||||
|
||||
@ -67,7 +67,8 @@ def to_mopidy_track(spotify_track, bitrate=None):
|
||||
return track_cache[uri]
|
||||
|
||||
|
||||
def to_mopidy_playlist(spotify_playlist, bitrate=None, username=None):
|
||||
def to_mopidy_playlist(
|
||||
spotify_playlist, folders=None, bitrate=None, username=None):
|
||||
if spotify_playlist is None or spotify_playlist.type() != 'playlist':
|
||||
return
|
||||
try:
|
||||
@ -78,6 +79,9 @@ def to_mopidy_playlist(spotify_playlist, bitrate=None, username=None):
|
||||
if not spotify_playlist.is_loaded():
|
||||
return Playlist(uri=uri, name='[loading...]')
|
||||
name = spotify_playlist.name()
|
||||
if folders:
|
||||
folder_names = '/'.join(folder.name() for folder in folders)
|
||||
name = folder_names + '/' + name
|
||||
tracks = [
|
||||
to_mopidy_track(spotify_track, bitrate=bitrate)
|
||||
for spotify_track in spotify_playlist
|
||||
|
||||
@ -1,6 +1,7 @@
|
||||
[stream]
|
||||
enabled = true
|
||||
protocols =
|
||||
file
|
||||
http
|
||||
https
|
||||
mms
|
||||
|
||||
83
mopidy/commands.py
Normal file
83
mopidy/commands.py
Normal file
@ -0,0 +1,83 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import argparse
|
||||
import sys
|
||||
|
||||
from mopidy import config as config_lib, ext
|
||||
from mopidy.utils import deps, versioning
|
||||
|
||||
|
||||
def config_files_type(value):
|
||||
return value.split(b':')
|
||||
|
||||
|
||||
def config_override_type(value):
|
||||
try:
|
||||
section, remainder = value.split(b'/', 1)
|
||||
key, value = remainder.split(b'=', 1)
|
||||
return (section.strip(), key.strip(), value.strip())
|
||||
except ValueError:
|
||||
raise argparse.ArgumentTypeError(
|
||||
'%s must have the format section/key=value' % value)
|
||||
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
'--version', action='version',
|
||||
version='Mopidy %s' % versioning.get_version())
|
||||
parser.add_argument(
|
||||
'-q', '--quiet',
|
||||
action='store_const', const=-1, dest='verbosity_level',
|
||||
help='less output (warning level)')
|
||||
parser.add_argument(
|
||||
'-v', '--verbose',
|
||||
action='count', dest='verbosity_level',
|
||||
help='more output (debug level)')
|
||||
parser.add_argument(
|
||||
'--save-debug-log',
|
||||
action='store_true', dest='save_debug_log',
|
||||
help='save debug log to "./mopidy.log"')
|
||||
parser.add_argument(
|
||||
'--show-config',
|
||||
action='store_true', dest='show_config',
|
||||
help='show current config')
|
||||
parser.add_argument(
|
||||
'--show-deps',
|
||||
action='store_true', dest='show_deps',
|
||||
help='show dependencies and their versions')
|
||||
parser.add_argument(
|
||||
'--config',
|
||||
action='store', dest='config_files', type=config_files_type,
|
||||
default=b'$XDG_CONFIG_DIR/mopidy/mopidy.conf',
|
||||
help='config files to use, colon seperated, later files override')
|
||||
parser.add_argument(
|
||||
'-o', '--option',
|
||||
action='append', dest='config_overrides', type=config_override_type,
|
||||
help='`section/key=value` values to override config options')
|
||||
|
||||
|
||||
def show_config(args):
|
||||
"""Prints the effective config and exits."""
|
||||
extensions = ext.load_extensions()
|
||||
config, errors = config_lib.load(
|
||||
args.config_files, extensions, args.config_overrides)
|
||||
|
||||
# Clear out any config for disabled extensions.
|
||||
for extension in extensions:
|
||||
if not ext.validate_extension(extension):
|
||||
config[extension.ext_name] = {b'enabled': False}
|
||||
errors[extension.ext_name] = {
|
||||
b'enabled': b'extension disabled itself.'}
|
||||
elif not config[extension.ext_name]['enabled']:
|
||||
config[extension.ext_name] = {b'enabled': False}
|
||||
errors[extension.ext_name] = {
|
||||
b'enabled': b'extension disabled by config.'}
|
||||
|
||||
print config_lib.format(config, extensions, errors)
|
||||
sys.exit(0)
|
||||
|
||||
|
||||
def show_deps():
|
||||
"""Prints a list of all dependencies and exits."""
|
||||
print deps.format_dependency_list()
|
||||
sys.exit(0)
|
||||
@ -5,8 +5,9 @@ import io
|
||||
import logging
|
||||
import os.path
|
||||
|
||||
from mopidy.config.schemas import *
|
||||
from mopidy.config.types import *
|
||||
from mopidy.config import keyring
|
||||
from mopidy.config.schemas import * # noqa
|
||||
from mopidy.config.types import * # noqa
|
||||
from mopidy.utils import path
|
||||
|
||||
logger = logging.getLogger('mopidy.config')
|
||||
@ -23,9 +24,13 @@ _audio_schema = ConfigSchema('audio')
|
||||
_audio_schema['mixer'] = String()
|
||||
_audio_schema['mixer_track'] = String(optional=True)
|
||||
_audio_schema['output'] = String()
|
||||
_audio_schema['visualizer'] = String(optional=True)
|
||||
|
||||
_proxy_schema = ConfigSchema('proxy')
|
||||
_proxy_schema['scheme'] = String(optional=True,
|
||||
choices=['http', 'https', 'socks4', 'socks5'])
|
||||
_proxy_schema['hostname'] = Hostname(optional=True)
|
||||
_proxy_schema['port'] = Port(optional=True)
|
||||
_proxy_schema['username'] = String(optional=True)
|
||||
_proxy_schema['password'] = Secret(optional=True)
|
||||
|
||||
@ -47,7 +52,7 @@ def load(files, extensions, overrides):
|
||||
config_dir = os.path.dirname(__file__)
|
||||
defaults = [read(os.path.join(config_dir, 'default.conf'))]
|
||||
defaults.extend(e.get_default_config() for e in extensions)
|
||||
raw_config = _load(files, defaults, overrides)
|
||||
raw_config = _load(files, defaults, keyring.fetch() + (overrides or []))
|
||||
|
||||
schemas = _schemas[:]
|
||||
schemas.extend(e.get_config_schema() for e in extensions)
|
||||
@ -86,8 +91,8 @@ def _load(files, defaults, overrides):
|
||||
filename)
|
||||
except configparser.ParsingError as e:
|
||||
linenos = ', '.join(str(lineno) for lineno, line in e.errors)
|
||||
logger.warning('%s has errors, line %s has been ignored.',
|
||||
filename, linenos)
|
||||
logger.warning(
|
||||
'%s has errors, line %s has been ignored.', filename, linenos)
|
||||
except IOError:
|
||||
# TODO: if this is the initial load of logging config we might not
|
||||
# have a logger at this point, we might want to handle this better.
|
||||
@ -101,7 +106,7 @@ def _load(files, defaults, overrides):
|
||||
for section in parser.sections():
|
||||
raw_config[section] = dict(parser.items(section))
|
||||
|
||||
for section, key, value in overrides or []:
|
||||
for section, key, value in overrides:
|
||||
raw_config.setdefault(section, {})[key] = value
|
||||
|
||||
return raw_config
|
||||
@ -124,7 +129,8 @@ def _validate(raw_config, schemas):
|
||||
def _format(config, comments, schemas, display):
|
||||
output = []
|
||||
for schema in schemas:
|
||||
serialized = schema.serialize(config.get(schema.name, {}), display=display)
|
||||
serialized = schema.serialize(
|
||||
config.get(schema.name, {}), display=display)
|
||||
if not serialized:
|
||||
continue
|
||||
output.append(b'[%s]' % bytes(schema.name))
|
||||
@ -139,13 +145,6 @@ def _format(config, comments, schemas, display):
|
||||
return b'\n'.join(output)
|
||||
|
||||
|
||||
def parse_override(override):
|
||||
"""Parse ``section/key=value`` command line overrides"""
|
||||
section, remainder = override.split(b'/', 1)
|
||||
key, value = remainder.split(b'=', 1)
|
||||
return (section.strip(), key.strip(), value.strip())
|
||||
|
||||
|
||||
class Proxy(collections.Mapping):
|
||||
def __init__(self, data):
|
||||
self._data = data
|
||||
|
||||
@ -39,6 +39,7 @@ def convert(settings):
|
||||
helper('audio/output', 'OUTPUT')
|
||||
|
||||
helper('proxy/hostname', 'SPOTIFY_PROXY_HOST')
|
||||
helper('proxy/port', 'SPOTIFY_PROXY_PORT')
|
||||
helper('proxy/username', 'SPOTIFY_PROXY_USERNAME')
|
||||
helper('proxy/password', 'SPOTIFY_PROXY_PASSWORD')
|
||||
|
||||
|
||||
@ -11,8 +11,11 @@ pykka = info
|
||||
mixer = autoaudiomixer
|
||||
mixer_track =
|
||||
output = autoaudiosink
|
||||
visualizer =
|
||||
|
||||
[proxy]
|
||||
scheme =
|
||||
hostname =
|
||||
port =
|
||||
username =
|
||||
password =
|
||||
|
||||
163
mopidy/config/keyring.py
Normal file
163
mopidy/config/keyring.py
Normal file
@ -0,0 +1,163 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger('mopidy.config.keyring')
|
||||
|
||||
try:
|
||||
import dbus
|
||||
except ImportError:
|
||||
dbus = None
|
||||
|
||||
|
||||
# XXX: Hack to workaround introspection bug caused by gnome-keyring, should be
|
||||
# fixed by version 3.5 per:
|
||||
# https://git.gnome.org/browse/gnome-keyring/commit/?id=5dccbe88eb94eea9934e2b7
|
||||
if dbus:
|
||||
EMPTY_STRING = dbus.String('', variant_level=1)
|
||||
else:
|
||||
EMPTY_STRING = ''
|
||||
|
||||
|
||||
def fetch():
|
||||
if not dbus:
|
||||
logger.debug('Fetching from keyring failed: dbus not installed.')
|
||||
return []
|
||||
|
||||
try:
|
||||
bus = dbus.SessionBus()
|
||||
except dbus.exceptions.DBusException as e:
|
||||
logger.debug('Fetching from keyring failed: %s', e)
|
||||
return []
|
||||
|
||||
if not bus.name_has_owner('org.freedesktop.secrets'):
|
||||
logger.debug(
|
||||
'Fetching from keyring failed: secrets service not running.')
|
||||
return []
|
||||
|
||||
service = _service(bus)
|
||||
session = service.OpenSession('plain', EMPTY_STRING)[1]
|
||||
items, locked = service.SearchItems({'service': 'mopidy'})
|
||||
|
||||
if not locked and not items:
|
||||
return []
|
||||
|
||||
if locked:
|
||||
# There is a chance we can unlock without prompting the users...
|
||||
items, prompt = service.Unlock(locked)
|
||||
if prompt != '/':
|
||||
_prompt(bus, prompt).Dismiss()
|
||||
logger.debug('Fetching from keyring failed: keyring is locked.')
|
||||
return []
|
||||
|
||||
result = []
|
||||
secrets = service.GetSecrets(items, session, byte_arrays=True)
|
||||
for item_path, values in secrets.iteritems():
|
||||
session_path, parameters, value, content_type = values
|
||||
attrs = _item_attributes(bus, item_path)
|
||||
result.append((attrs['section'], attrs['key'], bytes(value)))
|
||||
return result
|
||||
|
||||
|
||||
def set(section, key, value):
|
||||
"""Store a secret config value for a given section/key.
|
||||
|
||||
Indicates if storage failed or succeeded.
|
||||
"""
|
||||
if not dbus:
|
||||
logger.debug('Saving %s/%s to keyring failed: dbus not installed.',
|
||||
section, key)
|
||||
return False
|
||||
|
||||
try:
|
||||
bus = dbus.SessionBus()
|
||||
except dbus.exceptions.DBusException as e:
|
||||
logger.debug('Saving %s/%s to keyring failed: %s', section, key, e)
|
||||
return False
|
||||
|
||||
if not bus.name_has_owner('org.freedesktop.secrets'):
|
||||
logger.debug(
|
||||
'Saving %s/%s to keyring failed: secrets service not running.',
|
||||
section, key)
|
||||
return False
|
||||
|
||||
service = _service(bus)
|
||||
collection = _collection(bus)
|
||||
if not collection:
|
||||
return False
|
||||
|
||||
if isinstance(value, unicode):
|
||||
value = value.encode('utf-8')
|
||||
|
||||
session = service.OpenSession('plain', EMPTY_STRING)[1]
|
||||
secret = dbus.Struct((session, '', dbus.ByteArray(value),
|
||||
'plain/text; charset=utf8'))
|
||||
label = 'mopidy: %s/%s' % (section, key)
|
||||
attributes = {'service': 'mopidy', 'section': section, 'key': key}
|
||||
properties = {'org.freedesktop.Secret.Item.Label': label,
|
||||
'org.freedesktop.Secret.Item.Attributes': attributes}
|
||||
|
||||
try:
|
||||
item, prompt = collection.CreateItem(properties, secret, True)
|
||||
except dbus.exceptions.DBusException as e:
|
||||
# TODO: catch IsLocked errors etc.
|
||||
logger.debug('Saving %s/%s to keyring failed: %s', section, key, e)
|
||||
return False
|
||||
|
||||
if prompt == '/':
|
||||
return True
|
||||
|
||||
_prompt(bus, prompt).Dismiss()
|
||||
logger.debug('Saving secret %s/%s failed: Keyring is locked',
|
||||
section, key)
|
||||
return False
|
||||
|
||||
|
||||
def _service(bus):
|
||||
return _interface(bus, '/org/freedesktop/secrets',
|
||||
'org.freedesktop.Secret.Service')
|
||||
|
||||
|
||||
# NOTE: depending on versions and setup 'default' might not exists, so try and
|
||||
# use it but fall back to the 'login' collection, and finally the 'session' one
|
||||
# if all else fails. We should probably create a keyring/collection setting
|
||||
# that allows users to set this so they have control over where their secrets
|
||||
# get stored.
|
||||
def _collection(bus):
|
||||
for name in 'aliases/default', 'collection/login', 'collection/session':
|
||||
path = '/org/freedesktop/secrets/' + name
|
||||
if _collection_exists(bus, path):
|
||||
break
|
||||
else:
|
||||
return None
|
||||
return _interface(bus, path, 'org.freedesktop.Secret.Collection')
|
||||
|
||||
|
||||
# NOTE: Hack to probe if a given collection actually exists. Needed to work
|
||||
# around an introspection bug in setting passwords for non-existant aliases.
|
||||
def _collection_exists(bus, path):
|
||||
try:
|
||||
item = _interface(bus, path, 'org.freedesktop.DBus.Properties')
|
||||
item.Get('org.freedesktop.Secret.Collection', 'Label')
|
||||
return True
|
||||
except dbus.exceptions.DBusException:
|
||||
return False
|
||||
|
||||
|
||||
# NOTE: We could call prompt.Prompt('') to unlock the keyring when it is not
|
||||
# '/', but we would then also have to arrange to setup signals to wait until
|
||||
# this has been completed. So for now we just dismiss the prompt and expect
|
||||
# keyrings to be unlocked.
|
||||
def _prompt(bus, path):
|
||||
return _interface(bus, path, 'Prompt')
|
||||
|
||||
|
||||
def _item_attributes(bus, path):
|
||||
item = _interface(bus, path, 'org.freedesktop.DBus.Properties')
|
||||
result = item.Get('org.freedesktop.Secret.Item', 'Attributes')
|
||||
return dict((bytes(k), bytes(v)) for k, v in result.iteritems())
|
||||
|
||||
|
||||
def _interface(bus, path, interface):
|
||||
obj = bus.get_object('org.freedesktop.secrets', path)
|
||||
return dbus.Interface(obj, interface)
|
||||
@ -4,9 +4,6 @@ import collections
|
||||
|
||||
from mopidy.config import types
|
||||
|
||||
# TODO: 2.6 cleanup (#344).
|
||||
ordered_dict = getattr(collections, 'OrderedDict', dict)
|
||||
|
||||
|
||||
def _did_you_mean(name, choices):
|
||||
"""Suggest most likely setting based on levenshtein."""
|
||||
@ -40,7 +37,7 @@ def _levenshtein(a, b):
|
||||
return current[n]
|
||||
|
||||
|
||||
class ConfigSchema(object):
|
||||
class ConfigSchema(collections.OrderedDict):
|
||||
"""Logical group of config values that correspond to a config section.
|
||||
|
||||
Schemas are set up by assigning config keys with config values to
|
||||
@ -50,19 +47,9 @@ class ConfigSchema(object):
|
||||
:meth:`serialize` for converting the values to a form suitable for
|
||||
persistence.
|
||||
"""
|
||||
# TODO: Use collections.OrderedDict once 2.6 support is gone (#344)
|
||||
def __init__(self, name):
|
||||
super(ConfigSchema, self).__init__()
|
||||
self.name = name
|
||||
self._schema = {}
|
||||
self._order = []
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
if key not in self._schema:
|
||||
self._order.append(key)
|
||||
self._schema[key] = value
|
||||
|
||||
def __getitem__(self, key):
|
||||
return self._schema[key]
|
||||
|
||||
def deserialize(self, values):
|
||||
"""Validates the given ``values`` using the config schema.
|
||||
@ -73,17 +60,17 @@ class ConfigSchema(object):
|
||||
|
||||
for key, value in values.items():
|
||||
try:
|
||||
result[key] = self._schema[key].deserialize(value)
|
||||
result[key] = self[key].deserialize(value)
|
||||
except KeyError: # not in our schema
|
||||
errors[key] = 'unknown config key.'
|
||||
suggestion = _did_you_mean(key, self._schema.keys())
|
||||
suggestion = _did_you_mean(key, self.keys())
|
||||
if suggestion:
|
||||
errors[key] += ' Did you mean %s?' % suggestion
|
||||
except ValueError as e: # deserialization failed
|
||||
result[key] = None
|
||||
errors[key] = str(e)
|
||||
|
||||
for key in self._schema:
|
||||
for key in self.keys():
|
||||
if key not in result and key not in errors:
|
||||
result[key] = None
|
||||
errors[key] = 'config key not found.'
|
||||
@ -97,10 +84,10 @@ class ConfigSchema(object):
|
||||
will be masked out.
|
||||
|
||||
Returns a dict of config keys and values."""
|
||||
result = ordered_dict() # TODO: 2.6 cleanup (#344).
|
||||
for key in self._order:
|
||||
result = collections.OrderedDict()
|
||||
for key in self.keys():
|
||||
if key in values:
|
||||
result[key] = self._schema[key].serialize(values[key], display)
|
||||
result[key] = self[key].serialize(values[key], display)
|
||||
return result
|
||||
|
||||
|
||||
@ -109,7 +96,8 @@ class LogLevelConfigSchema(object):
|
||||
|
||||
Expects the config keys to be logger names and the values to be log levels
|
||||
as understood by the :class:`LogLevel` config value. Does not sub-class
|
||||
:class:`ConfigSchema`, but implements the same interface.
|
||||
:class:`ConfigSchema`, but implements the same serialize/deserialize
|
||||
interface.
|
||||
"""
|
||||
def __init__(self, name):
|
||||
self.name = name
|
||||
@ -128,7 +116,7 @@ class LogLevelConfigSchema(object):
|
||||
return result, errors
|
||||
|
||||
def serialize(self, values, display=False):
|
||||
result = ordered_dict() # TODO: 2.6 cleanup (#344)
|
||||
result = collections.OrderedDict()
|
||||
for key in sorted(values.keys()):
|
||||
result[key] = self._config_value.serialize(values[key], display)
|
||||
return result
|
||||
|
||||
@ -3,7 +3,6 @@ from __future__ import unicode_literals
|
||||
import logging
|
||||
import re
|
||||
import socket
|
||||
import sys
|
||||
|
||||
from mopidy.utils import path
|
||||
from mopidy.config import validators
|
||||
@ -72,9 +71,9 @@ class String(ConfigValue):
|
||||
def deserialize(self, value):
|
||||
value = decode(value).strip()
|
||||
validators.validate_required(value, self._required)
|
||||
validators.validate_choice(value, self._choices)
|
||||
if not value:
|
||||
return None
|
||||
validators.validate_choice(value, self._choices)
|
||||
return value
|
||||
|
||||
def serialize(self, value, display=False):
|
||||
@ -83,41 +82,38 @@ class String(ConfigValue):
|
||||
return encode(value)
|
||||
|
||||
|
||||
class Secret(ConfigValue):
|
||||
"""Secret value.
|
||||
class Secret(String):
|
||||
"""Secret string value.
|
||||
|
||||
Should be used for passwords, auth tokens etc. Deserializing will not
|
||||
convert to unicode. Will mask value when being displayed.
|
||||
Is decoded as utf-8 and \\n \\t escapes should work and be preserved.
|
||||
|
||||
Should be used for passwords, auth tokens etc. Will mask value when being
|
||||
displayed.
|
||||
"""
|
||||
def __init__(self, optional=False, choices=None):
|
||||
self._required = not optional
|
||||
|
||||
def deserialize(self, value):
|
||||
value = value.strip()
|
||||
validators.validate_required(value, self._required)
|
||||
if not value:
|
||||
return None
|
||||
return value
|
||||
self._choices = None # Choices doesn't make sense for secrets
|
||||
|
||||
def serialize(self, value, display=False):
|
||||
if isinstance(value, unicode):
|
||||
value = value.encode('utf-8')
|
||||
if value is None:
|
||||
return b''
|
||||
elif display:
|
||||
if value is not None and display:
|
||||
return b'********'
|
||||
return value
|
||||
return super(Secret, self).serialize(value, display)
|
||||
|
||||
|
||||
class Integer(ConfigValue):
|
||||
"""Integer value."""
|
||||
|
||||
def __init__(self, minimum=None, maximum=None, choices=None):
|
||||
def __init__(
|
||||
self, minimum=None, maximum=None, choices=None, optional=False):
|
||||
self._required = not optional
|
||||
self._minimum = minimum
|
||||
self._maximum = maximum
|
||||
self._choices = choices
|
||||
|
||||
def deserialize(self, value):
|
||||
validators.validate_required(value, self._required)
|
||||
if not value:
|
||||
return None
|
||||
value = int(value)
|
||||
validators.validate_choice(value, self._choices)
|
||||
validators.validate_minimum(value, self._minimum)
|
||||
@ -223,8 +219,9 @@ class Port(Integer):
|
||||
allocate a port for us.
|
||||
"""
|
||||
# TODO: consider probing if port is free or not?
|
||||
def __init__(self, choices=None):
|
||||
super(Port, self).__init__(minimum=0, maximum=2**16-1, choices=choices)
|
||||
def __init__(self, choices=None, optional=False):
|
||||
super(Port, self).__init__(
|
||||
minimum=0, maximum=2 ** 16 - 1, choices=choices, optional=optional)
|
||||
|
||||
|
||||
class Path(ConfigValue):
|
||||
@ -256,7 +253,7 @@ class Path(ConfigValue):
|
||||
|
||||
def serialize(self, value, display=False):
|
||||
if isinstance(value, unicode):
|
||||
value = value.encode(sys.getfilesystemencoding())
|
||||
raise ValueError('paths should always be bytes')
|
||||
if isinstance(value, ExpandedPath):
|
||||
return value.original
|
||||
return value
|
||||
|
||||
@ -69,8 +69,9 @@ class LibraryController(object):
|
||||
"""
|
||||
query = query or kwargs
|
||||
futures = [
|
||||
backend.library.find_exact(query=query, uris=uris)
|
||||
for (backend, uris) in self._get_backends_to_uris(uris).items()]
|
||||
backend.library.find_exact(query=query, uris=backend_uris)
|
||||
for (backend, backend_uris)
|
||||
in self._get_backends_to_uris(uris).items()]
|
||||
return [result for result in pykka.get_all(futures) if result]
|
||||
|
||||
def lookup(self, uri):
|
||||
@ -145,6 +146,7 @@ class LibraryController(object):
|
||||
"""
|
||||
query = query or kwargs
|
||||
futures = [
|
||||
backend.library.search(query=query, uris=uris)
|
||||
for (backend, uris) in self._get_backends_to_uris(uris).items()]
|
||||
backend.library.search(query=query, uris=backend_uris)
|
||||
for (backend, backend_uris)
|
||||
in self._get_backends_to_uris(uris).items()]
|
||||
return [result for result in pykka.get_all(futures) if result]
|
||||
|
||||
@ -13,9 +13,6 @@ logger = logging.getLogger('mopidy.core')
|
||||
|
||||
|
||||
class PlaybackController(object):
|
||||
# pylint: disable = R0902
|
||||
# Too many instance attributes
|
||||
|
||||
pykka_traversable = True
|
||||
|
||||
def __init__(self, audio, backends, core):
|
||||
@ -175,9 +172,6 @@ class PlaybackController(object):
|
||||
"""
|
||||
|
||||
def get_tl_track_at_eot(self):
|
||||
# pylint: disable = R0911
|
||||
# Too many return statements
|
||||
|
||||
tl_tracks = self.core.tracklist.tl_tracks
|
||||
|
||||
if not tl_tracks:
|
||||
@ -401,6 +395,7 @@ class PlaybackController(object):
|
||||
if self.random and self._shuffled:
|
||||
self._shuffled.remove(tl_track)
|
||||
if on_error_step == 1:
|
||||
# TODO: can cause an endless loop for single track repeat.
|
||||
self.next()
|
||||
elif on_error_step == -1:
|
||||
self.previous()
|
||||
|
||||
@ -79,6 +79,15 @@ class Extension(object):
|
||||
"""
|
||||
return []
|
||||
|
||||
def get_library_updaters(self):
|
||||
"""List of library updater classes
|
||||
|
||||
:returns: list of
|
||||
:class:`~mopidy.backends.base.BaseLibraryUpdateProvider`
|
||||
subclasses
|
||||
"""
|
||||
return []
|
||||
|
||||
def register_gstreamer_elements(self):
|
||||
"""Hook for registering custom GStreamer elements
|
||||
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
/*! Mopidy.js - built 2013-03-31
|
||||
/*! Mopidy.js - built 2013-09-17
|
||||
* http://www.mopidy.com/
|
||||
* Copyright (c) 2013 Stein Magnus Jodal and contributors
|
||||
* Licensed under the Apache License, Version 2.0 */
|
||||
((typeof define === "function" && define.amd && function (m) { define(m); }) ||
|
||||
((typeof define === "function" && define.amd && function (m) { define("bane", m); }) ||
|
||||
(typeof module === "object" && function (m) { module.exports = m(); }) ||
|
||||
function (m) { this.bane = m(); }
|
||||
)(function () {
|
||||
@ -148,7 +148,7 @@
|
||||
notifyListener(event, toNotify[i], args);
|
||||
}
|
||||
|
||||
toNotify = listeners(this, event).slice()
|
||||
toNotify = listeners(this, event).slice();
|
||||
args = slice.call(arguments, 1);
|
||||
for (i = 0, l = toNotify.length; i < l; ++i) {
|
||||
notifyListener(event, toNotify[i], args);
|
||||
@ -187,27 +187,30 @@ if (typeof window !== "undefined") {
|
||||
*
|
||||
* @author Brian Cavalier
|
||||
* @author John Hann
|
||||
* @version 2.0.0
|
||||
* @version 2.4.0
|
||||
*/
|
||||
(function(define) { 'use strict';
|
||||
define(function () {
|
||||
(function(define, global) { 'use strict';
|
||||
define(function (require) {
|
||||
|
||||
// Public API
|
||||
|
||||
when.defer = defer; // Create a deferred
|
||||
when.promise = promise; // Create a pending promise
|
||||
when.resolve = resolve; // Create a resolved promise
|
||||
when.reject = reject; // Create a rejected promise
|
||||
when.defer = defer; // Create a {promise, resolver} pair
|
||||
|
||||
when.join = join; // Join 2 or more promises
|
||||
|
||||
when.all = all; // Resolve a list of promises
|
||||
when.map = map; // Array.map() for promises
|
||||
when.reduce = reduce; // Array.reduce() for promises
|
||||
when.settle = settle; // Settle a list of promises
|
||||
|
||||
when.any = any; // One-winner race
|
||||
when.some = some; // Multi-winner race
|
||||
|
||||
when.isPromise = isPromise; // Determine if a thing is a promise
|
||||
when.isPromise = isPromiseLike; // DEPRECATED: use isPromiseLike
|
||||
when.isPromiseLike = isPromiseLike; // Is something promise-like, aka thenable
|
||||
|
||||
/**
|
||||
* Register an observer for a promise or immediate value.
|
||||
@ -235,13 +238,35 @@ define(function () {
|
||||
* a trusted when.js promise. Any other duck-typed promise is considered
|
||||
* untrusted.
|
||||
* @constructor
|
||||
* @param {function} sendMessage function to deliver messages to the promise's handler
|
||||
* @param {function?} inspect function that reports the promise's state
|
||||
* @name Promise
|
||||
*/
|
||||
function Promise(then) {
|
||||
this.then = then;
|
||||
function Promise(sendMessage, inspect) {
|
||||
this._message = sendMessage;
|
||||
this.inspect = inspect;
|
||||
}
|
||||
|
||||
Promise.prototype = {
|
||||
/**
|
||||
* Register handlers for this promise.
|
||||
* @param [onFulfilled] {Function} fulfillment handler
|
||||
* @param [onRejected] {Function} rejection handler
|
||||
* @param [onProgress] {Function} progress handler
|
||||
* @return {Promise} new Promise
|
||||
*/
|
||||
then: function(onFulfilled, onRejected, onProgress) {
|
||||
/*jshint unused:false*/
|
||||
var args, sendMessage;
|
||||
|
||||
args = arguments;
|
||||
sendMessage = this._message;
|
||||
|
||||
return _promise(function(resolve, reject, notify) {
|
||||
sendMessage('when', args, resolve, notify);
|
||||
}, this._status && this._status.observed());
|
||||
},
|
||||
|
||||
/**
|
||||
* Register a rejection handler. Shortcut for .then(undefined, onRejected)
|
||||
* @param {function?} onRejected
|
||||
@ -262,9 +287,7 @@ define(function () {
|
||||
* @returns {Promise}
|
||||
*/
|
||||
ensure: function(onFulfilledOrRejected) {
|
||||
var self = this;
|
||||
|
||||
return this.then(injectHandler, injectHandler).yield(self);
|
||||
return this.then(injectHandler, injectHandler)['yield'](this);
|
||||
|
||||
function injectHandler() {
|
||||
return resolve(onFulfilledOrRejected());
|
||||
@ -285,6 +308,16 @@ define(function () {
|
||||
});
|
||||
},
|
||||
|
||||
/**
|
||||
* Runs a side effect when this promise fulfills, without changing the
|
||||
* fulfillment value.
|
||||
* @param {function} onFulfilledSideEffect
|
||||
* @returns {Promise}
|
||||
*/
|
||||
tap: function(onFulfilledSideEffect) {
|
||||
return this.then(onFulfilledSideEffect)['yield'](this);
|
||||
},
|
||||
|
||||
/**
|
||||
* Assumes that this promise will fulfill with an array, and arranges
|
||||
* for the onFulfilled to be called with the array as its argument list
|
||||
@ -340,13 +373,16 @@ define(function () {
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new Deferred with fully isolated resolver and promise parts,
|
||||
* either or both of which may be given out safely to consumers.
|
||||
* Creates a {promise, resolver} pair, either or both of which
|
||||
* may be given out safely to consumers.
|
||||
* The resolver has resolve, reject, and progress. The promise
|
||||
* only has then.
|
||||
* has then plus extended promise API.
|
||||
*
|
||||
* @return {{
|
||||
* promise: Promise,
|
||||
* resolve: function:Promise,
|
||||
* reject: function:Promise,
|
||||
* notify: function:Promise
|
||||
* resolver: {
|
||||
* resolve: function:Promise,
|
||||
* reject: function:Promise,
|
||||
@ -394,12 +430,26 @@ define(function () {
|
||||
|
||||
/**
|
||||
* Creates a new promise whose fate is determined by resolver.
|
||||
* @private (for now)
|
||||
* @param {function} resolver function(resolve, reject, notify)
|
||||
* @returns {Promise} promise whose fate is determine by resolver
|
||||
*/
|
||||
function promise(resolver) {
|
||||
var value, handlers = [];
|
||||
return _promise(resolver, monitorApi.PromiseStatus && monitorApi.PromiseStatus());
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a new promise, linked to parent, whose fate is determined
|
||||
* by resolver.
|
||||
* @param {function} resolver function(resolve, reject, notify)
|
||||
* @param {Promise?} status promise from which the new promise is begotten
|
||||
* @returns {Promise} promise whose fate is determine by resolver
|
||||
* @private
|
||||
*/
|
||||
function _promise(resolver, status) {
|
||||
var self, value, consumers = [];
|
||||
|
||||
self = new Promise(_message, inspect);
|
||||
self._status = status;
|
||||
|
||||
// Call the provider resolver to seal the promise's fate
|
||||
try {
|
||||
@ -409,29 +459,34 @@ define(function () {
|
||||
}
|
||||
|
||||
// Return the promise
|
||||
return new Promise(then);
|
||||
return self;
|
||||
|
||||
/**
|
||||
* Register handlers for this promise.
|
||||
* @param [onFulfilled] {Function} fulfillment handler
|
||||
* @param [onRejected] {Function} rejection handler
|
||||
* @param [onProgress] {Function} progress handler
|
||||
* @return {Promise} new Promise
|
||||
* Private message delivery. Queues and delivers messages to
|
||||
* the promise's ultimate fulfillment value or rejection reason.
|
||||
* @private
|
||||
* @param {String} type
|
||||
* @param {Array} args
|
||||
* @param {Function} resolve
|
||||
* @param {Function} notify
|
||||
*/
|
||||
function then(onFulfilled, onRejected, onProgress) {
|
||||
return promise(function(resolve, reject, notify) {
|
||||
handlers
|
||||
// Call handlers later, after resolution
|
||||
? handlers.push(function(value) {
|
||||
value.then(onFulfilled, onRejected, onProgress)
|
||||
.then(resolve, reject, notify);
|
||||
})
|
||||
// Call handlers soon, but not in the current stack
|
||||
: enqueue(function() {
|
||||
value.then(onFulfilled, onRejected, onProgress)
|
||||
.then(resolve, reject, notify);
|
||||
});
|
||||
});
|
||||
function _message(type, args, resolve, notify) {
|
||||
consumers ? consumers.push(deliver) : enqueue(function() { deliver(value); });
|
||||
|
||||
function deliver(p) {
|
||||
p._message(type, args, resolve, notify);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a snapshot of the promise's state at the instant inspect()
|
||||
* is called. The returned object is not live and will not update as
|
||||
* the promise's state changes.
|
||||
* @returns {{ state:String, value?:*, reason?:* }} status snapshot
|
||||
* of the promise.
|
||||
*/
|
||||
function inspect() {
|
||||
return value ? value.inspect() : toPendingState();
|
||||
}
|
||||
|
||||
/**
|
||||
@ -440,14 +495,17 @@ define(function () {
|
||||
* @param {*|Promise} val resolution value
|
||||
*/
|
||||
function promiseResolve(val) {
|
||||
if(!handlers) {
|
||||
if(!consumers) {
|
||||
return;
|
||||
}
|
||||
|
||||
value = coerce(val);
|
||||
scheduleHandlers(handlers, value);
|
||||
scheduleConsumers(consumers, value);
|
||||
consumers = undef;
|
||||
|
||||
handlers = undef;
|
||||
if(status) {
|
||||
updateStatus(value, status);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
@ -463,27 +521,90 @@ define(function () {
|
||||
* @param {*} update progress event payload to pass to all listeners
|
||||
*/
|
||||
function promiseNotify(update) {
|
||||
if(handlers) {
|
||||
scheduleHandlers(handlers, progressing(update));
|
||||
if(consumers) {
|
||||
scheduleConsumers(consumers, progressed(update));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a fulfilled, local promise as a proxy for a value
|
||||
* NOTE: must never be exposed
|
||||
* @param {*} value fulfillment value
|
||||
* @returns {Promise}
|
||||
*/
|
||||
function fulfilled(value) {
|
||||
return near(
|
||||
new NearFulfilledProxy(value),
|
||||
function() { return toFulfilledState(value); }
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a rejected, local promise with the supplied reason
|
||||
* NOTE: must never be exposed
|
||||
* @param {*} reason rejection reason
|
||||
* @returns {Promise}
|
||||
*/
|
||||
function rejected(reason) {
|
||||
return near(
|
||||
new NearRejectedProxy(reason),
|
||||
function() { return toRejectedState(reason); }
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a near promise using the provided proxy
|
||||
* NOTE: must never be exposed
|
||||
* @param {object} proxy proxy for the promise's ultimate value or reason
|
||||
* @param {function} inspect function that returns a snapshot of the
|
||||
* returned near promise's state
|
||||
* @returns {Promise}
|
||||
*/
|
||||
function near(proxy, inspect) {
|
||||
return new Promise(function (type, args, resolve) {
|
||||
try {
|
||||
resolve(proxy[type].apply(proxy, args));
|
||||
} catch(e) {
|
||||
resolve(rejected(e));
|
||||
}
|
||||
}, inspect);
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a progress promise with the supplied update.
|
||||
* @private
|
||||
* @param {*} update
|
||||
* @return {Promise} progress promise
|
||||
*/
|
||||
function progressed(update) {
|
||||
return new Promise(function (type, args, _, notify) {
|
||||
var onProgress = args[2];
|
||||
try {
|
||||
notify(typeof onProgress === 'function' ? onProgress(update) : update);
|
||||
} catch(e) {
|
||||
notify(e);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Coerces x to a trusted Promise
|
||||
*
|
||||
* @private
|
||||
* @param {*} x thing to coerce
|
||||
* @returns {Promise} Guaranteed to return a trusted Promise. If x
|
||||
* @returns {*} Guaranteed to return a trusted Promise. If x
|
||||
* is trusted, returns x, otherwise, returns a new, trusted, already-resolved
|
||||
* Promise whose resolution value is:
|
||||
* * the resolution value of x if it's a foreign promise, or
|
||||
* * x if it's a value
|
||||
*/
|
||||
function coerce(x) {
|
||||
if(x instanceof Promise) {
|
||||
if (x instanceof Promise) {
|
||||
return x;
|
||||
} else if (x !== Object(x)) {
|
||||
}
|
||||
|
||||
if (!(x === Object(x) && 'then' in x)) {
|
||||
return fulfilled(x);
|
||||
}
|
||||
|
||||
@ -510,61 +631,34 @@ define(function () {
|
||||
}
|
||||
|
||||
/**
|
||||
* Create an already-fulfilled promise for the supplied value
|
||||
* @private
|
||||
* Proxy for a near, fulfilled value
|
||||
* @param {*} value
|
||||
* @return {Promise} fulfilled promise
|
||||
* @constructor
|
||||
*/
|
||||
function fulfilled(value) {
|
||||
var self = new Promise(function (onFulfilled) {
|
||||
try {
|
||||
return typeof onFulfilled == 'function'
|
||||
? coerce(onFulfilled(value)) : self;
|
||||
} catch (e) {
|
||||
return rejected(e);
|
||||
}
|
||||
});
|
||||
|
||||
return self;
|
||||
function NearFulfilledProxy(value) {
|
||||
this.value = value;
|
||||
}
|
||||
|
||||
NearFulfilledProxy.prototype.when = function(onResult) {
|
||||
return typeof onResult === 'function' ? onResult(this.value) : this.value;
|
||||
};
|
||||
|
||||
/**
|
||||
* Create an already-rejected promise with the supplied rejection reason.
|
||||
* @private
|
||||
* Proxy for a near rejection
|
||||
* @param {*} reason
|
||||
* @return {Promise} rejected promise
|
||||
* @constructor
|
||||
*/
|
||||
function rejected(reason) {
|
||||
var self = new Promise(function (_, onRejected) {
|
||||
try {
|
||||
return typeof onRejected == 'function'
|
||||
? coerce(onRejected(reason)) : self;
|
||||
} catch (e) {
|
||||
return rejected(e);
|
||||
}
|
||||
});
|
||||
|
||||
return self;
|
||||
function NearRejectedProxy(reason) {
|
||||
this.reason = reason;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a progress promise with the supplied update.
|
||||
* @private
|
||||
* @param {*} update
|
||||
* @return {Promise} progress promise
|
||||
*/
|
||||
function progressing(update) {
|
||||
var self = new Promise(function (_, __, onProgress) {
|
||||
try {
|
||||
return typeof onProgress == 'function'
|
||||
? progressing(onProgress(update)) : self;
|
||||
} catch (e) {
|
||||
return progressing(e);
|
||||
}
|
||||
});
|
||||
|
||||
return self;
|
||||
}
|
||||
NearRejectedProxy.prototype.when = function(_, onError) {
|
||||
if(typeof onError === 'function') {
|
||||
return onError(this.reason);
|
||||
} else {
|
||||
throw this.reason;
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Schedule a task that will process a list of handlers
|
||||
@ -573,7 +667,7 @@ define(function () {
|
||||
* @param {Array} handlers queue of handlers to execute
|
||||
* @param {*} value passed as the only arg to each handler
|
||||
*/
|
||||
function scheduleHandlers(handlers, value) {
|
||||
function scheduleConsumers(handlers, value) {
|
||||
enqueue(function() {
|
||||
var handler, i = 0;
|
||||
while (handler = handlers[i++]) {
|
||||
@ -582,14 +676,23 @@ define(function () {
|
||||
});
|
||||
}
|
||||
|
||||
function updateStatus(value, status) {
|
||||
value.then(statusFulfilled, statusRejected);
|
||||
|
||||
function statusFulfilled() { status.fulfilled(); }
|
||||
function statusRejected(r) { status.rejected(r); }
|
||||
}
|
||||
|
||||
/**
|
||||
* Determines if promiseOrValue is a promise or not
|
||||
*
|
||||
* @param {*} promiseOrValue anything
|
||||
* @returns {boolean} true if promiseOrValue is a {@link Promise}
|
||||
* Determines if x is promise-like, i.e. a thenable object
|
||||
* NOTE: Will return true for *any thenable object*, and isn't truly
|
||||
* safe, since it may attempt to access the `then` property of x (i.e.
|
||||
* clever/malicious getters may do weird things)
|
||||
* @param {*} x anything
|
||||
* @returns {boolean} true if x is promise-like
|
||||
*/
|
||||
function isPromise(promiseOrValue) {
|
||||
return promiseOrValue && typeof promiseOrValue.then === 'function';
|
||||
function isPromiseLike(x) {
|
||||
return x && typeof x.then === 'function';
|
||||
}
|
||||
|
||||
/**
|
||||
@ -601,17 +704,15 @@ define(function () {
|
||||
* @param {Array} promisesOrValues array of anything, may contain a mix
|
||||
* of promises and values
|
||||
* @param howMany {number} number of promisesOrValues to resolve
|
||||
* @param {function?} [onFulfilled] resolution handler
|
||||
* @param {function?} [onRejected] rejection handler
|
||||
* @param {function?} [onProgress] progress handler
|
||||
* @param {function?} [onFulfilled] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onRejected] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onProgress] DEPRECATED, use returnedPromise.then()
|
||||
* @returns {Promise} promise that will resolve to an array of howMany values that
|
||||
* resolved first, or will reject with an array of
|
||||
* (promisesOrValues.length - howMany) + 1 rejection reasons.
|
||||
*/
|
||||
function some(promisesOrValues, howMany, onFulfilled, onRejected, onProgress) {
|
||||
|
||||
checkCallbacks(2, arguments);
|
||||
|
||||
return when(promisesOrValues, function(promisesOrValues) {
|
||||
|
||||
return promise(resolveSome).then(onFulfilled, onRejected, onProgress);
|
||||
@ -635,7 +736,7 @@ define(function () {
|
||||
rejectOne = function(reason) {
|
||||
reasons.push(reason);
|
||||
if(!--toReject) {
|
||||
fulfillOne = rejectOne = noop;
|
||||
fulfillOne = rejectOne = identity;
|
||||
reject(reasons);
|
||||
}
|
||||
};
|
||||
@ -644,7 +745,7 @@ define(function () {
|
||||
// This orders the values based on promise resolution order
|
||||
values.push(val);
|
||||
if (!--toResolve) {
|
||||
fulfillOne = rejectOne = noop;
|
||||
fulfillOne = rejectOne = identity;
|
||||
resolve(values);
|
||||
}
|
||||
};
|
||||
@ -674,9 +775,9 @@ define(function () {
|
||||
*
|
||||
* @param {Array|Promise} promisesOrValues array of anything, may contain a mix
|
||||
* of {@link Promise}s and values
|
||||
* @param {function?} [onFulfilled] resolution handler
|
||||
* @param {function?} [onRejected] rejection handler
|
||||
* @param {function?} [onProgress] progress handler
|
||||
* @param {function?} [onFulfilled] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onRejected] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onProgress] DEPRECATED, use returnedPromise.then()
|
||||
* @returns {Promise} promise that will resolve to the value that resolved first, or
|
||||
* will reject with an array of all rejected inputs.
|
||||
*/
|
||||
@ -697,14 +798,13 @@ define(function () {
|
||||
*
|
||||
* @param {Array|Promise} promisesOrValues array of anything, may contain a mix
|
||||
* of {@link Promise}s and values
|
||||
* @param {function?} [onFulfilled] resolution handler
|
||||
* @param {function?} [onRejected] rejection handler
|
||||
* @param {function?} [onProgress] progress handler
|
||||
* @param {function?} [onFulfilled] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onRejected] DEPRECATED, use returnedPromise.then()
|
||||
* @param {function?} [onProgress] DEPRECATED, use returnedPromise.then()
|
||||
* @returns {Promise}
|
||||
*/
|
||||
function all(promisesOrValues, onFulfilled, onRejected, onProgress) {
|
||||
checkCallbacks(1, arguments);
|
||||
return map(promisesOrValues, identity).then(onFulfilled, onRejected, onProgress);
|
||||
return _map(promisesOrValues, identity).then(onFulfilled, onRejected, onProgress);
|
||||
}
|
||||
|
||||
/**
|
||||
@ -713,28 +813,49 @@ define(function () {
|
||||
* have fulfilled, or will reject when *any one* of the input promises rejects.
|
||||
*/
|
||||
function join(/* ...promises */) {
|
||||
return map(arguments, identity);
|
||||
return _map(arguments, identity);
|
||||
}
|
||||
|
||||
/**
|
||||
* Traditional map function, similar to `Array.prototype.map()`, but allows
|
||||
* input to contain {@link Promise}s and/or values, and mapFunc may return
|
||||
* either a value or a {@link Promise}
|
||||
*
|
||||
* @param {Array|Promise} array array of anything, may contain a mix
|
||||
* of {@link Promise}s and values
|
||||
* @param {function} mapFunc mapping function mapFunc(value) which may return
|
||||
* either a {@link Promise} or value
|
||||
* @returns {Promise} a {@link Promise} that will resolve to an array containing
|
||||
* the mapped output values.
|
||||
* Settles all input promises such that they are guaranteed not to
|
||||
* be pending once the returned promise fulfills. The returned promise
|
||||
* will always fulfill, except in the case where `array` is a promise
|
||||
* that rejects.
|
||||
* @param {Array|Promise} array or promise for array of promises to settle
|
||||
* @returns {Promise} promise that always fulfills with an array of
|
||||
* outcome snapshots for each input promise.
|
||||
*/
|
||||
function settle(array) {
|
||||
return _map(array, toFulfilledState, toRejectedState);
|
||||
}
|
||||
|
||||
/**
|
||||
* Promise-aware array map function, similar to `Array.prototype.map()`,
|
||||
* but input array may contain promises or values.
|
||||
* @param {Array|Promise} array array of anything, may contain promises and values
|
||||
* @param {function} mapFunc map function which may return a promise or value
|
||||
* @returns {Promise} promise that will fulfill with an array of mapped values
|
||||
* or reject if any input promise rejects.
|
||||
*/
|
||||
function map(array, mapFunc) {
|
||||
return _map(array, mapFunc);
|
||||
}
|
||||
|
||||
/**
|
||||
* Internal map that allows a fallback to handle rejections
|
||||
* @param {Array|Promise} array array of anything, may contain promises and values
|
||||
* @param {function} mapFunc map function which may return a promise or value
|
||||
* @param {function?} fallback function to handle rejected promises
|
||||
* @returns {Promise} promise that will fulfill with an array of mapped values
|
||||
* or reject if any input promise rejects.
|
||||
*/
|
||||
function _map(array, mapFunc, fallback) {
|
||||
return when(array, function(array) {
|
||||
|
||||
return promise(resolveMap);
|
||||
return _promise(resolveMap);
|
||||
|
||||
function resolveMap(resolve, reject, notify) {
|
||||
var results, len, toResolve, resolveOne, i;
|
||||
var results, len, toResolve, i;
|
||||
|
||||
// Since we know the resulting length, we can preallocate the results
|
||||
// array to avoid array expansions.
|
||||
@ -743,27 +864,28 @@ define(function () {
|
||||
|
||||
if(!toResolve) {
|
||||
resolve(results);
|
||||
} else {
|
||||
return;
|
||||
}
|
||||
|
||||
resolveOne = function(item, i) {
|
||||
when(item, mapFunc).then(function(mapped) {
|
||||
results[i] = mapped;
|
||||
|
||||
if(!--toResolve) {
|
||||
resolve(results);
|
||||
}
|
||||
}, reject, notify);
|
||||
};
|
||||
|
||||
// Since mapFunc may be async, get all invocations of it into flight
|
||||
for(i = 0; i < len; i++) {
|
||||
if(i in array) {
|
||||
resolveOne(array[i], i);
|
||||
} else {
|
||||
--toResolve;
|
||||
}
|
||||
// Since mapFunc may be async, get all invocations of it into flight
|
||||
for(i = 0; i < len; i++) {
|
||||
if(i in array) {
|
||||
resolveOne(array[i], i);
|
||||
} else {
|
||||
--toResolve;
|
||||
}
|
||||
}
|
||||
|
||||
function resolveOne(item, i) {
|
||||
when(item, mapFunc, fallback).then(function(mapped) {
|
||||
results[i] = mapped;
|
||||
notify(mapped);
|
||||
|
||||
if(!--toResolve) {
|
||||
resolve(results);
|
||||
}
|
||||
}, reject);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
@ -803,12 +925,46 @@ define(function () {
|
||||
});
|
||||
}
|
||||
|
||||
// Snapshot states
|
||||
|
||||
/**
|
||||
* Creates a fulfilled state snapshot
|
||||
* @private
|
||||
* @param {*} x any value
|
||||
* @returns {{state:'fulfilled',value:*}}
|
||||
*/
|
||||
function toFulfilledState(x) {
|
||||
return { state: 'fulfilled', value: x };
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a rejected state snapshot
|
||||
* @private
|
||||
* @param {*} x any reason
|
||||
* @returns {{state:'rejected',reason:*}}
|
||||
*/
|
||||
function toRejectedState(x) {
|
||||
return { state: 'rejected', reason: x };
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a pending state snapshot
|
||||
* @private
|
||||
* @returns {{state:'pending'}}
|
||||
*/
|
||||
function toPendingState() {
|
||||
return { state: 'pending' };
|
||||
}
|
||||
|
||||
//
|
||||
// Utilities, etc.
|
||||
// Internals, utilities, etc.
|
||||
//
|
||||
|
||||
var reduceArray, slice, fcall, nextTick, handlerQueue,
|
||||
timeout, funcProto, call, arrayProto, undef;
|
||||
setTimeout, funcProto, call, arrayProto, monitorApi,
|
||||
cjsRequire, undef;
|
||||
|
||||
cjsRequire = require;
|
||||
|
||||
//
|
||||
// Shared handler queue processing
|
||||
@ -826,20 +982,13 @@ define(function () {
|
||||
*/
|
||||
function enqueue(task) {
|
||||
if(handlerQueue.push(task) === 1) {
|
||||
scheduleDrainQueue();
|
||||
nextTick(drainQueue);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Schedule the queue to be drained in the next tick.
|
||||
*/
|
||||
function scheduleDrainQueue() {
|
||||
nextTick(drainQueue);
|
||||
}
|
||||
|
||||
/**
|
||||
* Drain the handler queue entirely or partially, being careful to allow
|
||||
* the queue to be extended while it is being processed, and to continue
|
||||
* Drain the handler queue entirely, being careful to allow the
|
||||
* queue to be extended while it is being processed, and to continue
|
||||
* processing until it is truly empty.
|
||||
*/
|
||||
function drainQueue() {
|
||||
@ -852,20 +1001,36 @@ define(function () {
|
||||
handlerQueue = [];
|
||||
}
|
||||
|
||||
//
|
||||
// Capture function and array utils
|
||||
//
|
||||
/*global setImmediate:true*/
|
||||
// capture setTimeout to avoid being caught by fake timers
|
||||
// used in time based tests
|
||||
setTimeout = global.setTimeout;
|
||||
|
||||
// capture setTimeout to avoid being caught by fake timers used in time based tests
|
||||
timeout = setTimeout;
|
||||
nextTick = typeof setImmediate === 'function'
|
||||
? typeof window === 'undefined'
|
||||
? setImmediate
|
||||
: setImmediate.bind(window)
|
||||
: typeof process === 'object'
|
||||
? process.nextTick
|
||||
: function(task) { timeout(task, 0); };
|
||||
// Allow attaching the monitor to when() if env has no console
|
||||
monitorApi = typeof console != 'undefined' ? console : when;
|
||||
|
||||
// Prefer setImmediate or MessageChannel, cascade to node,
|
||||
// vertx and finally setTimeout
|
||||
/*global setImmediate,MessageChannel,process*/
|
||||
if (typeof setImmediate === 'function') {
|
||||
nextTick = setImmediate.bind(global);
|
||||
} else if(typeof MessageChannel !== 'undefined') {
|
||||
var channel = new MessageChannel();
|
||||
channel.port1.onmessage = drainQueue;
|
||||
nextTick = function() { channel.port2.postMessage(0); };
|
||||
} else if (typeof process === 'object' && process.nextTick) {
|
||||
nextTick = process.nextTick;
|
||||
} else {
|
||||
try {
|
||||
// vert.x 1.x || 2.x
|
||||
nextTick = cjsRequire('vertx').runOnLoop || cjsRequire('vertx').runOnContext;
|
||||
} catch(ignore) {
|
||||
nextTick = function(t) { setTimeout(t, 0); };
|
||||
}
|
||||
}
|
||||
|
||||
//
|
||||
// Capture/polyfill function and array utils
|
||||
//
|
||||
|
||||
// Safe function calls
|
||||
funcProto = Function.prototype;
|
||||
@ -926,43 +1091,13 @@ define(function () {
|
||||
return reduced;
|
||||
};
|
||||
|
||||
//
|
||||
// Utility functions
|
||||
//
|
||||
|
||||
/**
|
||||
* Helper that checks arrayOfCallbacks to ensure that each element is either
|
||||
* a function, or null or undefined.
|
||||
* @private
|
||||
* @param {number} start index at which to start checking items in arrayOfCallbacks
|
||||
* @param {Array} arrayOfCallbacks array to check
|
||||
* @throws {Error} if any element of arrayOfCallbacks is something other than
|
||||
* a functions, null, or undefined.
|
||||
*/
|
||||
function checkCallbacks(start, arrayOfCallbacks) {
|
||||
// TODO: Promises/A+ update type checking and docs
|
||||
var arg, i = arrayOfCallbacks.length;
|
||||
|
||||
while(i > start) {
|
||||
arg = arrayOfCallbacks[--i];
|
||||
|
||||
if (arg != null && typeof arg != 'function') {
|
||||
throw new Error('arg '+i+' must be a function');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function noop() {}
|
||||
|
||||
function identity(x) {
|
||||
return x;
|
||||
}
|
||||
|
||||
return when;
|
||||
});
|
||||
})(
|
||||
typeof define === 'function' && define.amd ? define : function (factory) { module.exports = factory(); }
|
||||
);
|
||||
})(typeof define === 'function' && define.amd ? define : function (factory) { module.exports = factory(require); }, this);
|
||||
|
||||
if (typeof module === "object" && typeof require === "function") {
|
||||
var bane = require("bane");
|
||||
|
||||
4
mopidy/frontends/http/data/mopidy.min.js
vendored
4
mopidy/frontends/http/data/mopidy.min.js
vendored
File diff suppressed because one or more lines are too long
@ -18,15 +18,13 @@ class MpdFrontend(pykka.ThreadingActor, CoreListener):
|
||||
hostname = network.format_hostname(config['mpd']['hostname'])
|
||||
port = config['mpd']['port']
|
||||
|
||||
# NOTE kwargs dict keys must be bytestrings to work on Python < 2.6.5
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details.
|
||||
try:
|
||||
network.Server(
|
||||
hostname, port,
|
||||
protocol=session.MpdSession,
|
||||
protocol_kwargs={
|
||||
b'config': config,
|
||||
b'core': core,
|
||||
'config': config,
|
||||
'core': core,
|
||||
},
|
||||
max_connections=config['mpd']['max_connections'],
|
||||
timeout=config['mpd']['connection_timeout'])
|
||||
|
||||
@ -236,6 +236,8 @@ class MpdContext(object):
|
||||
#: The subsytems that we want to be notified about in idle mode.
|
||||
subscriptions = None
|
||||
|
||||
_invalid_playlist_chars = re.compile(r'[\n\r/]')
|
||||
|
||||
def __init__(self, dispatcher, session=None, config=None, core=None):
|
||||
self.dispatcher = dispatcher
|
||||
self.session = session
|
||||
@ -248,10 +250,11 @@ class MpdContext(object):
|
||||
self.refresh_playlists_mapping()
|
||||
|
||||
def create_unique_name(self, playlist_name):
|
||||
name = playlist_name
|
||||
stripped_name = self._invalid_playlist_chars.sub(' ', playlist_name)
|
||||
name = stripped_name
|
||||
i = 2
|
||||
while name in self._playlist_uri_from_name:
|
||||
name = '%s [%d]' % (playlist_name, i)
|
||||
name = '%s [%d]' % (stripped_name, i)
|
||||
i += 1
|
||||
return name
|
||||
|
||||
@ -266,6 +269,7 @@ class MpdContext(object):
|
||||
for playlist in self.core.playlists.playlists.get():
|
||||
if not playlist.name:
|
||||
continue
|
||||
# TODO: add scheme to name perhaps 'foo (spotify)' etc.
|
||||
name = self.create_unique_name(playlist.name)
|
||||
self._playlist_uri_from_name[name] = playlist.uri
|
||||
self._playlist_name_from_uri[playlist.uri] = name
|
||||
|
||||
@ -56,12 +56,7 @@ def handle_request(pattern, auth_required=True):
|
||||
if match is not None:
|
||||
mpd_commands.add(
|
||||
MpdCommand(name=match.group(), auth_required=auth_required))
|
||||
# NOTE Make pattern a bytestring to get bytestring keys in the dict
|
||||
# returned from matches.groupdict(), which is again used as a **kwargs
|
||||
# dict. This is needed to work on Python < 2.6.5.
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details.
|
||||
bytestring_pattern = pattern.encode('utf-8')
|
||||
compiled_pattern = re.compile(bytestring_pattern, flags=re.UNICODE)
|
||||
compiled_pattern = re.compile(pattern, flags=re.UNICODE)
|
||||
if compiled_pattern in request_handlers:
|
||||
raise ValueError('Tried to redefine handler for %s with %s' % (
|
||||
pattern, func))
|
||||
@ -77,9 +72,7 @@ def load_protocol_modules():
|
||||
The protocol modules must be imported to get them registered in
|
||||
:attr:`request_handlers` and :attr:`mpd_commands`.
|
||||
"""
|
||||
# pylint: disable = W0612
|
||||
from . import ( # noqa
|
||||
audio_output, channels, command_list, connection, current_playlist,
|
||||
empty, music_db, playback, reflection, status, stickers,
|
||||
stored_playlists)
|
||||
# pylint: enable = W0612
|
||||
|
||||
@ -101,9 +101,6 @@ def deleteid(context, tlid):
|
||||
Deletes the song ``SONGID`` from the playlist
|
||||
"""
|
||||
tlid = int(tlid)
|
||||
tl_track = context.core.playback.current_tl_track.get()
|
||||
if tl_track and tl_track.tlid == tlid:
|
||||
context.core.playback.next()
|
||||
tl_tracks = context.core.tracklist.remove(tlid=tlid).get()
|
||||
if not tl_tracks:
|
||||
raise MpdNoExistError('No such song', command='deleteid')
|
||||
|
||||
@ -39,8 +39,8 @@ def _artist_as_track(artist):
|
||||
artists=[artist])
|
||||
|
||||
|
||||
@handle_request(r'^count "(?P<tag>[^"]+)" "(?P<needle>[^"]*)"$')
|
||||
def count(context, tag, needle):
|
||||
@handle_request(r'^count ' + QUERY_RE)
|
||||
def count(context, mpd_query):
|
||||
"""
|
||||
*musicpd.org, music database section:*
|
||||
|
||||
@ -48,6 +48,11 @@ def count(context, tag, needle):
|
||||
|
||||
Counts the number of songs and their total playtime in the db
|
||||
matching ``TAG`` exactly.
|
||||
|
||||
*GMPC:*
|
||||
|
||||
- does not add quotes around the tag argument.
|
||||
- use multiple tag-needle pairs to make more specific searches.
|
||||
"""
|
||||
return [('songs', 0), ('playtime', 0)] # TODO
|
||||
|
||||
@ -240,8 +245,9 @@ def _list_date(context, query):
|
||||
return dates
|
||||
|
||||
|
||||
@handle_request(r'^listall "(?P<uri>[^"]+)"')
|
||||
def listall(context, uri):
|
||||
@handle_request(r'^listall$')
|
||||
@handle_request(r'^listall "(?P<uri>[^"]+)"$')
|
||||
def listall(context, uri=None):
|
||||
"""
|
||||
*musicpd.org, music database section:*
|
||||
|
||||
@ -252,8 +258,9 @@ def listall(context, uri):
|
||||
raise MpdNotImplemented # TODO
|
||||
|
||||
|
||||
@handle_request(r'^listallinfo "(?P<uri>[^"]+)"')
|
||||
def listallinfo(context, uri):
|
||||
@handle_request(r'^listallinfo$')
|
||||
@handle_request(r'^listallinfo "(?P<uri>[^"]+)"$')
|
||||
def listallinfo(context, uri=None):
|
||||
"""
|
||||
*musicpd.org, music database section:*
|
||||
|
||||
|
||||
@ -10,6 +10,8 @@ from mopidy.frontends.mpd.exceptions import MpdArgError
|
||||
from mopidy.models import TlTrack
|
||||
from mopidy.utils.path import mtime as get_mtime, uri_to_path, split_path
|
||||
|
||||
# TODO: special handling of local:// uri scheme
|
||||
|
||||
|
||||
def track_to_mpd_format(track, position=None):
|
||||
"""
|
||||
@ -139,8 +141,6 @@ def query_from_mpd_list_format(field, mpd_query):
|
||||
"""
|
||||
Converts an MPD ``list`` query to a Mopidy query.
|
||||
"""
|
||||
# NOTE kwargs dict keys must be bytestrings to work on Python < 2.6.5
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details
|
||||
if mpd_query is None:
|
||||
return {}
|
||||
try:
|
||||
@ -156,14 +156,14 @@ def query_from_mpd_list_format(field, mpd_query):
|
||||
if field == 'album':
|
||||
if not tokens[0]:
|
||||
raise ValueError
|
||||
return {b'artist': [tokens[0]]} # See above NOTE
|
||||
return {'artist': [tokens[0]]}
|
||||
else:
|
||||
raise MpdArgError(
|
||||
'should be "Album" for 3 arguments', command='list')
|
||||
elif len(tokens) % 2 == 0:
|
||||
query = {}
|
||||
while tokens:
|
||||
key = str(tokens[0].lower()) # See above NOTE
|
||||
key = tokens[0].lower()
|
||||
value = tokens[1]
|
||||
tokens = tokens[2:]
|
||||
if key not in ('artist', 'album', 'date', 'genre'):
|
||||
@ -215,6 +215,7 @@ def query_from_mpd_search_format(mpd_query):
|
||||
return query
|
||||
|
||||
|
||||
# TODO: move to tagcache backend.
|
||||
def tracks_to_tag_cache_format(tracks, media_dir):
|
||||
"""
|
||||
Format list of tracks for output to MPD tag cache
|
||||
@ -236,6 +237,7 @@ def tracks_to_tag_cache_format(tracks, media_dir):
|
||||
_add_to_tag_cache(result, dirs, files, media_dir)
|
||||
return result
|
||||
|
||||
|
||||
# TODO: bytes only
|
||||
def _add_to_tag_cache(result, dirs, files, media_dir):
|
||||
base_path = media_dir.encode('utf-8')
|
||||
|
||||
@ -34,7 +34,7 @@ class MprisFrontend(pykka.ThreadingActor, CoreListener):
|
||||
self.mpris_object = objects.MprisObject(self.config, self.core)
|
||||
self._send_startup_notification()
|
||||
except Exception as e:
|
||||
logger.error('MPRIS frontend setup failed (%s)', e)
|
||||
logger.warning('MPRIS frontend setup failed (%s)', e)
|
||||
self.stop()
|
||||
|
||||
def on_stop(self):
|
||||
|
||||
@ -66,15 +66,13 @@ class ImmutableObject(object):
|
||||
:type values: dict
|
||||
:rtype: new instance of the model being copied
|
||||
"""
|
||||
# NOTE kwargs dict keys must be bytestrings to work on Python < 2.6.5
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details
|
||||
data = {}
|
||||
for key in self.__dict__.keys():
|
||||
public_key = key.lstrip('_')
|
||||
data[str(public_key)] = values.pop(public_key, self.__dict__[key])
|
||||
data[public_key] = values.pop(public_key, self.__dict__[key])
|
||||
for key in values.keys():
|
||||
if hasattr(self, key):
|
||||
data[str(key)] = values.pop(key)
|
||||
data[key] = values.pop(key)
|
||||
if values:
|
||||
raise TypeError(
|
||||
'copy() got an unexpected keyword argument "%s"' % key)
|
||||
@ -92,7 +90,7 @@ class ImmutableObject(object):
|
||||
for v in value]
|
||||
elif isinstance(value, ImmutableObject):
|
||||
value = value.serialize()
|
||||
if value:
|
||||
if not (isinstance(value, list) and len(value) == 0):
|
||||
data[public_key] = value
|
||||
return data
|
||||
|
||||
@ -127,15 +125,13 @@ def model_json_decoder(dct):
|
||||
{u'a_track': Track(artists=[], name=u'name')}
|
||||
|
||||
"""
|
||||
# NOTE kwargs dict keys must be bytestrings to work on Python < 2.6.5
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details.
|
||||
if '__model__' in dct:
|
||||
model_name = dct.pop('__model__')
|
||||
cls = globals().get(model_name, None)
|
||||
if issubclass(cls, ImmutableObject):
|
||||
kwargs = {}
|
||||
for key, value in dct.items():
|
||||
kwargs[str(key)] = value
|
||||
kwargs[key] = value
|
||||
return cls(**kwargs)
|
||||
return dct
|
||||
|
||||
@ -208,10 +204,8 @@ class Album(ImmutableObject):
|
||||
# actual usage of this field with more than one image.
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
# NOTE kwargs dict keys must be bytestrings to work on Python < 2.6.5
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details
|
||||
self.__dict__[b'artists'] = frozenset(kwargs.pop('artists', []))
|
||||
self.__dict__[b'images'] = frozenset(kwargs.pop('images', []))
|
||||
self.__dict__['artists'] = frozenset(kwargs.pop('artists', []))
|
||||
self.__dict__['images'] = frozenset(kwargs.pop('images', []))
|
||||
super(Album, self).__init__(*args, **kwargs)
|
||||
|
||||
|
||||
@ -237,6 +231,8 @@ class Track(ImmutableObject):
|
||||
:type bitrate: integer
|
||||
:param musicbrainz_id: MusicBrainz ID
|
||||
:type musicbrainz_id: string
|
||||
:param last_modified: Represents last modification time
|
||||
:type last_modified: integer
|
||||
"""
|
||||
|
||||
#: The track URI. Read-only.
|
||||
@ -269,10 +265,13 @@ class Track(ImmutableObject):
|
||||
#: The MusicBrainz ID of the track. Read-only.
|
||||
musicbrainz_id = None
|
||||
|
||||
#: Integer representing when the track was last modified, exact meaning
|
||||
#: depends on source of track. For local files this is the mtime, for other
|
||||
#: backends it could be a timestamp or simply a version counter.
|
||||
last_modified = 0
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
# NOTE kwargs dict keys must be bytestrings to work on Python < 2.6.5
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details
|
||||
self.__dict__[b'artists'] = frozenset(kwargs.pop('artists', []))
|
||||
self.__dict__['artists'] = frozenset(kwargs.pop('artists', []))
|
||||
super(Track, self).__init__(*args, **kwargs)
|
||||
|
||||
|
||||
@ -304,11 +303,9 @@ class TlTrack(ImmutableObject):
|
||||
track = None
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
# NOTE kwargs dict keys must be bytestrings to work on Python < 2.6.5
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details
|
||||
if len(args) == 2 and len(kwargs) == 0:
|
||||
kwargs[b'tlid'] = args[0]
|
||||
kwargs[b'track'] = args[1]
|
||||
kwargs['tlid'] = args[0]
|
||||
kwargs['track'] = args[1]
|
||||
args = []
|
||||
super(TlTrack, self).__init__(*args, **kwargs)
|
||||
|
||||
@ -343,9 +340,7 @@ class Playlist(ImmutableObject):
|
||||
last_modified = None
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
# NOTE kwargs dict keys must be bytestrings to work on Python < 2.6.5
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details
|
||||
self.__dict__[b'tracks'] = tuple(kwargs.pop('tracks', []))
|
||||
self.__dict__['tracks'] = tuple(kwargs.pop('tracks', []))
|
||||
super(Playlist, self).__init__(*args, **kwargs)
|
||||
|
||||
# TODO: def insert(self, pos, track): ... ?
|
||||
@ -381,9 +376,7 @@ class SearchResult(ImmutableObject):
|
||||
albums = tuple()
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
# NOTE kwargs dict keys must be bytestrings to work on Python < 2.6.5
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details
|
||||
self.__dict__[b'tracks'] = tuple(kwargs.pop('tracks', []))
|
||||
self.__dict__[b'artists'] = tuple(kwargs.pop('artists', []))
|
||||
self.__dict__[b'albums'] = tuple(kwargs.pop('albums', []))
|
||||
self.__dict__['tracks'] = tuple(kwargs.pop('tracks', []))
|
||||
self.__dict__['artists'] = tuple(kwargs.pop('artists', []))
|
||||
self.__dict__['albums'] = tuple(kwargs.pop('albums', []))
|
||||
super(SearchResult, self).__init__(*args, **kwargs)
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import argparse
|
||||
import datetime
|
||||
import logging
|
||||
import optparse
|
||||
import os
|
||||
import sys
|
||||
|
||||
@ -27,13 +27,12 @@ pygst.require('0.10')
|
||||
import gst
|
||||
|
||||
from mopidy import config as config_lib, ext
|
||||
from mopidy.frontends.mpd import translator as mpd_translator
|
||||
from mopidy.models import Track, Artist, Album
|
||||
from mopidy.utils import log, path, versioning
|
||||
|
||||
|
||||
def main():
|
||||
options = parse_options()
|
||||
args = parse_args()
|
||||
# TODO: support config files and overrides (shared from main?)
|
||||
config_files = [b'/etc/mopidy/mopidy.conf',
|
||||
b'$XDG_CONFIG_DIR/mopidy/mopidy.conf']
|
||||
@ -43,7 +42,7 @@ def main():
|
||||
# Initial config without extensions to bootstrap logging.
|
||||
logging_config, _ = config_lib.load(config_files, [], config_overrides)
|
||||
log.setup_root_logger()
|
||||
log.setup_console_logging(logging_config, options.verbosity_level)
|
||||
log.setup_console_logging(logging_config, args.verbosity_level)
|
||||
|
||||
extensions = ext.load_extensions()
|
||||
config, errors = config_lib.load(
|
||||
@ -54,70 +53,109 @@ def main():
|
||||
logging.warning('Config value local/media_dir is not set.')
|
||||
return
|
||||
|
||||
# TODO: missing error checking and other default setup code.
|
||||
if not config['local']['scan_timeout']:
|
||||
logging.warning('Config value local/scan_timeout is not set.')
|
||||
return
|
||||
|
||||
tracks = []
|
||||
# TODO: missing config error checking and other default setup code.
|
||||
|
||||
updaters = {}
|
||||
for e in extensions:
|
||||
for updater_class in e.get_library_updaters():
|
||||
if updater_class and 'local' in updater_class.uri_schemes:
|
||||
updaters[e.ext_name] = updater_class
|
||||
|
||||
if not updaters:
|
||||
logging.error('No usable library updaters found.')
|
||||
return
|
||||
elif len(updaters) > 1:
|
||||
logging.error('More than one library updater found. '
|
||||
'Provided by: %s', ', '.join(updaters.keys()))
|
||||
return
|
||||
|
||||
local_updater = updaters.values()[0](config) # TODO: switch to actor?
|
||||
|
||||
media_dir = config['local']['media_dir']
|
||||
|
||||
uris_library = set()
|
||||
uris_update = set()
|
||||
uris_remove = set()
|
||||
|
||||
logging.info('Checking tracks from library.')
|
||||
for track in local_updater.load():
|
||||
try:
|
||||
stat = os.stat(path.uri_to_path(track.uri))
|
||||
if int(stat.st_mtime) > track.last_modified:
|
||||
uris_update.add(track.uri)
|
||||
uris_library.add(track.uri)
|
||||
except OSError:
|
||||
uris_remove.add(track.uri)
|
||||
|
||||
logging.info('Removing %d moved or deleted tracks.', len(uris_remove))
|
||||
for uri in uris_remove:
|
||||
local_updater.remove(uri)
|
||||
|
||||
logging.info('Checking %s for new or modified tracks.', media_dir)
|
||||
for uri in path.find_uris(config['local']['media_dir']):
|
||||
if uri not in uris_library:
|
||||
uris_update.add(uri)
|
||||
|
||||
logging.info('Found %d new or modified tracks.', len(uris_update))
|
||||
|
||||
def store(data):
|
||||
track = translator(data)
|
||||
tracks.append(track)
|
||||
local_updater.add(track)
|
||||
logging.debug('Added %s', track.uri)
|
||||
|
||||
def debug(uri, error, debug):
|
||||
logging.warning('Failed %s: %s', uri, error)
|
||||
logging.debug('Debug info for %s: %s', uri, debug)
|
||||
|
||||
logging.info('Scanning %s', config['local']['media_dir'])
|
||||
scan_timeout = config['local']['scan_timeout']
|
||||
|
||||
scanner = Scanner(config['local']['media_dir'], store, debug)
|
||||
logging.info('Scanning new and modified tracks.')
|
||||
# TODO: just pass the library in instead?
|
||||
scanner = Scanner(uris_update, store, debug, scan_timeout)
|
||||
try:
|
||||
scanner.start()
|
||||
except KeyboardInterrupt:
|
||||
scanner.stop()
|
||||
raise
|
||||
|
||||
logging.info('Done scanning; writing tag cache...')
|
||||
|
||||
for row in mpd_translator.tracks_to_tag_cache_format(
|
||||
tracks, config['local']['media_dir']):
|
||||
if len(row) == 1:
|
||||
print ('%s' % row).encode('utf-8')
|
||||
else:
|
||||
print ('%s: %s' % row).encode('utf-8')
|
||||
|
||||
logging.info('Done writing tag cache')
|
||||
logging.info('Done scanning; commiting changes.')
|
||||
local_updater.commit()
|
||||
|
||||
|
||||
def parse_options():
|
||||
parser = optparse.OptionParser(
|
||||
def parse_args():
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
'--version', action='version',
|
||||
version='Mopidy %s' % versioning.get_version())
|
||||
# NOTE First argument to add_option must be bytestrings on Python < 2.6.2
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details
|
||||
parser.add_option(
|
||||
b'-q', '--quiet',
|
||||
parser.add_argument(
|
||||
'-q', '--quiet',
|
||||
action='store_const', const=0, dest='verbosity_level',
|
||||
help='less output (warning level)')
|
||||
parser.add_option(
|
||||
b'-v', '--verbose',
|
||||
parser.add_argument(
|
||||
'-v', '--verbose',
|
||||
action='count', default=1, dest='verbosity_level',
|
||||
help='more output (debug level)')
|
||||
return parser.parse_args(args=mopidy_args)[0]
|
||||
return parser.parse_args(args=mopidy_args)
|
||||
|
||||
|
||||
# TODO: move into scanner.
|
||||
def translator(data):
|
||||
albumartist_kwargs = {}
|
||||
album_kwargs = {}
|
||||
artist_kwargs = {}
|
||||
track_kwargs = {}
|
||||
|
||||
# NOTE kwargs dict keys must be bytestrings to work on Python < 2.6.5
|
||||
# See https://github.com/mopidy/mopidy/issues/302 for details.
|
||||
|
||||
def _retrieve(source_key, target_key, target):
|
||||
if source_key in data:
|
||||
target[str(target_key)] = data[source_key]
|
||||
target[target_key] = data[source_key]
|
||||
|
||||
_retrieve(gst.TAG_ALBUM, 'name', album_kwargs)
|
||||
_retrieve(gst.TAG_TRACK_COUNT, 'num_tracks', album_kwargs)
|
||||
_retrieve(gst.TAG_ALBUM_VOLUME_COUNT, 'num_discs', album_kwargs)
|
||||
_retrieve(gst.TAG_ARTIST, 'name', artist_kwargs)
|
||||
|
||||
if gst.TAG_DATE in data and data[gst.TAG_DATE]:
|
||||
@ -127,10 +165,11 @@ def translator(data):
|
||||
except ValueError:
|
||||
pass # Ignore invalid dates
|
||||
else:
|
||||
track_kwargs[b'date'] = date.isoformat()
|
||||
track_kwargs['date'] = date.isoformat()
|
||||
|
||||
_retrieve(gst.TAG_TITLE, 'name', track_kwargs)
|
||||
_retrieve(gst.TAG_TRACK_NUMBER, 'track_no', track_kwargs)
|
||||
_retrieve(gst.TAG_ALBUM_VOLUME_NUMBER, 'disc_no', track_kwargs)
|
||||
|
||||
# Following keys don't seem to have TAG_* constant.
|
||||
_retrieve('album-artist', 'name', albumartist_kwargs)
|
||||
@ -141,23 +180,27 @@ def translator(data):
|
||||
'musicbrainz-albumartistid', 'musicbrainz_id', albumartist_kwargs)
|
||||
|
||||
if albumartist_kwargs:
|
||||
album_kwargs[b'artists'] = [Artist(**albumartist_kwargs)]
|
||||
album_kwargs['artists'] = [Artist(**albumartist_kwargs)]
|
||||
|
||||
track_kwargs[b'uri'] = data['uri']
|
||||
track_kwargs[b'length'] = data[gst.TAG_DURATION]
|
||||
track_kwargs[b'album'] = Album(**album_kwargs)
|
||||
track_kwargs[b'artists'] = [Artist(**artist_kwargs)]
|
||||
track_kwargs['uri'] = data['uri']
|
||||
track_kwargs['last_modified'] = int(data['mtime'])
|
||||
track_kwargs['length'] = data[gst.TAG_DURATION]
|
||||
track_kwargs['album'] = Album(**album_kwargs)
|
||||
track_kwargs['artists'] = [Artist(**artist_kwargs)]
|
||||
|
||||
return Track(**track_kwargs)
|
||||
|
||||
|
||||
class Scanner(object):
|
||||
def __init__(self, base_dir, data_callback, error_callback=None):
|
||||
def __init__(
|
||||
self, uris, data_callback, error_callback=None, scan_timeout=1000):
|
||||
self.data = {}
|
||||
self.files = path.find_files(base_dir)
|
||||
self.uris = iter(uris)
|
||||
self.data_callback = data_callback
|
||||
self.error_callback = error_callback
|
||||
self.scan_timeout = scan_timeout
|
||||
self.loop = gobject.MainLoop()
|
||||
self.timeout_id = None
|
||||
|
||||
self.fakesink = gst.element_factory_make('fakesink')
|
||||
self.fakesink.set_property('signal-handoffs', True)
|
||||
@ -198,7 +241,9 @@ class Scanner(object):
|
||||
if message.structure.get_name() != 'handoff':
|
||||
return
|
||||
|
||||
self.data['uri'] = unicode(self.uribin.get_property('uri'))
|
||||
uri = unicode(self.uribin.get_property('uri'))
|
||||
self.data['uri'] = uri
|
||||
self.data['mtime'] = os.path.getmtime(path.uri_to_path(uri))
|
||||
self.data[gst.TAG_DURATION] = self.get_duration()
|
||||
|
||||
try:
|
||||
@ -226,6 +271,14 @@ class Scanner(object):
|
||||
self.error_callback(uri, error, debug)
|
||||
self.next_uri()
|
||||
|
||||
def process_timeout(self):
|
||||
if self.error_callback:
|
||||
uri = self.uribin.get_property('uri')
|
||||
self.error_callback(
|
||||
uri, 'Scan timed out after %d ms' % self.scan_timeout, None)
|
||||
self.next_uri()
|
||||
return False
|
||||
|
||||
def get_duration(self):
|
||||
self.pipe.get_state() # Block until state change is done.
|
||||
try:
|
||||
@ -236,13 +289,18 @@ class Scanner(object):
|
||||
|
||||
def next_uri(self):
|
||||
self.data = {}
|
||||
if self.timeout_id:
|
||||
gobject.source_remove(self.timeout_id)
|
||||
self.timeout_id = None
|
||||
try:
|
||||
uri = path.path_to_uri(self.files.next())
|
||||
uri = next(self.uris)
|
||||
except StopIteration:
|
||||
self.stop()
|
||||
return False
|
||||
self.pipe.set_state(gst.STATE_NULL)
|
||||
self.uribin.set_property('uri', uri)
|
||||
self.timeout_id = gobject.timeout_add(
|
||||
self.scan_timeout, self.process_timeout)
|
||||
self.pipe.set_state(gst.STATE_PLAYING)
|
||||
return True
|
||||
|
||||
|
||||
@ -3,7 +3,6 @@ from __future__ import unicode_literals
|
||||
import functools
|
||||
import os
|
||||
import platform
|
||||
import sys
|
||||
|
||||
import pygst
|
||||
pygst.require('0.10')
|
||||
@ -14,17 +13,6 @@ import pkg_resources
|
||||
from . import formatting
|
||||
|
||||
|
||||
def show_deps_optparse_callback(*args):
|
||||
"""
|
||||
Prints a list of all dependencies.
|
||||
|
||||
Called by optparse when Mopidy is run with the :option:`--show-deps`
|
||||
option.
|
||||
"""
|
||||
print format_dependency_list()
|
||||
sys.exit(0)
|
||||
|
||||
|
||||
def format_dependency_list(adapters=None):
|
||||
if adapters is None:
|
||||
dist_names = set([
|
||||
|
||||
@ -34,10 +34,10 @@ def setup_root_logger():
|
||||
|
||||
|
||||
def setup_console_logging(config, verbosity_level):
|
||||
if verbosity_level == 0:
|
||||
if verbosity_level == -1:
|
||||
log_level = logging.WARNING
|
||||
log_format = config['logging']['console_format']
|
||||
elif verbosity_level >= 2:
|
||||
elif verbosity_level >= 1:
|
||||
log_level = logging.DEBUG
|
||||
log_format = config['logging']['debug_format']
|
||||
else:
|
||||
|
||||
@ -2,12 +2,9 @@ from __future__ import unicode_literals
|
||||
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
# pylint: disable = W0402
|
||||
import string
|
||||
# pylint: enable = W0402
|
||||
import sys
|
||||
import urllib
|
||||
import urlparse
|
||||
|
||||
import glib
|
||||
|
||||
@ -51,7 +48,7 @@ def get_or_create_file(file_path):
|
||||
return file_path
|
||||
|
||||
|
||||
def path_to_uri(*paths):
|
||||
def path_to_uri(path):
|
||||
"""
|
||||
Convert OS specific path to file:// URI.
|
||||
|
||||
@ -61,17 +58,15 @@ def path_to_uri(*paths):
|
||||
|
||||
Returns a file:// URI as an unicode string.
|
||||
"""
|
||||
path = os.path.join(*paths)
|
||||
if isinstance(path, unicode):
|
||||
path = path.encode('utf-8')
|
||||
if sys.platform == 'win32':
|
||||
return 'file:' + urllib.quote(path)
|
||||
return 'file://' + urllib.quote(path)
|
||||
path = urllib.quote(path)
|
||||
return urlparse.urlunsplit((b'file', b'', path, b'', b''))
|
||||
|
||||
|
||||
def uri_to_path(uri):
|
||||
"""
|
||||
Convert the file:// to a OS specific path.
|
||||
Convert an URI to a OS specific path.
|
||||
|
||||
Returns a bytestring, since the file path can contain chars with other
|
||||
encoding than UTF-8.
|
||||
@ -82,10 +77,7 @@ def uri_to_path(uri):
|
||||
"""
|
||||
if isinstance(uri, unicode):
|
||||
uri = uri.encode('utf-8')
|
||||
if sys.platform == 'win32':
|
||||
return urllib.unquote(re.sub(b'^file:', b'', uri))
|
||||
else:
|
||||
return urllib.unquote(re.sub(b'^file://', b'', uri))
|
||||
return urllib.unquote(urlparse.urlsplit(uri).path)
|
||||
|
||||
|
||||
def split_path(path):
|
||||
@ -141,6 +133,11 @@ def find_files(path):
|
||||
yield os.path.join(dirpath, filename)
|
||||
|
||||
|
||||
def find_uris(path):
|
||||
for p in find_files(path):
|
||||
yield path_to_uri(p)
|
||||
|
||||
|
||||
def check_file_path_is_inside_base_dir(file_path, base_path):
|
||||
assert not file_path.endswith(os.sep), (
|
||||
'File path %s cannot end with a path separator' % file_path)
|
||||
|
||||
@ -14,11 +14,9 @@ def get_version():
|
||||
|
||||
def get_git_version():
|
||||
process = Popen(['git', 'describe'], stdout=PIPE, stderr=PIPE)
|
||||
# pylint: disable = E1101
|
||||
if process.wait() != 0:
|
||||
raise EnvironmentError('Execution of "git describe" failed')
|
||||
version = process.stdout.read().strip()
|
||||
# pylint: enable = E1101
|
||||
if version.startswith('v'):
|
||||
version = version[1:]
|
||||
return version
|
||||
|
||||
21
pylintrc
21
pylintrc
@ -1,21 +0,0 @@
|
||||
[MESSAGES CONTROL]
|
||||
#
|
||||
# Disabled messages
|
||||
# -----------------
|
||||
#
|
||||
# C0103 - Invalid name "%s" (should match %s)
|
||||
# C0111 - Missing docstring
|
||||
# R0201 - Method could be a function
|
||||
# R0801 - Similar lines in %s files
|
||||
# R0902 - Too many instance attributes (%s/%s)
|
||||
# R0903 - Too few public methods (%s/%s)
|
||||
# R0904 - Too many public methods (%s/%s)
|
||||
# R0912 - Too many branches (%s/%s)
|
||||
# R0913 - Too many arguments (%s/%s)
|
||||
# R0921 - Abstract class not referenced
|
||||
# W0141 - Used builtin function '%s'
|
||||
# W0142 - Used * or ** magic
|
||||
# W0511 - TODO, FIXME and XXX in the code
|
||||
# W0613 - Unused argument %r
|
||||
#
|
||||
disable = C0103,C0111,R0201,R0801,R0902,R0903,R0904,R0912,R0913,R0921,W0141,W0142,W0511,W0613
|
||||
@ -1,2 +0,0 @@
|
||||
pyserial
|
||||
# Available as python-serial in Debian/Ubuntu
|
||||
@ -2,4 +2,5 @@ cherrypy >= 3.2.2
|
||||
# Available as python-cherrypy3 in Debian/Ubuntu
|
||||
|
||||
ws4py >= 0.2.3
|
||||
# Available as python-ws4py from apt.mopidy.com
|
||||
# Available as python-ws4py in newer Debian/Ubuntu and from apt.mopidy.com for
|
||||
# older releases of Debian/Ubuntu
|
||||
|
||||
@ -2,6 +2,3 @@ coverage
|
||||
flake8
|
||||
mock >= 1.0
|
||||
nose
|
||||
pylint
|
||||
tox
|
||||
unittest2
|
||||
|
||||
@ -1,6 +0,0 @@
|
||||
[nosetests]
|
||||
verbosity = 1
|
||||
#with-coverage = 1
|
||||
cover-package = mopidy
|
||||
cover-inclusive = 1
|
||||
cover-html = 1
|
||||
2
setup.py
2
setup.py
@ -36,7 +36,6 @@ setup(
|
||||
tests_require=[
|
||||
'nose',
|
||||
'mock >= 1.0',
|
||||
'unittest2',
|
||||
],
|
||||
entry_points={
|
||||
'console_scripts': [
|
||||
@ -61,7 +60,6 @@ setup(
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Operating System :: MacOS :: MacOS X',
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python :: 2.6',
|
||||
'Programming Language :: Python :: 2.7',
|
||||
'Topic :: Multimedia :: Sound/Audio :: Players',
|
||||
],
|
||||
|
||||
@ -1,17 +1,11 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
||||
if sys.version_info < (2, 7):
|
||||
import unittest2 as unittest
|
||||
else:
|
||||
import unittest # noqa
|
||||
|
||||
|
||||
def path_to_data_dir(name):
|
||||
if not isinstance(name, bytes):
|
||||
name = name.encode(sys.getfilesystemencoding())
|
||||
name = name.encode('utf-8')
|
||||
path = os.path.dirname(__file__)
|
||||
path = os.path.join(path, b'data')
|
||||
path = os.path.abspath(path)
|
||||
|
||||
@ -1,5 +1,7 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import unittest
|
||||
|
||||
import pygst
|
||||
pygst.require('0.10')
|
||||
import gst
|
||||
@ -9,7 +11,7 @@ import pykka
|
||||
from mopidy import audio
|
||||
from mopidy.utils.path import path_to_uri
|
||||
|
||||
from tests import unittest, path_to_data_dir
|
||||
from tests import path_to_data_dir
|
||||
|
||||
|
||||
class AudioTest(unittest.TestCase):
|
||||
@ -19,6 +21,7 @@ class AudioTest(unittest.TestCase):
|
||||
'mixer': 'fakemixer track_max_volume=65536',
|
||||
'mixer_track': None,
|
||||
'output': 'fakesink',
|
||||
'visualizer': None,
|
||||
}
|
||||
}
|
||||
self.song_uri = path_to_uri(path_to_data_dir('song1.wav'))
|
||||
@ -68,6 +71,7 @@ class AudioTest(unittest.TestCase):
|
||||
'mixer': 'fakemixer track_max_volume=40',
|
||||
'mixer_track': None,
|
||||
'output': 'fakesink',
|
||||
'visualizer': None,
|
||||
}
|
||||
}
|
||||
self.audio = audio.Audio.start(config=config).proxy()
|
||||
|
||||
@ -1,11 +1,10 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
from mopidy import audio
|
||||
|
||||
from tests import unittest
|
||||
|
||||
|
||||
class AudioListenerTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
|
||||
@ -1,12 +1,12 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import unittest
|
||||
|
||||
import pykka
|
||||
|
||||
from mopidy import core
|
||||
from mopidy.models import Track, Album, Artist
|
||||
|
||||
from tests import unittest, path_to_data_dir
|
||||
|
||||
|
||||
class LibraryControllerTest(object):
|
||||
artists = [Artist(name='artist1'), Artist(name='artist2'), Artist()]
|
||||
@ -15,13 +15,10 @@ class LibraryControllerTest(object):
|
||||
Album(name='album2', artists=artists[1:2]),
|
||||
Album()]
|
||||
tracks = [
|
||||
Track(
|
||||
uri='file://' + path_to_data_dir('uri1'), name='track1',
|
||||
artists=artists[:1], album=albums[0], date='2001-02-03',
|
||||
length=4000),
|
||||
Track(
|
||||
uri='file://' + path_to_data_dir('uri2'), name='track2',
|
||||
artists=artists[1:2], album=albums[1], date='2002', length=4000),
|
||||
Track(uri='local:track:path1', name='track1', artists=artists[:1],
|
||||
album=albums[0], date='2001-02-03', length=4000),
|
||||
Track(uri='local:track:path2', name='track2', artists=artists[1:2],
|
||||
album=albums[1], date='2002', length=4000),
|
||||
Track()]
|
||||
config = {}
|
||||
|
||||
@ -64,11 +61,11 @@ class LibraryControllerTest(object):
|
||||
self.assertEqual(list(result[0].tracks), [])
|
||||
|
||||
def test_find_exact_uri(self):
|
||||
track_1_uri = 'file://' + path_to_data_dir('uri1')
|
||||
track_1_uri = 'local:track:path1'
|
||||
result = self.library.find_exact(uri=track_1_uri)
|
||||
self.assertEqual(list(result[0].tracks), self.tracks[:1])
|
||||
|
||||
track_2_uri = 'file://' + path_to_data_dir('uri2')
|
||||
track_2_uri = 'local:track:path2'
|
||||
result = self.library.find_exact(uri=track_2_uri)
|
||||
self.assertEqual(list(result[0].tracks), self.tracks[1:2])
|
||||
|
||||
@ -134,10 +131,10 @@ class LibraryControllerTest(object):
|
||||
self.assertEqual(list(result[0].tracks), [])
|
||||
|
||||
def test_search_uri(self):
|
||||
result = self.library.search(uri=['RI1'])
|
||||
result = self.library.search(uri=['TH1'])
|
||||
self.assertEqual(list(result[0].tracks), self.tracks[:1])
|
||||
|
||||
result = self.library.search(uri=['RI2'])
|
||||
result = self.library.search(uri=['TH2'])
|
||||
self.assertEqual(list(result[0].tracks), self.tracks[1:2])
|
||||
|
||||
def test_search_track(self):
|
||||
@ -181,7 +178,7 @@ class LibraryControllerTest(object):
|
||||
self.assertEqual(list(result[0].tracks), self.tracks[:1])
|
||||
result = self.library.search(any=['Bum1'])
|
||||
self.assertEqual(list(result[0].tracks), self.tracks[:1])
|
||||
result = self.library.search(any=['RI1'])
|
||||
result = self.library.search(any=['TH1'])
|
||||
self.assertEqual(list(result[0].tracks), self.tracks[:1])
|
||||
|
||||
def test_search_wrong_type(self):
|
||||
|
||||
@ -3,6 +3,7 @@ from __future__ import unicode_literals
|
||||
import mock
|
||||
import random
|
||||
import time
|
||||
import unittest
|
||||
|
||||
import pykka
|
||||
|
||||
@ -10,7 +11,6 @@ from mopidy import audio, core
|
||||
from mopidy.core import PlaybackState
|
||||
from mopidy.models import Track
|
||||
|
||||
from tests import unittest
|
||||
from tests.backends.base import populate_tracklist
|
||||
|
||||
# TODO Test 'playlist repeat', e.g. repeat=1,single=0
|
||||
|
||||
@ -1,12 +1,12 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import unittest
|
||||
|
||||
import pykka
|
||||
|
||||
from mopidy import audio, core
|
||||
from mopidy.models import Playlist
|
||||
|
||||
from tests import unittest
|
||||
|
||||
|
||||
class PlaylistsControllerTest(object):
|
||||
config = {}
|
||||
|
||||
@ -1,11 +1,10 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
from mopidy.backends.listener import BackendListener
|
||||
|
||||
from tests import unittest
|
||||
|
||||
|
||||
class BackendListenerTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
|
||||
@ -1,9 +1,4 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from mopidy.utils.path import path_to_uri
|
||||
|
||||
from tests import path_to_data_dir
|
||||
|
||||
|
||||
song = path_to_data_dir('song%s.wav')
|
||||
generate_song = lambda i: path_to_uri(song % i)
|
||||
generate_song = lambda i: 'local:track:song%s.wav' % i
|
||||
|
||||
@ -1,8 +1,10 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import unittest
|
||||
|
||||
from mopidy.backends.local import actor
|
||||
|
||||
from tests import unittest, path_to_data_dir
|
||||
from tests import path_to_data_dir
|
||||
from tests.backends.base import events
|
||||
|
||||
|
||||
|
||||
@ -1,8 +1,10 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import unittest
|
||||
|
||||
from mopidy.backends.local import actor
|
||||
|
||||
from tests import unittest, path_to_data_dir
|
||||
from tests import path_to_data_dir
|
||||
from tests.backends.base.library import LibraryControllerTest
|
||||
|
||||
|
||||
|
||||
@ -1,11 +1,12 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import unittest
|
||||
|
||||
from mopidy.backends.local import actor
|
||||
from mopidy.core import PlaybackState
|
||||
from mopidy.models import Track
|
||||
from mopidy.utils.path import path_to_uri
|
||||
|
||||
from tests import unittest, path_to_data_dir
|
||||
from tests import path_to_data_dir
|
||||
from tests.backends.base.playback import PlaybackControllerTest
|
||||
from tests.backends.local import generate_song
|
||||
|
||||
@ -22,25 +23,25 @@ class LocalPlaybackControllerTest(PlaybackControllerTest, unittest.TestCase):
|
||||
tracks = [
|
||||
Track(uri=generate_song(i), length=4464) for i in range(1, 4)]
|
||||
|
||||
def add_track(self, path):
|
||||
uri = path_to_uri(path_to_data_dir(path))
|
||||
def add_track(self, uri):
|
||||
track = Track(uri=uri, length=4464)
|
||||
self.tracklist.add([track])
|
||||
|
||||
def test_uri_scheme(self):
|
||||
self.assertIn('file', self.core.uri_schemes)
|
||||
self.assertNotIn('file', self.core.uri_schemes)
|
||||
self.assertIn('local', self.core.uri_schemes)
|
||||
|
||||
def test_play_mp3(self):
|
||||
self.add_track('blank.mp3')
|
||||
self.add_track('local:track:blank.mp3')
|
||||
self.playback.play()
|
||||
self.assertEqual(self.playback.state, PlaybackState.PLAYING)
|
||||
|
||||
def test_play_ogg(self):
|
||||
self.add_track('blank.ogg')
|
||||
self.add_track('local:track:blank.ogg')
|
||||
self.playback.play()
|
||||
self.assertEqual(self.playback.state, PlaybackState.PLAYING)
|
||||
|
||||
def test_play_flac(self):
|
||||
self.add_track('blank.flac')
|
||||
self.add_track('local:track:blank.flac')
|
||||
self.playback.play()
|
||||
self.assertEqual(self.playback.state, PlaybackState.PLAYING)
|
||||
|
||||
@ -3,12 +3,12 @@ from __future__ import unicode_literals
|
||||
import os
|
||||
import shutil
|
||||
import tempfile
|
||||
import unittest
|
||||
|
||||
from mopidy.backends.local import actor
|
||||
from mopidy.models import Track
|
||||
from mopidy.utils.path import path_to_uri
|
||||
|
||||
from tests import unittest, path_to_data_dir
|
||||
from tests import path_to_data_dir
|
||||
from tests.backends.base.playlists import (
|
||||
PlaylistsControllerTest)
|
||||
from tests.backends.local import generate_song
|
||||
@ -88,21 +88,18 @@ class LocalPlaylistsControllerTest(
|
||||
|
||||
def test_playlist_contents_is_written_to_disk(self):
|
||||
track = Track(uri=generate_song(1))
|
||||
track_path = track.uri[len('file://'):]
|
||||
playlist = self.core.playlists.create('test')
|
||||
playlist_path = playlist.uri[len('file://'):]
|
||||
playlist_path = os.path.join(self.playlists_dir, 'test.m3u')
|
||||
playlist = playlist.copy(tracks=[track])
|
||||
playlist = self.core.playlists.save(playlist)
|
||||
|
||||
with open(playlist_path) as playlist_file:
|
||||
contents = playlist_file.read()
|
||||
|
||||
self.assertEqual(track_path, contents.strip())
|
||||
self.assertEqual(track.uri, contents.strip())
|
||||
|
||||
def test_playlists_are_loaded_at_startup(self):
|
||||
playlist_path = os.path.join(self.playlists_dir, 'test.m3u')
|
||||
|
||||
track = Track(uri=path_to_uri(path_to_data_dir('uri2')))
|
||||
track = Track(uri='local:track:path2')
|
||||
playlist = self.core.playlists.create('test')
|
||||
playlist = playlist.copy(tracks=[track])
|
||||
playlist = self.core.playlists.save(playlist)
|
||||
@ -111,8 +108,7 @@ class LocalPlaylistsControllerTest(
|
||||
|
||||
self.assert_(backend.playlists.playlists)
|
||||
self.assertEqual(
|
||||
path_to_uri(playlist_path),
|
||||
backend.playlists.playlists[0].uri)
|
||||
'local:playlist:test', backend.playlists.playlists[0].uri)
|
||||
self.assertEqual(
|
||||
playlist.name, backend.playlists.playlists[0].name)
|
||||
self.assertEqual(
|
||||
|
||||
@ -1,9 +1,11 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import unittest
|
||||
|
||||
from mopidy.backends.local import actor
|
||||
from mopidy.models import Track
|
||||
|
||||
from tests import unittest, path_to_data_dir
|
||||
from tests import path_to_data_dir
|
||||
from tests.backends.base.tracklist import TracklistControllerTest
|
||||
from tests.backends.local import generate_song
|
||||
|
||||
|
||||
@ -4,12 +4,13 @@ from __future__ import unicode_literals
|
||||
|
||||
import os
|
||||
import tempfile
|
||||
import unittest
|
||||
|
||||
from mopidy.backends.local.translator import parse_m3u, parse_mpd_tag_cache
|
||||
from mopidy.models import Track, Artist, Album
|
||||
from mopidy.utils.path import path_to_uri
|
||||
|
||||
from tests import unittest, path_to_data_dir
|
||||
from tests import path_to_data_dir
|
||||
|
||||
data_dir = path_to_data_dir('')
|
||||
song1_path = path_to_data_dir('song1.mp3')
|
||||
@ -97,10 +98,11 @@ expected_tracks = []
|
||||
|
||||
|
||||
def generate_track(path, ident):
|
||||
uri = path_to_uri(path_to_data_dir(path))
|
||||
uri = 'local:track:%s' % path
|
||||
track = Track(
|
||||
uri=uri, name='trackname', artists=expected_artists,
|
||||
album=expected_albums[0], track_no=1, date='2006', length=4000)
|
||||
album=expected_albums[0], track_no=1, date='2006', length=4000,
|
||||
last_modified=1272319626)
|
||||
expected_tracks.append(track)
|
||||
|
||||
|
||||
@ -124,10 +126,10 @@ class MPDTagCacheToTracksTest(unittest.TestCase):
|
||||
def test_simple_cache(self):
|
||||
tracks = parse_mpd_tag_cache(
|
||||
path_to_data_dir('simple_tag_cache'), path_to_data_dir(''))
|
||||
uri = path_to_uri(path_to_data_dir('song1.mp3'))
|
||||
track = Track(
|
||||
uri=uri, name='trackname', artists=expected_artists, track_no=1,
|
||||
album=expected_albums[0], date='2006', length=4000)
|
||||
uri='local:track:song1.mp3', name='trackname',
|
||||
artists=expected_artists, track_no=1, album=expected_albums[0],
|
||||
date='2006', length=4000, last_modified=1272319626)
|
||||
self.assertEqual(set([track]), tracks)
|
||||
|
||||
def test_advanced_cache(self):
|
||||
@ -139,11 +141,11 @@ class MPDTagCacheToTracksTest(unittest.TestCase):
|
||||
tracks = parse_mpd_tag_cache(
|
||||
path_to_data_dir('utf8_tag_cache'), path_to_data_dir(''))
|
||||
|
||||
uri = path_to_uri(path_to_data_dir('song1.mp3'))
|
||||
artists = [Artist(name='æøå')]
|
||||
album = Album(name='æøå', artists=artists)
|
||||
track = Track(
|
||||
uri=uri, name='æøå', artists=artists, album=album, length=4000)
|
||||
uri='local:track:song1.mp3', name='æøå', artists=artists,
|
||||
album=album, length=4000, last_modified=1272319626)
|
||||
|
||||
self.assertEqual(track, list(tracks)[0])
|
||||
|
||||
@ -155,8 +157,9 @@ class MPDTagCacheToTracksTest(unittest.TestCase):
|
||||
def test_cache_with_blank_track_info(self):
|
||||
tracks = parse_mpd_tag_cache(
|
||||
path_to_data_dir('blank_tag_cache'), path_to_data_dir(''))
|
||||
uri = path_to_uri(path_to_data_dir('song1.mp3'))
|
||||
self.assertEqual(set([Track(uri=uri, length=4000)]), tracks)
|
||||
expected = Track(
|
||||
uri='local:track:song1.mp3', length=4000, last_modified=1272319626)
|
||||
self.assertEqual(set([expected]), tracks)
|
||||
|
||||
def test_musicbrainz_tagcache(self):
|
||||
tracks = parse_mpd_tag_cache(
|
||||
@ -178,10 +181,10 @@ class MPDTagCacheToTracksTest(unittest.TestCase):
|
||||
def test_albumartist_tag_cache(self):
|
||||
tracks = parse_mpd_tag_cache(
|
||||
path_to_data_dir('albumartist_tag_cache'), path_to_data_dir(''))
|
||||
uri = path_to_uri(path_to_data_dir('song1.mp3'))
|
||||
artist = Artist(name='albumartistname')
|
||||
album = expected_albums[0].copy(artists=[artist])
|
||||
track = Track(
|
||||
uri=uri, name='trackname', artists=expected_artists, track_no=1,
|
||||
album=album, date='2006', length=4000)
|
||||
uri='local:track:song1.mp3', name='trackname',
|
||||
artists=expected_artists, track_no=1, album=album, date='2006',
|
||||
length=4000, last_modified=1272319626)
|
||||
self.assertEqual(track, list(tracks)[0])
|
||||
|
||||
44
tests/commands_test.py
Normal file
44
tests/commands_test.py
Normal file
@ -0,0 +1,44 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import argparse
|
||||
import unittest
|
||||
|
||||
from mopidy import commands
|
||||
|
||||
|
||||
class ConfigOverrideTypeTest(unittest.TestCase):
|
||||
def test_valid_override(self):
|
||||
expected = (b'section', b'key', b'value')
|
||||
self.assertEqual(
|
||||
expected, commands.config_override_type(b'section/key=value'))
|
||||
self.assertEqual(
|
||||
expected, commands.config_override_type(b'section/key=value '))
|
||||
self.assertEqual(
|
||||
expected, commands.config_override_type(b'section/key =value'))
|
||||
self.assertEqual(
|
||||
expected, commands.config_override_type(b'section /key=value'))
|
||||
|
||||
def test_valid_override_is_bytes(self):
|
||||
section, key, value = commands.config_override_type(
|
||||
b'section/key=value')
|
||||
self.assertIsInstance(section, bytes)
|
||||
self.assertIsInstance(key, bytes)
|
||||
self.assertIsInstance(value, bytes)
|
||||
|
||||
def test_empty_override(self):
|
||||
expected = ('section', 'key', '')
|
||||
self.assertEqual(
|
||||
expected, commands.config_override_type(b'section/key='))
|
||||
self.assertEqual(
|
||||
expected, commands.config_override_type(b'section/key= '))
|
||||
|
||||
def test_invalid_override(self):
|
||||
self.assertRaises(
|
||||
argparse.ArgumentTypeError,
|
||||
commands.config_override_type, b'section/key')
|
||||
self.assertRaises(
|
||||
argparse.ArgumentTypeError,
|
||||
commands.config_override_type, b'section=')
|
||||
self.assertRaises(
|
||||
argparse.ArgumentTypeError,
|
||||
commands.config_override_type, b'section')
|
||||
@ -3,10 +3,11 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
from mopidy import config
|
||||
|
||||
from tests import unittest, path_to_data_dir
|
||||
from tests import path_to_data_dir
|
||||
|
||||
|
||||
class LoadConfigTest(unittest.TestCase):
|
||||
@ -105,28 +106,3 @@ class ValidateTest(unittest.TestCase):
|
||||
self.assertEqual({'foo': {'bar': 'bad'}}, errors)
|
||||
|
||||
# TODO: add more tests
|
||||
|
||||
|
||||
class ParseOverrideTest(unittest.TestCase):
|
||||
def test_valid_override(self):
|
||||
expected = (b'section', b'key', b'value')
|
||||
self.assertEqual(expected, config.parse_override(b'section/key=value'))
|
||||
self.assertEqual(expected, config.parse_override(b'section/key=value '))
|
||||
self.assertEqual(expected, config.parse_override(b'section/key =value'))
|
||||
self.assertEqual(expected, config.parse_override(b'section /key=value'))
|
||||
|
||||
def test_valid_override_is_bytes(self):
|
||||
section, key, value = config.parse_override(b'section/key=value')
|
||||
self.assertIsInstance(section, bytes)
|
||||
self.assertIsInstance(key, bytes)
|
||||
self.assertIsInstance(value, bytes)
|
||||
|
||||
def test_empty_override(self):
|
||||
expected = ('section', 'key', '')
|
||||
self.assertEqual(expected, config.parse_override(b'section/key='))
|
||||
self.assertEqual(expected, config.parse_override(b'section/key= '))
|
||||
|
||||
def test_invalid_override(self):
|
||||
self.assertRaises(ValueError, config.parse_override, b'section/key')
|
||||
self.assertRaises(ValueError, config.parse_override, b'section=')
|
||||
self.assertRaises(ValueError, config.parse_override, b'section')
|
||||
|
||||
@ -2,10 +2,11 @@ from __future__ import unicode_literals
|
||||
|
||||
import logging
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
from mopidy.config import schemas, types
|
||||
from mopidy.config import schemas
|
||||
|
||||
from tests import unittest, any_unicode
|
||||
from tests import any_unicode
|
||||
|
||||
|
||||
class ConfigSchemaTest(unittest.TestCase):
|
||||
@ -80,7 +81,8 @@ class ConfigSchemaTest(unittest.TestCase):
|
||||
class LogLevelConfigSchemaTest(unittest.TestCase):
|
||||
def test_conversion(self):
|
||||
schema = schemas.LogLevelConfigSchema('test')
|
||||
result, errors = schema.deserialize({'foo.bar': 'DEBUG', 'baz': 'INFO'})
|
||||
result, errors = schema.deserialize(
|
||||
{'foo.bar': 'DEBUG', 'baz': 'INFO'})
|
||||
|
||||
self.assertEqual(logging.DEBUG, result['foo.bar'])
|
||||
self.assertEqual(logging.INFO, result['baz'])
|
||||
|
||||
@ -5,12 +5,10 @@ from __future__ import unicode_literals
|
||||
import logging
|
||||
import mock
|
||||
import socket
|
||||
import sys
|
||||
import unittest
|
||||
|
||||
from mopidy.config import types
|
||||
|
||||
from tests import unittest
|
||||
|
||||
# TODO: DecodeTest and EncodeTest
|
||||
|
||||
|
||||
@ -100,13 +98,18 @@ class StringTest(unittest.TestCase):
|
||||
self.assertIsInstance(result, bytes)
|
||||
self.assertEqual(b'', result)
|
||||
|
||||
def test_deserialize_enforces_choices_optional(self):
|
||||
value = types.String(optional=True, choices=['foo', 'bar', 'baz'])
|
||||
self.assertEqual(None, value.deserialize(b''))
|
||||
self.assertRaises(ValueError, value.deserialize, b'foobar')
|
||||
|
||||
|
||||
class SecretTest(unittest.TestCase):
|
||||
def test_deserialize_passes_through(self):
|
||||
def test_deserialize_decodes_utf8(self):
|
||||
value = types.Secret()
|
||||
result = value.deserialize(b'foo')
|
||||
self.assertIsInstance(result, bytes)
|
||||
self.assertEqual(b'foo', result)
|
||||
result = value.deserialize('æøå'.encode('utf-8'))
|
||||
self.assertIsInstance(result, unicode)
|
||||
self.assertEqual('æøå', result)
|
||||
|
||||
def test_deserialize_enforces_required(self):
|
||||
value = types.Secret()
|
||||
@ -165,6 +168,10 @@ class IntegerTest(unittest.TestCase):
|
||||
self.assertEqual(5, value.deserialize('5'))
|
||||
self.assertRaises(ValueError, value.deserialize, '15')
|
||||
|
||||
def test_deserialize_respects_optional(self):
|
||||
value = types.Integer(optional=True)
|
||||
self.assertEqual(None, value.deserialize(''))
|
||||
|
||||
|
||||
class BooleanTest(unittest.TestCase):
|
||||
def test_deserialize_conversion_success(self):
|
||||
@ -368,7 +375,4 @@ class PathTest(unittest.TestCase):
|
||||
|
||||
def test_serialize_unicode_string(self):
|
||||
value = types.Path()
|
||||
expected = 'æøå'.encode(sys.getfilesystemencoding())
|
||||
result = value.serialize('æøå')
|
||||
self.assertEqual(expected, result)
|
||||
self.assertIsInstance(result, bytes)
|
||||
self.assertRaises(ValueError, value.serialize, 'æøå')
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from mopidy.config import validators
|
||||
import unittest
|
||||
|
||||
from tests import unittest
|
||||
from mopidy.config import validators
|
||||
|
||||
|
||||
class ValidateChoiceTest(unittest.TestCase):
|
||||
|
||||
@ -1,12 +1,12 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
import pykka
|
||||
|
||||
from mopidy.core import Core
|
||||
|
||||
from tests import unittest
|
||||
|
||||
|
||||
class CoreActorTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
|
||||
@ -1,14 +1,14 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
import pykka
|
||||
|
||||
from mopidy import core
|
||||
from mopidy.backends import dummy
|
||||
from mopidy.models import Track
|
||||
|
||||
from tests import unittest
|
||||
|
||||
|
||||
@mock.patch.object(core.CoreListener, 'send')
|
||||
class BackendEventsTest(unittest.TestCase):
|
||||
|
||||
@ -1,13 +1,12 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
from mopidy.backends import base
|
||||
from mopidy.core import Core
|
||||
from mopidy.models import SearchResult, Track
|
||||
|
||||
from tests import unittest
|
||||
|
||||
|
||||
class CoreLibraryTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
|
||||
@ -1,12 +1,11 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
from mopidy.core import CoreListener, PlaybackState
|
||||
from mopidy.models import Playlist, TlTrack
|
||||
|
||||
from tests import unittest
|
||||
|
||||
|
||||
class CoreListenerTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
|
||||
@ -1,13 +1,12 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
from mopidy.backends import base
|
||||
from mopidy.core import Core, PlaybackState
|
||||
from mopidy.models import Track
|
||||
|
||||
from tests import unittest
|
||||
|
||||
|
||||
class CorePlaybackTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
|
||||
@ -1,13 +1,12 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
from mopidy.backends import base
|
||||
from mopidy.core import Core
|
||||
from mopidy.models import Playlist, Track
|
||||
|
||||
from tests import unittest
|
||||
|
||||
|
||||
class PlaylistsTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
|
||||
@ -1,13 +1,12 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
from mopidy.backends import base
|
||||
from mopidy.core import Core
|
||||
from mopidy.models import Track
|
||||
|
||||
from tests import unittest
|
||||
|
||||
|
||||
class TracklistTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
|
||||
@ -3,22 +3,22 @@ mpd_version: 0.14.2
|
||||
fs_charset: UTF-8
|
||||
info_end
|
||||
songList begin
|
||||
key: uri1
|
||||
file: /uri1
|
||||
key: key1
|
||||
file: /path1
|
||||
Artist: artist1
|
||||
Title: track1
|
||||
Album: album1
|
||||
Date: 2001-02-03
|
||||
Time: 4
|
||||
key: uri2
|
||||
file: /uri2
|
||||
key: key1
|
||||
file: /path2
|
||||
Artist: artist2
|
||||
Title: track2
|
||||
Album: album2
|
||||
Date: 2002
|
||||
Time: 4
|
||||
key: uri3
|
||||
file: /uri3
|
||||
key: key3
|
||||
file: /path3
|
||||
Artist: artist3
|
||||
Title: track3
|
||||
Album: album3
|
||||
|
||||
BIN
tests/data/scanner/empty.wav
Normal file
BIN
tests/data/scanner/empty.wav
Normal file
Binary file not shown.
BIN
tests/data/scanner/example.log
Normal file
BIN
tests/data/scanner/example.log
Normal file
Binary file not shown.
@ -1,8 +1,8 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from mopidy import exceptions
|
||||
import unittest
|
||||
|
||||
from tests import unittest
|
||||
from mopidy import exceptions
|
||||
|
||||
|
||||
class ExceptionsTest(unittest.TestCase):
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from mopidy import config, ext
|
||||
import unittest
|
||||
|
||||
from tests import unittest
|
||||
from mopidy import config, ext
|
||||
|
||||
|
||||
class ExtensionTest(unittest.TestCase):
|
||||
|
||||
@ -1,6 +1,8 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import json
|
||||
import mock
|
||||
import unittest
|
||||
|
||||
try:
|
||||
import cherrypy
|
||||
@ -15,9 +17,6 @@ except ImportError:
|
||||
if cherrypy and ws4py:
|
||||
from mopidy.frontends.http import actor
|
||||
|
||||
import mock
|
||||
from tests import unittest
|
||||
|
||||
|
||||
@unittest.skipUnless(cherrypy, 'cherrypy not found')
|
||||
@unittest.skipUnless(ws4py, 'ws4py not found')
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user