Release v2.2.0
This commit is contained in:
commit
e025f04160
15
.mailmap
15
.mailmap
@ -13,7 +13,9 @@ Alexandre Petitjean <alpetitjean@gmail.com>
|
||||
Alexandre Petitjean <alpetitjean@gmail.com> <alexandre.petitjean@lne.fr>
|
||||
Javier Domingo Cansino <javierdo1@gmail.com> <javier.domingo@fon.com>
|
||||
Lasse Bigum <lasse@bigum.org> <l.bigum@samsung.com>
|
||||
Nick Steel <kingosticks@gmail.com> <kingosticks@gmail.com>
|
||||
Nick Steel <nsteel@fastmail.com> <kingosticks@gmail.com>
|
||||
Nick Steel <nsteel@fastmail.com> <nickasteel@gmail.com>
|
||||
Nick Steel <nsteel@fastmail.com> <nick.steel@nokia.com>
|
||||
Janez Troha <janez.troha@gmail.com> <dz0ny@users.noreply.github.com>
|
||||
Janez Troha <janez.troha@gmail.com> <dz0ny@ubuntu.si>
|
||||
Luke Giuliani <luke@giuliani.com.au>
|
||||
@ -28,3 +30,14 @@ Kyle Heyne <kyleheyne@gmail.com>
|
||||
Tom Roth <rawdlite@googlemail.com>
|
||||
Eric Jahn <ejahn@newstore.com>
|
||||
Loïck Bonniot <git@lesterpig.com>
|
||||
Jens Lütjen <dublok@users.noreply.github.com>
|
||||
Jens Lütjen <dublok@users.noreply.github.com> <dublok@users.noreplay.github.com>
|
||||
Daniel T <thomas_d_j@yahoo.com.au>
|
||||
Ismael Asensio <ismailof@github.com>
|
||||
Brendan Jones <btjones711@gmail.com> <Btjones711@gmail.com>
|
||||
Marvin Preuss <marvin@xsteadfastx.org>
|
||||
Bernhard Gehl <bernhard.gehl@gmail.com>
|
||||
Caysho <caysho@internode.on.net> <mopidy-dev@mopidy-dev.mopidy-dev>
|
||||
Nick Aquina <nickaquina@gmail.com> <fantostisch@users.noreply.github.com>
|
||||
Jarryd Tilbrook <jrad.tilbrook@gmail.com>
|
||||
Matthieu Melquiond <matt.llvw@gmail.com>
|
||||
|
||||
@ -8,12 +8,10 @@ python:
|
||||
|
||||
env:
|
||||
- TOX_ENV=py27
|
||||
- TOX_ENV=py27-tornado32
|
||||
- TOX_ENV=docs
|
||||
- TOX_ENV=flake8
|
||||
|
||||
before_install:
|
||||
- "sudo sed -i '/127.0.1.1/d' /etc/hosts" # Workaround tornadoweb/tornado#1573
|
||||
- "sudo apt-get update -qq"
|
||||
- "sudo apt-get install -y gir1.2-gst-plugins-base-1.0 gir1.2-gstreamer-1.0 graphviz-dev gstreamer1.0-plugins-good gstreamer1.0-plugins-bad python-gst-1.0"
|
||||
|
||||
|
||||
36
AUTHORS
36
AUTHORS
@ -16,7 +16,7 @@
|
||||
- Jeremy B. Merrill <jeremybmerrill@gmail.com>
|
||||
- Adam Rigg <adam@adamrigg.id.au>
|
||||
- Ernst Bammer <herr.ernst@gmail.com>
|
||||
- Nick Steel <kingosticks@gmail.com>
|
||||
- Nick Steel <nsteel@fastmail.com>
|
||||
- Zan Dobersek <zandobersek@gmail.com>
|
||||
- Thomas Refis <refis.thomas@gmail.com>
|
||||
- Janez Troha <janez.troha@gmail.com>
|
||||
@ -77,3 +77,37 @@
|
||||
- Alex Malone <jalexmalone@gmail.com>
|
||||
- Daniel Hahler <git@thequod.de>
|
||||
- Bryan Bennett <bbenne10@gmail.com>
|
||||
- Jens Lütjen <dublok@users.noreply.github.com>
|
||||
- Lina He <linahe93@gmail.com>
|
||||
- Daniel T <thomas_d_j@yahoo.com.au>
|
||||
- Lars Kruse <devel@sumpfralle.de>
|
||||
- Benjamin Chrétien <chretien.b@gmail.com>
|
||||
- SeppSTA <s.staats@gmx.de>
|
||||
- Ismael Asensio <ismailof@github.com>
|
||||
- Tom Parker <palfrey@tevp.net>
|
||||
- Nantas Nardelli <nantas.nardelli@gmail.com>
|
||||
- Naglis Jonaitis <naglis@mailbox.org>
|
||||
- Alexander Jaworowski <alexander@jaworowski.se>
|
||||
- Don Armstrong <don@donarmstrong.com>
|
||||
- Nadav Tau <nadavt@sedonasys.com>
|
||||
- Aleksandar Benic <aleksandar.benic@protonmail.com>
|
||||
- Tom Swirly <tom@swirly.com>
|
||||
- Piotr Dobrowolski <Informatic@users.noreply.github.com>
|
||||
- Tomas Susanka <tsusanka@gmail.com>
|
||||
- James Barnsley <james@barnsley.nz>
|
||||
- Caysho <caysho@internode.on.net>
|
||||
- Brendan Jones <btjones711@gmail.com>
|
||||
- Marvin Preuss <marvin@xsteadfastx.org>
|
||||
- Bernhard Gehl <bernhard.gehl@gmail.com>
|
||||
- CL123123 <clairclair628@gmail.com>
|
||||
- Piotr Dobrowolski <admin@tastycode.pl>
|
||||
- Nick Aquina <nickaquina@gmail.com>
|
||||
- Marcus Götling <marcus@gotling.se>
|
||||
- Dominique Tardif <dommtardif@users.noreply.github.com>
|
||||
- Alexey Murz Korepov <murznn@gmail.com>
|
||||
- Jarryd Tilbrook <jrad.tilbrook@gmail.com>
|
||||
- Dan Brough <dan@danbrough.org>
|
||||
- Jonathan Jefferies <jjok@users.noreply.github.com>
|
||||
- Matthieu Melquiond <matt.llvw@gmail.com>
|
||||
- Damien Cassou <damien@cassou.me>
|
||||
- Leonid Bogdanov <leonid_bogdanov@mail.ru>
|
||||
|
||||
@ -39,7 +39,7 @@ block in the Pi Musicbox integrated audio jukebox system for Raspberry Pi.
|
||||
|
||||
**Mopidy is hackable**
|
||||
|
||||
Mopidy's extension support and Python, JSON-RPC, and JavaScript APIs makes
|
||||
Mopidy's extension support and Python, JSON-RPC, and JavaScript APIs make
|
||||
Mopidy perfect for building your own hacks. In one project, a Raspberry Pi was
|
||||
embedded in an old cassette player. The buttons and volume control are wired up
|
||||
with GPIO on the Raspberry Pi, and is used to control playback through a custom
|
||||
|
||||
@ -13,8 +13,7 @@ mock
|
||||
responses
|
||||
|
||||
# Test runners
|
||||
pytest<3.3.0
|
||||
pytest-capturelog
|
||||
pytest>=3.3.0
|
||||
pytest-cov
|
||||
pytest-xdist
|
||||
tox
|
||||
|
||||
@ -5,6 +5,68 @@ Changelog
|
||||
This changelog is used to track all major changes to Mopidy.
|
||||
|
||||
|
||||
v2.2.0 (2018-09-30)
|
||||
===================
|
||||
|
||||
Mopidy 2.2.0, a feature release, is out. It is a quite small release, featuring
|
||||
mostly minor fixes and improvements.
|
||||
|
||||
Most notably, this release introduces CSRF protection for both the HTTP and
|
||||
WebSocket RPC interfaces, and improves the file path checking in the M3U
|
||||
backend. The CSRF protection should stop attacks against local Mopidy servers
|
||||
from malicious websites, like what was demonstrated by Josef Gajdusek in
|
||||
:issue:`1659`.
|
||||
|
||||
Since the release of 2.1.0, we've closed approximately 21 issues and pull
|
||||
requests through 133 commits by 22 authors.
|
||||
|
||||
- Dependencies: Drop support for Tornado < 4.4. Though strictly a breaking
|
||||
change, this shouldn't affect any supported systems as even Debian stable
|
||||
includes Tornado >= 4.4.
|
||||
|
||||
- Core: Remove upper limit of 10000 tracks in tracklist. 10000 tracks is still
|
||||
the default limit as some MPD clients crash if the tracklist is longer, but
|
||||
it is now possible to set the :confval:`core/max_tracklist_length` config
|
||||
value as high as you want to. (Fixes: :issue:`1600`, PR: :issue:`1666`)
|
||||
|
||||
- Core: Fix crash on ``library.lookup(uris=[])``. (Fixes: :issue:`1619`, PR:
|
||||
:issue:`1620`)
|
||||
|
||||
- Core: Define return value of ``playlists.delete()`` to be a bool, :class:`True`
|
||||
on success, :class:`False` otherwise. (PR: :issue:`1702`)
|
||||
|
||||
- M3U: Ignore all attempts at accessing files outside the
|
||||
:confval:`m3u/playlist_dir`. (Partly fixes: :issue:`1659`, PR: :issue:`1702`)
|
||||
|
||||
- File: Change default ordering to show directories first, then files. (PR:
|
||||
:issue:`1595`)
|
||||
|
||||
- File: Fix extraneous encoding of path. (PR: :issue:`1611`)
|
||||
|
||||
- HTTP: Protect RPC and WebSocket interfaces against CSRF by blocking requests
|
||||
that originate from servers other than those specified in the new config
|
||||
value :confval:`http/allowed_origins`. An artifact of this is that all
|
||||
JSON-RPC requests must now always set the header
|
||||
``Content-Type: application/json``.
|
||||
(Partly fixes: :issue:`1659`, PR: :issue:`1668`)
|
||||
|
||||
- MPD: Added ``idle`` to the list of available commands.
|
||||
(Fixes: :issue:`1593`, PR: :issue:`1597`)
|
||||
|
||||
- MPD: Added Unix domain sockets for binding MPD to.
|
||||
(Fixes: :issue:`1531`, PR: :issue:`1629`)
|
||||
|
||||
- MPD: Lookup track metadata for MPD ``load`` and ``listplaylistinfo``.
|
||||
(Fixes: :issue:`1511`, PR: :issue:`1621`)
|
||||
|
||||
- Ensure that decoding of OS errors with unknown encoding never crashes, but
|
||||
instead replaces unknown bytes with a replacement marker. (Fixes:
|
||||
:issue:`1599`)
|
||||
|
||||
- Set GLib program and application name, so that we show up as "Mopidy" in
|
||||
PulseAudio instead of "python ...". (PR: :issue:`1626`)
|
||||
|
||||
|
||||
v2.1.0 (2017-01-02)
|
||||
===================
|
||||
|
||||
|
||||
@ -271,31 +271,31 @@ Running unit tests
|
||||
Under the hood, ``tox -e py27`` will use `pytest <http://pytest.org/>`_ as the
|
||||
test runner. We can also use it directly to run all tests::
|
||||
|
||||
py.test
|
||||
pytest
|
||||
|
||||
py.test has lots of possibilities, so you'll have to dive into their docs and
|
||||
pytest has lots of possibilities, so you'll have to dive into their docs and
|
||||
plugins to get full benefit from it. To get you interested, here are some
|
||||
examples.
|
||||
|
||||
We can limit to just tests in a single directory to save time::
|
||||
|
||||
py.test tests/http/
|
||||
pytest tests/http/
|
||||
|
||||
With the help of the pytest-xdist plugin, we can run tests with four Python
|
||||
processes in parallel, which usually cuts the test time in half or more::
|
||||
|
||||
py.test -n 4
|
||||
pytest -n 4
|
||||
|
||||
Another useful feature from pytest-xdist, is the possiblity to stop on the
|
||||
first test failure, watch the file system for changes, and then rerun the
|
||||
tests. This makes for a very quick code-test cycle::
|
||||
|
||||
py.test -f # or --looponfail
|
||||
pytest -f # or --looponfail
|
||||
|
||||
With the help of the pytest-cov plugin, we can get a report on what parts of
|
||||
the given module, ``mopidy`` in this example, are covered by the test suite::
|
||||
|
||||
py.test --cov=mopidy --cov-report=term-missing
|
||||
pytest --cov=mopidy --cov-report=term-missing
|
||||
|
||||
.. note::
|
||||
|
||||
@ -305,9 +305,9 @@ the given module, ``mopidy`` in this example, are covered by the test suite::
|
||||
If we want to speed up the test suite, we can even get a list of the ten
|
||||
slowest tests::
|
||||
|
||||
py.test --durations=10
|
||||
pytest --durations=10
|
||||
|
||||
By now, you should be convinced that running py.test directly during
|
||||
By now, you should be convinced that running pytest directly during
|
||||
development can be very useful.
|
||||
|
||||
|
||||
@ -402,7 +402,7 @@ Working on extensions
|
||||
Much of the above also applies to Mopidy extensions, though they're often a bit
|
||||
simpler. They don't have documentation sites and their test suites are either
|
||||
small and fast, or sadly missing entirely. Most of them use tox and flake8, and
|
||||
py.test can be used to run their test suites.
|
||||
pytest can be used to run their test suites.
|
||||
|
||||
.. contents::
|
||||
:local:
|
||||
|
||||
@ -20,7 +20,7 @@ When it is enabled it starts a web server at the port specified by the
|
||||
authorization. Anyone able to access the web server can use the full core
|
||||
API of Mopidy. Thus, you probably only want to make the web server
|
||||
available from your local network or place it behind a web proxy which
|
||||
takes care or user authentication. You have been warned.
|
||||
takes care of user authentication. You have been warned.
|
||||
|
||||
|
||||
Hosting web clients
|
||||
@ -98,3 +98,15 @@ See :ref:`config` for general help on configuring Mopidy.
|
||||
be published.
|
||||
|
||||
Set to an empty string to disable Zeroconf for HTTP.
|
||||
|
||||
.. confval:: http/allowed_origins
|
||||
|
||||
A list of domains allowed to perform Cross-Origin Resource Sharing (CORS)
|
||||
requests. This applies to both JSON-RPC and Websocket requests. Values
|
||||
should be in the format ``hostname:port`` and separated by either a comma or
|
||||
newline.
|
||||
|
||||
Same-origin requests (i.e. requests from Mopidy's web server) are always
|
||||
allowed and so you don't need an entry for those. However, if your requests
|
||||
originate from a different web server, you will need to add an entry for
|
||||
that server in this list.
|
||||
|
||||
@ -63,7 +63,8 @@ See :ref:`config` for general help on configuring Mopidy.
|
||||
|
||||
.. confval:: mpd/hostname
|
||||
|
||||
Which address the MPD server should bind to.
|
||||
Which address the MPD server should bind to. This can be a network address
|
||||
or the path to a Unix socket.
|
||||
|
||||
``127.0.0.1``
|
||||
Listens only on the IPv4 loopback interface
|
||||
@ -73,6 +74,9 @@ See :ref:`config` for general help on configuring Mopidy.
|
||||
Listens on all IPv4 interfaces
|
||||
``::``
|
||||
Listens on all interfaces, both IPv4 and IPv6
|
||||
``unix:/path/to/unix/socket.sock``
|
||||
Listen on the Unix socket at the specified path. Must be prefixed with
|
||||
``unix:``
|
||||
|
||||
.. confval:: mpd/port
|
||||
|
||||
|
||||
@ -14,4 +14,4 @@ if not (2, 7) <= sys.version_info < (3,):
|
||||
warnings.filterwarnings('ignore', 'could not open display')
|
||||
|
||||
|
||||
__version__ = '2.1.0'
|
||||
__version__ = '2.2.0'
|
||||
|
||||
@ -5,6 +5,10 @@ import os
|
||||
import signal
|
||||
import sys
|
||||
|
||||
import pykka.debug
|
||||
|
||||
from mopidy import commands, config as config_lib, ext
|
||||
from mopidy.internal import encoding, log, path, process, versioning
|
||||
from mopidy.internal.gi import Gst # noqa: F401
|
||||
|
||||
try:
|
||||
@ -15,10 +19,6 @@ try:
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
import pykka.debug # noqa: I100
|
||||
|
||||
from mopidy import commands, config as config_lib, ext
|
||||
from mopidy.internal import encoding, log, path, process, versioning
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@ -176,7 +176,7 @@ def _query_duration(pipeline):
|
||||
elif duration < 0:
|
||||
duration = None # Stream without duration.
|
||||
else:
|
||||
duration = duration // Gst.MSECOND
|
||||
duration = int(duration // Gst.MSECOND)
|
||||
return success, duration
|
||||
|
||||
|
||||
|
||||
@ -362,10 +362,16 @@ class PlaylistsProvider(object):
|
||||
"""
|
||||
Delete playlist identified by the URI.
|
||||
|
||||
Returns :class:`True` if deleted, :class:`False` otherwise.
|
||||
|
||||
*MUST be implemented by subclass.*
|
||||
|
||||
:param uri: URI of the playlist to delete
|
||||
:type uri: string
|
||||
:rtype: :class:`bool`
|
||||
|
||||
.. versionchanged:: 2.2
|
||||
Return type defined.
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
@ -23,7 +23,7 @@ _core_schema['cache_dir'] = Path()
|
||||
_core_schema['config_dir'] = Path()
|
||||
_core_schema['data_dir'] = Path()
|
||||
# MPD supports at most 10k tracks, some clients segfault when this is exceeded.
|
||||
_core_schema['max_tracklist_length'] = Integer(minimum=1, maximum=10000)
|
||||
_core_schema['max_tracklist_length'] = Integer(minimum=1)
|
||||
_core_schema['restore_state'] = Boolean(optional=True)
|
||||
|
||||
_logging_schema = ConfigSchema('logging')
|
||||
|
||||
@ -258,6 +258,9 @@ class Hostname(ConfigValue):
|
||||
validators.validate_required(value, self._required)
|
||||
if not value.strip():
|
||||
return None
|
||||
socket_path = path.get_unix_socket_path(value)
|
||||
if socket_path is not None:
|
||||
return 'unix:' + Path(not self._required).deserialize(socket_path)
|
||||
try:
|
||||
socket.getaddrinfo(value, None)
|
||||
except socket.error:
|
||||
|
||||
@ -228,8 +228,9 @@ class LibraryController(object):
|
||||
|
||||
# TODO: lookup(uris) to backend APIs
|
||||
for backend, backend_uris in self._get_backends_to_uris(uris).items():
|
||||
for u in backend_uris:
|
||||
futures[(backend, u)] = backend.library.lookup(u)
|
||||
if backend_uris:
|
||||
for u in backend_uris:
|
||||
futures[(backend, u)] = backend.library.lookup(u)
|
||||
|
||||
for (backend, u), future in futures.items():
|
||||
with _backend_error_handling(backend):
|
||||
|
||||
@ -180,22 +180,35 @@ class PlaylistsController(object):
|
||||
If the URI doesn't match the URI schemes handled by the current
|
||||
backends, nothing happens.
|
||||
|
||||
Returns :class:`True` if deleted, :class:`False` otherwise.
|
||||
|
||||
:param uri: URI of the playlist to delete
|
||||
:type uri: string
|
||||
:rtype: :class:`bool`
|
||||
|
||||
.. versionchanged:: 2.2
|
||||
Return type defined.
|
||||
"""
|
||||
validation.check_uri(uri)
|
||||
|
||||
uri_scheme = urllib.parse.urlparse(uri).scheme
|
||||
backend = self.backends.with_playlists.get(uri_scheme, None)
|
||||
if not backend:
|
||||
return None # TODO: error reporting to user
|
||||
return False
|
||||
|
||||
success = False
|
||||
with _backend_error_handling(backend):
|
||||
backend.playlists.delete(uri).get()
|
||||
# TODO: error detection and reporting to user
|
||||
success = backend.playlists.delete(uri).get()
|
||||
|
||||
if success is None:
|
||||
# Return type was defined in Mopidy 2.2. Assume everything went
|
||||
# well if the backend doesn't report otherwise.
|
||||
success = True
|
||||
|
||||
if success:
|
||||
listener.CoreListener.send('playlist_deleted', uri=uri)
|
||||
|
||||
# TODO: return value?
|
||||
return success
|
||||
|
||||
def filter(self, criteria=None, **kwargs):
|
||||
"""
|
||||
|
||||
@ -1,7 +1,6 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import logging
|
||||
import operator
|
||||
import os
|
||||
import sys
|
||||
import urllib2
|
||||
@ -82,7 +81,10 @@ class FileLibraryProvider(backend.LibraryProvider):
|
||||
elif os.path.isfile(child_path):
|
||||
result.append(models.Ref.track(name=name, uri=uri))
|
||||
|
||||
result.sort(key=operator.attrgetter('name'))
|
||||
def order(item):
|
||||
return (item.type != models.Ref.DIRECTORY, item.name)
|
||||
result.sort(key=order)
|
||||
|
||||
return result
|
||||
|
||||
def lookup(self, uri):
|
||||
@ -140,6 +142,5 @@ class FileLibraryProvider(backend.LibraryProvider):
|
||||
|
||||
def _is_in_basedir(self, local_path):
|
||||
return any(
|
||||
path.is_path_inside_base_dir(
|
||||
local_path, media_dir['path'].encode('utf-8'))
|
||||
path.is_path_inside_base_dir(local_path, media_dir['path'])
|
||||
for media_dir in self._media_dirs)
|
||||
|
||||
@ -25,6 +25,7 @@ class Extension(ext.Extension):
|
||||
schema['port'] = config_lib.Port()
|
||||
schema['static_dir'] = config_lib.Path(optional=True)
|
||||
schema['zeroconf'] = config_lib.String(optional=True)
|
||||
schema['allowed_origins'] = config_lib.List(optional=True)
|
||||
return schema
|
||||
|
||||
def validate_environment(self):
|
||||
|
||||
@ -4,3 +4,4 @@ hostname = 127.0.0.1
|
||||
port = 6680
|
||||
static_dir =
|
||||
zeroconf = Mopidy HTTP server on $hostname
|
||||
allowed_origins =
|
||||
|
||||
@ -3,7 +3,6 @@ from __future__ import absolute_import, unicode_literals
|
||||
import functools
|
||||
import logging
|
||||
import os
|
||||
import socket
|
||||
|
||||
import tornado.escape
|
||||
import tornado.ioloop
|
||||
@ -12,6 +11,7 @@ import tornado.websocket
|
||||
|
||||
import mopidy
|
||||
from mopidy import core, models
|
||||
from mopidy.compat import urllib
|
||||
from mopidy.internal import encoding, jsonrpc
|
||||
|
||||
|
||||
@ -20,12 +20,17 @@ logger = logging.getLogger(__name__)
|
||||
|
||||
def make_mopidy_app_factory(apps, statics):
|
||||
def mopidy_app_factory(config, core):
|
||||
allowed_origins = {
|
||||
x.lower() for x in config['http']['allowed_origins'] if x
|
||||
}
|
||||
return [
|
||||
(r'/ws/?', WebSocketHandler, {
|
||||
'core': core,
|
||||
'allowed_origins': allowed_origins,
|
||||
}),
|
||||
(r'/rpc', JsonRpcHandler, {
|
||||
'core': core,
|
||||
'allowed_origins': allowed_origins,
|
||||
}),
|
||||
(r'/(.+)', StaticFileHandler, {
|
||||
'path': os.path.join(os.path.dirname(__file__), 'data'),
|
||||
@ -97,16 +102,12 @@ class WebSocketHandler(tornado.websocket.WebSocketHandler):
|
||||
# One callback per client to keep time we hold up the loop short
|
||||
loop.add_callback(functools.partial(_send_broadcast, client, msg))
|
||||
|
||||
def initialize(self, core):
|
||||
def initialize(self, core, allowed_origins):
|
||||
self.jsonrpc = make_jsonrpc_wrapper(core)
|
||||
self.allowed_origins = allowed_origins
|
||||
|
||||
def open(self):
|
||||
if hasattr(self, 'set_nodelay'):
|
||||
# New in Tornado 3.1
|
||||
self.set_nodelay(True)
|
||||
else:
|
||||
self.stream.socket.setsockopt(
|
||||
socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
|
||||
self.set_nodelay(True)
|
||||
self.clients.add(self)
|
||||
logger.debug(
|
||||
'New WebSocket connection from %s', self.request.remote_ip)
|
||||
@ -138,9 +139,7 @@ class WebSocketHandler(tornado.websocket.WebSocketHandler):
|
||||
self.close()
|
||||
|
||||
def check_origin(self, origin):
|
||||
# Allow cross-origin WebSocket connections, like Tornado before 4.0
|
||||
# defaulted to.
|
||||
return True
|
||||
return check_origin(origin, self.request.headers, self.allowed_origins)
|
||||
|
||||
|
||||
def set_mopidy_headers(request_handler):
|
||||
@ -149,16 +148,31 @@ def set_mopidy_headers(request_handler):
|
||||
'X-Mopidy-Version', mopidy.__version__.encode('utf-8'))
|
||||
|
||||
|
||||
def check_origin(origin, request_headers, allowed_origins):
|
||||
if origin is None:
|
||||
logger.debug('Origin was not set')
|
||||
return False
|
||||
allowed_origins.add(request_headers.get('Host'))
|
||||
parsed_origin = urllib.parse.urlparse(origin).netloc.lower()
|
||||
return parsed_origin and parsed_origin in allowed_origins
|
||||
|
||||
|
||||
class JsonRpcHandler(tornado.web.RequestHandler):
|
||||
|
||||
def initialize(self, core):
|
||||
def initialize(self, core, allowed_origins):
|
||||
self.jsonrpc = make_jsonrpc_wrapper(core)
|
||||
self.allowed_origins = allowed_origins
|
||||
|
||||
def head(self):
|
||||
self.set_extra_headers()
|
||||
self.finish()
|
||||
|
||||
def post(self):
|
||||
content_type = self.request.headers.get('Content-Type', '')
|
||||
if content_type != 'application/json':
|
||||
self.set_status(415, 'Content-Type must be application/json')
|
||||
return
|
||||
|
||||
data = self.request.body
|
||||
if not data:
|
||||
return
|
||||
@ -183,6 +197,18 @@ class JsonRpcHandler(tornado.web.RequestHandler):
|
||||
self.set_header('Accept', 'application/json')
|
||||
self.set_header('Content-Type', 'application/json; utf-8')
|
||||
|
||||
def options(self):
|
||||
origin = self.request.headers.get('Origin')
|
||||
if not check_origin(
|
||||
origin, self.request.headers, self.allowed_origins):
|
||||
self.set_status(403, 'Access denied for origin %s' % origin)
|
||||
return
|
||||
|
||||
self.set_header('Access-Control-Allow-Origin', '%s' % origin)
|
||||
self.set_header('Access-Control-Allow-Headers', 'Content-Type')
|
||||
self.set_status(204)
|
||||
self.finish()
|
||||
|
||||
|
||||
class ClientListHandler(tornado.web.RequestHandler):
|
||||
|
||||
|
||||
@ -9,4 +9,4 @@ def locale_decode(bytestr):
|
||||
try:
|
||||
return compat.text_type(bytestr)
|
||||
except UnicodeError:
|
||||
return bytes(bytestr).decode(locale.getpreferredencoding())
|
||||
return bytes(bytestr).decode(locale.getpreferredencoding(), 'replace')
|
||||
|
||||
@ -25,6 +25,8 @@ else:
|
||||
gi.require_version('GstPbutils', '1.0')
|
||||
from gi.repository import GstPbutils
|
||||
|
||||
GLib.set_prgname('mopidy')
|
||||
GLib.set_application_name('Mopidy')
|
||||
|
||||
REQUIRED_GST_VERSION = (1, 2, 3)
|
||||
|
||||
|
||||
@ -2,6 +2,7 @@ from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import errno
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import socket
|
||||
import sys
|
||||
@ -9,13 +10,20 @@ import threading
|
||||
|
||||
import pykka
|
||||
|
||||
from mopidy.internal import encoding
|
||||
from mopidy.internal import encoding, path, validation
|
||||
from mopidy.internal.gi import GObject
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def is_unix_socket(sock):
|
||||
"""Check if the provided socket is a Unix domain socket"""
|
||||
if hasattr(socket, 'AF_UNIX'):
|
||||
return sock.family == socket.AF_UNIX
|
||||
return False
|
||||
|
||||
|
||||
class ShouldRetrySocketCall(Exception):
|
||||
|
||||
"""Indicate that attempted socket call should be retried"""
|
||||
@ -40,7 +48,7 @@ def try_ipv6_socket():
|
||||
has_ipv6 = try_ipv6_socket()
|
||||
|
||||
|
||||
def create_socket():
|
||||
def create_tcp_socket():
|
||||
"""Create a TCP socket with or without IPv6 depending on system support"""
|
||||
if has_ipv6:
|
||||
sock = socket.socket(socket.AF_INET6, socket.SOCK_STREAM)
|
||||
@ -57,6 +65,19 @@ def create_socket():
|
||||
return sock
|
||||
|
||||
|
||||
def create_unix_socket():
|
||||
"""Create a Unix domain socket"""
|
||||
return socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
|
||||
|
||||
|
||||
def format_socket_name(sock):
|
||||
"""Format the connection string for the given socket"""
|
||||
if is_unix_socket(sock):
|
||||
return '%s' % sock.getsockname()
|
||||
else:
|
||||
return '[%s]:%s' % sock.getsockname()[:2]
|
||||
|
||||
|
||||
def format_hostname(hostname):
|
||||
"""Format hostname for display."""
|
||||
if (has_ipv6 and re.match(r'\d+.\d+.\d+.\d+', hostname) is not None):
|
||||
@ -76,17 +97,42 @@ class Server(object):
|
||||
self.timeout = timeout
|
||||
self.server_socket = self.create_server_socket(host, port)
|
||||
|
||||
self.register_server_socket(self.server_socket.fileno())
|
||||
self.watcher = self.register_server_socket(self.server_socket.fileno())
|
||||
|
||||
def create_server_socket(self, host, port):
|
||||
sock = create_socket()
|
||||
socket_path = path.get_unix_socket_path(host)
|
||||
if socket_path is not None: # host is a path so use unix socket
|
||||
sock = create_unix_socket()
|
||||
sock.bind(socket_path)
|
||||
else:
|
||||
# ensure the port is supplied
|
||||
validation.check_integer(port)
|
||||
sock = create_tcp_socket()
|
||||
sock.bind((host, port))
|
||||
|
||||
sock.setblocking(False)
|
||||
sock.bind((host, port))
|
||||
sock.listen(1)
|
||||
return sock
|
||||
|
||||
def stop(self):
|
||||
GObject.source_remove(self.watcher)
|
||||
if is_unix_socket(self.server_socket):
|
||||
unix_socket_path = self.server_socket.getsockname()
|
||||
else:
|
||||
unix_socket_path = None
|
||||
|
||||
self.server_socket.shutdown(socket.SHUT_RDWR)
|
||||
self.server_socket.close()
|
||||
|
||||
# clean up the socket file
|
||||
if unix_socket_path is not None:
|
||||
os.unlink(unix_socket_path)
|
||||
|
||||
def register_server_socket(self, fileno):
|
||||
GObject.io_add_watch(fileno, GObject.IO_IN, self.handle_connection)
|
||||
return GObject.io_add_watch(
|
||||
fileno,
|
||||
GObject.IO_IN,
|
||||
self.handle_connection)
|
||||
|
||||
def handle_connection(self, fd, flags):
|
||||
try:
|
||||
@ -102,7 +148,10 @@ class Server(object):
|
||||
|
||||
def accept_connection(self):
|
||||
try:
|
||||
return self.server_socket.accept()
|
||||
sock, addr = self.server_socket.accept()
|
||||
if is_unix_socket(sock):
|
||||
addr = (sock.getsockname(), None)
|
||||
return sock, addr
|
||||
except socket.error as e:
|
||||
if e.errno in (errno.EAGAIN, errno.EINTR):
|
||||
raise ShouldRetrySocketCall
|
||||
@ -117,7 +166,9 @@ class Server(object):
|
||||
|
||||
def reject_connection(self, sock, addr):
|
||||
# FIXME provide more context in logging?
|
||||
logger.warning('Rejected connection from [%s]:%s', addr[0], addr[1])
|
||||
logger.warning(
|
||||
'Rejected connection from %s',
|
||||
format_socket_name(sock))
|
||||
try:
|
||||
sock.close()
|
||||
except socket.error:
|
||||
@ -142,7 +193,7 @@ class Connection(object):
|
||||
|
||||
self.host, self.port = addr[:2] # IPv6 has larger addr
|
||||
|
||||
self.sock = sock
|
||||
self._sock = sock
|
||||
self.protocol = protocol
|
||||
self.protocol_kwargs = protocol_kwargs
|
||||
self.timeout = timeout
|
||||
@ -180,7 +231,7 @@ class Connection(object):
|
||||
self.disable_send()
|
||||
|
||||
try:
|
||||
self.sock.close()
|
||||
self._sock.close()
|
||||
except socket.error:
|
||||
pass
|
||||
|
||||
@ -195,7 +246,7 @@ class Connection(object):
|
||||
def send(self, data):
|
||||
"""Send data to client, return any unsent data."""
|
||||
try:
|
||||
sent = self.sock.send(data)
|
||||
sent = self._sock.send(data)
|
||||
return data[sent:]
|
||||
except socket.error as e:
|
||||
if e.errno in (errno.EWOULDBLOCK, errno.EINTR):
|
||||
@ -226,7 +277,7 @@ class Connection(object):
|
||||
|
||||
try:
|
||||
self.recv_id = GObject.io_add_watch(
|
||||
self.sock.fileno(),
|
||||
self._sock.fileno(),
|
||||
GObject.IO_IN | GObject.IO_ERR | GObject.IO_HUP,
|
||||
self.recv_callback)
|
||||
except socket.error as e:
|
||||
@ -244,7 +295,7 @@ class Connection(object):
|
||||
|
||||
try:
|
||||
self.send_id = GObject.io_add_watch(
|
||||
self.sock.fileno(),
|
||||
self._sock.fileno(),
|
||||
GObject.IO_OUT | GObject.IO_ERR | GObject.IO_HUP,
|
||||
self.send_callback)
|
||||
except socket.error as e:
|
||||
@ -263,7 +314,7 @@ class Connection(object):
|
||||
return True
|
||||
|
||||
try:
|
||||
data = self.sock.recv(4096)
|
||||
data = self._sock.recv(4096)
|
||||
except socket.error as e:
|
||||
if e.errno not in (errno.EWOULDBLOCK, errno.EINTR):
|
||||
self.stop('Unexpected client error: %s' % e)
|
||||
@ -304,6 +355,9 @@ class Connection(object):
|
||||
self.stop('Client inactive for %ds; closing connection' % self.timeout)
|
||||
return False
|
||||
|
||||
def __str__(self):
|
||||
return format_socket_name(self._sock)
|
||||
|
||||
|
||||
class LineProtocol(pykka.ThreadingActor):
|
||||
|
||||
|
||||
@ -2,6 +2,7 @@ from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import stat
|
||||
import string
|
||||
import threading
|
||||
@ -47,6 +48,13 @@ def get_or_create_file(file_path, mkdir=True, content=None):
|
||||
return file_path
|
||||
|
||||
|
||||
def get_unix_socket_path(socket_path):
|
||||
match = re.search('^unix:(.*)', socket_path)
|
||||
if not match:
|
||||
return None
|
||||
return match.group(1)
|
||||
|
||||
|
||||
def path_to_uri(path):
|
||||
"""
|
||||
Convert OS specific path to file:// URI.
|
||||
|
||||
@ -9,6 +9,7 @@ import os
|
||||
import tempfile
|
||||
|
||||
from mopidy import backend
|
||||
from mopidy.internal import path
|
||||
|
||||
from . import Extension, translator
|
||||
|
||||
@ -89,13 +90,22 @@ class M3UPlaylistsProvider(backend.PlaylistsProvider):
|
||||
|
||||
def delete(self, uri):
|
||||
path = translator.uri_to_path(uri)
|
||||
if not self._is_in_basedir(path):
|
||||
logger.debug('Ignoring path outside playlist dir: %s', uri)
|
||||
return False
|
||||
try:
|
||||
os.remove(self._abspath(path))
|
||||
except EnvironmentError as e:
|
||||
log_environment_error('Error deleting playlist %s' % uri, e)
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
def get_items(self, uri):
|
||||
path = translator.uri_to_path(uri)
|
||||
if not self._is_in_basedir(path):
|
||||
logger.debug('Ignoring path outside playlist dir: %s', uri)
|
||||
return None
|
||||
try:
|
||||
with self._open(path, 'r') as fp:
|
||||
items = translator.load_items(fp, self._base_dir)
|
||||
@ -106,6 +116,9 @@ class M3UPlaylistsProvider(backend.PlaylistsProvider):
|
||||
|
||||
def lookup(self, uri):
|
||||
path = translator.uri_to_path(uri)
|
||||
if not self._is_in_basedir(path):
|
||||
logger.debug('Ignoring path outside playlist dir: %s', uri)
|
||||
return None
|
||||
try:
|
||||
with self._open(path, 'r') as fp:
|
||||
items = translator.load_items(fp, self._base_dir)
|
||||
@ -120,6 +133,10 @@ class M3UPlaylistsProvider(backend.PlaylistsProvider):
|
||||
|
||||
def save(self, playlist):
|
||||
path = translator.uri_to_path(playlist.uri)
|
||||
if not self._is_in_basedir(path):
|
||||
logger.debug(
|
||||
'Ignoring path outside playlist dir: %s', playlist.uri)
|
||||
return None
|
||||
name = translator.name_from_path(path)
|
||||
try:
|
||||
with self._open(path, 'w') as fp:
|
||||
@ -137,6 +154,11 @@ class M3UPlaylistsProvider(backend.PlaylistsProvider):
|
||||
def _abspath(self, path):
|
||||
return os.path.join(self._playlists_dir, path)
|
||||
|
||||
def _is_in_basedir(self, local_path):
|
||||
if not os.path.isabs(local_path):
|
||||
local_path = os.path.join(self._playlists_dir, local_path)
|
||||
return path.is_path_inside_base_dir(local_path, self._playlists_dir)
|
||||
|
||||
def _open(self, path, mode='r'):
|
||||
if path.endswith(b'.m3u8'):
|
||||
encoding = 'utf-8'
|
||||
@ -144,6 +166,10 @@ class M3UPlaylistsProvider(backend.PlaylistsProvider):
|
||||
encoding = self._default_encoding
|
||||
if not os.path.isabs(path):
|
||||
path = os.path.join(self._playlists_dir, path)
|
||||
if not self._is_in_basedir(path):
|
||||
raise Exception(
|
||||
'Path (%s) is not inside playlist_dir (%s)'
|
||||
% (path, self._playlists_dir))
|
||||
if 'w' in mode:
|
||||
return replace(path, mode, encoding=encoding, errors='replace')
|
||||
else:
|
||||
|
||||
@ -110,7 +110,9 @@ def dump_items(items, fp):
|
||||
print(item.uri, file=fp)
|
||||
|
||||
|
||||
def playlist(path, items=[], mtime=None):
|
||||
def playlist(path, items=None, mtime=None):
|
||||
if items is None:
|
||||
items = []
|
||||
return models.Playlist(
|
||||
uri=path_to_uri(path),
|
||||
name=name_from_path(path),
|
||||
|
||||
@ -6,7 +6,7 @@ from mopidy.models.immutable import ImmutableObject, ValidatedImmutableObject
|
||||
from mopidy.models.serialize import ModelJSONEncoder, model_json_decoder
|
||||
|
||||
__all__ = [
|
||||
'ImmutableObject', 'Ref', 'Image', 'Artist', 'Album', 'track', 'TlTrack',
|
||||
'ImmutableObject', 'Ref', 'Image', 'Artist', 'Album', 'Track', 'TlTrack',
|
||||
'Playlist', 'SearchResult', 'model_json_decoder', 'ModelJSONEncoder',
|
||||
'ValidatedImmutableObject']
|
||||
|
||||
|
||||
@ -19,7 +19,7 @@ class Extension(ext.Extension):
|
||||
def get_config_schema(self):
|
||||
schema = super(Extension, self).get_config_schema()
|
||||
schema['hostname'] = config.Hostname()
|
||||
schema['port'] = config.Port()
|
||||
schema['port'] = config.Port(optional=True)
|
||||
schema['password'] = config.Secret(optional=True)
|
||||
schema['max_connections'] = config.Integer(minimum=1)
|
||||
schema['connection_timeout'] = config.Integer(minimum=1)
|
||||
|
||||
@ -41,11 +41,11 @@ class MpdFrontend(pykka.ThreadingActor, CoreListener):
|
||||
self.zeroconf_name = config['mpd']['zeroconf']
|
||||
self.zeroconf_service = None
|
||||
|
||||
self._setup_server(config, core)
|
||||
self.server = self._setup_server(config, core)
|
||||
|
||||
def _setup_server(self, config, core):
|
||||
try:
|
||||
network.Server(
|
||||
server = network.Server(
|
||||
self.hostname, self.port,
|
||||
protocol=session.MpdSession,
|
||||
protocol_kwargs={
|
||||
@ -60,10 +60,15 @@ class MpdFrontend(pykka.ThreadingActor, CoreListener):
|
||||
'MPD server startup failed: %s' %
|
||||
encoding.locale_decode(error))
|
||||
|
||||
logger.info('MPD server running at [%s]:%s', self.hostname, self.port)
|
||||
logger.info(
|
||||
'MPD server running at %s',
|
||||
network.format_socket_name(server.server_socket))
|
||||
|
||||
return server
|
||||
|
||||
def on_start(self):
|
||||
if self.zeroconf_name:
|
||||
if (self.zeroconf_name and not
|
||||
network.is_unix_socket(self.server.server_socket)):
|
||||
self.zeroconf_service = zeroconf.Zeroconf(
|
||||
name=self.zeroconf_name,
|
||||
stype='_mpd._tcp',
|
||||
@ -75,6 +80,7 @@ class MpdFrontend(pykka.ThreadingActor, CoreListener):
|
||||
self.zeroconf_service.unpublish()
|
||||
|
||||
process.stop_actors_by_class(session.MpdSession)
|
||||
self.server.stop()
|
||||
|
||||
def on_event(self, event, **kwargs):
|
||||
if event not in _CORE_EVENTS_TO_IDLE_SUBSYSTEMS:
|
||||
|
||||
@ -42,7 +42,7 @@ def currentsong(context):
|
||||
tl_track, position=position, stream_title=stream_title)
|
||||
|
||||
|
||||
@protocol.commands.add('idle', list_command=False)
|
||||
@protocol.commands.add('idle')
|
||||
def idle(context, *subsystems):
|
||||
"""
|
||||
*musicpd.org, status section:*
|
||||
|
||||
@ -3,7 +3,6 @@ from __future__ import absolute_import, division, unicode_literals
|
||||
import datetime
|
||||
import logging
|
||||
import re
|
||||
import warnings
|
||||
|
||||
from mopidy.compat import urllib
|
||||
from mopidy.mpd import exceptions, protocol, translator
|
||||
@ -16,6 +15,16 @@ def _check_playlist_name(name):
|
||||
raise exceptions.MpdInvalidPlaylistName()
|
||||
|
||||
|
||||
def _get_playlist(context, name, must_exist=True):
|
||||
playlist = None
|
||||
uri = context.lookup_playlist_uri_from_name(name)
|
||||
if uri:
|
||||
playlist = context.core.playlists.lookup(uri).get()
|
||||
if must_exist and playlist is None:
|
||||
raise exceptions.MpdNoExistError('No such playlist')
|
||||
return playlist
|
||||
|
||||
|
||||
@protocol.commands.add('listplaylist')
|
||||
def listplaylist(context, name):
|
||||
"""
|
||||
@ -31,10 +40,7 @@ def listplaylist(context, name):
|
||||
file: relative/path/to/file2.ogg
|
||||
file: relative/path/to/file3.mp3
|
||||
"""
|
||||
uri = context.lookup_playlist_uri_from_name(name)
|
||||
playlist = uri is not None and context.core.playlists.lookup(uri).get()
|
||||
if not playlist:
|
||||
raise exceptions.MpdNoExistError('No such playlist')
|
||||
playlist = _get_playlist(context, name)
|
||||
return ['file: %s' % t.uri for t in playlist.tracks]
|
||||
|
||||
|
||||
@ -52,10 +58,13 @@ def listplaylistinfo(context, name):
|
||||
Standard track listing, with fields: file, Time, Title, Date,
|
||||
Album, Artist, Track
|
||||
"""
|
||||
uri = context.lookup_playlist_uri_from_name(name)
|
||||
playlist = uri is not None and context.core.playlists.lookup(uri).get()
|
||||
if not playlist:
|
||||
raise exceptions.MpdNoExistError('No such playlist')
|
||||
playlist = _get_playlist(context, name)
|
||||
track_uris = [track.uri for track in playlist.tracks]
|
||||
tracks_map = context.core.library.lookup(uris=track_uris).get()
|
||||
tracks = []
|
||||
for uri in track_uris:
|
||||
tracks.extend(tracks_map[uri])
|
||||
playlist = playlist.replace(tracks=tracks)
|
||||
return translator.playlist_to_mpd_format(playlist)
|
||||
|
||||
|
||||
@ -134,14 +143,9 @@ def load(context, name, playlist_slice=slice(0, None)):
|
||||
- MPD 0.17.1 does not fail if the specified range is outside the playlist,
|
||||
in either or both ends.
|
||||
"""
|
||||
uri = context.lookup_playlist_uri_from_name(name)
|
||||
playlist = uri is not None and context.core.playlists.lookup(uri).get()
|
||||
if not playlist:
|
||||
raise exceptions.MpdNoExistError('No such playlist')
|
||||
|
||||
with warnings.catch_warnings():
|
||||
warnings.filterwarnings('ignore', 'tracklist.add.*"tracks".*')
|
||||
context.core.tracklist.add(playlist.tracks[playlist_slice]).get()
|
||||
playlist = _get_playlist(context, name)
|
||||
track_uris = [track.uri for track in playlist.tracks[playlist_slice]]
|
||||
context.core.tracklist.add(uris=track_uris).get()
|
||||
|
||||
|
||||
@protocol.commands.add('playlistadd')
|
||||
@ -156,8 +160,7 @@ def playlistadd(context, name, track_uri):
|
||||
``NAME.m3u`` will be created if it does not exist.
|
||||
"""
|
||||
_check_playlist_name(name)
|
||||
uri = context.lookup_playlist_uri_from_name(name)
|
||||
old_playlist = uri is not None and context.core.playlists.lookup(uri).get()
|
||||
old_playlist = _get_playlist(context, name, must_exist=False)
|
||||
if not old_playlist:
|
||||
# Create new playlist with this single track
|
||||
lookup_res = context.core.library.lookup(uris=[track_uri]).get()
|
||||
@ -227,8 +230,7 @@ def playlistclear(context, name):
|
||||
The playlist will be created if it does not exist.
|
||||
"""
|
||||
_check_playlist_name(name)
|
||||
uri = context.lookup_playlist_uri_from_name(name)
|
||||
playlist = uri is not None and context.core.playlists.lookup(uri).get()
|
||||
playlist = _get_playlist(context, name, must_exist=False)
|
||||
if not playlist:
|
||||
playlist = context.core.playlists.create(name).get()
|
||||
|
||||
@ -236,7 +238,7 @@ def playlistclear(context, name):
|
||||
playlist = playlist.replace(tracks=[])
|
||||
if context.core.playlists.save(playlist).get() is None:
|
||||
raise exceptions.MpdFailedToSavePlaylist(
|
||||
urllib.parse.urlparse(uri).scheme)
|
||||
urllib.parse.urlparse(playlist.uri).scheme)
|
||||
|
||||
|
||||
@protocol.commands.add('playlistdelete', songpos=protocol.UINT)
|
||||
@ -249,10 +251,7 @@ def playlistdelete(context, name, songpos):
|
||||
Deletes ``SONGPOS`` from the playlist ``NAME.m3u``.
|
||||
"""
|
||||
_check_playlist_name(name)
|
||||
uri = context.lookup_playlist_uri_from_name(name)
|
||||
playlist = uri is not None and context.core.playlists.lookup(uri).get()
|
||||
if not playlist:
|
||||
raise exceptions.MpdNoExistError('No such playlist')
|
||||
playlist = _get_playlist(context, name)
|
||||
|
||||
try:
|
||||
# Convert tracks to list and remove requested
|
||||
@ -266,7 +265,7 @@ def playlistdelete(context, name, songpos):
|
||||
saved_playlist = context.core.playlists.save(playlist).get()
|
||||
if saved_playlist is None:
|
||||
raise exceptions.MpdFailedToSavePlaylist(
|
||||
urllib.parse.urlparse(uri).scheme)
|
||||
urllib.parse.urlparse(playlist.uri).scheme)
|
||||
|
||||
|
||||
@protocol.commands.add(
|
||||
@ -290,10 +289,7 @@ def playlistmove(context, name, from_pos, to_pos):
|
||||
return
|
||||
|
||||
_check_playlist_name(name)
|
||||
uri = context.lookup_playlist_uri_from_name(name)
|
||||
playlist = uri is not None and context.core.playlists.lookup(uri).get()
|
||||
if not playlist:
|
||||
raise exceptions.MpdNoExistError('No such playlist')
|
||||
playlist = _get_playlist(context, name)
|
||||
if from_pos == to_pos:
|
||||
return # Nothing to do
|
||||
|
||||
@ -310,7 +306,7 @@ def playlistmove(context, name, from_pos, to_pos):
|
||||
saved_playlist = context.core.playlists.save(playlist).get()
|
||||
if saved_playlist is None:
|
||||
raise exceptions.MpdFailedToSavePlaylist(
|
||||
urllib.parse.urlparse(uri).scheme)
|
||||
urllib.parse.urlparse(playlist.uri).scheme)
|
||||
|
||||
|
||||
@protocol.commands.add('rename')
|
||||
@ -325,21 +321,14 @@ def rename(context, old_name, new_name):
|
||||
_check_playlist_name(old_name)
|
||||
_check_playlist_name(new_name)
|
||||
|
||||
old_uri = context.lookup_playlist_uri_from_name(old_name)
|
||||
if not old_uri:
|
||||
raise exceptions.MpdNoExistError('No such playlist')
|
||||
old_playlist = _get_playlist(context, old_name)
|
||||
|
||||
old_playlist = context.core.playlists.lookup(old_uri).get()
|
||||
if not old_playlist:
|
||||
raise exceptions.MpdNoExistError('No such playlist')
|
||||
|
||||
new_uri = context.lookup_playlist_uri_from_name(new_name)
|
||||
if new_uri and context.core.playlists.lookup(new_uri).get():
|
||||
if _get_playlist(context, new_name, must_exist=False):
|
||||
raise exceptions.MpdExistError('Playlist already exists')
|
||||
# TODO: should we purge the mapping in an else?
|
||||
|
||||
# Create copy of the playlist and remove original
|
||||
uri_scheme = urllib.parse.urlparse(old_uri).scheme
|
||||
uri_scheme = urllib.parse.urlparse(old_playlist.uri).scheme
|
||||
new_playlist = context.core.playlists.create(new_name, uri_scheme).get()
|
||||
new_playlist = new_playlist.replace(tracks=old_playlist.tracks)
|
||||
saved_playlist = context.core.playlists.save(new_playlist).get()
|
||||
@ -377,8 +366,7 @@ def save(context, name):
|
||||
"""
|
||||
_check_playlist_name(name)
|
||||
tracks = context.core.tracklist.get_tracks().get()
|
||||
uri = context.lookup_playlist_uri_from_name(name)
|
||||
playlist = uri is not None and context.core.playlists.lookup(uri).get()
|
||||
playlist = _get_playlist(context, name, must_exist=False)
|
||||
if not playlist:
|
||||
# Create new playlist
|
||||
_create_playlist(context, name, tracks)
|
||||
@ -388,4 +376,4 @@ def save(context, name):
|
||||
saved_playlist = context.core.playlists.save(new_playlist).get()
|
||||
if saved_playlist is None:
|
||||
raise exceptions.MpdFailedToSavePlaylist(
|
||||
urllib.parse.urlparse(uri).scheme)
|
||||
urllib.parse.urlparse(playlist.uri).scheme)
|
||||
|
||||
@ -25,18 +25,19 @@ class MpdSession(network.LineProtocol):
|
||||
session=self, config=config, core=core, uri_map=uri_map)
|
||||
|
||||
def on_start(self):
|
||||
logger.info('New MPD connection from [%s]:%s', self.host, self.port)
|
||||
logger.info('New MPD connection from %s', self.connection)
|
||||
self.send_lines(['OK MPD %s' % protocol.VERSION])
|
||||
|
||||
def on_line_received(self, line):
|
||||
logger.debug('Request from [%s]:%s: %s', self.host, self.port, line)
|
||||
logger.debug('Request from [%s]: %s', self.connection, line)
|
||||
|
||||
response = self.dispatcher.handle_request(line)
|
||||
if not response:
|
||||
return
|
||||
|
||||
logger.debug(
|
||||
'Response to [%s]:%s: %s', self.host, self.port,
|
||||
'Response to [%s]: %s',
|
||||
self.connection,
|
||||
formatting.indent(self.terminator.join(response)))
|
||||
|
||||
self.send_lines(response)
|
||||
|
||||
3
setup.py
3
setup.py
@ -23,11 +23,12 @@ setup(
|
||||
packages=find_packages(exclude=['tests', 'tests.*']),
|
||||
zip_safe=False,
|
||||
include_package_data=True,
|
||||
python_requires='>= 2.7, < 3',
|
||||
install_requires=[
|
||||
'Pykka >= 1.1',
|
||||
'requests >= 2.0',
|
||||
'setuptools',
|
||||
'tornado >= 3.2, < 5', # Tornado 5 requires Python >= 2.7.9
|
||||
'tornado >= 4.4, < 5', # Tornado 5 requires Python >= 2.7.9
|
||||
],
|
||||
extras_require={'http': []},
|
||||
entry_points={
|
||||
|
||||
10
tasks.py
10
tasks.py
@ -4,18 +4,18 @@ from invoke import run, task
|
||||
|
||||
|
||||
@task
|
||||
def docs(watch=False, warn=False):
|
||||
def docs(ctx, watch=False, warn=False):
|
||||
if watch:
|
||||
return watcher(docs)
|
||||
run('make -C docs/ html', warn=warn)
|
||||
|
||||
|
||||
@task
|
||||
def test(path=None, coverage=False, watch=False, warn=False):
|
||||
def test(ctx, path=None, coverage=False, watch=False, warn=False):
|
||||
if watch:
|
||||
return watcher(test, path=path, coverage=coverage)
|
||||
path = path or 'tests/'
|
||||
cmd = 'py.test'
|
||||
cmd = 'pytest'
|
||||
if coverage:
|
||||
cmd += ' --cov=mopidy --cov-report=term-missing'
|
||||
cmd += ' %s' % path
|
||||
@ -23,14 +23,14 @@ def test(path=None, coverage=False, watch=False, warn=False):
|
||||
|
||||
|
||||
@task
|
||||
def lint(watch=False, warn=False):
|
||||
def lint(ctx, watch=False, warn=False):
|
||||
if watch:
|
||||
return watcher(lint)
|
||||
run('flake8', warn=warn)
|
||||
|
||||
|
||||
@task
|
||||
def update_authors():
|
||||
def update_authors(ctx):
|
||||
# Keep authors in the order of appearance and use awk to filter out dupes
|
||||
run("git log --format='- %aN <%aE>' --reverse | awk '!x[$0]++' > AUTHORS")
|
||||
|
||||
|
||||
@ -23,4 +23,3 @@ def test_core_schema_has_max_tracklist_length():
|
||||
max_tracklist_length_schema = config._core_schema['max_tracklist_length']
|
||||
assert isinstance(max_tracklist_length_schema, config.Integer)
|
||||
assert max_tracklist_length_schema._minimum == 1
|
||||
assert max_tracklist_length_schema._maximum == 10000
|
||||
|
||||
@ -344,6 +344,12 @@ class HostnameTest(unittest.TestCase):
|
||||
self.assertIsNone(value.deserialize(' '))
|
||||
self.assertEqual(0, getaddrinfo_mock.call_count)
|
||||
|
||||
@mock.patch('mopidy.internal.path.expand_path')
|
||||
def test_deserialize_with_unix_socket(self, expand_path_mock):
|
||||
value = types.Hostname()
|
||||
self.assertIsNotNone(value.deserialize('unix:/tmp/mopidy.socket'))
|
||||
expand_path_mock.assert_called_once_with('/tmp/mopidy.socket')
|
||||
|
||||
|
||||
class PortTest(unittest.TestCase):
|
||||
|
||||
|
||||
@ -148,6 +148,9 @@ class CoreLibraryTest(BaseCoreLibraryTest):
|
||||
Ref.track(uri='dummy1:track:/foo/baz.mp3', name='Baz'),
|
||||
])
|
||||
|
||||
def test_lookup_returns_empty_dict_for_no_uris(self):
|
||||
self.assertEqual({}, self.core.library.lookup(uris=[]))
|
||||
|
||||
def test_lookup_fails_with_uri_and_uris_set(self):
|
||||
with self.assertRaises(ValueError):
|
||||
self.core.library.lookup('dummy1:a', ['dummy2:a'])
|
||||
|
||||
@ -141,26 +141,30 @@ class PlaylistTest(BasePlaylistsTest):
|
||||
self.assertFalse(self.sp2.create.called)
|
||||
|
||||
def test_delete_selects_the_dummy1_backend(self):
|
||||
self.core.playlists.delete('dummy1:a')
|
||||
success = self.core.playlists.delete('dummy1:a')
|
||||
|
||||
self.assertTrue(success)
|
||||
self.sp1.delete.assert_called_once_with('dummy1:a')
|
||||
self.assertFalse(self.sp2.delete.called)
|
||||
|
||||
def test_delete_selects_the_dummy2_backend(self):
|
||||
self.core.playlists.delete('dummy2:a')
|
||||
success = self.core.playlists.delete('dummy2:a')
|
||||
|
||||
self.assertTrue(success)
|
||||
self.assertFalse(self.sp1.delete.called)
|
||||
self.sp2.delete.assert_called_once_with('dummy2:a')
|
||||
|
||||
def test_delete_with_unknown_uri_scheme_does_nothing(self):
|
||||
self.core.playlists.delete('unknown:a')
|
||||
success = self.core.playlists.delete('unknown:a')
|
||||
|
||||
self.assertFalse(success)
|
||||
self.assertFalse(self.sp1.delete.called)
|
||||
self.assertFalse(self.sp2.delete.called)
|
||||
|
||||
def test_delete_ignores_backend_without_playlist_support(self):
|
||||
self.core.playlists.delete('dummy3:a')
|
||||
success = self.core.playlists.delete('dummy3:a')
|
||||
|
||||
self.assertFalse(success)
|
||||
self.assertFalse(self.sp1.delete.called)
|
||||
self.assertFalse(self.sp2.delete.called)
|
||||
|
||||
@ -377,7 +381,7 @@ class DeleteBadBackendsTest(MockBackendCorePlaylistsBase):
|
||||
|
||||
def test_backend_raises_exception(self, logger):
|
||||
self.playlists.delete.return_value.get.side_effect = Exception
|
||||
self.assertIsNone(self.core.playlists.delete('dummy:/1'))
|
||||
self.assertFalse(self.core.playlists.delete('dummy:/1'))
|
||||
logger.exception.assert_called_with(mock.ANY, 'DummyBackend')
|
||||
|
||||
|
||||
|
||||
@ -31,9 +31,10 @@ class DummyAudio(pykka.ThreadingActor):
|
||||
|
||||
def set_uri(self, uri):
|
||||
assert self._uri is None, 'prepare change not called before set'
|
||||
self._tags = {}
|
||||
self._position = 0
|
||||
self._uri = uri
|
||||
self._stream_changed = True
|
||||
self._tags = {}
|
||||
|
||||
def set_appsrc(self, *args, **kwargs):
|
||||
pass
|
||||
|
||||
@ -102,11 +102,15 @@ class DummyPlaylistsProvider(backend.PlaylistsProvider):
|
||||
def __init__(self, backend):
|
||||
super(DummyPlaylistsProvider, self).__init__(backend)
|
||||
self._playlists = []
|
||||
self._allow_save = True
|
||||
|
||||
def set_dummy_playlists(self, playlists):
|
||||
"""For tests using the dummy provider through an actor proxy."""
|
||||
self._playlists = playlists
|
||||
|
||||
def set_allow_save(self, enabled):
|
||||
self._allow_save = enabled
|
||||
|
||||
def as_list(self):
|
||||
return [
|
||||
Ref.playlist(uri=pl.uri, name=pl.name) for pl in self._playlists]
|
||||
@ -137,6 +141,9 @@ class DummyPlaylistsProvider(backend.PlaylistsProvider):
|
||||
self._playlists.remove(playlist)
|
||||
|
||||
def save(self, playlist):
|
||||
if not self._allow_save:
|
||||
return None
|
||||
|
||||
old_playlist = self.lookup(playlist.uri)
|
||||
|
||||
if old_playlist is not None:
|
||||
|
||||
@ -41,48 +41,48 @@ class StaticFileHandlerTest(tornado.testing.AsyncHTTPTestCase):
|
||||
response.headers['Cache-Control'], 'no-cache')
|
||||
|
||||
|
||||
# We aren't bothering with skipIf as then we would need to "backport" gen_test
|
||||
if hasattr(tornado.websocket, 'websocket_connect'):
|
||||
class WebSocketHandlerTest(tornado.testing.AsyncHTTPTestCase):
|
||||
class WebSocketHandlerTest(tornado.testing.AsyncHTTPTestCase):
|
||||
|
||||
def get_app(self):
|
||||
self.core = mock.Mock()
|
||||
return tornado.web.Application([
|
||||
(r'/ws/?', handlers.WebSocketHandler, {'core': self.core})
|
||||
])
|
||||
def get_app(self):
|
||||
self.core = mock.Mock()
|
||||
return tornado.web.Application([
|
||||
(r'/ws/?', handlers.WebSocketHandler, {
|
||||
'core': self.core, 'allowed_origins': []
|
||||
})
|
||||
])
|
||||
|
||||
def connection(self):
|
||||
url = self.get_url('/ws').replace('http', 'ws')
|
||||
return tornado.websocket.websocket_connect(url, self.io_loop)
|
||||
def connection(self):
|
||||
url = self.get_url('/ws').replace('http', 'ws')
|
||||
return tornado.websocket.websocket_connect(url, self.io_loop)
|
||||
|
||||
@tornado.testing.gen_test
|
||||
def test_invalid_json_rpc_request_doesnt_crash_handler(self):
|
||||
# An uncaught error would result in no message, so this is just a
|
||||
# simplistic test to verify this.
|
||||
conn = yield self.connection()
|
||||
conn.write_message('invalid request')
|
||||
message = yield conn.read_message()
|
||||
self.assertTrue(message)
|
||||
@tornado.testing.gen_test
|
||||
def test_invalid_json_rpc_request_doesnt_crash_handler(self):
|
||||
# An uncaught error would result in no message, so this is just a
|
||||
# simplistic test to verify this.
|
||||
conn = yield self.connection()
|
||||
conn.write_message('invalid request')
|
||||
message = yield conn.read_message()
|
||||
self.assertTrue(message)
|
||||
|
||||
@tornado.testing.gen_test
|
||||
def test_broadcast_makes_it_to_client(self):
|
||||
conn = yield self.connection()
|
||||
handlers.WebSocketHandler.broadcast('message')
|
||||
message = yield conn.read_message()
|
||||
self.assertEqual(message, 'message')
|
||||
@tornado.testing.gen_test
|
||||
def test_broadcast_makes_it_to_client(self):
|
||||
conn = yield self.connection()
|
||||
handlers.WebSocketHandler.broadcast('message')
|
||||
message = yield conn.read_message()
|
||||
self.assertEqual(message, 'message')
|
||||
|
||||
@tornado.testing.gen_test
|
||||
def test_broadcast_to_client_that_just_closed_connection(self):
|
||||
conn = yield self.connection()
|
||||
conn.stream.close()
|
||||
handlers.WebSocketHandler.broadcast('message')
|
||||
@tornado.testing.gen_test
|
||||
def test_broadcast_to_client_that_just_closed_connection(self):
|
||||
conn = yield self.connection()
|
||||
conn.stream.close()
|
||||
handlers.WebSocketHandler.broadcast('message')
|
||||
|
||||
@tornado.testing.gen_test
|
||||
def test_broadcast_to_client_without_ws_connection_present(self):
|
||||
yield self.connection()
|
||||
# Tornado checks for ws_connection and raises WebSocketClosedError
|
||||
# if it is missing, this test case simulates winning a race were
|
||||
# this has happened but we have not yet been removed from clients.
|
||||
for client in handlers.WebSocketHandler.clients:
|
||||
client.ws_connection = None
|
||||
handlers.WebSocketHandler.broadcast('message')
|
||||
@tornado.testing.gen_test
|
||||
def test_broadcast_to_client_without_ws_connection_present(self):
|
||||
yield self.connection()
|
||||
# Tornado checks for ws_connection and raises WebSocketClosedError
|
||||
# if it is missing, this test case simulates winning a race were
|
||||
# this has happened but we have not yet been removed from clients.
|
||||
for client in handlers.WebSocketHandler.clients:
|
||||
client.ws_connection = None
|
||||
handlers.WebSocketHandler.broadcast('message')
|
||||
|
||||
@ -20,6 +20,7 @@ class HttpServerTest(tornado.testing.AsyncHTTPTestCase):
|
||||
'port': 6680,
|
||||
'static_dir': None,
|
||||
'zeroconf': '',
|
||||
'allowed_origins': [],
|
||||
}
|
||||
}
|
||||
|
||||
@ -128,7 +129,8 @@ class MopidyRPCHandlerTest(HttpServerTest):
|
||||
def test_should_return_rpc_error(self):
|
||||
cmd = tornado.escape.json_encode({'action': 'get_version'})
|
||||
|
||||
response = self.fetch('/mopidy/rpc', method='POST', body=cmd)
|
||||
response = self.fetch('/mopidy/rpc', method='POST', body=cmd, headers={
|
||||
'Content-Type': 'application/json'})
|
||||
|
||||
self.assertEqual(
|
||||
{'jsonrpc': '2.0', 'id': None, 'error':
|
||||
@ -139,7 +141,8 @@ class MopidyRPCHandlerTest(HttpServerTest):
|
||||
def test_should_return_parse_error(self):
|
||||
cmd = '{[[[]}'
|
||||
|
||||
response = self.fetch('/mopidy/rpc', method='POST', body=cmd)
|
||||
response = self.fetch('/mopidy/rpc', method='POST', body=cmd, headers={
|
||||
'Content-Type': 'application/json'})
|
||||
|
||||
self.assertEqual(
|
||||
{'jsonrpc': '2.0', 'id': None, 'error':
|
||||
@ -154,7 +157,8 @@ class MopidyRPCHandlerTest(HttpServerTest):
|
||||
'id': 1,
|
||||
})
|
||||
|
||||
response = self.fetch('/mopidy/rpc', method='POST', body=cmd)
|
||||
response = self.fetch('/mopidy/rpc', method='POST', body=cmd, headers={
|
||||
'Content-Type': 'application/json'})
|
||||
|
||||
self.assertEqual(
|
||||
{'jsonrpc': '2.0', 'id': 1, 'result': mopidy.__version__},
|
||||
@ -168,6 +172,38 @@ class MopidyRPCHandlerTest(HttpServerTest):
|
||||
self.assertIn('Cache-Control', response.headers)
|
||||
self.assertIn('Content-Type', response.headers)
|
||||
|
||||
def test_should_require_correct_content_type(self):
|
||||
cmd = tornado.escape.json_encode({
|
||||
'method': 'core.get_version',
|
||||
'params': [],
|
||||
'jsonrpc': '2.0',
|
||||
'id': 1,
|
||||
})
|
||||
|
||||
response = self.fetch('/mopidy/rpc', method='POST', body=cmd, headers={
|
||||
'Content-Type': 'text/plain'})
|
||||
|
||||
self.assertEqual(response.code, 415)
|
||||
self.assertEqual(
|
||||
response.reason, 'Content-Type must be application/json')
|
||||
|
||||
def test_different_origin_returns_access_denied(self):
|
||||
response = self.fetch('/mopidy/rpc', method='OPTIONS', headers={
|
||||
'Host': 'me:6680', 'Origin': 'http://evil:666'})
|
||||
|
||||
self.assertEqual(response.code, 403)
|
||||
self.assertEqual(
|
||||
response.reason, 'Access denied for origin http://evil:666')
|
||||
|
||||
def test_same_origin_returns_cors_headers(self):
|
||||
response = self.fetch('/mopidy/rpc', method='OPTIONS', headers={
|
||||
'Host': 'me:6680', 'Origin': 'http://me:6680'})
|
||||
|
||||
self.assertEqual(
|
||||
response.headers['Access-Control-Allow-Origin'], 'http://me:6680')
|
||||
self.assertEqual(
|
||||
response.headers['Access-Control-Allow-Headers'], 'Content-Type')
|
||||
|
||||
|
||||
class HttpServerWithStaticFilesTest(tornado.testing.AsyncHTTPTestCase):
|
||||
|
||||
|
||||
@ -51,7 +51,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
|
||||
network.Connection.__init__(
|
||||
self.mock, protocol, protocol_kwargs, sock, addr, sentinel.timeout)
|
||||
self.assertEqual(sock, self.mock.sock)
|
||||
self.assertEqual(sock, self.mock._sock)
|
||||
self.assertEqual(protocol, self.mock.protocol)
|
||||
self.assertEqual(protocol_kwargs, self.mock.protocol_kwargs)
|
||||
self.assertEqual(sentinel.timeout, self.mock.timeout)
|
||||
@ -73,7 +73,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
def test_stop_disables_recv_send_and_timeout(self):
|
||||
self.mock.stopping = False
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
network.Connection.stop(self.mock, sentinel.reason)
|
||||
self.mock.disable_timeout.assert_called_once_with()
|
||||
@ -83,24 +83,24 @@ class ConnectionTest(unittest.TestCase):
|
||||
def test_stop_closes_socket(self):
|
||||
self.mock.stopping = False
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
network.Connection.stop(self.mock, sentinel.reason)
|
||||
self.mock.sock.close.assert_called_once_with()
|
||||
self.mock._sock.close.assert_called_once_with()
|
||||
|
||||
def test_stop_closes_socket_error(self):
|
||||
self.mock.stopping = False
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.close.side_effect = socket.error
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.close.side_effect = socket.error
|
||||
|
||||
network.Connection.stop(self.mock, sentinel.reason)
|
||||
self.mock.sock.close.assert_called_once_with()
|
||||
self.mock._sock.close.assert_called_once_with()
|
||||
|
||||
def test_stop_stops_actor(self):
|
||||
self.mock.stopping = False
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
network.Connection.stop(self.mock, sentinel.reason)
|
||||
self.mock.actor_ref.stop.assert_called_once_with(block=False)
|
||||
@ -109,7 +109,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
self.mock.stopping = False
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.actor_ref.stop.side_effect = pykka.ActorDeadError()
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
network.Connection.stop(self.mock, sentinel.reason)
|
||||
self.mock.actor_ref.stop.assert_called_once_with(block=False)
|
||||
@ -117,7 +117,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
def test_stop_sets_stopping_to_true(self):
|
||||
self.mock.stopping = False
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
network.Connection.stop(self.mock, sentinel.reason)
|
||||
self.assertEqual(True, self.mock.stopping)
|
||||
@ -125,17 +125,17 @@ class ConnectionTest(unittest.TestCase):
|
||||
def test_stop_does_not_proceed_when_already_stopping(self):
|
||||
self.mock.stopping = True
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
network.Connection.stop(self.mock, sentinel.reason)
|
||||
self.assertEqual(0, self.mock.actor_ref.stop.call_count)
|
||||
self.assertEqual(0, self.mock.sock.close.call_count)
|
||||
self.assertEqual(0, self.mock._sock.close.call_count)
|
||||
|
||||
@patch.object(network.logger, 'log', new=Mock())
|
||||
def test_stop_logs_reason(self):
|
||||
self.mock.stopping = False
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
network.Connection.stop(self.mock, sentinel.reason)
|
||||
network.logger.log.assert_called_once_with(
|
||||
@ -145,7 +145,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
def test_stop_logs_reason_with_level(self):
|
||||
self.mock.stopping = False
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
network.Connection.stop(
|
||||
self.mock, sentinel.reason, level=sentinel.level)
|
||||
@ -156,7 +156,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
def test_stop_logs_that_it_is_calling_itself(self):
|
||||
self.mock.stopping = True
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
network.Connection.stop(self.mock, sentinel.reason)
|
||||
network.logger.log(any_int, any_unicode)
|
||||
@ -164,8 +164,8 @@ class ConnectionTest(unittest.TestCase):
|
||||
@patch.object(GObject, 'io_add_watch', new=Mock())
|
||||
def test_enable_recv_registers_with_gobject(self):
|
||||
self.mock.recv_id = None
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.fileno.return_value = sentinel.fileno
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.fileno.return_value = sentinel.fileno
|
||||
GObject.io_add_watch.return_value = sentinel.tag
|
||||
|
||||
network.Connection.enable_recv(self.mock)
|
||||
@ -177,7 +177,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
|
||||
@patch.object(GObject, 'io_add_watch', new=Mock())
|
||||
def test_enable_recv_already_registered(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock.recv_id = sentinel.tag
|
||||
|
||||
network.Connection.enable_recv(self.mock)
|
||||
@ -185,7 +185,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
|
||||
def test_enable_recv_does_not_change_tag(self):
|
||||
self.mock.recv_id = sentinel.tag
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
network.Connection.enable_recv(self.mock)
|
||||
self.assertEqual(sentinel.tag, self.mock.recv_id)
|
||||
@ -208,8 +208,8 @@ class ConnectionTest(unittest.TestCase):
|
||||
|
||||
def test_enable_recv_on_closed_socket(self):
|
||||
self.mock.recv_id = None
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.fileno.side_effect = socket.error(errno.EBADF, '')
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.fileno.side_effect = socket.error(errno.EBADF, '')
|
||||
|
||||
network.Connection.enable_recv(self.mock)
|
||||
self.mock.stop.assert_called_once_with(any_unicode)
|
||||
@ -218,8 +218,8 @@ class ConnectionTest(unittest.TestCase):
|
||||
@patch.object(GObject, 'io_add_watch', new=Mock())
|
||||
def test_enable_send_registers_with_gobject(self):
|
||||
self.mock.send_id = None
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.fileno.return_value = sentinel.fileno
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.fileno.return_value = sentinel.fileno
|
||||
GObject.io_add_watch.return_value = sentinel.tag
|
||||
|
||||
network.Connection.enable_send(self.mock)
|
||||
@ -231,7 +231,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
|
||||
@patch.object(GObject, 'io_add_watch', new=Mock())
|
||||
def test_enable_send_already_registered(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock.send_id = sentinel.tag
|
||||
|
||||
network.Connection.enable_send(self.mock)
|
||||
@ -239,7 +239,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
|
||||
def test_enable_send_does_not_change_tag(self):
|
||||
self.mock.send_id = sentinel.tag
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
network.Connection.enable_send(self.mock)
|
||||
self.assertEqual(sentinel.tag, self.mock.send_id)
|
||||
@ -262,8 +262,8 @@ class ConnectionTest(unittest.TestCase):
|
||||
|
||||
def test_enable_send_on_closed_socket(self):
|
||||
self.mock.send_id = None
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.fileno.side_effect = socket.error(errno.EBADF, '')
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.fileno.side_effect = socket.error(errno.EBADF, '')
|
||||
|
||||
network.Connection.enable_send(self.mock)
|
||||
self.assertEqual(None, self.mock.send_id)
|
||||
@ -367,7 +367,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
self.assertEqual('', self.mock.send_buffer)
|
||||
|
||||
def test_recv_callback_respects_io_err(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock.actor_ref = Mock()
|
||||
|
||||
self.assertTrue(network.Connection.recv_callback(
|
||||
@ -375,7 +375,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
self.mock.stop.assert_called_once_with(any_unicode)
|
||||
|
||||
def test_recv_callback_respects_io_hup(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock.actor_ref = Mock()
|
||||
|
||||
self.assertTrue(network.Connection.recv_callback(
|
||||
@ -383,7 +383,7 @@ class ConnectionTest(unittest.TestCase):
|
||||
self.mock.stop.assert_called_once_with(any_unicode)
|
||||
|
||||
def test_recv_callback_respects_io_hup_and_io_err(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock.actor_ref = Mock()
|
||||
|
||||
self.assertTrue(network.Connection.recv_callback(
|
||||
@ -392,8 +392,8 @@ class ConnectionTest(unittest.TestCase):
|
||||
self.mock.stop.assert_called_once_with(any_unicode)
|
||||
|
||||
def test_recv_callback_sends_data_to_actor(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.recv.return_value = 'data'
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.recv.return_value = 'data'
|
||||
self.mock.actor_ref = Mock()
|
||||
|
||||
self.assertTrue(network.Connection.recv_callback(
|
||||
@ -402,8 +402,8 @@ class ConnectionTest(unittest.TestCase):
|
||||
{'received': 'data'})
|
||||
|
||||
def test_recv_callback_handles_dead_actors(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.recv.return_value = 'data'
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.recv.return_value = 'data'
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.actor_ref.tell.side_effect = pykka.ActorDeadError()
|
||||
|
||||
@ -412,38 +412,38 @@ class ConnectionTest(unittest.TestCase):
|
||||
self.mock.stop.assert_called_once_with(any_unicode)
|
||||
|
||||
def test_recv_callback_gets_no_data(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.recv.return_value = ''
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.recv.return_value = ''
|
||||
self.mock.actor_ref = Mock()
|
||||
|
||||
self.assertTrue(network.Connection.recv_callback(
|
||||
self.mock, sentinel.fd, GObject.IO_IN))
|
||||
self.assertEqual(self.mock.mock_calls, [
|
||||
call.sock.recv(any_int),
|
||||
call._sock.recv(any_int),
|
||||
call.disable_recv(),
|
||||
call.actor_ref.tell({'close': True}),
|
||||
])
|
||||
|
||||
def test_recv_callback_recoverable_error(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
for error in (errno.EWOULDBLOCK, errno.EINTR):
|
||||
self.mock.sock.recv.side_effect = socket.error(error, '')
|
||||
self.mock._sock.recv.side_effect = socket.error(error, '')
|
||||
self.assertTrue(network.Connection.recv_callback(
|
||||
self.mock, sentinel.fd, GObject.IO_IN))
|
||||
self.assertEqual(0, self.mock.stop.call_count)
|
||||
|
||||
def test_recv_callback_unrecoverable_error(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.recv.side_effect = socket.error
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.recv.side_effect = socket.error
|
||||
|
||||
self.assertTrue(network.Connection.recv_callback(
|
||||
self.mock, sentinel.fd, GObject.IO_IN))
|
||||
self.mock.stop.assert_called_once_with(any_unicode)
|
||||
|
||||
def test_send_callback_respects_io_err(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.send.return_value = 1
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.send.return_value = 1
|
||||
self.mock.send_lock = Mock()
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.send_buffer = ''
|
||||
@ -453,8 +453,8 @@ class ConnectionTest(unittest.TestCase):
|
||||
self.mock.stop.assert_called_once_with(any_unicode)
|
||||
|
||||
def test_send_callback_respects_io_hup(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.send.return_value = 1
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.send.return_value = 1
|
||||
self.mock.send_lock = Mock()
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.send_buffer = ''
|
||||
@ -464,8 +464,8 @@ class ConnectionTest(unittest.TestCase):
|
||||
self.mock.stop.assert_called_once_with(any_unicode)
|
||||
|
||||
def test_send_callback_respects_io_hup_and_io_err(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.send.return_value = 1
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.send.return_value = 1
|
||||
self.mock.send_lock = Mock()
|
||||
self.mock.actor_ref = Mock()
|
||||
self.mock.send_buffer = ''
|
||||
@ -479,8 +479,8 @@ class ConnectionTest(unittest.TestCase):
|
||||
self.mock.send_lock = Mock()
|
||||
self.mock.send_lock.acquire.return_value = True
|
||||
self.mock.send_buffer = ''
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.send.return_value = 0
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.send.return_value = 0
|
||||
|
||||
self.assertTrue(network.Connection.send_callback(
|
||||
self.mock, sentinel.fd, GObject.IO_IN))
|
||||
@ -491,13 +491,13 @@ class ConnectionTest(unittest.TestCase):
|
||||
self.mock.send_lock = Mock()
|
||||
self.mock.send_lock.acquire.return_value = False
|
||||
self.mock.send_buffer = ''
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.send.return_value = 0
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.send.return_value = 0
|
||||
|
||||
self.assertTrue(network.Connection.send_callback(
|
||||
self.mock, sentinel.fd, GObject.IO_IN))
|
||||
self.mock.send_lock.acquire.assert_called_once_with(False)
|
||||
self.assertEqual(0, self.mock.sock.send.call_count)
|
||||
self.assertEqual(0, self.mock._sock.send.call_count)
|
||||
|
||||
def test_send_callback_sends_all_data(self):
|
||||
self.mock.send_lock = Mock()
|
||||
@ -523,31 +523,31 @@ class ConnectionTest(unittest.TestCase):
|
||||
self.assertEqual('ta', self.mock.send_buffer)
|
||||
|
||||
def test_send_recoverable_error(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
|
||||
for error in (errno.EWOULDBLOCK, errno.EINTR):
|
||||
self.mock.sock.send.side_effect = socket.error(error, '')
|
||||
self.mock._sock.send.side_effect = socket.error(error, '')
|
||||
|
||||
network.Connection.send(self.mock, 'data')
|
||||
self.assertEqual(0, self.mock.stop.call_count)
|
||||
|
||||
def test_send_calls_socket_send(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.send.return_value = 4
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.send.return_value = 4
|
||||
|
||||
self.assertEqual('', network.Connection.send(self.mock, 'data'))
|
||||
self.mock.sock.send.assert_called_once_with('data')
|
||||
self.mock._sock.send.assert_called_once_with('data')
|
||||
|
||||
def test_send_calls_socket_send_partial_send(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.send.return_value = 2
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.send.return_value = 2
|
||||
|
||||
self.assertEqual('ta', network.Connection.send(self.mock, 'data'))
|
||||
self.mock.sock.send.assert_called_once_with('data')
|
||||
self.mock._sock.send.assert_called_once_with('data')
|
||||
|
||||
def test_send_unrecoverable_error(self):
|
||||
self.mock.sock = Mock(spec=socket.SocketType)
|
||||
self.mock.sock.send.side_effect = socket.error
|
||||
self.mock._sock = Mock(spec=socket.SocketType)
|
||||
self.mock._sock.send.side_effect = socket.error
|
||||
|
||||
self.assertEqual('', network.Connection.send(self.mock, 'data'))
|
||||
self.mock.stop.assert_called_once_with(any_unicode)
|
||||
|
||||
@ -1,11 +1,13 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import errno
|
||||
import os
|
||||
import socket
|
||||
import unittest
|
||||
|
||||
from mock import Mock, patch, sentinel
|
||||
|
||||
from mopidy import exceptions
|
||||
from mopidy.internal import network
|
||||
from mopidy.internal.gi import GObject
|
||||
|
||||
@ -22,6 +24,7 @@ class ServerTest(unittest.TestCase):
|
||||
self.mock, sentinel.host, sentinel.port, sentinel.protocol)
|
||||
self.mock.create_server_socket.assert_called_once_with(
|
||||
sentinel.host, sentinel.port)
|
||||
self.mock.stop()
|
||||
|
||||
def test_init_calls_register_server(self):
|
||||
sock = Mock(spec=socket.SocketType)
|
||||
@ -55,40 +58,99 @@ class ServerTest(unittest.TestCase):
|
||||
self.assertEqual(sentinel.timeout, self.mock.timeout)
|
||||
self.assertEqual(sock, self.mock.server_socket)
|
||||
|
||||
@patch.object(network, 'create_socket', spec=socket.SocketType)
|
||||
def test_create_server_socket_sets_up_listener(self, create_socket):
|
||||
sock = create_socket.return_value
|
||||
def test_create_server_socket_no_port(self):
|
||||
with self.assertRaises(exceptions.ValidationError):
|
||||
network.Server.create_server_socket(
|
||||
self.mock, str(sentinel.host), None)
|
||||
|
||||
def test_create_server_socket_invalid_port(self):
|
||||
with self.assertRaises(exceptions.ValidationError):
|
||||
network.Server.create_server_socket(
|
||||
self.mock, str(sentinel.host), str(sentinel.port))
|
||||
|
||||
@patch.object(network, 'create_tcp_socket', spec=socket.SocketType)
|
||||
def test_create_server_socket_sets_up_listener(self, create_tcp_socket):
|
||||
sock = create_tcp_socket.return_value
|
||||
|
||||
network.Server.create_server_socket(
|
||||
self.mock, sentinel.host, sentinel.port)
|
||||
self.mock, str(sentinel.host), 1234)
|
||||
sock.setblocking.assert_called_once_with(False)
|
||||
sock.bind.assert_called_once_with((sentinel.host, sentinel.port))
|
||||
sock.bind.assert_called_once_with((str(sentinel.host), 1234))
|
||||
sock.listen.assert_called_once_with(any_int)
|
||||
create_tcp_socket.assert_called_once()
|
||||
|
||||
@patch.object(network, 'create_socket', new=Mock())
|
||||
@patch.object(network, 'create_unix_socket', spec=socket.SocketType)
|
||||
def test_create_server_socket_sets_up_listener_unix(
|
||||
self,
|
||||
create_unix_socket):
|
||||
sock = create_unix_socket.return_value
|
||||
|
||||
network.Server.create_server_socket(
|
||||
self.mock, 'unix:' + str(sentinel.host), sentinel.port)
|
||||
sock.setblocking.assert_called_once_with(False)
|
||||
sock.bind.assert_called_once_with(str(sentinel.host))
|
||||
sock.listen.assert_called_once_with(any_int)
|
||||
create_unix_socket.assert_called_once()
|
||||
|
||||
@patch.object(network, 'create_tcp_socket', new=Mock())
|
||||
def test_create_server_socket_fails(self):
|
||||
network.create_socket.side_effect = socket.error
|
||||
network.create_tcp_socket.side_effect = socket.error
|
||||
with self.assertRaises(socket.error):
|
||||
network.Server.create_server_socket(
|
||||
self.mock, sentinel.host, sentinel.port)
|
||||
self.mock, str(sentinel.host), 1234)
|
||||
|
||||
@patch.object(network, 'create_socket', new=Mock())
|
||||
@patch.object(network, 'create_unix_socket', new=Mock())
|
||||
def test_create_server_socket_fails_unix(self):
|
||||
network.create_unix_socket.side_effect = socket.error
|
||||
with self.assertRaises(socket.error):
|
||||
network.Server.create_server_socket(
|
||||
self.mock, 'unix:' + str(sentinel.host), sentinel.port)
|
||||
|
||||
@patch.object(network, 'create_tcp_socket', new=Mock())
|
||||
def test_create_server_bind_fails(self):
|
||||
sock = network.create_socket.return_value
|
||||
sock = network.create_tcp_socket.return_value
|
||||
sock.bind.side_effect = socket.error
|
||||
|
||||
with self.assertRaises(socket.error):
|
||||
network.Server.create_server_socket(
|
||||
self.mock, sentinel.host, sentinel.port)
|
||||
self.mock, str(sentinel.host), 1234)
|
||||
|
||||
@patch.object(network, 'create_socket', new=Mock())
|
||||
@patch.object(network, 'create_unix_socket', new=Mock())
|
||||
def test_create_server_bind_fails_unix(self):
|
||||
sock = network.create_unix_socket.return_value
|
||||
sock.bind.side_effect = socket.error
|
||||
|
||||
with self.assertRaises(socket.error):
|
||||
network.Server.create_server_socket(
|
||||
self.mock, 'unix:' + str(sentinel.host), sentinel.port)
|
||||
|
||||
@patch.object(network, 'create_tcp_socket', new=Mock())
|
||||
def test_create_server_listen_fails(self):
|
||||
sock = network.create_socket.return_value
|
||||
sock = network.create_tcp_socket.return_value
|
||||
sock.listen.side_effect = socket.error
|
||||
|
||||
with self.assertRaises(socket.error):
|
||||
network.Server.create_server_socket(
|
||||
self.mock, sentinel.host, sentinel.port)
|
||||
self.mock, str(sentinel.host), 1234)
|
||||
|
||||
@patch.object(network, 'create_unix_socket', new=Mock())
|
||||
def test_create_server_listen_fails_unix(self):
|
||||
sock = network.create_unix_socket.return_value
|
||||
sock.listen.side_effect = socket.error
|
||||
|
||||
with self.assertRaises(socket.error):
|
||||
network.Server.create_server_socket(
|
||||
self.mock, 'unix:' + str(sentinel.host), sentinel.port)
|
||||
|
||||
@patch.object(os, 'unlink', new=Mock())
|
||||
@patch.object(GObject, 'source_remove', new=Mock())
|
||||
def test_stop_server_cleans_unix_socket(self):
|
||||
self.mock.watcher = Mock()
|
||||
sock = Mock()
|
||||
sock.family = socket.AF_UNIX
|
||||
self.mock.server_socket = sock
|
||||
network.Server.stop(self.mock)
|
||||
os.unlink.assert_called_once_with(sock.getsockname())
|
||||
|
||||
@patch.object(GObject, 'io_add_watch', new=Mock())
|
||||
def test_register_server_socket_sets_up_io_watch(self):
|
||||
@ -124,13 +186,26 @@ class ServerTest(unittest.TestCase):
|
||||
|
||||
def test_accept_connection(self):
|
||||
sock = Mock(spec=socket.SocketType)
|
||||
sock.accept.return_value = (sentinel.sock, sentinel.addr)
|
||||
connected_sock = Mock(spec=socket.SocketType)
|
||||
sock.accept.return_value = (connected_sock, sentinel.addr)
|
||||
self.mock.server_socket = sock
|
||||
|
||||
sock, addr = network.Server.accept_connection(self.mock)
|
||||
self.assertEqual(sentinel.sock, sock)
|
||||
self.assertEqual(connected_sock, sock)
|
||||
self.assertEqual(sentinel.addr, addr)
|
||||
|
||||
def test_accept_connection_unix(self):
|
||||
sock = Mock(spec=socket.SocketType)
|
||||
connected_sock = Mock(spec=socket.SocketType)
|
||||
connected_sock.family = socket.AF_UNIX
|
||||
connected_sock.getsockname.return_value = sentinel.sockname
|
||||
sock.accept.return_value = (connected_sock, sentinel.addr)
|
||||
self.mock.server_socket = sock
|
||||
|
||||
sock, addr = network.Server.accept_connection(self.mock)
|
||||
self.assertEqual(connected_sock, sock)
|
||||
self.assertEqual((sentinel.sockname, None), addr)
|
||||
|
||||
def test_accept_connection_recoverable_error(self):
|
||||
sock = Mock(spec=socket.SocketType)
|
||||
self.mock.server_socket = sock
|
||||
@ -182,6 +257,7 @@ class ServerTest(unittest.TestCase):
|
||||
sentinel.protocol, {}, sentinel.sock, sentinel.addr,
|
||||
sentinel.timeout)
|
||||
|
||||
@patch.object(network, 'format_socket_name', new=Mock())
|
||||
def test_reject_connection(self):
|
||||
sock = Mock(spec=socket.SocketType)
|
||||
|
||||
@ -189,6 +265,7 @@ class ServerTest(unittest.TestCase):
|
||||
self.mock, sock, (sentinel.host, sentinel.port))
|
||||
sock.close.assert_called_once_with()
|
||||
|
||||
@patch.object(network, 'format_socket_name', new=Mock())
|
||||
def test_reject_connection_error(self):
|
||||
sock = Mock(spec=socket.SocketType)
|
||||
sock.close.side_effect = socket.error
|
||||
|
||||
@ -3,7 +3,7 @@ from __future__ import absolute_import, unicode_literals
|
||||
import socket
|
||||
import unittest
|
||||
|
||||
from mock import Mock, patch
|
||||
from mock import Mock, patch, sentinel
|
||||
|
||||
from mopidy.internal import network
|
||||
|
||||
@ -22,6 +22,25 @@ class FormatHostnameTest(unittest.TestCase):
|
||||
self.assertEqual(network.format_hostname('0.0.0.0'), '0.0.0.0')
|
||||
|
||||
|
||||
class FormatSocketConnectionTest(unittest.TestCase):
|
||||
|
||||
def test_format_socket_name(self):
|
||||
sock = Mock(spec=socket.SocketType)
|
||||
sock.family = socket.AF_INET
|
||||
sock.getsockname.return_value = (sentinel.ip, sentinel.port)
|
||||
self.assertEqual(
|
||||
network.format_socket_name(sock),
|
||||
'[%s]:%s' % (sentinel.ip, sentinel.port))
|
||||
|
||||
def test_format_socket_name_unix(self):
|
||||
sock = Mock(spec=socket.SocketType)
|
||||
sock.family = socket.AF_UNIX
|
||||
sock.getsockname.return_value = sentinel.sockname
|
||||
self.assertEqual(
|
||||
network.format_socket_name(sock),
|
||||
str(sentinel.sockname))
|
||||
|
||||
|
||||
class TryIPv6SocketTest(unittest.TestCase):
|
||||
|
||||
@patch('socket.has_ipv6', False)
|
||||
@ -46,14 +65,14 @@ class CreateSocketTest(unittest.TestCase):
|
||||
@patch('mopidy.internal.network.has_ipv6', False)
|
||||
@patch('socket.socket')
|
||||
def test_ipv4_socket(self, socket_mock):
|
||||
network.create_socket()
|
||||
network.create_tcp_socket()
|
||||
self.assertEqual(
|
||||
socket_mock.call_args[0], (socket.AF_INET, socket.SOCK_STREAM))
|
||||
|
||||
@patch('mopidy.internal.network.has_ipv6', True)
|
||||
@patch('socket.socket')
|
||||
def test_ipv6_socket(self, socket_mock):
|
||||
network.create_socket()
|
||||
network.create_tcp_socket()
|
||||
self.assertEqual(
|
||||
socket_mock.call_args[0], (socket.AF_INET6, socket.SOCK_STREAM))
|
||||
|
||||
|
||||
@ -40,6 +40,13 @@ class LocaleDecodeTest(unittest.TestCase):
|
||||
def test_does_not_use_locale_to_decode_ascii_bytestrings(self, mock):
|
||||
mock.return_value = 'UTF-8'
|
||||
|
||||
encoding.locale_decode('abc')
|
||||
encoding.locale_decode(b'abc')
|
||||
|
||||
self.assertFalse(mock.called)
|
||||
|
||||
def test_replaces_unknown_bytes_instead_of_crashing(self, mock):
|
||||
mock.return_value = 'US-ASCII'
|
||||
|
||||
result = encoding.locale_decode(b'abc\xc3def')
|
||||
|
||||
assert result == 'abc\ufffddef'
|
||||
|
||||
@ -33,7 +33,7 @@ def test_download_on_server_side_error(session, caplog):
|
||||
result = http.download(session, URI)
|
||||
|
||||
assert result is None
|
||||
assert 'Problem downloading' in caplog.text()
|
||||
assert 'Problem downloading' in caplog.text
|
||||
|
||||
|
||||
def test_download_times_out_if_connection_times_out(session_mock, caplog):
|
||||
@ -45,7 +45,7 @@ def test_download_times_out_if_connection_times_out(session_mock, caplog):
|
||||
assert result is None
|
||||
assert (
|
||||
'Download of %r failed due to connection timeout after 1.000s' % URI
|
||||
in caplog.text())
|
||||
in caplog.text)
|
||||
|
||||
|
||||
@responses.activate
|
||||
@ -60,4 +60,4 @@ def test_download_times_out_if_download_is_slow(session, caplog):
|
||||
assert result is None
|
||||
assert (
|
||||
'Download of %r failed due to download taking more than 1.000s' % URI
|
||||
in caplog.text())
|
||||
in caplog.text)
|
||||
|
||||
@ -137,6 +137,18 @@ class GetOrCreateFileTest(unittest.TestCase):
|
||||
self.assertEqual(fh.read(), b'foobar\xc3\xa6\xc3\xb8\xc3\xa5')
|
||||
|
||||
|
||||
class GetUnixSocketPathTest(unittest.TestCase):
|
||||
|
||||
def test_correctly_matched_socket_path(self):
|
||||
self.assertEqual(
|
||||
path.get_unix_socket_path('unix:/tmp/mopidy.socket'),
|
||||
'/tmp/mopidy.socket'
|
||||
)
|
||||
|
||||
def test_correctly_no_match_socket_path(self):
|
||||
self.assertIsNone(path.get_unix_socket_path('127.0.0.1'))
|
||||
|
||||
|
||||
class PathToFileURITest(unittest.TestCase):
|
||||
|
||||
def test_simple_path(self):
|
||||
|
||||
@ -94,9 +94,15 @@ class M3UPlaylistsProviderTest(unittest.TestCase):
|
||||
self.assertEqual(uri, playlist.uri)
|
||||
self.assertTrue(os.path.exists(path))
|
||||
|
||||
self.core.playlists.delete(playlist.uri)
|
||||
success = self.core.playlists.delete(playlist.uri)
|
||||
self.assertTrue(success)
|
||||
self.assertFalse(os.path.exists(path))
|
||||
|
||||
def test_delete_on_path_outside_playlist_dir_returns_none(self):
|
||||
success = self.core.playlists.delete('m3u:///etc/passwd')
|
||||
|
||||
self.assertFalse(success)
|
||||
|
||||
def test_playlist_contents_is_written_to_disk(self):
|
||||
track = Track(uri=generate_song(1))
|
||||
playlist = self.core.playlists.create('test')
|
||||
@ -122,7 +128,7 @@ class M3UPlaylistsProviderTest(unittest.TestCase):
|
||||
def test_latin1_playlist_contents_is_written_to_disk(self):
|
||||
track = Track(uri=generate_song(1), name='Test\x9f', length=60000)
|
||||
playlist = self.core.playlists.create('test')
|
||||
playlist = self.core.playlists.save(playlist.copy(tracks=[track]))
|
||||
playlist = self.core.playlists.save(playlist.replace(tracks=[track]))
|
||||
path = os.path.join(self.playlists_dir, b'test.m3u')
|
||||
|
||||
with open(path, 'rb') as f:
|
||||
@ -132,7 +138,7 @@ class M3UPlaylistsProviderTest(unittest.TestCase):
|
||||
def test_utf8_playlist_contents_is_replaced_and_written_to_disk(self):
|
||||
track = Track(uri=generate_song(1), name='Test\u07b4', length=60000)
|
||||
playlist = self.core.playlists.create('test')
|
||||
playlist = self.core.playlists.save(playlist.copy(tracks=[track]))
|
||||
playlist = self.core.playlists.save(playlist.replace(tracks=[track]))
|
||||
path = os.path.join(self.playlists_dir, b'test.m3u')
|
||||
|
||||
with open(path, 'rb') as f:
|
||||
@ -151,6 +157,9 @@ class M3UPlaylistsProviderTest(unittest.TestCase):
|
||||
self.assertEqual(playlist.name, result.name)
|
||||
self.assertEqual(track.uri, result.tracks[0].uri)
|
||||
|
||||
@unittest.skipIf(
|
||||
platform.system() == 'Darwin',
|
||||
'macOS 10.13 raises IOError "Illegal byte sequence" on open.')
|
||||
def test_load_playlist_with_nonfilesystem_encoding_of_filename(self):
|
||||
path = os.path.join(self.playlists_dir, 'øæå.m3u'.encode('latin-1'))
|
||||
with open(path, 'wb+') as f:
|
||||
@ -160,10 +169,7 @@ class M3UPlaylistsProviderTest(unittest.TestCase):
|
||||
|
||||
self.assertEqual(len(self.core.playlists.as_list()), 1)
|
||||
result = self.core.playlists.as_list()
|
||||
if platform.system() == 'Darwin':
|
||||
self.assertEqual('%F8%E6%E5', result[0].name)
|
||||
else:
|
||||
self.assertEqual('\ufffd\ufffd\ufffd', result[0].name)
|
||||
self.assertEqual('\ufffd\ufffd\ufffd', result[0].name)
|
||||
|
||||
@unittest.SkipTest
|
||||
def test_playlists_dir_is_created(self):
|
||||
@ -216,6 +222,11 @@ class M3UPlaylistsProviderTest(unittest.TestCase):
|
||||
|
||||
self.assertEqual(original_playlist, looked_up_playlist)
|
||||
|
||||
def test_lookup_on_path_outside_playlist_dir_returns_none(self):
|
||||
result = self.core.playlists.lookup('m3u:///etc/passwd')
|
||||
|
||||
self.assertIsNone(result)
|
||||
|
||||
def test_refresh(self):
|
||||
playlist = self.core.playlists.create('test')
|
||||
self.assertEqual(playlist, self.core.playlists.lookup(playlist.uri))
|
||||
@ -251,6 +262,10 @@ class M3UPlaylistsProviderTest(unittest.TestCase):
|
||||
path = os.path.join(self.playlists_dir, b'test.m3u')
|
||||
self.assertTrue(os.path.exists(path))
|
||||
|
||||
def test_save_on_path_outside_playlist_dir_returns_none(self):
|
||||
result = self.core.playlists.save(Playlist(uri='m3u:///tmp/test.m3u'))
|
||||
self.assertIsNone(result)
|
||||
|
||||
def test_playlist_with_unknown_track(self):
|
||||
track = Track(uri='file:///dev/null')
|
||||
playlist = self.core.playlists.create('test')
|
||||
@ -330,6 +345,11 @@ class M3UPlaylistsProviderTest(unittest.TestCase):
|
||||
|
||||
self.assertIsNone(item_refs)
|
||||
|
||||
def test_get_items_from_file_outside_playlist_dir_returns_none(self):
|
||||
item_refs = self.core.playlists.get_items('m3u:///etc/passwd')
|
||||
|
||||
self.assertIsNone(item_refs)
|
||||
|
||||
|
||||
class M3UPlaylistsProviderBaseDirectoryTest(M3UPlaylistsProviderTest):
|
||||
|
||||
|
||||
@ -23,7 +23,7 @@ class ReflectionHandlerTest(protocol.BaseTestCase):
|
||||
self.assertNotInResponse('command: command_list_begin')
|
||||
self.assertNotInResponse('command: command_list_ok_begin')
|
||||
self.assertNotInResponse('command: command_list_end')
|
||||
self.assertNotInResponse('command: idle')
|
||||
self.assertInResponse('command: idle')
|
||||
self.assertNotInResponse('command: noidle')
|
||||
self.assertNotInResponse('command: sticker')
|
||||
self.assertInResponse('OK')
|
||||
|
||||
@ -46,6 +46,10 @@ class PlaylistsHandlerTest(protocol.BaseTestCase):
|
||||
self.assertInResponse('OK')
|
||||
|
||||
def test_listplaylistinfo(self):
|
||||
tracks = [
|
||||
Track(uri='dummy:a', name='Track A', length=5000),
|
||||
]
|
||||
self.backend.library.dummy_library = tracks
|
||||
self.backend.playlists.set_dummy_playlists([
|
||||
Playlist(
|
||||
name='name', uri='dummy:name', tracks=[Track(uri='dummy:a')])])
|
||||
@ -53,14 +57,20 @@ class PlaylistsHandlerTest(protocol.BaseTestCase):
|
||||
self.send_request('listplaylistinfo "name"')
|
||||
|
||||
self.assertInResponse('file: dummy:a')
|
||||
self.assertInResponse('Title: Track A')
|
||||
self.assertInResponse('Time: 5')
|
||||
self.assertNotInResponse('Track: 0')
|
||||
self.assertNotInResponse('Pos: 0')
|
||||
self.assertInResponse('OK')
|
||||
|
||||
def test_listplaylistinfo_without_quotes(self):
|
||||
tracks = [
|
||||
Track(uri='dummy:a'),
|
||||
]
|
||||
self.backend.library.dummy_library = tracks
|
||||
self.backend.playlists.set_dummy_playlists([
|
||||
Playlist(
|
||||
name='name', uri='dummy:name', tracks=[Track(uri='dummy:a')])])
|
||||
name='name', uri='dummy:name', tracks=tracks)])
|
||||
|
||||
self.send_request('listplaylistinfo name')
|
||||
|
||||
@ -76,13 +86,18 @@ class PlaylistsHandlerTest(protocol.BaseTestCase):
|
||||
'ACK [50@0] {listplaylistinfo} No such playlist')
|
||||
|
||||
def test_listplaylistinfo_duplicate(self):
|
||||
playlist1 = Playlist(name='a', uri='dummy:a1', tracks=[Track(uri='b')])
|
||||
playlist2 = Playlist(name='a', uri='dummy:a2', tracks=[Track(uri='c')])
|
||||
tracks = [
|
||||
Track(uri='dummy:b'),
|
||||
Track(uri='dummy:c'),
|
||||
]
|
||||
self.backend.library.dummy_library = tracks
|
||||
playlist1 = Playlist(name='a', uri='dummy:a1', tracks=tracks[:1])
|
||||
playlist2 = Playlist(name='a', uri='dummy:a2', tracks=tracks[1:])
|
||||
self.backend.playlists.set_dummy_playlists([playlist1, playlist2])
|
||||
|
||||
self.send_request('listplaylistinfo "a [2]"')
|
||||
|
||||
self.assertInResponse('file: c')
|
||||
self.assertInResponse('file: dummy:c')
|
||||
self.assertNotInResponse('Track: 0')
|
||||
self.assertNotInResponse('Pos: 0')
|
||||
self.assertInResponse('OK')
|
||||
@ -236,6 +251,25 @@ class PlaylistsHandlerTest(protocol.BaseTestCase):
|
||||
self.send_request('load "unknown/playlist"')
|
||||
self.assertEqualResponse('ACK [50@0] {load} No such playlist')
|
||||
|
||||
def test_load_full_track_metadata(self):
|
||||
tracks = [
|
||||
Track(uri='dummy:a', name='Track A', length=5000),
|
||||
]
|
||||
self.backend.library.dummy_library = tracks
|
||||
self.backend.playlists.set_dummy_playlists([
|
||||
Playlist(
|
||||
name='A-list', uri='dummy:a1', tracks=[Track(uri='dummy:a')])])
|
||||
|
||||
self.send_request('load "A-list"')
|
||||
|
||||
tracks = self.core.tracklist.tracks.get()
|
||||
self.assertEqual(len(tracks), 1)
|
||||
self.assertEqual(tracks[0].uri, 'dummy:a')
|
||||
self.assertEqual(tracks[0].name, 'Track A')
|
||||
self.assertEqual(tracks[0].length, 5000)
|
||||
|
||||
self.assertInResponse('OK')
|
||||
|
||||
def test_playlistadd(self):
|
||||
tracks = [
|
||||
Track(uri='dummy:a'),
|
||||
@ -286,6 +320,13 @@ class PlaylistsHandlerTest(protocol.BaseTestCase):
|
||||
self.assertInResponse('OK')
|
||||
self.assertIsNotNone(self.backend.playlists.lookup('dummy:name').get())
|
||||
|
||||
def test_playlistclear_creates_playlist_save_fails(self):
|
||||
self.backend.playlists.set_allow_save(False)
|
||||
self.send_request('playlistclear "name"')
|
||||
|
||||
self.assertInResponse('ACK [0@0] {playlistclear} Backend with '
|
||||
'scheme "dummy" failed to save playlist')
|
||||
|
||||
def test_playlistclear_invalid_name_acks(self):
|
||||
self.send_request('playlistclear "foo/bar"')
|
||||
self.assertInResponse('ACK [2@0] {playlistclear} playlist name is '
|
||||
@ -308,6 +349,22 @@ class PlaylistsHandlerTest(protocol.BaseTestCase):
|
||||
self.assertEqual(
|
||||
2, len(self.backend.playlists.get_items('dummy:a1').get()))
|
||||
|
||||
def test_playlistdelete_save_fails(self):
|
||||
tracks = [
|
||||
Track(uri='dummy:a'),
|
||||
Track(uri='dummy:b'),
|
||||
Track(uri='dummy:c'),
|
||||
] # len() == 3
|
||||
self.backend.playlists.set_dummy_playlists([
|
||||
Playlist(
|
||||
name='name', uri='dummy:a1', tracks=tracks)])
|
||||
self.backend.playlists.set_allow_save(False)
|
||||
|
||||
self.send_request('playlistdelete "name" "2"')
|
||||
|
||||
self.assertInResponse('ACK [0@0] {playlistdelete} Backend with '
|
||||
'scheme "dummy" failed to save playlist')
|
||||
|
||||
def test_playlistdelete_invalid_name_acks(self):
|
||||
self.send_request('playlistdelete "foo/bar" "0"')
|
||||
self.assertInResponse('ACK [2@0] {playlistdelete} playlist name is '
|
||||
@ -340,6 +397,22 @@ class PlaylistsHandlerTest(protocol.BaseTestCase):
|
||||
"dummy:c",
|
||||
self.backend.playlists.get_items('dummy:a1').get()[0].uri)
|
||||
|
||||
def test_playlistmove_save_fails(self):
|
||||
tracks = [
|
||||
Track(uri='dummy:a'),
|
||||
Track(uri='dummy:b'),
|
||||
Track(uri='dummy:c') # this one is getting moved to top
|
||||
]
|
||||
self.backend.playlists.set_dummy_playlists([
|
||||
Playlist(
|
||||
name='name', uri='dummy:a1', tracks=tracks)])
|
||||
self.backend.playlists.set_allow_save(False)
|
||||
|
||||
self.send_request('playlistmove "name" "2" "0"')
|
||||
|
||||
self.assertInResponse('ACK [0@0] {playlistmove} Backend with '
|
||||
'scheme "dummy" failed to save playlist')
|
||||
|
||||
def test_playlistmove_invalid_name_acks(self):
|
||||
self.send_request('playlistmove "foo/bar" "0" "1"')
|
||||
self.assertInResponse('ACK [2@0] {playlistmove} playlist name is '
|
||||
@ -387,6 +460,17 @@ class PlaylistsHandlerTest(protocol.BaseTestCase):
|
||||
self.assertIsNotNone(
|
||||
self.backend.playlists.lookup('dummy:new_name').get())
|
||||
|
||||
def test_rename_save_fails(self):
|
||||
self.backend.playlists.set_dummy_playlists([
|
||||
Playlist(
|
||||
name='old_name', uri='dummy:a1', tracks=[Track(uri='b')])])
|
||||
self.backend.playlists.set_allow_save(False)
|
||||
|
||||
self.send_request('rename "old_name" "new_name"')
|
||||
|
||||
self.assertInResponse('ACK [0@0] {rename} Backend with '
|
||||
'scheme "dummy" failed to save playlist')
|
||||
|
||||
def test_rename_unknown_playlist_acks(self):
|
||||
self.send_request('rename "foo" "bar"')
|
||||
self.assertInResponse('ACK [50@0] {rename} No such playlist')
|
||||
@ -438,6 +522,13 @@ class PlaylistsHandlerTest(protocol.BaseTestCase):
|
||||
self.assertInResponse('OK')
|
||||
self.assertIsNotNone(self.backend.playlists.lookup('dummy:name').get())
|
||||
|
||||
def test_save_fails(self):
|
||||
self.backend.playlists.set_allow_save(False)
|
||||
self.send_request('save "name"')
|
||||
|
||||
self.assertInResponse('ACK [0@0] {save} Backend with '
|
||||
'scheme "dummy" failed to save playlist')
|
||||
|
||||
def test_save_invalid_name_acks(self):
|
||||
self.send_request('save "foo/bar"')
|
||||
self.assertInResponse('ACK [2@0] {save} playlist name is invalid: '
|
||||
|
||||
@ -60,4 +60,8 @@ def test_lookup_converts_uri_metadata_to_track(audio, config, track_uri):
|
||||
backend = actor.StreamBackend(audio=audio, config=config)
|
||||
|
||||
result = backend.library.lookup(track_uri)
|
||||
assert result == [Track(length=4406, uri=track_uri)]
|
||||
|
||||
assert len(result) == 1
|
||||
track = result[0]
|
||||
assert track.uri == track_uri
|
||||
assert track.length == 4406
|
||||
|
||||
@ -1,5 +1,7 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import logging
|
||||
|
||||
import mock
|
||||
|
||||
import pytest
|
||||
@ -92,6 +94,7 @@ class TestTranslateURI(object):
|
||||
def test_text_playlist_with_mpeg_stream(
|
||||
self, scanner, provider, caplog):
|
||||
|
||||
caplog.set_level(logging.DEBUG)
|
||||
scanner.scan.side_effect = [
|
||||
# Scanning playlist
|
||||
mock.Mock(mime='text/foo', playable=False),
|
||||
@ -112,11 +115,11 @@ class TestTranslateURI(object):
|
||||
|
||||
# Check logging to ensure debuggability
|
||||
assert 'Unwrapping stream from URI: %s' % PLAYLIST_URI
|
||||
assert 'Parsed playlist (%s)' % PLAYLIST_URI in caplog.text()
|
||||
assert 'Parsed playlist (%s)' % PLAYLIST_URI in caplog.text
|
||||
assert 'Unwrapping stream from URI: %s' % STREAM_URI
|
||||
assert (
|
||||
'Unwrapped potential audio/mpeg stream: %s' % STREAM_URI
|
||||
in caplog.text())
|
||||
in caplog.text)
|
||||
|
||||
# Check proper Requests session setup
|
||||
assert responses.calls[0].request.headers['User-Agent'].startswith(
|
||||
@ -146,6 +149,7 @@ class TestTranslateURI(object):
|
||||
def test_scan_fails_but_playlist_parsing_succeeds(
|
||||
self, scanner, provider, caplog):
|
||||
|
||||
caplog.set_level(logging.DEBUG)
|
||||
scanner.scan.side_effect = [
|
||||
# Scanning playlist
|
||||
exceptions.ScannerError('some failure'),
|
||||
@ -160,18 +164,18 @@ class TestTranslateURI(object):
|
||||
|
||||
assert 'Unwrapping stream from URI: %s' % PLAYLIST_URI
|
||||
assert (
|
||||
'GStreamer failed scanning URI (%s)' % PLAYLIST_URI
|
||||
in caplog.text())
|
||||
assert 'Parsed playlist (%s)' % PLAYLIST_URI in caplog.text()
|
||||
'GStreamer failed scanning URI (%s)' % PLAYLIST_URI in caplog.text)
|
||||
assert 'Parsed playlist (%s)' % PLAYLIST_URI in caplog.text
|
||||
assert (
|
||||
'Unwrapped potential audio/mpeg stream: %s' % STREAM_URI
|
||||
in caplog.text())
|
||||
in caplog.text)
|
||||
assert result == STREAM_URI
|
||||
|
||||
@responses.activate
|
||||
def test_scan_fails_and_playlist_parsing_fails(
|
||||
self, scanner, provider, caplog):
|
||||
|
||||
caplog.set_level(logging.DEBUG)
|
||||
scanner.scan.side_effect = exceptions.ScannerError('some failure')
|
||||
responses.add(
|
||||
responses.GET, STREAM_URI,
|
||||
@ -181,15 +185,15 @@ class TestTranslateURI(object):
|
||||
|
||||
assert 'Unwrapping stream from URI: %s' % STREAM_URI
|
||||
assert (
|
||||
'GStreamer failed scanning URI (%s)' % STREAM_URI
|
||||
in caplog.text())
|
||||
'GStreamer failed scanning URI (%s)' % STREAM_URI in caplog.text)
|
||||
assert (
|
||||
'Failed parsing URI (%s) as playlist; found potential stream.'
|
||||
% STREAM_URI in caplog.text())
|
||||
% STREAM_URI in caplog.text)
|
||||
assert result == STREAM_URI
|
||||
|
||||
@responses.activate
|
||||
def test_failed_download_returns_none(self, scanner, provider, caplog):
|
||||
caplog.set_level(logging.DEBUG)
|
||||
scanner.scan.side_effect = [
|
||||
mock.Mock(mime='text/foo', playable=False)
|
||||
]
|
||||
@ -204,10 +208,11 @@ class TestTranslateURI(object):
|
||||
|
||||
assert (
|
||||
'Unwrapping stream from URI (%s) failed: '
|
||||
'error downloading URI' % PLAYLIST_URI) in caplog.text()
|
||||
'error downloading URI' % PLAYLIST_URI) in caplog.text
|
||||
|
||||
@responses.activate
|
||||
def test_playlist_references_itself(self, scanner, provider, caplog):
|
||||
caplog.set_level(logging.DEBUG)
|
||||
scanner.scan.side_effect = [
|
||||
mock.Mock(mime='text/foo', playable=False)
|
||||
]
|
||||
@ -218,11 +223,11 @@ class TestTranslateURI(object):
|
||||
|
||||
result = provider.translate_uri(PLAYLIST_URI)
|
||||
|
||||
assert 'Unwrapping stream from URI: %s' % PLAYLIST_URI in caplog.text()
|
||||
assert 'Unwrapping stream from URI: %s' % PLAYLIST_URI in caplog.text
|
||||
assert (
|
||||
'Parsed playlist (%s) and found new URI: %s'
|
||||
% (PLAYLIST_URI, PLAYLIST_URI)) in caplog.text()
|
||||
% (PLAYLIST_URI, PLAYLIST_URI)) in caplog.text
|
||||
assert (
|
||||
'Unwrapping stream from URI (%s) failed: '
|
||||
'playlist referenced itself' % PLAYLIST_URI) in caplog.text()
|
||||
'playlist referenced itself' % PLAYLIST_URI) in caplog.text
|
||||
assert result is None
|
||||
|
||||
15
tox.ini
15
tox.ini
@ -1,28 +1,19 @@
|
||||
[tox]
|
||||
envlist = py27, py27-tornado32, docs, flake8
|
||||
envlist = py27, docs, flake8
|
||||
|
||||
[testenv]
|
||||
sitepackages = true
|
||||
commands =
|
||||
py.test \
|
||||
pytest \
|
||||
--basetemp={envtmpdir} \
|
||||
--cov=mopidy --cov-report=term-missing \
|
||||
-n 4 \
|
||||
{posargs}
|
||||
deps =
|
||||
mock
|
||||
pytest<3.3.0
|
||||
pytest-capturelog
|
||||
pytest
|
||||
pytest-cov
|
||||
pytest-xdist
|
||||
responses
|
||||
|
||||
[testenv:py27-tornado32]
|
||||
commands = py.test tests/http
|
||||
deps =
|
||||
{[testenv]deps}
|
||||
tornado==3.2.2
|
||||
|
||||
[testenv:docs]
|
||||
deps = -r{toxinidir}/docs/requirements.txt
|
||||
changedir = docs
|
||||
|
||||
Loading…
Reference in New Issue
Block a user