Release v1.1.1
This commit is contained in:
commit
bce00e985c
@ -1,5 +0,0 @@
|
||||
[report]
|
||||
omit =
|
||||
*/pyshared/*
|
||||
*/python?.?/*
|
||||
*/site-packages/nose/*
|
||||
@ -27,7 +27,7 @@ script:
|
||||
- "tox -e $TOX_ENV"
|
||||
|
||||
after_success:
|
||||
- "if [ $TOX_ENV == 'py27' ]; then pip install coveralls; coveralls; fi"
|
||||
- "if [ $TOX_ENV == 'py27' ]; then pip install coveralls==1.0b1; coveralls; fi"
|
||||
|
||||
branches:
|
||||
except:
|
||||
|
||||
@ -1,7 +1,6 @@
|
||||
include *.py
|
||||
include *.rst
|
||||
include *.txt
|
||||
include .coveragerc
|
||||
include .mailmap
|
||||
include .travis.yml
|
||||
include AUTHORS
|
||||
|
||||
@ -4,6 +4,99 @@ Changelog
|
||||
|
||||
This changelog is used to track all major changes to Mopidy.
|
||||
|
||||
|
||||
v1.1.1 (2015-09-14)
|
||||
===================
|
||||
|
||||
Bug fix release.
|
||||
|
||||
- Dependencies: Specify that we need Requests >= 2.0, not just any version.
|
||||
This ensures that we fail earlier if Mopidy is used with a too old Requests.
|
||||
|
||||
- Core: Make :meth:`mopidy.core.LibraryController.refresh` work for all
|
||||
backends with a library provider. Previously, it wrongly worked for all
|
||||
backends with a playlists provider. (Fixes: :issue:`1257`)
|
||||
|
||||
- Core: Respect :confval:`core/cache_dir` and :confval:`core/data_dir` config
|
||||
values added in 1.1.0 when creating the dirs Mopidy need to store data. This
|
||||
should not change the behavior for desktop users running Mopidy. When running
|
||||
Mopidy as a system service installed from a package which sets the core dir
|
||||
configs properly (e.g. Debian and Arch packages), this fix avoids the
|
||||
creation of a couple of directories that should not be used, typically
|
||||
:file:`/var/lib/mopidy/.local` and :file:`/var/lib/mopidy/.cache`. (Fixes:
|
||||
:issue:`1259`, PR: :issue:`1266`)
|
||||
|
||||
- Core: Fix error in :meth:`~mopidy.core.TracklistController.get_eot_tlid`
|
||||
docstring. (Fixes: :issue:`1269`)
|
||||
|
||||
- Audio: Add ``timeout`` parameter to :meth:`~mopidy.audio.scan.Scanner.scan`.
|
||||
(Part of: :issue:`1250`, PR: :issue:`1281`)
|
||||
|
||||
- Extension support: Make :meth:`~mopidy.ext.Extension.get_cache_dir`,
|
||||
:meth:`~mopidy.ext.Extension.get_config_dir`, and
|
||||
:meth:`~mopidy.ext.Extension.get_data_dir` class methods, so they can be used
|
||||
without creating an instance of the :class:`~mopidy.ext.Extension` class.
|
||||
(Fixes: :issue:`1275`)
|
||||
|
||||
- Local: Deprecate :confval:`local/data_dir` and respect
|
||||
:confval:`core/data_dir` instead. This does not change the defaults for
|
||||
desktop users, only system services installed from packages that properly set
|
||||
:confval:`core/data_dir`, like the Debian and Arch packages. (Fixes:
|
||||
:issue:`1259`, PR: :issue:`1266`)
|
||||
|
||||
- Local: Change default value of :confval:`local/scan_flush_threshold` from
|
||||
1000 to 100 to shorten the time Mopidy-Local-SQLite blocks incoming requests
|
||||
while scanning the local library.
|
||||
|
||||
- M3U: Changed default for the :confval:`m3u/playlists_dir` from
|
||||
``$XDG_DATA_DIR/mopidy/m3u`` to unset, which now means the extension's data
|
||||
dir. This does not change the defaults for desktop users, only system
|
||||
services installed from packages that properly set :confval:`core/data_dir`,
|
||||
like the Debian and Arch pakages. (Fixes: :issue:`1259`, PR: :issue:`1266`)
|
||||
|
||||
- Stream: Expand nested playlists to find the stream URI. This used to work,
|
||||
but regressed in 1.1.0 with the extraction of stream playlist parsing from
|
||||
GStreamer to being handled by the Mopidy-Stream backend. (Fixes:
|
||||
:issue:`1250`, PR: :issue:`1281`)
|
||||
|
||||
- Stream: If "file" is present in the :confval:`stream/protocols` config value
|
||||
and the :ref:`ext-file` extension is enabled, we exited with an error because
|
||||
two extensions claimed the same URI scheme. We now log a warning recommending
|
||||
to remove "file" from the :confval:`stream/protocols` config, and then
|
||||
proceed startup. (Fixes: :issue:`1248`, PR: :issue:`1254`)
|
||||
|
||||
- Stream: Fix bug in new playlist parser. A non-ASCII char in an urilist
|
||||
comment would cause a crash while parsing due to comparision of a non-ASCII
|
||||
bytestring with a Unicode string. (Fixes: :issue:`1265`)
|
||||
|
||||
- File: Adjust log levels when failing to expand ``$XDG_MUSIC_DIR`` into a real
|
||||
path. This usually happens when running Mopidy as a system service, and thus
|
||||
with a limited set of environment variables. (Fixes: :issue:`1249`, PR:
|
||||
:issue:`1255`)
|
||||
|
||||
- File: When browsing files, we no longer scan the files to check if they're
|
||||
playable. This makes browsing of the file hierarchy instant for HTTP clients,
|
||||
which do no scanning of the files' metadata, and a bit faster for MPD
|
||||
clients, which no longer scan the files twice. (Fixes: :issue:`1260`, PR:
|
||||
:issue:`1261`)
|
||||
|
||||
- File: Allow looking up metadata about any ``file://`` URI, just like we did
|
||||
in Mopidy 1.0.x, where Mopidy-Stream handled ``file://`` URIs. In Mopidy
|
||||
1.1.0, Mopidy-File did not allow one to lookup files outside the directories
|
||||
listed in :confval:`file/media_dir`. This broke Mopidy-Local-SQLite when the
|
||||
:confval:`local/media_dir` directory was not within one of the
|
||||
:confval:`file/media_dirs` directories. For browsing of files, we still limit
|
||||
access to files inside the :confval:`file/media_dir` directories. For lookup,
|
||||
you can now read metadata for any file you know the path of. (Fixes:
|
||||
:issue:`1268`, PR: :issue:`1273`)
|
||||
|
||||
- Audio: Fix timeout handling in scanner. This regression caused timeouts to
|
||||
expire before it should, causing scans to fail.
|
||||
|
||||
- Audio: Update scanner to emit MIME type instead of an error when missing a
|
||||
plugin.
|
||||
|
||||
|
||||
v1.1.0 (2015-08-09)
|
||||
===================
|
||||
|
||||
@ -63,7 +156,7 @@ Core API
|
||||
|
||||
- Add ``tlid`` alternatives to methods that take ``tl_track`` and also add
|
||||
``get_{eot,next,previous}_tlid`` methods as light weight alternatives to the
|
||||
``tl_track`` versions of the calls. (Fixes: :issue:`1131` PR: :issue:`1136`,
|
||||
``tl_track`` versions of the calls. (Fixes: :issue:`1131`, PR: :issue:`1136`,
|
||||
:issue:`1140`)
|
||||
|
||||
- Add :meth:`mopidy.core.PlaybackController.get_current_tlid`.
|
||||
|
||||
@ -78,7 +78,7 @@ def setup(app):
|
||||
|
||||
# -- General configuration ----------------------------------------------------
|
||||
|
||||
needs_sphinx = '1.0'
|
||||
needs_sphinx = '1.3'
|
||||
|
||||
extensions = [
|
||||
'sphinx.ext.autodoc',
|
||||
|
||||
@ -65,6 +65,13 @@ Core configuration
|
||||
|
||||
Path to base directory for storing cached data.
|
||||
|
||||
Mopidy and extensions will use this path to cache data that can safely be
|
||||
thrown away.
|
||||
|
||||
If your system is running from an SD card, it can help avoid wear and
|
||||
corruption of your SD card by pointing this config to another location. If
|
||||
you have enough RAM, a tmpfs might be a good choice.
|
||||
|
||||
When running Mopidy as a regular user, this should usually be
|
||||
``$XDG_CACHE_DIR/mopidy``, i.e. :file:`~/.cache/mopidy`.
|
||||
|
||||
@ -85,6 +92,11 @@ Core configuration
|
||||
|
||||
Path to base directory for persistent data files.
|
||||
|
||||
Mopidy and extensions will use this path to store data that cannot be
|
||||
be thrown away and reproduced without some effort. Examples include
|
||||
Mopidy-Local's index of your media library and Mopidy-M3U's stored
|
||||
playlists.
|
||||
|
||||
When running Mopidy as a regular user, this should usually be
|
||||
``$XDG_DATA_DIR/mopidy``, i.e. :file:`~/.local/share/mopidy`.
|
||||
|
||||
|
||||
@ -113,12 +113,17 @@ from a regular Mopidy setup you'll want to know about.
|
||||
|
||||
sudo service mopidy status
|
||||
|
||||
- Mopidy installed from a Debian package can use both Mopidy extensions
|
||||
installed both from Debian packages and extensions installed with pip.
|
||||
- Mopidy installed from a Debian package can use Mopidy extensions installed
|
||||
both from Debian packages and with pip. This has always been the case.
|
||||
|
||||
The other way around does not work: Mopidy installed with pip can use
|
||||
extensions installed with pip, but not extensions installed from a Debian
|
||||
package. This is because the Debian packages install extensions into
|
||||
Mopidy installed with pip can use extensions installed with pip, but
|
||||
not extensions installed from a Debian package released before August 2015.
|
||||
This is because the Debian packages used to install extensions into
|
||||
:file:`/usr/share/mopidy` which is normally not on your ``PYTHONPATH``.
|
||||
Thus, your pip-installed Mopidy will not find the Debian package-installed
|
||||
Thus, your pip-installed Mopidy would not find the Debian package-installed
|
||||
extensions.
|
||||
|
||||
In August 2015, all Mopidy extension Debian packages was modified to install
|
||||
into :file:`/usr/lib/python2.7/dist-packages`, like any other Python Debian
|
||||
package. Thus, Mopidy installed with pip can now use extensions installed
|
||||
from Debian.
|
||||
|
||||
@ -35,6 +35,23 @@ To make a local library for your music available for Mopidy:
|
||||
#. Start Mopidy, find the music library in a client, and play some local music!
|
||||
|
||||
|
||||
Updating the local library
|
||||
==========================
|
||||
|
||||
When you've added or removed music in your collection and want to update
|
||||
Mopidy's index of your local library, you need to rescan::
|
||||
|
||||
mopidy local scan
|
||||
|
||||
Note that if you are using the default local library storage, ``json``, you
|
||||
need to restart Mopidy after the scan completes for the updated index to be
|
||||
used.
|
||||
|
||||
If you want index updates to come into effect immediately, you can try out
|
||||
`Mopidy-Local-SQLite <https://github.com/mopidy/mopidy-local-sqlite>`_, which
|
||||
will probably become the default backend in the near future.
|
||||
|
||||
|
||||
Pluggable library support
|
||||
=========================
|
||||
|
||||
@ -47,8 +64,8 @@ active at a time.
|
||||
To create a new library provider you must create class that implements the
|
||||
:class:`mopidy.local.Library` interface and install it in the extension
|
||||
registry under ``local:library``. Any data that the library needs to store on
|
||||
disc should be stored in :confval:`local/data_dir` using the library name as
|
||||
part of the filename or directory to avoid any conflicts.
|
||||
disc should be stored in the extension's data dir, as returned by
|
||||
:meth:`~mopidy.ext.Extension.get_data_dir`.
|
||||
|
||||
|
||||
Configuration
|
||||
|
||||
@ -52,4 +52,5 @@ See :ref:`config` for general help on configuring Mopidy.
|
||||
|
||||
.. confval:: m3u/playlists_dir
|
||||
|
||||
Path to directory with M3U files.
|
||||
Path to directory with M3U files. Unset by default, in which case the
|
||||
extension's data dir is used to store playlists.
|
||||
|
||||
@ -14,4 +14,4 @@ if not (2, 7) <= sys.version_info < (3,):
|
||||
warnings.filterwarnings('ignore', 'could not open display')
|
||||
|
||||
|
||||
__version__ = '1.1.0'
|
||||
__version__ = '1.1.1'
|
||||
|
||||
@ -75,15 +75,16 @@ def main():
|
||||
|
||||
args = root_cmd.parse(mopidy_args)
|
||||
|
||||
create_file_structures_and_config(args, extensions_data)
|
||||
check_old_locations()
|
||||
|
||||
config, config_errors = config_lib.load(
|
||||
args.config_files,
|
||||
[d.config_schema for d in extensions_data],
|
||||
[d.config_defaults for d in extensions_data],
|
||||
args.config_overrides)
|
||||
|
||||
create_core_dirs(config)
|
||||
create_initial_config_file(args, extensions_data)
|
||||
check_old_locations()
|
||||
|
||||
verbosity_level = args.base_verbosity_level
|
||||
if args.verbosity_level:
|
||||
verbosity_level += args.verbosity_level
|
||||
@ -166,12 +167,17 @@ def main():
|
||||
raise
|
||||
|
||||
|
||||
def create_file_structures_and_config(args, extensions_data):
|
||||
path.get_or_create_dir(b'$XDG_DATA_DIR/mopidy')
|
||||
path.get_or_create_dir(b'$XDG_CONFIG_DIR/mopidy')
|
||||
def create_core_dirs(config):
|
||||
path.get_or_create_dir(config['core']['cache_dir'])
|
||||
path.get_or_create_dir(config['core']['config_dir'])
|
||||
path.get_or_create_dir(config['core']['data_dir'])
|
||||
|
||||
|
||||
def create_initial_config_file(args, extensions_data):
|
||||
"""Initialize whatever the last config file is with defaults"""
|
||||
|
||||
# Initialize whatever the last config file is with defaults
|
||||
config_file = args.config_files[-1]
|
||||
|
||||
if os.path.exists(path.expand_path(config_file)):
|
||||
return
|
||||
|
||||
|
||||
@ -12,8 +12,6 @@ from mopidy import exceptions
|
||||
from mopidy.audio import utils
|
||||
from mopidy.internal import encoding
|
||||
|
||||
_missing_plugin_desc = gst.pbutils.missing_plugin_message_get_description
|
||||
|
||||
_Result = collections.namedtuple(
|
||||
'Result', ('uri', 'tags', 'duration', 'seekable', 'mime', 'playable'))
|
||||
|
||||
@ -35,12 +33,15 @@ class Scanner(object):
|
||||
self._timeout_ms = int(timeout)
|
||||
self._proxy_config = proxy_config or {}
|
||||
|
||||
def scan(self, uri):
|
||||
def scan(self, uri, timeout=None):
|
||||
"""
|
||||
Scan the given uri collecting relevant metadata.
|
||||
|
||||
:param uri: URI of the resource to scan.
|
||||
:type event: string
|
||||
:type uri: string
|
||||
:param timeout: timeout for scanning a URI in ms. Defaults to the
|
||||
``timeout`` value used when creating the scanner.
|
||||
:type timeout: int
|
||||
:return: A named tuple containing
|
||||
``(uri, tags, duration, seekable, mime)``.
|
||||
``tags`` is a dictionary of lists for all the tags we found.
|
||||
@ -48,12 +49,13 @@ class Scanner(object):
|
||||
:class:`None` if the URI has no duration. ``seekable`` is boolean.
|
||||
indicating if a seek would succeed.
|
||||
"""
|
||||
timeout = int(timeout or self._timeout_ms)
|
||||
tags, duration, seekable, mime = None, None, None, None
|
||||
pipeline = _setup_pipeline(uri, self._proxy_config)
|
||||
|
||||
try:
|
||||
_start_pipeline(pipeline)
|
||||
tags, mime, have_audio = _process(pipeline, self._timeout_ms)
|
||||
tags, mime, have_audio = _process(pipeline, timeout)
|
||||
duration = _query_duration(pipeline)
|
||||
seekable = _query_seekable(pipeline)
|
||||
finally:
|
||||
@ -134,12 +136,15 @@ def _process(pipeline, timeout_ms):
|
||||
clock = pipeline.get_clock()
|
||||
bus = pipeline.get_bus()
|
||||
timeout = timeout_ms * gst.MSECOND
|
||||
tags, mime, have_audio, missing_description = {}, None, False, None
|
||||
tags = {}
|
||||
mime = None
|
||||
have_audio = False
|
||||
missing_message = None
|
||||
|
||||
types = (gst.MESSAGE_ELEMENT | gst.MESSAGE_APPLICATION | gst.MESSAGE_ERROR
|
||||
| gst.MESSAGE_EOS | gst.MESSAGE_ASYNC_DONE | gst.MESSAGE_TAG)
|
||||
|
||||
start = clock.get_time()
|
||||
previous = clock.get_time()
|
||||
while timeout > 0:
|
||||
message = bus.timed_pop_filtered(timeout, types)
|
||||
|
||||
@ -147,8 +152,7 @@ def _process(pipeline, timeout_ms):
|
||||
break
|
||||
elif message.type == gst.MESSAGE_ELEMENT:
|
||||
if gst.pbutils.is_missing_plugin_message(message):
|
||||
missing_description = encoding.locale_decode(
|
||||
_missing_plugin_desc(message))
|
||||
missing_message = message
|
||||
elif message.type == gst.MESSAGE_APPLICATION:
|
||||
if message.structure.get_name() == 'have-type':
|
||||
mime = message.structure['caps'].get_name()
|
||||
@ -158,8 +162,10 @@ def _process(pipeline, timeout_ms):
|
||||
have_audio = True
|
||||
elif message.type == gst.MESSAGE_ERROR:
|
||||
error = encoding.locale_decode(message.parse_error()[0])
|
||||
if missing_description:
|
||||
error = '%s (%s)' % (missing_description, error)
|
||||
if missing_message and not mime:
|
||||
caps = missing_message.structure['detail']
|
||||
mime = caps.get_structure(0).get_name()
|
||||
return tags, mime, have_audio
|
||||
raise exceptions.ScannerError(error)
|
||||
elif message.type == gst.MESSAGE_EOS:
|
||||
return tags, mime, have_audio
|
||||
@ -171,7 +177,9 @@ def _process(pipeline, timeout_ms):
|
||||
# Note that this will only keep the last tag.
|
||||
tags.update(utils.convert_taglist(taglist))
|
||||
|
||||
timeout -= clock.get_time() - start
|
||||
now = clock.get_time()
|
||||
timeout -= now - previous
|
||||
previous = now
|
||||
|
||||
raise exceptions.ScannerError('Timeout after %dms' % timeout_ms)
|
||||
|
||||
|
||||
@ -161,7 +161,7 @@ class Backends(list):
|
||||
|
||||
for scheme in b.uri_schemes.get():
|
||||
assert scheme not in backends_by_scheme, (
|
||||
'Cannot add URI scheme %s for %s, '
|
||||
'Cannot add URI scheme "%s" for %s, '
|
||||
'it is already handled by %s'
|
||||
) % (scheme, name(b), name(backends_by_scheme[scheme]))
|
||||
backends_by_scheme[scheme] = b
|
||||
|
||||
@ -255,7 +255,7 @@ class LibraryController(object):
|
||||
backends = {}
|
||||
uri_scheme = urlparse.urlparse(uri).scheme if uri else None
|
||||
|
||||
for backend_scheme, backend in self.backends.with_playlists.items():
|
||||
for backend_scheme, backend in self.backends.with_library.items():
|
||||
backends.setdefault(backend, set()).add(backend_scheme)
|
||||
|
||||
for backend, backend_schemes in backends.items():
|
||||
|
||||
@ -236,7 +236,7 @@ class TracklistController(object):
|
||||
|
||||
def get_eot_tlid(self):
|
||||
"""
|
||||
The TLID of the track that will be played after the given track.
|
||||
The TLID of the track that will be played after the current track.
|
||||
|
||||
Not necessarily the same TLID as returned by :meth:`get_next_tlid`.
|
||||
|
||||
@ -332,7 +332,7 @@ class TracklistController(object):
|
||||
|
||||
def get_previous_tlid(self):
|
||||
"""
|
||||
Returns the TLID of the track that will be played if calling
|
||||
Returns the TLID of the track that will be played if calling
|
||||
:meth:`mopidy.core.PlaybackController.previous()`.
|
||||
|
||||
For normal playback this is the previous track in the tracklist. If
|
||||
|
||||
@ -60,39 +60,46 @@ class Extension(object):
|
||||
schema['enabled'] = config_lib.Boolean()
|
||||
return schema
|
||||
|
||||
def get_cache_dir(self, config):
|
||||
@classmethod
|
||||
def get_cache_dir(cls, config):
|
||||
"""Get or create cache directory for the extension.
|
||||
|
||||
Use this directory to cache data that can safely be thrown away.
|
||||
|
||||
:param config: the Mopidy config object
|
||||
:return: string
|
||||
"""
|
||||
assert self.ext_name is not None
|
||||
assert cls.ext_name is not None
|
||||
cache_dir_path = bytes(os.path.join(config['core']['cache_dir'],
|
||||
self.ext_name))
|
||||
cls.ext_name))
|
||||
path.get_or_create_dir(cache_dir_path)
|
||||
return cache_dir_path
|
||||
|
||||
def get_config_dir(self, config):
|
||||
@classmethod
|
||||
def get_config_dir(cls, config):
|
||||
"""Get or create configuration directory for the extension.
|
||||
|
||||
:param config: the Mopidy config object
|
||||
:return: string
|
||||
"""
|
||||
assert self.ext_name is not None
|
||||
assert cls.ext_name is not None
|
||||
config_dir_path = bytes(os.path.join(config['core']['config_dir'],
|
||||
self.ext_name))
|
||||
cls.ext_name))
|
||||
path.get_or_create_dir(config_dir_path)
|
||||
return config_dir_path
|
||||
|
||||
def get_data_dir(self, config):
|
||||
@classmethod
|
||||
def get_data_dir(cls, config):
|
||||
"""Get or create data directory for the extension.
|
||||
|
||||
Use this directory to store data that should be persistent.
|
||||
|
||||
:param config: the Mopidy config object
|
||||
:returns: string
|
||||
"""
|
||||
assert self.ext_name is not None
|
||||
assert cls.ext_name is not None
|
||||
data_dir_path = bytes(os.path.join(config['core']['data_dir'],
|
||||
self.ext_name))
|
||||
cls.ext_name))
|
||||
path.get_or_create_dir(data_dir_path)
|
||||
return data_dir_path
|
||||
|
||||
|
||||
@ -71,7 +71,7 @@ class FileLibraryProvider(backend.LibraryProvider):
|
||||
name = dir_entry.decode(FS_ENCODING, 'replace')
|
||||
if os.path.isdir(child_path):
|
||||
result.append(models.Ref.directory(name=name, uri=uri))
|
||||
elif os.path.isfile(child_path) and self._is_audio_file(uri):
|
||||
elif os.path.isfile(child_path):
|
||||
result.append(models.Ref.track(name=name, uri=uri))
|
||||
|
||||
result.sort(key=operator.attrgetter('name'))
|
||||
@ -81,10 +81,6 @@ class FileLibraryProvider(backend.LibraryProvider):
|
||||
logger.debug('Looking up file URI: %s', uri)
|
||||
local_path = path.uri_to_path(uri)
|
||||
|
||||
if not self._is_in_basedir(local_path):
|
||||
logger.warning('Ignoring URI outside base dir: %s', local_path)
|
||||
return []
|
||||
|
||||
try:
|
||||
result = self._scanner.scan(uri)
|
||||
track = utils.convert_tags_to_track(result.tags).copy(
|
||||
@ -108,12 +104,15 @@ class FileLibraryProvider(backend.LibraryProvider):
|
||||
media_dir_split[0].encode(FS_ENCODING))
|
||||
|
||||
if not local_path:
|
||||
logger.warning('Failed expanding path (%s) from'
|
||||
'file/media_dirs config value.',
|
||||
media_dir_split[0])
|
||||
logger.debug(
|
||||
'Failed expanding path (%s) from file/media_dirs config '
|
||||
'value.',
|
||||
media_dir_split[0])
|
||||
continue
|
||||
elif not os.path.isdir(local_path):
|
||||
logger.warning('%s is not a directory', local_path)
|
||||
logger.warning(
|
||||
'%s is not a directory. Please create the directory or '
|
||||
'update the file/media_dirs config value.', local_path)
|
||||
continue
|
||||
|
||||
media_dir['path'] = local_path
|
||||
@ -131,18 +130,6 @@ class FileLibraryProvider(backend.LibraryProvider):
|
||||
name=media_dir['name'],
|
||||
uri=path.path_to_uri(media_dir['path']))
|
||||
|
||||
def _is_audio_file(self, uri):
|
||||
try:
|
||||
result = self._scanner.scan(uri)
|
||||
if result.playable:
|
||||
logger.debug('Playable file: %s', result.uri)
|
||||
else:
|
||||
logger.debug('Unplayable file: %s (not audio)', result.uri)
|
||||
return result.playable
|
||||
except exceptions.ScannerError as e:
|
||||
logger.debug('Unplayable file: %s (%s)', uri, e)
|
||||
return False
|
||||
|
||||
def _is_in_basedir(self, local_path):
|
||||
return any(
|
||||
path.is_path_inside_base_dir(local_path, media_dir['path'])
|
||||
|
||||
@ -1,9 +1,14 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import logging
|
||||
import time
|
||||
|
||||
import requests
|
||||
|
||||
from mopidy import httpclient
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def get_requests_session(proxy_config, user_agent):
|
||||
proxy = httpclient.format_proxy(proxy_config)
|
||||
@ -14,3 +19,30 @@ def get_requests_session(proxy_config, user_agent):
|
||||
session.headers.update({'user-agent': full_user_agent})
|
||||
|
||||
return session
|
||||
|
||||
|
||||
def download(session, uri, timeout=1.0, chunk_size=4096):
|
||||
try:
|
||||
response = session.get(uri, stream=True, timeout=timeout)
|
||||
except requests.exceptions.Timeout:
|
||||
logger.warning('Download of %r failed due to connection timeout after '
|
||||
'%.3fs', uri, timeout)
|
||||
return None
|
||||
except requests.exceptions.InvalidSchema:
|
||||
logger.warning('%s has an unsupported schema.', uri)
|
||||
return None
|
||||
|
||||
content = []
|
||||
deadline = time.time() + timeout
|
||||
for chunk in response.iter_content(chunk_size):
|
||||
content.append(chunk)
|
||||
if time.time() > deadline:
|
||||
logger.warning('Download of %r failed due to download taking more '
|
||||
'than %.3fs', uri, timeout)
|
||||
return None
|
||||
|
||||
if not response.ok:
|
||||
logger.warning('Problem downloading %r: %s', uri, response.reason)
|
||||
return None
|
||||
|
||||
return b''.join(content)
|
||||
|
||||
@ -122,7 +122,7 @@ def parse_asx(data):
|
||||
def parse_urilist(data):
|
||||
result = []
|
||||
for line in data.splitlines():
|
||||
if not line.strip() or line.startswith('#'):
|
||||
if not line.strip() or line.startswith(b'#'):
|
||||
continue
|
||||
try:
|
||||
validation.check_uri(line)
|
||||
|
||||
@ -23,7 +23,7 @@ class Extension(ext.Extension):
|
||||
schema = super(Extension, self).get_config_schema()
|
||||
schema['library'] = config.String()
|
||||
schema['media_dir'] = config.Path()
|
||||
schema['data_dir'] = config.Path()
|
||||
schema['data_dir'] = config.Path(optional=True)
|
||||
schema['playlists_dir'] = config.Deprecated()
|
||||
schema['tag_cache_file'] = config.Deprecated()
|
||||
schema['scan_timeout'] = config.Integer(
|
||||
|
||||
@ -4,7 +4,7 @@ library = json
|
||||
media_dir = $XDG_MUSIC_DIR
|
||||
data_dir = $XDG_DATA_DIR/mopidy/local
|
||||
scan_timeout = 1000
|
||||
scan_flush_threshold = 1000
|
||||
scan_flush_threshold = 100
|
||||
scan_follow_symlinks = false
|
||||
excluded_file_extensions =
|
||||
.directory
|
||||
|
||||
@ -116,7 +116,7 @@ class JsonLibrary(local.Library):
|
||||
self._browse_cache = None
|
||||
self._media_dir = config['local']['media_dir']
|
||||
self._json_file = os.path.join(
|
||||
config['local']['data_dir'], b'library.json.gz')
|
||||
local.Extension.get_data_dir(config), b'library.json.gz')
|
||||
|
||||
storage.check_dirs_and_files(config)
|
||||
|
||||
|
||||
@ -3,8 +3,6 @@ from __future__ import absolute_import, unicode_literals
|
||||
import logging
|
||||
import os
|
||||
|
||||
from mopidy.internal import encoding, path
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@ -13,10 +11,3 @@ def check_dirs_and_files(config):
|
||||
logger.warning(
|
||||
'Local media dir %s does not exist.' %
|
||||
config['local']['media_dir'])
|
||||
|
||||
try:
|
||||
path.get_or_create_dir(config['local']['data_dir'])
|
||||
except EnvironmentError as error:
|
||||
logger.warning(
|
||||
'Could not create local data dir: %s',
|
||||
encoding.locale_decode(error))
|
||||
|
||||
@ -21,7 +21,7 @@ class Extension(ext.Extension):
|
||||
|
||||
def get_config_schema(self):
|
||||
schema = super(Extension, self).get_config_schema()
|
||||
schema['playlists_dir'] = config.Path()
|
||||
schema['playlists_dir'] = config.Path(optional=True)
|
||||
return schema
|
||||
|
||||
def setup(self, registry):
|
||||
|
||||
@ -4,7 +4,7 @@ import logging
|
||||
|
||||
import pykka
|
||||
|
||||
from mopidy import backend
|
||||
from mopidy import backend, m3u
|
||||
from mopidy.internal import encoding, path
|
||||
from mopidy.m3u.library import M3ULibraryProvider
|
||||
from mopidy.m3u.playlists import M3UPlaylistsProvider
|
||||
@ -21,12 +21,16 @@ class M3UBackend(pykka.ThreadingActor, backend.Backend):
|
||||
|
||||
self._config = config
|
||||
|
||||
try:
|
||||
path.get_or_create_dir(config['m3u']['playlists_dir'])
|
||||
except EnvironmentError as error:
|
||||
logger.warning(
|
||||
'Could not create M3U playlists dir: %s',
|
||||
encoding.locale_decode(error))
|
||||
if config['m3u']['playlists_dir'] is not None:
|
||||
self._playlists_dir = config['m3u']['playlists_dir']
|
||||
try:
|
||||
path.get_or_create_dir(self._playlists_dir)
|
||||
except EnvironmentError as error:
|
||||
logger.warning(
|
||||
'Could not create M3U playlists dir: %s',
|
||||
encoding.locale_decode(error))
|
||||
else:
|
||||
self._playlists_dir = m3u.Extension.get_data_dir(config)
|
||||
|
||||
self.playlists = M3UPlaylistsProvider(backend=self)
|
||||
self.library = M3ULibraryProvider(backend=self)
|
||||
|
||||
@ -1,3 +1,3 @@
|
||||
[m3u]
|
||||
enabled = true
|
||||
playlists_dir = $XDG_DATA_DIR/mopidy/m3u
|
||||
playlists_dir =
|
||||
|
||||
@ -23,7 +23,7 @@ class M3UPlaylistsProvider(backend.PlaylistsProvider):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(M3UPlaylistsProvider, self).__init__(*args, **kwargs)
|
||||
|
||||
self._playlists_dir = self.backend._config['m3u']['playlists_dir']
|
||||
self._playlists_dir = self.backend._playlists_dir
|
||||
self._playlists = {}
|
||||
self.refresh()
|
||||
|
||||
|
||||
@ -271,9 +271,9 @@ class MpdContext(object):
|
||||
|
||||
If ``lookup`` is true and the ``path`` is to a track, the returned
|
||||
``data`` is a future which will contain the results from looking up
|
||||
the URI with :meth:`mopidy.core.LibraryController.lookup` If ``lookup``
|
||||
is false and the ``path`` is to a track, the returned ``data`` will be
|
||||
a :class:`mopidy.models.Ref` for the track.
|
||||
the URI with :meth:`mopidy.core.LibraryController.lookup`. If
|
||||
``lookup`` is false and the ``path`` is to a track, the returned
|
||||
``data`` will be a :class:`mopidy.models.Ref` for the track.
|
||||
|
||||
For all entries that are not tracks, the returned ``data`` will be
|
||||
:class:`None`.
|
||||
|
||||
@ -8,8 +8,6 @@ import urlparse
|
||||
|
||||
import pykka
|
||||
|
||||
import requests
|
||||
|
||||
from mopidy import audio as audio_lib, backend, exceptions, stream
|
||||
from mopidy.audio import scan, utils
|
||||
from mopidy.internal import http, playlists
|
||||
@ -36,6 +34,13 @@ class StreamBackend(pykka.ThreadingActor, backend.Backend):
|
||||
self.uri_schemes = audio_lib.supported_uri_schemes(
|
||||
config['stream']['protocols'])
|
||||
|
||||
if 'file' in self.uri_schemes and config['file']['enabled']:
|
||||
logger.warning(
|
||||
'The stream/protocols config value includes the "file" '
|
||||
'protocol. "file" playback is now handled by Mopidy-File. '
|
||||
'Please remove it from the stream/protocols config.')
|
||||
self.uri_schemes -= {'file'}
|
||||
|
||||
|
||||
class StreamLibraryProvider(backend.LibraryProvider):
|
||||
|
||||
@ -70,59 +75,84 @@ class StreamPlaybackProvider(backend.PlaybackProvider):
|
||||
super(StreamPlaybackProvider, self).__init__(audio, backend)
|
||||
self._config = config
|
||||
self._scanner = backend._scanner
|
||||
|
||||
def translate_uri(self, uri):
|
||||
try:
|
||||
scan_result = self._scanner.scan(uri)
|
||||
except exceptions.ScannerError as e:
|
||||
logger.warning(
|
||||
'Problem scanning URI %s: %s', uri, e)
|
||||
return None
|
||||
|
||||
if not (scan_result.mime.startswith('text/') or
|
||||
scan_result.mime.startswith('application/')):
|
||||
return uri
|
||||
|
||||
content = self._download(uri)
|
||||
if content is None:
|
||||
return None
|
||||
|
||||
tracks = list(playlists.parse(content))
|
||||
if tracks:
|
||||
# TODO Test streams and return first that seems to be playable
|
||||
return tracks[0]
|
||||
|
||||
def _download(self, uri):
|
||||
timeout = self._config['stream']['timeout'] / 1000.0
|
||||
|
||||
session = http.get_requests_session(
|
||||
proxy_config=self._config['proxy'],
|
||||
self._session = http.get_requests_session(
|
||||
proxy_config=config['proxy'],
|
||||
user_agent='%s/%s' % (
|
||||
stream.Extension.dist_name, stream.Extension.version))
|
||||
|
||||
def translate_uri(self, uri):
|
||||
return _unwrap_stream(
|
||||
uri,
|
||||
timeout=self._config['stream']['timeout'],
|
||||
scanner=self._scanner,
|
||||
requests_session=self._session)
|
||||
|
||||
|
||||
def _unwrap_stream(uri, timeout, scanner, requests_session):
|
||||
"""
|
||||
Get a stream URI from a playlist URI, ``uri``.
|
||||
|
||||
Unwraps nested playlists until something that's not a playlist is found or
|
||||
the ``timeout`` is reached.
|
||||
"""
|
||||
|
||||
original_uri = uri
|
||||
seen_uris = set()
|
||||
deadline = time.time() + timeout
|
||||
|
||||
while time.time() < deadline:
|
||||
if uri in seen_uris:
|
||||
logger.info(
|
||||
'Unwrapping stream from URI (%s) failed: '
|
||||
'playlist referenced itself', uri)
|
||||
return None
|
||||
else:
|
||||
seen_uris.add(uri)
|
||||
|
||||
logger.debug('Unwrapping stream from URI: %s', uri)
|
||||
|
||||
try:
|
||||
response = session.get(
|
||||
uri, stream=True, timeout=timeout)
|
||||
except requests.exceptions.Timeout:
|
||||
logger.warning(
|
||||
'Download of stream playlist (%s) failed due to connection '
|
||||
'timeout after %.3fs', uri, timeout)
|
||||
return None
|
||||
|
||||
deadline = time.time() + timeout
|
||||
content = []
|
||||
for chunk in response.iter_content(4096):
|
||||
content.append(chunk)
|
||||
if time.time() > deadline:
|
||||
logger.warning(
|
||||
'Download of stream playlist (%s) failed due to download '
|
||||
'taking more than %.3fs', uri, timeout)
|
||||
scan_timeout = deadline - time.time()
|
||||
if scan_timeout < 0:
|
||||
logger.info(
|
||||
'Unwrapping stream from URI (%s) failed: '
|
||||
'timed out in %sms', uri, timeout)
|
||||
return None
|
||||
scan_result = scanner.scan(uri, timeout=scan_timeout)
|
||||
except exceptions.ScannerError as exc:
|
||||
logger.debug('GStreamer failed scanning URI (%s): %s', uri, exc)
|
||||
scan_result = None
|
||||
|
||||
if not response.ok:
|
||||
logger.warning(
|
||||
'Problem downloading stream playlist %s: %s',
|
||||
uri, response.reason)
|
||||
if scan_result is not None and not (
|
||||
scan_result.mime.startswith('text/') or
|
||||
scan_result.mime.startswith('application/')):
|
||||
logger.debug(
|
||||
'Unwrapped potential %s stream: %s', scan_result.mime, uri)
|
||||
return uri
|
||||
|
||||
download_timeout = deadline - time.time()
|
||||
if download_timeout < 0:
|
||||
logger.info(
|
||||
'Unwrapping stream from URI (%s) failed: timed out in %sms',
|
||||
uri, timeout)
|
||||
return None
|
||||
content = http.download(
|
||||
requests_session, uri, timeout=download_timeout)
|
||||
|
||||
if content is None:
|
||||
logger.info(
|
||||
'Unwrapping stream from URI (%s) failed: '
|
||||
'error downloading URI %s', original_uri, uri)
|
||||
return None
|
||||
|
||||
return b''.join(content)
|
||||
uris = playlists.parse(content)
|
||||
if not uris:
|
||||
logger.debug(
|
||||
'Failed parsing URI (%s) as playlist; found potential stream.',
|
||||
uri)
|
||||
return uri
|
||||
|
||||
# TODO Test streams and return first that seems to be playable
|
||||
logger.debug(
|
||||
'Parsed playlist (%s) and found new URI: %s', uri, uris[0])
|
||||
uri = uris[0]
|
||||
|
||||
2
setup.py
2
setup.py
@ -25,7 +25,7 @@ setup(
|
||||
include_package_data=True,
|
||||
install_requires=[
|
||||
'Pykka >= 1.1',
|
||||
'requests',
|
||||
'requests >= 2.0',
|
||||
'setuptools',
|
||||
'tornado >= 2.3',
|
||||
],
|
||||
|
||||
@ -40,8 +40,11 @@ class ScannerTest(unittest.TestCase):
|
||||
self.assertEqual(self.result[name].tags[key], value)
|
||||
|
||||
def check_if_missing_plugin(self):
|
||||
if any(['missing a plug-in' in str(e) for e in self.errors.values()]):
|
||||
raise unittest.SkipTest('Missing MP3 support?')
|
||||
for path, result in self.result.items():
|
||||
if not path.endswith('.mp3'):
|
||||
continue
|
||||
if not result.playable and result.mime == 'audio/mpeg':
|
||||
raise unittest.SkipTest('Missing MP3 support?')
|
||||
|
||||
def test_tags_is_set(self):
|
||||
self.scan(self.find('scanner/simple'))
|
||||
@ -109,6 +112,17 @@ class ScannerTest(unittest.TestCase):
|
||||
wav = path_to_data_dir('scanner/empty.wav')
|
||||
self.assertEqual(self.result[wav].duration, 0)
|
||||
|
||||
def test_uri_list(self):
|
||||
path = path_to_data_dir('scanner/playlist.m3u')
|
||||
self.scan([path])
|
||||
self.assertEqual(self.result[path].mime, 'text/uri-list')
|
||||
|
||||
def test_text_plain(self):
|
||||
# GStreamer decode bin hardcodes bad handling of text plain :/
|
||||
path = path_to_data_dir('scanner/plain.txt')
|
||||
self.scan([path])
|
||||
self.assertIn(path, self.errors)
|
||||
|
||||
@unittest.SkipTest
|
||||
def test_song_without_time_is_handeled(self):
|
||||
pass
|
||||
|
||||
@ -37,7 +37,8 @@ class CoreActorTest(unittest.TestCase):
|
||||
|
||||
self.assertRaisesRegexp(
|
||||
AssertionError,
|
||||
'Cannot add URI scheme dummy1 for B2, it is already handled by B1',
|
||||
'Cannot add URI scheme "dummy1" for B2, '
|
||||
'it is already handled by B1',
|
||||
Core, mixer=None, backends=[self.backend1, self.backend2])
|
||||
|
||||
def test_version(self):
|
||||
|
||||
@ -20,6 +20,7 @@ class BaseCoreLibraryTest(unittest.TestCase):
|
||||
self.library1.get_images.return_value.get.return_value = {}
|
||||
self.library1.root_directory.get.return_value = dummy1_root
|
||||
self.backend1.library = self.library1
|
||||
self.backend1.has_playlists.return_value.get.return_value = False
|
||||
|
||||
dummy2_root = Ref.directory(uri='dummy2:directory', name='dummy2')
|
||||
self.backend2 = mock.Mock()
|
||||
@ -29,13 +30,14 @@ class BaseCoreLibraryTest(unittest.TestCase):
|
||||
self.library2.get_images.return_value.get.return_value = {}
|
||||
self.library2.root_directory.get.return_value = dummy2_root
|
||||
self.backend2.library = self.library2
|
||||
self.backend2.has_playlists.return_value.get.return_value = False
|
||||
|
||||
# A backend without the optional library provider
|
||||
self.backend3 = mock.Mock()
|
||||
self.backend3.uri_schemes.get.return_value = ['dummy3']
|
||||
self.backend3.actor_ref.actor_class.__name__ = 'DummyBackend3'
|
||||
self.backend3.has_library().get.return_value = False
|
||||
self.backend3.has_library_browse().get.return_value = False
|
||||
self.backend3.has_library.return_value.get.return_value = False
|
||||
self.backend3.has_library_browse.return_value.get.return_value = False
|
||||
|
||||
self.core = core.Core(mixer=None, backends=[
|
||||
self.backend1, self.backend2, self.backend3])
|
||||
|
||||
1
tests/data/scanner/plain.txt
Normal file
1
tests/data/scanner/plain.txt
Normal file
@ -0,0 +1 @@
|
||||
Some plain text file with nothing special in it.
|
||||
1
tests/data/scanner/playlist.m3u
Normal file
1
tests/data/scanner/playlist.m3u
Normal file
@ -0,0 +1 @@
|
||||
http://example.com/
|
||||
63
tests/internal/test_http.py
Normal file
63
tests/internal/test_http.py
Normal file
@ -0,0 +1,63 @@
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
|
||||
import mock
|
||||
|
||||
import pytest
|
||||
|
||||
import requests
|
||||
|
||||
import responses
|
||||
|
||||
from mopidy.internal import http
|
||||
|
||||
|
||||
TIMEOUT = 1000
|
||||
URI = 'http://example.com/foo.txt'
|
||||
BODY = "This is the contents of foo.txt."
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def session():
|
||||
return requests.Session()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def session_mock():
|
||||
return mock.Mock(spec=requests.Session)
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_download_on_server_side_error(session, caplog):
|
||||
responses.add(responses.GET, URI, body=BODY, status=500)
|
||||
|
||||
result = http.download(session, URI)
|
||||
|
||||
assert result is None
|
||||
assert 'Problem downloading' in caplog.text()
|
||||
|
||||
|
||||
def test_download_times_out_if_connection_times_out(session_mock, caplog):
|
||||
session_mock.get.side_effect = requests.exceptions.Timeout
|
||||
|
||||
result = http.download(session_mock, URI, timeout=1.0)
|
||||
|
||||
session_mock.get.assert_called_once_with(URI, timeout=1.0, stream=True)
|
||||
assert result is None
|
||||
assert (
|
||||
'Download of %r failed due to connection timeout after 1.000s' % URI
|
||||
in caplog.text())
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_download_times_out_if_download_is_slow(session, caplog):
|
||||
responses.add(responses.GET, URI, body=BODY, content_type='text/plain')
|
||||
|
||||
with mock.patch.object(http, 'time') as time_mock:
|
||||
time_mock.time.side_effect = [0, TIMEOUT + 1]
|
||||
|
||||
result = http.download(session, URI)
|
||||
|
||||
assert result is None
|
||||
assert (
|
||||
'Download of %r failed due to download taking more than 1.000s' % URI
|
||||
in caplog.text())
|
||||
@ -23,7 +23,7 @@ file:///tmp/baz
|
||||
|
||||
URILIST = b"""
|
||||
file:///tmp/foo
|
||||
# a comment
|
||||
# a comment \xc5\xa7\xc5\x95
|
||||
file:///tmp/bar
|
||||
|
||||
file:///tmp/baz
|
||||
|
||||
@ -45,10 +45,11 @@ class BrowseCacheTest(unittest.TestCase):
|
||||
class JsonLibraryTest(unittest.TestCase):
|
||||
|
||||
config = {
|
||||
'core': {
|
||||
'data_dir': path_to_data_dir(''),
|
||||
},
|
||||
'local': {
|
||||
'media_dir': path_to_data_dir(''),
|
||||
'data_dir': path_to_data_dir(''),
|
||||
'playlists_dir': b'',
|
||||
'library': 'json',
|
||||
},
|
||||
}
|
||||
|
||||
@ -65,10 +65,11 @@ class LocalLibraryProviderTest(unittest.TestCase):
|
||||
]
|
||||
|
||||
config = {
|
||||
'core': {
|
||||
'data_dir': path_to_data_dir(''),
|
||||
},
|
||||
'local': {
|
||||
'media_dir': path_to_data_dir(''),
|
||||
'data_dir': path_to_data_dir(''),
|
||||
'playlists_dir': b'',
|
||||
'library': 'json',
|
||||
},
|
||||
}
|
||||
@ -105,11 +106,15 @@ class LocalLibraryProviderTest(unittest.TestCase):
|
||||
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
try:
|
||||
tmplib = os.path.join(tmpdir, 'library.json.gz')
|
||||
shutil.copy(path_to_data_dir('library.json.gz'), tmplib)
|
||||
tmpdir_local = os.path.join(tmpdir, 'local')
|
||||
shutil.copytree(path_to_data_dir('local'), tmpdir_local)
|
||||
|
||||
config = {'local': self.config['local'].copy()}
|
||||
config['local']['data_dir'] = tmpdir
|
||||
config = {
|
||||
'core': {
|
||||
'data_dir': tmpdir,
|
||||
},
|
||||
'local': self.config['local'],
|
||||
}
|
||||
backend = actor.LocalBackend(config=config, audio=None)
|
||||
|
||||
# Sanity check that value is in the library
|
||||
@ -117,6 +122,7 @@ class LocalLibraryProviderTest(unittest.TestCase):
|
||||
self.assertEqual(result, self.tracks[0:1])
|
||||
|
||||
# Clear and refresh.
|
||||
tmplib = os.path.join(tmpdir_local, 'library.json.gz')
|
||||
open(tmplib, 'w').close()
|
||||
backend.library.refresh()
|
||||
|
||||
|
||||
@ -23,12 +23,11 @@ from tests.local import generate_song, populate_tracklist
|
||||
class LocalPlaybackProviderTest(unittest.TestCase):
|
||||
config = {
|
||||
'core': {
|
||||
'data_dir': path_to_data_dir(''),
|
||||
'max_tracklist_length': 10000,
|
||||
},
|
||||
'local': {
|
||||
'media_dir': path_to_data_dir(''),
|
||||
'data_dir': path_to_data_dir(''),
|
||||
'playlists_dir': b'',
|
||||
'library': 'json',
|
||||
}
|
||||
}
|
||||
|
||||
@ -18,11 +18,11 @@ from tests.local import generate_song, populate_tracklist
|
||||
class LocalTracklistProviderTest(unittest.TestCase):
|
||||
config = {
|
||||
'core': {
|
||||
'data_dir': path_to_data_dir(''),
|
||||
'max_tracklist_length': 10000
|
||||
},
|
||||
'local': {
|
||||
'media_dir': path_to_data_dir(''),
|
||||
'data_dir': path_to_data_dir(''),
|
||||
'playlists_dir': b'',
|
||||
'library': 'json',
|
||||
}
|
||||
|
||||
@ -4,8 +4,6 @@ import mock
|
||||
|
||||
import pytest
|
||||
|
||||
import requests
|
||||
|
||||
import responses
|
||||
|
||||
from mopidy import exceptions
|
||||
@ -14,7 +12,8 @@ from mopidy.stream import actor
|
||||
|
||||
|
||||
TIMEOUT = 1000
|
||||
URI = 'http://example.com/listen.m3u'
|
||||
PLAYLIST_URI = 'http://example.com/listen.m3u'
|
||||
STREAM_URI = 'http://example.com/stream.mp3'
|
||||
BODY = """
|
||||
#EXTM3U
|
||||
http://example.com/stream.mp3
|
||||
@ -39,9 +38,7 @@ def audio():
|
||||
|
||||
@pytest.fixture
|
||||
def scanner():
|
||||
scanner = mock.Mock(spec=scan.Scanner)
|
||||
scanner.scan.return_value.mime = 'text/foo'
|
||||
return scanner
|
||||
return mock.Mock(spec=scan.Scanner)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
@ -57,89 +54,134 @@ def provider(audio, backend, config):
|
||||
return actor.StreamPlaybackProvider(audio, backend, config)
|
||||
|
||||
|
||||
@responses.activate
|
||||
def test_translate_uri_of_audio_stream_returns_same_uri(
|
||||
scanner, provider):
|
||||
class TestTranslateURI(object):
|
||||
|
||||
scanner.scan.return_value.mime = 'audio/ogg'
|
||||
@responses.activate
|
||||
def test_audio_stream_returns_same_uri(self, scanner, provider):
|
||||
scanner.scan.return_value.mime = 'audio/mpeg'
|
||||
|
||||
result = provider.translate_uri(URI)
|
||||
result = provider.translate_uri(STREAM_URI)
|
||||
|
||||
scanner.scan.assert_called_once_with(URI)
|
||||
assert result == URI
|
||||
scanner.scan.assert_called_once_with(STREAM_URI, timeout=mock.ANY)
|
||||
assert result == STREAM_URI
|
||||
|
||||
@responses.activate
|
||||
def test_text_playlist_with_mpeg_stream(
|
||||
self, scanner, provider, caplog):
|
||||
|
||||
@responses.activate
|
||||
def test_translate_uri_of_playlist_returns_first_uri_in_list(
|
||||
scanner, provider):
|
||||
scanner.scan.side_effect = [
|
||||
mock.Mock(mime='text/foo'), # scanning playlist
|
||||
mock.Mock(mime='audio/mpeg'), # scanning stream
|
||||
]
|
||||
responses.add(
|
||||
responses.GET, PLAYLIST_URI,
|
||||
body=BODY, content_type='audio/x-mpegurl')
|
||||
|
||||
responses.add(
|
||||
responses.GET, URI, body=BODY, content_type='audio/x-mpegurl')
|
||||
result = provider.translate_uri(PLAYLIST_URI)
|
||||
|
||||
result = provider.translate_uri(URI)
|
||||
assert scanner.scan.mock_calls == [
|
||||
mock.call(PLAYLIST_URI, timeout=mock.ANY),
|
||||
mock.call(STREAM_URI, timeout=mock.ANY),
|
||||
]
|
||||
assert result == STREAM_URI
|
||||
|
||||
scanner.scan.assert_called_once_with(URI)
|
||||
assert result == 'http://example.com/stream.mp3'
|
||||
assert responses.calls[0].request.headers['User-Agent'].startswith(
|
||||
'Mopidy-Stream/')
|
||||
# Check logging to ensure debuggability
|
||||
assert 'Unwrapping stream from URI: %s' % PLAYLIST_URI
|
||||
assert 'Parsed playlist (%s)' % PLAYLIST_URI in caplog.text()
|
||||
assert 'Unwrapping stream from URI: %s' % STREAM_URI
|
||||
assert (
|
||||
'Unwrapped potential audio/mpeg stream: %s' % STREAM_URI
|
||||
in caplog.text())
|
||||
|
||||
# Check proper Requests session setup
|
||||
assert responses.calls[0].request.headers['User-Agent'].startswith(
|
||||
'Mopidy-Stream/')
|
||||
|
||||
@responses.activate
|
||||
def test_translate_uri_of_playlist_with_xml_mimetype(scanner, provider):
|
||||
scanner.scan.return_value.mime = 'application/xspf+xml'
|
||||
responses.add(
|
||||
responses.GET, URI, body=BODY, content_type='application/xspf+xml')
|
||||
@responses.activate
|
||||
def test_xml_playlist_with_mpeg_stream(self, scanner, provider):
|
||||
scanner.scan.side_effect = [
|
||||
mock.Mock(mime='application/xspf+xml'), # scanning playlist
|
||||
mock.Mock(mime='audio/mpeg'), # scanning stream
|
||||
]
|
||||
responses.add(
|
||||
responses.GET, PLAYLIST_URI,
|
||||
body=BODY, content_type='application/xspf+xml')
|
||||
|
||||
result = provider.translate_uri(URI)
|
||||
result = provider.translate_uri(PLAYLIST_URI)
|
||||
|
||||
scanner.scan.assert_called_once_with(URI)
|
||||
assert result == 'http://example.com/stream.mp3'
|
||||
assert scanner.scan.mock_calls == [
|
||||
mock.call(PLAYLIST_URI, timeout=mock.ANY),
|
||||
mock.call(STREAM_URI, timeout=mock.ANY),
|
||||
]
|
||||
assert result == STREAM_URI
|
||||
|
||||
@responses.activate
|
||||
def test_scan_fails_but_playlist_parsing_succeeds(
|
||||
self, scanner, provider, caplog):
|
||||
|
||||
def test_translate_uri_when_scanner_fails(scanner, provider, caplog):
|
||||
scanner.scan.side_effect = exceptions.ScannerError('foo failed')
|
||||
scanner.scan.side_effect = [
|
||||
exceptions.ScannerError('some failure'), # scanning playlist
|
||||
mock.Mock(mime='audio/mpeg'), # scanning stream
|
||||
]
|
||||
responses.add(
|
||||
responses.GET, PLAYLIST_URI,
|
||||
body=BODY, content_type='audio/x-mpegurl')
|
||||
|
||||
result = provider.translate_uri('bar')
|
||||
result = provider.translate_uri(PLAYLIST_URI)
|
||||
|
||||
assert result is None
|
||||
assert 'Problem scanning URI bar: foo failed' in caplog.text()
|
||||
assert 'Unwrapping stream from URI: %s' % PLAYLIST_URI
|
||||
assert (
|
||||
'GStreamer failed scanning URI (%s)' % PLAYLIST_URI
|
||||
in caplog.text())
|
||||
assert 'Parsed playlist (%s)' % PLAYLIST_URI in caplog.text()
|
||||
assert (
|
||||
'Unwrapped potential audio/mpeg stream: %s' % STREAM_URI
|
||||
in caplog.text())
|
||||
assert result == STREAM_URI
|
||||
|
||||
@responses.activate
|
||||
def test_scan_fails_and_playlist_parsing_fails(
|
||||
self, scanner, provider, caplog):
|
||||
|
||||
@responses.activate
|
||||
def test_translate_uri_when_playlist_download_fails(provider, caplog):
|
||||
responses.add(responses.GET, URI, body=BODY, status=500)
|
||||
scanner.scan.side_effect = exceptions.ScannerError('some failure')
|
||||
responses.add(
|
||||
responses.GET, STREAM_URI,
|
||||
body=b'some audio data', content_type='audio/mpeg')
|
||||
|
||||
result = provider.translate_uri(URI)
|
||||
result = provider.translate_uri(STREAM_URI)
|
||||
|
||||
assert result is None
|
||||
assert 'Problem downloading stream playlist' in caplog.text()
|
||||
assert 'Unwrapping stream from URI: %s' % STREAM_URI
|
||||
assert (
|
||||
'GStreamer failed scanning URI (%s)' % STREAM_URI
|
||||
in caplog.text())
|
||||
assert (
|
||||
'Failed parsing URI (%s) as playlist; found potential stream.'
|
||||
% STREAM_URI in caplog.text())
|
||||
assert result == STREAM_URI
|
||||
|
||||
def test_failed_download_returns_none(self, provider, caplog):
|
||||
with mock.patch.object(actor, 'http') as http_mock:
|
||||
http_mock.download.return_value = None
|
||||
|
||||
def test_translate_uri_times_out_if_connection_times_out(provider, caplog):
|
||||
with mock.patch.object(actor.requests, 'Session') as session_mock:
|
||||
get_mock = session_mock.return_value.get
|
||||
get_mock.side_effect = requests.exceptions.Timeout
|
||||
result = provider.translate_uri(PLAYLIST_URI)
|
||||
|
||||
result = provider.translate_uri(URI)
|
||||
assert result is None
|
||||
|
||||
get_mock.assert_called_once_with(URI, timeout=1.0, stream=True)
|
||||
assert result is None
|
||||
assert (
|
||||
'Download of stream playlist (%s) failed due to connection '
|
||||
'timeout after 1.000s' % URI in caplog.text())
|
||||
@responses.activate
|
||||
def test_playlist_references_itself(self, scanner, provider, caplog):
|
||||
scanner.scan.return_value.mime = 'text/foo'
|
||||
responses.add(
|
||||
responses.GET, PLAYLIST_URI,
|
||||
body=BODY.replace(STREAM_URI, PLAYLIST_URI),
|
||||
content_type='audio/x-mpegurl')
|
||||
|
||||
result = provider.translate_uri(PLAYLIST_URI)
|
||||
|
||||
@responses.activate
|
||||
def test_translate_uri_times_out_if_download_is_slow(provider, caplog):
|
||||
responses.add(
|
||||
responses.GET, URI, body=BODY, content_type='audio/x-mpegurl')
|
||||
|
||||
with mock.patch.object(actor, 'time') as time_mock:
|
||||
time_mock.time.side_effect = [0, TIMEOUT + 1]
|
||||
|
||||
result = provider.translate_uri(URI)
|
||||
|
||||
assert result is None
|
||||
assert (
|
||||
'Download of stream playlist (%s) failed due to download taking '
|
||||
'more than 1.000s' % URI in caplog.text())
|
||||
assert 'Unwrapping stream from URI: %s' % PLAYLIST_URI in caplog.text()
|
||||
assert (
|
||||
'Parsed playlist (%s) and found new URI: %s'
|
||||
% (PLAYLIST_URI, PLAYLIST_URI)) in caplog.text()
|
||||
assert (
|
||||
'Unwrapping stream from URI (%s) failed: '
|
||||
'playlist referenced itself' % PLAYLIST_URI) in caplog.text()
|
||||
assert result is None
|
||||
|
||||
@ -58,17 +58,17 @@ class TestExtension(object):
|
||||
def test_get_cache_dir_raises_assertion_error(self, extension):
|
||||
config = {'core': {'cache_dir': '/tmp'}}
|
||||
with pytest.raises(AssertionError): # ext_name not set
|
||||
extension.get_cache_dir(config)
|
||||
ext.Extension.get_cache_dir(config)
|
||||
|
||||
def test_get_config_dir_raises_assertion_error(self, extension):
|
||||
config = {'core': {'config_dir': '/tmp'}}
|
||||
with pytest.raises(AssertionError): # ext_name not set
|
||||
extension.get_config_dir(config)
|
||||
ext.Extension.get_config_dir(config)
|
||||
|
||||
def test_get_data_dir_raises_assertion_error(self, extension):
|
||||
config = {'core': {'data_dir': '/tmp'}}
|
||||
with pytest.raises(AssertionError): # ext_name not set
|
||||
extension.get_data_dir(config)
|
||||
ext.Extension.get_data_dir(config)
|
||||
|
||||
|
||||
class TestLoadExtensions(object):
|
||||
|
||||
Loading…
Reference in New Issue
Block a user