Merge branch 'devel' of https://github.com/foosel/OctoPrint into mrbeam
This commit is contained in:
commit
d80cbcdee5
21 changed files with 259 additions and 167 deletions
14
CHANGELOG.md
14
CHANGELOG.md
|
|
@ -41,16 +41,27 @@
|
|||
* The "Slicing done" notification is now colored green ([#558](https://github.com/foosel/OctoPrint/issues/558)).
|
||||
* File management now supports STL files as first class citizens (including UI adjustments to allow management of
|
||||
uploaded STL files including removal and reslicing) and also allows folders (not yet supported by UI)
|
||||
* Also interpret lines starting with "!!" as errors
|
||||
* Added deletion of pyc files to the `python setup.py clean` command
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* [#435](https://github.com/foosel/OctoPrint/issues/435) - Always interpret negative duration (e.g. for print time left)
|
||||
as 0
|
||||
* [#633](https://github.com/foosel/OctoPrint/issues/633) - Correctly interpret temperature lines from multi extruder
|
||||
setups under Smoothieware
|
||||
* Various fixes of bugs in newly introduced features and improvements:
|
||||
* [#625](https://github.com/foosel/OctoPrint/pull/625) - Newly added GCODE files were not being added to the analysis
|
||||
queue
|
||||
|
||||
## 1.1.1 (Unreleased)
|
||||
## 1.1.2 (Unreleased)
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
* [#634](https://github.com/foosel/OctoPrint/pull/634) - Fixed missing `branch` fields in version dicts generated
|
||||
by versioneer
|
||||
|
||||
## 1.1.1 (2014-10-27)
|
||||
|
||||
### Improvements
|
||||
|
||||
|
|
@ -58,6 +69,7 @@
|
|||
server start and written back into ``config.yaml``
|
||||
* Event subscriptions are now enabled by default (it was an accident that they weren't)
|
||||
* Generate the key used for session hashing individually for each server instance
|
||||
* Generate the salt used for hashing user passwords individually for each server instance
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
|
|
|
|||
27
README.md
27
README.md
|
|
@ -29,43 +29,45 @@ OctoPrint via `setup.py`:
|
|||
|
||||
python setup.py install
|
||||
|
||||
You should also do this after pulling from the repository, since the dependencies might have changed.
|
||||
You should also do this every time after pulling from the repository, since the dependencies might have changed.
|
||||
|
||||
OctoPrint currently only supports Python 2.7.
|
||||
|
||||
Usage
|
||||
-----
|
||||
|
||||
From the source directory you can start the server via
|
||||
Running the `setup.py` script installs the `octoprint` script in your Python installation's scripts folder
|
||||
(which depending on whether you installed OctoPrint globally or into a virtual env will be on your `PATH` or not). The
|
||||
following usage examples assume that said `octoprint` script is on your `PATH`.
|
||||
|
||||
./run
|
||||
You can start the server via
|
||||
|
||||
octoprint
|
||||
|
||||
By default it binds to all interfaces on port 5000 (so pointing your browser to `http://127.0.0.1:5000`
|
||||
will do the trick). If you want to change that, use the additional command line parameters `host` and `port`,
|
||||
which accept the host ip to bind to and the numeric port number respectively. If for example you want the server
|
||||
to only listen on the local interface on port 8080, the command line would be
|
||||
|
||||
./run --host=127.0.0.1 --port=8080
|
||||
octoprint --host=127.0.0.1 --port=8080
|
||||
|
||||
Alternatively, the host and port on which to bind can be defined via the configuration.
|
||||
|
||||
If you want to run OctoPrint as a daemon (only supported on Linux), use
|
||||
|
||||
./run --daemon {start|stop|restart} [--pid PIDFILE]
|
||||
octoprint --daemon {start|stop|restart} [--pid PIDFILE]
|
||||
|
||||
If you do not supply a custom pidfile location via `--pid PIDFILE`, it will be created at `/tmp/octoprint.pid`.
|
||||
|
||||
You can also specify the configfile or the base directory (for basing off the `uploads`, `timelapse` and `logs` folders),
|
||||
e.g.:
|
||||
|
||||
./run --config /path/to/another/config.yaml --basedir /path/to/my/basedir
|
||||
octoprint --config /path/to/another/config.yaml --basedir /path/to/my/basedir
|
||||
|
||||
See `run --help` for further information.
|
||||
|
||||
Running the `setup.py` script also installs the `octoprint` startup script in your Python installation's scripts folder
|
||||
(which depending on whether you installed OctoPrint globally or into a virtual env will be on your `PATH` or not). The
|
||||
examples above also work with that startup script as it excepts the same parameters as `run`.
|
||||
See `octoprint --help` for further information.
|
||||
|
||||
OctoPrint also ships with a `run` script in its source directory. You can also invoke that to start up the server, it
|
||||
takes the same command line arguments as the `octoprint` script.
|
||||
|
||||
Configuration
|
||||
-------------
|
||||
|
|
@ -75,4 +77,5 @@ which is located at `~/.octoprint` on Linux, at `%APPDATA%/OctoPrint` on Windows
|
|||
at `~/Library/Application Support/OctoPrint` on MacOS.
|
||||
|
||||
A comprehensive overview of all available configuration settings can be found
|
||||
[on the wiki](https://github.com/foosel/OctoPrint/wiki/Configuration).
|
||||
[on the wiki](https://github.com/foosel/OctoPrint/wiki/Configuration). Please note that the most commonly used
|
||||
configuration settings can also easily be edited from OctoPrint's settings dialog.
|
||||
|
|
|
|||
|
|
@ -328,6 +328,7 @@ SlicingStarted
|
|||
|
||||
* ``stl``: the STL's filename
|
||||
* ``gcode``: the sliced GCODE's filename
|
||||
* ``progressAvailable``: true if progress information via the ``slicingProgress`` push update will be available, false if not
|
||||
|
||||
SlicingDone
|
||||
The slicing of a file has completed.
|
||||
|
|
|
|||
43
setup.py
43
setup.py
|
|
@ -33,6 +33,26 @@ def package_data_dirs(source, sub_folders):
|
|||
return dirs
|
||||
|
||||
|
||||
def _recursively_handle_files(directory, file_matcher, folder_handler=None, file_handler=None):
|
||||
applied_handler = False
|
||||
|
||||
for filename in os.listdir(directory):
|
||||
path = os.path.join(directory, filename)
|
||||
|
||||
if file_handler is not None and file_matcher(filename):
|
||||
file_handler(path)
|
||||
applied_handler = True
|
||||
|
||||
elif os.path.isdir(path):
|
||||
sub_applied_handler = _recursively_handle_files(path, file_matcher, folder_handler=folder_handler, file_handler=file_handler)
|
||||
if sub_applied_handler:
|
||||
applied_handler = True
|
||||
|
||||
if folder_handler is not None:
|
||||
folder_handler(path, sub_applied_handler)
|
||||
|
||||
return applied_handler
|
||||
|
||||
class CleanCommand(Command):
|
||||
description = "clean build artifacts"
|
||||
user_options = []
|
||||
|
|
@ -45,14 +65,37 @@ class CleanCommand(Command):
|
|||
pass
|
||||
|
||||
def run(self):
|
||||
# build folder
|
||||
if os.path.exists('build'):
|
||||
print "Deleting build directory"
|
||||
shutil.rmtree('build')
|
||||
|
||||
# eggs
|
||||
eggs = glob.glob('OctoPrint*.egg-info')
|
||||
for egg in eggs:
|
||||
print "Deleting %s directory" % egg
|
||||
shutil.rmtree(egg)
|
||||
|
||||
# pyc files
|
||||
def delete_folder_if_empty(path, applied_handler):
|
||||
if not applied_handler:
|
||||
return
|
||||
if len(os.listdir(path)) == 0:
|
||||
shutil.rmtree(path)
|
||||
print "Deleted %s since it was empty" % path
|
||||
|
||||
def delete_file(path):
|
||||
os.remove(path)
|
||||
print "Deleted %s" % path
|
||||
|
||||
import fnmatch
|
||||
_recursively_handle_files(
|
||||
os.path.abspath("src"),
|
||||
lambda name: fnmatch.fnmatch(name.lower(), "*.pyc"),
|
||||
folder_handler=delete_folder_if_empty,
|
||||
file_handler=delete_file
|
||||
)
|
||||
|
||||
|
||||
class NewTranslation(Command):
|
||||
description = "create a new translation"
|
||||
|
|
|
|||
|
|
@ -214,7 +214,7 @@ def versions_from_parentdir(parentdir_prefix, root, verbose=False):
|
|||
print("guessing rootdir is '%s', but '%s' doesn't start with prefix '%s'" %
|
||||
(root, dirname, parentdir_prefix))
|
||||
return None
|
||||
return {"version": dirname[len(parentdir_prefix):], "full": ""}
|
||||
return {"version": dirname[len(parentdir_prefix):], "full": "", "branch": ""}
|
||||
|
||||
tag_prefix = ""
|
||||
parentdir_prefix = ""
|
||||
|
|
@ -249,7 +249,7 @@ def parse_lookup_file(root, lookup_path=None):
|
|||
break
|
||||
return lookup
|
||||
|
||||
def get_versions(default={"version": "unknown", "full": ""}, lookup_path=None, verbose=False):
|
||||
def get_versions(default={"version": "unknown", "full": "", "branch": "unknown"}, lookup_path=None, verbose=False):
|
||||
# I am in _version.py, which lives at ROOT/VERSIONFILE_SOURCE. If we have
|
||||
# __file__, we can work backwards from there to the root. Some
|
||||
# py2exe/bbfreeze/non-CPython implementations don't do __file__, in which
|
||||
|
|
|
|||
|
|
@ -200,9 +200,11 @@ class FileManager(object):
|
|||
if dest_job_key in self._slicing_jobs:
|
||||
del self._slicing_jobs[dest_job_key]
|
||||
|
||||
slicer = self._slicing_manager.get_slicer(slicer_name)
|
||||
|
||||
import time
|
||||
start_time = time.time()
|
||||
eventManager().fire(Events.SLICING_STARTED, {"stl": source_path, "gcode": dest_path})
|
||||
eventManager().fire(Events.SLICING_STARTED, {"stl": source_path, "gcode": dest_path, "progressAvailable": slicer.get_slicer_properties()["progress_report"] if slicer else False})
|
||||
|
||||
import tempfile
|
||||
f = tempfile.NamedTemporaryFile(suffix=".gco", delete=False)
|
||||
|
|
@ -266,10 +268,11 @@ class FileManager(object):
|
|||
def add_file(self, destination, path, file_object, links=None, allow_overwrite=False):
|
||||
file_path = self._storage(destination).add_file(path, file_object, links=links, allow_overwrite=allow_overwrite)
|
||||
absolute_path = self._storage(destination).get_absolute_path(file_path)
|
||||
file_type = get_file_type(file_path)[-1]
|
||||
|
||||
queue_entry = QueueEntry(file_path, file_type, destination, absolute_path)
|
||||
self._analysis_queue.enqueue(queue_entry, high_priority=True)
|
||||
file_type = get_file_type(absolute_path)
|
||||
if file_type:
|
||||
queue_entry = QueueEntry(file_path, file_type[-1], destination, absolute_path)
|
||||
self._analysis_queue.enqueue(queue_entry, high_priority=True)
|
||||
|
||||
eventManager().fire(Events.UPDATED_FILES, dict(type="printables"))
|
||||
return file_path
|
||||
|
|
|
|||
|
|
@ -199,6 +199,7 @@ class SlicerPlugin(Plugin):
|
|||
type=None,
|
||||
name=None,
|
||||
same_device=True,
|
||||
progress_report=False
|
||||
)
|
||||
|
||||
def get_slicer_profile_options(self):
|
||||
|
|
|
|||
|
|
@ -194,7 +194,8 @@ class CuraPlugin(octoprint.plugin.SlicerPlugin,
|
|||
return dict(
|
||||
type="cura",
|
||||
name="CuraEngine",
|
||||
same_device=True
|
||||
same_device=True,
|
||||
progress_report=True
|
||||
)
|
||||
|
||||
def get_slicer_default_profile(self):
|
||||
|
|
|
|||
|
|
@ -457,55 +457,70 @@ class Profile(object):
|
|||
|
||||
@classmethod
|
||||
def merge_profile(cls, profile, overrides=None):
|
||||
import copy
|
||||
|
||||
result = copy.deepcopy(defaults)
|
||||
for k in result.keys():
|
||||
profile_value = None
|
||||
override_value = None
|
||||
|
||||
if k in profile:
|
||||
profile_value = profile[k]
|
||||
if overrides and k in overrides:
|
||||
override_value = overrides[k]
|
||||
|
||||
if profile_value is None and override_value is None:
|
||||
# neither override nor profile, no need to handle this key further
|
||||
continue
|
||||
|
||||
if k in ("filament_diameter", "print_temperature", "start_gcode", "end_gcode"):
|
||||
# the array fields need some special treatment. Basically something like this:
|
||||
#
|
||||
# override_value: [None, "b"]
|
||||
# profile_value : ["a" , None, "c"]
|
||||
# default_value : ["d" , "e" , "f", "g"]
|
||||
#
|
||||
# should merge to something like this:
|
||||
#
|
||||
# ["a" , "b" , "c", "g"]
|
||||
#
|
||||
# So override > profile > default, if neither override nor profile value are available
|
||||
# the default value should just be left as is
|
||||
|
||||
for x in xrange(len(result[k])):
|
||||
if override_value is not None and x < len(override_value) and override_value[x] is not None:
|
||||
# we have an override value for this location, so we use it
|
||||
result[k][x] = override_value[x]
|
||||
elif profile_value is not None and x < len(profile_value) and profile_value[x] is not None:
|
||||
# we have a profile value for this location, so we use it
|
||||
result[k][x] = profile_value[x]
|
||||
|
||||
else:
|
||||
# just change the result value to the override_value if available, otherwise to the profile_value if
|
||||
# that is given, else just leave as is
|
||||
if override_value is not None:
|
||||
result[k] = override_value
|
||||
elif profile_value is not None:
|
||||
result[k] = profile_value
|
||||
result = dict()
|
||||
for key in defaults.keys():
|
||||
r = cls.merge_profile_key(key, profile, overrides=overrides)
|
||||
if r is not None:
|
||||
result[key] = r
|
||||
return result
|
||||
|
||||
def __init__(self, profile):
|
||||
self.profile = profile
|
||||
@classmethod
|
||||
def merge_profile_key(cls, key, profile, overrides=None):
|
||||
profile_value = None
|
||||
override_value = None
|
||||
|
||||
if not key in defaults:
|
||||
return None
|
||||
import copy
|
||||
result = copy.deepcopy(defaults[key])
|
||||
|
||||
if key in profile:
|
||||
profile_value = profile[key]
|
||||
if overrides and key in overrides:
|
||||
override_value = overrides[key]
|
||||
|
||||
if profile_value is None and override_value is None:
|
||||
# neither override nor profile, no need to handle this key further
|
||||
return None
|
||||
|
||||
if key in ("filament_diameter", "print_temperature", "start_gcode", "end_gcode"):
|
||||
# the array fields need some special treatment. Basically something like this:
|
||||
#
|
||||
# override_value: [None, "b"]
|
||||
# profile_value : ["a" , None, "c"]
|
||||
# default_value : ["d" , "e" , "f", "g"]
|
||||
#
|
||||
# should merge to something like this:
|
||||
#
|
||||
# ["a" , "b" , "c", "g"]
|
||||
#
|
||||
# So override > profile > default, if neither override nor profile value are available
|
||||
# the default value should just be left as is
|
||||
|
||||
for x in xrange(len(result)):
|
||||
if override_value is not None and x < len(override_value) and override_value[x] is not None:
|
||||
# we have an override value for this location, so we use it
|
||||
result[x] = override_value[x]
|
||||
elif profile_value is not None and x < len(profile_value) and profile_value[x] is not None:
|
||||
# we have a profile value for this location, so we use it
|
||||
result[x] = profile_value[x]
|
||||
|
||||
else:
|
||||
# just change the result value to the override_value if available, otherwise to the profile_value if
|
||||
# that is given, else just leave as is
|
||||
if override_value is not None:
|
||||
result = override_value
|
||||
elif profile_value is not None:
|
||||
result = profile_value
|
||||
|
||||
return result
|
||||
|
||||
def __init__(self, profile, overrides=None):
|
||||
self._profile = self.__class__.merge_profile(profile, overrides=overrides)
|
||||
|
||||
def profile(self):
|
||||
import copy
|
||||
return copy.deepcopy(self._profile)
|
||||
|
||||
def get(self, key):
|
||||
if key in ("machine_width", "machine_depth", "machine_center_is_zero"):
|
||||
|
|
@ -547,7 +562,7 @@ class Profile(object):
|
|||
if not match:
|
||||
return 0.0
|
||||
|
||||
diameters = defaults["filament_diameter"]
|
||||
diameters = self._get("filament_diameter")
|
||||
if not match.group(1):
|
||||
return diameters[0]
|
||||
index = int(match.group(1))
|
||||
|
|
@ -560,7 +575,7 @@ class Profile(object):
|
|||
if not match:
|
||||
return 0.0
|
||||
|
||||
temperatures = defaults["print_temperature"]
|
||||
temperatures = self._get("print_temperature")
|
||||
if not match.group(1):
|
||||
return temperatures[0]
|
||||
index = int(match.group(1))
|
||||
|
|
@ -569,12 +584,15 @@ class Profile(object):
|
|||
return temperatures[index]
|
||||
|
||||
else:
|
||||
if key in self.profile:
|
||||
return self.profile[key]
|
||||
elif key in defaults:
|
||||
return defaults[key]
|
||||
else:
|
||||
return None
|
||||
return self._get(key)
|
||||
|
||||
def _get(self, key):
|
||||
if key in self._profile:
|
||||
return self._profile[key]
|
||||
elif key in defaults:
|
||||
return defaults[key]
|
||||
else:
|
||||
return None
|
||||
|
||||
def get_int(self, key, default=None):
|
||||
value = self.get(key)
|
||||
|
|
@ -622,8 +640,8 @@ class Profile(object):
|
|||
def get_gcode_template(self, key):
|
||||
extruder_count = s.globalGetInt(["printerParameters", "numExtruders"])
|
||||
|
||||
if key in self.profile:
|
||||
gcode = self.profile[key]
|
||||
if key in self._profile:
|
||||
gcode = self._profile[key]
|
||||
else:
|
||||
gcode = defaults[key]
|
||||
|
||||
|
|
@ -646,7 +664,7 @@ class Profile(object):
|
|||
|
||||
import copy
|
||||
profile = copy.deepcopy(defaults)
|
||||
profile.update(self.profile)
|
||||
profile.update(self._profile)
|
||||
for key in ("print_temperature", "print_temperature2", "print_temperature3", "print_temperature4",
|
||||
"filament_diameter", "filament_diameter2", "filament_diameter3", "filament_diameter4"):
|
||||
profile[key] = self.get(key)
|
||||
|
|
|
|||
|
|
@ -413,6 +413,9 @@ class Server():
|
|||
"level": "CRITICAL",
|
||||
"handlers": ["serialFile"],
|
||||
"propagate": False
|
||||
},
|
||||
"tornado.application": {
|
||||
"level": "ERROR"
|
||||
}
|
||||
},
|
||||
"root": {
|
||||
|
|
|
|||
|
|
@ -249,7 +249,7 @@ def login():
|
|||
|
||||
user = octoprint.server.userManager.findUser(username)
|
||||
if user is not None:
|
||||
if user.check_password(octoprint.users.UserManager.createPasswordHash(password)):
|
||||
if octoprint.server.userManager.checkPassword(username, password):
|
||||
if octoprint.server.userManager is not None:
|
||||
user = octoprint.server.userManager.login_user(user)
|
||||
session["usersession.id"] = user.get_session()
|
||||
|
|
|
|||
|
|
@ -24,6 +24,7 @@ import tornado.httpclient
|
|||
import tornado.http1connection
|
||||
import tornado.iostream
|
||||
import tornado.tcpserver
|
||||
import tornado.util
|
||||
|
||||
import octoprint.util
|
||||
|
||||
|
|
@ -707,8 +708,6 @@ class CustomHTTP1ConnectionParameters(tornado.http1connection.HTTP1ConnectionPar
|
|||
|
||||
class LargeResponseHandler(tornado.web.StaticFileHandler):
|
||||
|
||||
CHUNK_SIZE = 16 * 1024
|
||||
|
||||
def initialize(self, path, default_filename=None, as_attachment=False, access_validation=None):
|
||||
tornado.web.StaticFileHandler.initialize(self, os.path.abspath(path), default_filename)
|
||||
self._as_attachment = as_attachment
|
||||
|
|
@ -717,70 +716,18 @@ class LargeResponseHandler(tornado.web.StaticFileHandler):
|
|||
def get(self, path, include_body=True):
|
||||
if self._access_validation is not None:
|
||||
self._access_validation(self.request)
|
||||
|
||||
path = self.parse_url_path(path)
|
||||
abspath = os.path.abspath(os.path.join(self.root, path))
|
||||
# os.path.abspath strips a trailing /
|
||||
# it needs to be temporarily added back for requests to root/
|
||||
if not (abspath + os.path.sep).startswith(self.root):
|
||||
raise tornado.web.HTTPError(403, "%s is not in root static directory", path)
|
||||
if os.path.isdir(abspath) and self.default_filename is not None:
|
||||
# need to look at the request.path here for when path is empty
|
||||
# but there is some prefix to the path that was already
|
||||
# trimmed by the routing
|
||||
if not self.request.path.endswith("/"):
|
||||
self.redirect(self.request.path + "/")
|
||||
return
|
||||
abspath = os.path.join(abspath, self.default_filename)
|
||||
if not os.path.exists(abspath):
|
||||
raise tornado.web.HTTPError(404)
|
||||
if not os.path.isfile(abspath):
|
||||
raise tornado.web.HTTPError(403, "%s is not a file", path)
|
||||
|
||||
stat_result = os.stat(abspath)
|
||||
modified = datetime.datetime.fromtimestamp(stat_result[stat.ST_MTIME])
|
||||
|
||||
self.set_header("Last-Modified", modified)
|
||||
|
||||
mime_type, encoding = mimetypes.guess_type(abspath)
|
||||
if mime_type:
|
||||
self.set_header("Content-Type", mime_type)
|
||||
|
||||
cache_time = self.get_cache_time(path, modified, mime_type)
|
||||
|
||||
if cache_time > 0:
|
||||
self.set_header("Expires", datetime.datetime.utcnow() +
|
||||
datetime.timedelta(seconds=cache_time))
|
||||
self.set_header("Cache-Control", "max-age=" + str(cache_time))
|
||||
|
||||
self.set_extra_headers(path)
|
||||
|
||||
# Check the If-Modified-Since, and don't send the result if the
|
||||
# content has not been modified
|
||||
ims_value = self.request.headers.get("If-Modified-Since")
|
||||
if ims_value is not None:
|
||||
date_tuple = email.utils.parsedate(ims_value)
|
||||
if_since = datetime.datetime.fromtimestamp(time.mktime(date_tuple))
|
||||
if if_since >= modified:
|
||||
self.set_status(304)
|
||||
return
|
||||
|
||||
if not include_body:
|
||||
assert self.request.method == "HEAD"
|
||||
self.set_header("Content-Length", stat_result[stat.ST_SIZE])
|
||||
else:
|
||||
with open(abspath, "rb") as file:
|
||||
while True:
|
||||
data = file.read(LargeResponseHandler.CHUNK_SIZE)
|
||||
if not data:
|
||||
break
|
||||
self.write(data)
|
||||
self.flush()
|
||||
result = tornado.web.StaticFileHandler.get(self, path, include_body=include_body)
|
||||
return result
|
||||
|
||||
def set_extra_headers(self, path):
|
||||
if self._as_attachment:
|
||||
self.set_header("Content-Disposition", "attachment")
|
||||
|
||||
@classmethod
|
||||
def get_content_version(cls, abspath):
|
||||
import os
|
||||
import stat
|
||||
return os.stat(abspath)[stat.ST_MTIME]
|
||||
|
||||
##~~ URL Forward Handler for forwarding requests to a preconfigured static URL
|
||||
|
||||
|
|
|
|||
|
|
@ -131,6 +131,7 @@ default_settings = {
|
|||
},
|
||||
"accessControl": {
|
||||
"enabled": True,
|
||||
"salt": None,
|
||||
"userManager": "octoprint.users.FilebasedUserManager",
|
||||
"userfile": None,
|
||||
"autologinLocal": False,
|
||||
|
|
@ -172,6 +173,7 @@ default_settings = {
|
|||
"includeCurrentToolInTemps": True,
|
||||
"hasBed": True,
|
||||
"repetierStyleTargetTemperature": False,
|
||||
"smoothieTemperatureReporting": False,
|
||||
"extendedSdFileList": False
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -73,22 +73,26 @@ class SlicingManager(object):
|
|||
|
||||
@property
|
||||
def slicing_enabled(self):
|
||||
return len(self.registered_slicers) > 0
|
||||
return len(self.configured_slicers) > 0
|
||||
|
||||
@property
|
||||
def registered_slicers(self):
|
||||
return self._slicers.keys()
|
||||
|
||||
@property
|
||||
def configured_slicers(self):
|
||||
return map(lambda slicer: slicer.get_slicer_properties()["type"], filter(lambda slicer: slicer.is_slicer_configured(), self._slicers.values()))
|
||||
|
||||
@property
|
||||
def default_slicer(self):
|
||||
slicer_name = settings().get(["slicing", "defaultSlicer"])
|
||||
if slicer_name in self.registered_slicers:
|
||||
if slicer_name in self.configured_slicers:
|
||||
return slicer_name
|
||||
else:
|
||||
return None
|
||||
|
||||
def get_slicer(self, slicer):
|
||||
return self._slicers[slicer] if slicer in self._slicers else None
|
||||
def get_slicer(self, slicer, require_configured=True):
|
||||
return self._slicers[slicer] if slicer in self._slicers and (not require_configured or self._slicers[slicer].is_slicer_configured()) else None
|
||||
|
||||
def slice(self, slicer_name, source_path, dest_path, profile_name, callback, callback_args=None, callback_kwargs=None, overrides=None, on_progress=None, on_progress_args=None, on_progress_kwargs=None):
|
||||
if callback_args is None:
|
||||
|
|
@ -96,8 +100,11 @@ class SlicingManager(object):
|
|||
if callback_kwargs is None:
|
||||
callback_kwargs = dict()
|
||||
|
||||
if not slicer_name in self.registered_slicers:
|
||||
error = "No such slicer: {slicer_name}".format(**locals())
|
||||
if not slicer_name in self.configured_slicers:
|
||||
if not slicer_name in self.registered_slicers:
|
||||
error = "No such slicer: {slicer_name}".format(**locals())
|
||||
else:
|
||||
error = "Slicer not configured: {slicer_name}".format(**locals())
|
||||
callback_kwargs.update(dict(_error=error))
|
||||
callback(*callback_args, **callback_kwargs)
|
||||
return False, error
|
||||
|
|
|
|||
|
|
@ -181,7 +181,11 @@ function DataUpdater(allViewModels) {
|
|||
} else if (type == "SlicingStarted") {
|
||||
gcodeUploadProgress.addClass("progress-striped").addClass("active");
|
||||
gcodeUploadProgressBar.css("width", "100%");
|
||||
gcodeUploadProgressBar.text(_.sprintf(gettext("Slicing ... (%(percentage)d%%)"), {percentage: 0}));
|
||||
if (payload.progressAvailable) {
|
||||
gcodeUploadProgressBar.text(_.sprintf(gettext("Slicing ... (%(percentage)d%%)"), {percentage: 0}));
|
||||
} else {
|
||||
gcodeUploadProgressBar.text(gettext("Slicing ..."));
|
||||
}
|
||||
} else if (type == "SlicingDone") {
|
||||
gcodeUploadProgress.removeClass("progress-striped").removeClass("active");
|
||||
gcodeUploadProgressBar.css("width", "0%");
|
||||
|
|
|
|||
|
|
@ -100,7 +100,7 @@
|
|||
</a>
|
||||
<div id="login_dropdown_loggedout" style="padding: 15px" class="dropdown-menu" data-bind="css: {hide: loginState.loggedIn(), 'dropdown-menu': !loginState.loggedIn()}">
|
||||
<label for="login_user">{{ _('Username') }}</label>
|
||||
<input type="text" id="login_user" placeholder="{{ _('Username') }}">
|
||||
<input type="text" id="login_user" placeholder="{{ _('Username') }}" autocapitalize="none">
|
||||
<label for="login_pass">{{ _('Password') }}</label>
|
||||
<input type="password" id="login_pass" placeholder="{{ _('Password') }}">
|
||||
<label class="checkbox">
|
||||
|
|
|
|||
|
|
@ -7,6 +7,7 @@ __copyright__ = "Copyright (C) 2014 The OctoPrint Project - Released under terms
|
|||
|
||||
from flask.ext.login import UserMixin
|
||||
from flask.ext.principal import Identity
|
||||
from werkzeug.local import LocalProxy
|
||||
import hashlib
|
||||
import os
|
||||
import yaml
|
||||
|
|
@ -27,7 +28,9 @@ class UserManager(object):
|
|||
def login_user(self, user):
|
||||
self._cleanup_sessions()
|
||||
|
||||
if user is None:
|
||||
if user is None \
|
||||
or (isinstance(user, LocalProxy) and not isinstance(user._get_current_object(), User)) \
|
||||
or (not isinstance(user, LocalProxy) and not isinstance(user, User)):
|
||||
return None
|
||||
|
||||
if not isinstance(user, SessionUser):
|
||||
|
|
@ -70,8 +73,38 @@ class UserManager(object):
|
|||
self.logout_user(user)
|
||||
|
||||
@staticmethod
|
||||
def createPasswordHash(password):
|
||||
return hashlib.sha512(password + "mvBUTvwzBzD3yPwvnJ4E4tXNf3CGJvvW").hexdigest()
|
||||
def createPasswordHash(password, salt=None):
|
||||
if not salt:
|
||||
salt = settings().get(["accessControl", "salt"])
|
||||
if salt is None:
|
||||
import string
|
||||
from random import choice
|
||||
chars = string.ascii_lowercase + string.ascii_uppercase + string.digits
|
||||
salt = "".join(choice(chars) for _ in xrange(32))
|
||||
settings().set(["accessControl", "salt"], salt)
|
||||
settings().save()
|
||||
|
||||
return hashlib.sha512(password + salt).hexdigest()
|
||||
|
||||
def checkPassword(self, username, password):
|
||||
user = self.findUser(username)
|
||||
if not user:
|
||||
return False
|
||||
|
||||
hash = UserManager.createPasswordHash(password)
|
||||
if user.check_password(hash):
|
||||
# new hash matches, correct password
|
||||
return True
|
||||
else:
|
||||
# new hash doesn't match, but maybe the old one does, so check that!
|
||||
oldHash = UserManager.createPasswordHash(password, salt="mvBUTvwzBzD3yPwvnJ4E4tXNf3CGJvvW")
|
||||
if user.check_password(oldHash):
|
||||
# old hash matches, we migrate the stored password hash to the new one and return True since it's the correct password
|
||||
self.changeUserPassword(username, password)
|
||||
return True
|
||||
else:
|
||||
# old hash doesn't match either, wrong password
|
||||
return False
|
||||
|
||||
def addUser(self, username, password, active, roles):
|
||||
pass
|
||||
|
|
@ -165,7 +198,10 @@ class FilebasedUserManager(UserManager):
|
|||
self._dirty = False
|
||||
self._load()
|
||||
|
||||
def addUser(self, username, password, active=False, roles=["user"], apikey=None):
|
||||
def addUser(self, username, password, active=False, roles=None, apikey=None):
|
||||
if not roles:
|
||||
roles = ["user"]
|
||||
|
||||
if username in self._users.keys():
|
||||
raise UserAlreadyExists(username)
|
||||
|
||||
|
|
|
|||
|
|
@ -580,8 +580,8 @@ class MachineCom(object):
|
|||
maxToolNum, parsedTemps = self._parseTemperatures(line)
|
||||
|
||||
# extruder temperatures
|
||||
if not "T0" in parsedTemps.keys() and "T" in parsedTemps.keys():
|
||||
# only single reporting, "T" is our one and only extruder temperature
|
||||
if not "T0" in parsedTemps.keys() and not "T1" in parsedTemps.keys() and "T" in parsedTemps.keys():
|
||||
# no T1 so only single reporting, "T" is our one and only extruder temperature
|
||||
toolNum, actual, target = parsedTemps["T"]
|
||||
|
||||
if target is not None:
|
||||
|
|
@ -591,7 +591,13 @@ class MachineCom(object):
|
|||
self._temp[0] = (actual, oldTarget)
|
||||
else:
|
||||
self._temp[0] = (actual, None)
|
||||
elif "T0" in parsedTemps.keys():
|
||||
elif not "T0" in parsedTemps.keys() and "T" in parsedTemps.keys():
|
||||
# Smoothieware sends multi extruder temperature data this way: "T:<first extruder> T1:<second extruder> ..." and therefore needs some special treatment...
|
||||
_, actual, target = parsedTemps["T"]
|
||||
del parsedTemps["T"]
|
||||
parsedTemps["T0"] = (0, actual, target)
|
||||
|
||||
if "T0" in parsedTemps.keys():
|
||||
for n in range(maxToolNum + 1):
|
||||
tool = "T%d" % n
|
||||
if not tool in parsedTemps.keys():
|
||||
|
|
@ -1001,7 +1007,7 @@ class MachineCom(object):
|
|||
|
||||
def _handleErrors(self, line):
|
||||
# No matter the state, if we see an error, goto the error state and store the error for reference.
|
||||
if line.startswith('Error:'):
|
||||
if line.startswith('Error:') or line.startswith('!!'):
|
||||
#Oh YEAH, consistency.
|
||||
# Marlin reports an MIN/MAX temp error as "Error:x\n: Extruder switched off. MAXTEMP triggered !\n"
|
||||
# But a bed temp error is reported as "Error: Temperature heated bed switched off. MAXTEMP triggered !!"
|
||||
|
|
|
|||
|
|
@ -213,6 +213,9 @@ class VirtualPrinter():
|
|||
allTemps.append((i, self.temp[i], self.targetTemp[i]))
|
||||
allTempsString = " ".join(map(lambda x: "T%d:%.2f /%.2f" % x if includeTarget else "T%d:%.2f" % (x[0], x[1]), allTemps))
|
||||
|
||||
if settings().getBoolean(["devel", "virtualPrinter", "smoothieTemperatureReporting"]):
|
||||
allTempsString = allTempsString.replace("T0:", "T:")
|
||||
|
||||
if settings().getBoolean(["devel", "virtualPrinter", "hasBed"]):
|
||||
if includeTarget:
|
||||
allTempsString = "B:%.2f /%.2f %s" % (self.bedTemp, self.bedTargetTemp, allTempsString)
|
||||
|
|
|
|||
|
|
@ -157,7 +157,7 @@ class FileManagerTest(unittest.TestCase):
|
|||
self.file_manager.slice("some_slicer", octoprint.filemanager.FileDestinations.LOCAL, "source.file", octoprint.filemanager.FileDestinations.LOCAL, "dest.file", callback=callback, callback_args=callback_args)
|
||||
|
||||
# assert that events where fired
|
||||
expected_events = [mock.call(octoprint.filemanager.Events.SLICING_STARTED, {"stl": "source.file", "gcode": "dest.file"}),
|
||||
expected_events = [mock.call(octoprint.filemanager.Events.SLICING_STARTED, {"stl": "source.file", "gcode": "dest.file", "progressAvailable": False}),
|
||||
mock.call(octoprint.filemanager.Events.SLICING_DONE, {"stl": "source.file", "gcode": "dest.file", "time": 15.694000005722046})]
|
||||
self.fire_event.call_args_list = expected_events
|
||||
|
||||
|
|
|
|||
|
|
@ -473,7 +473,7 @@ def versions_from_parentdir(parentdir_prefix, root, verbose=False):
|
|||
print("guessing rootdir is '%%s', but '%%s' doesn't start with prefix '%%s'" %%
|
||||
(root, dirname, parentdir_prefix))
|
||||
return None
|
||||
return {"version": dirname[len(parentdir_prefix):], "full": ""}
|
||||
return {"version": dirname[len(parentdir_prefix):], "full": "", "branch": ""}
|
||||
|
||||
tag_prefix = "%(TAG_PREFIX)s"
|
||||
parentdir_prefix = "%(PARENTDIR_PREFIX)s"
|
||||
|
|
@ -508,7 +508,7 @@ def parse_lookup_file(root, lookup_path=None):
|
|||
break
|
||||
return lookup
|
||||
|
||||
def get_versions(default={"version": "unknown", "full": ""}, lookup_path=None, verbose=False):
|
||||
def get_versions(default={"version": "unknown", "full": "", "branch": "unknown"}, lookup_path=None, verbose=False):
|
||||
# I am in _version.py, which lives at ROOT/VERSIONFILE_SOURCE. If we have
|
||||
# __file__, we can work backwards from there to the root. Some
|
||||
# py2exe/bbfreeze/non-CPython implementations don't do __file__, in which
|
||||
|
|
@ -649,7 +649,8 @@ def versions_from_expanded_variables(variables, tag_prefix, verbose=False):
|
|||
if verbose:
|
||||
print("no suitable tags, using full revision id")
|
||||
return { "version": variables["full"].strip(),
|
||||
"full": variables["full"].strip() }
|
||||
"full": variables["full"].strip(),
|
||||
"branch": ""}
|
||||
|
||||
|
||||
def versions_from_lookup(lookup, root, verbose=False):
|
||||
|
|
@ -741,7 +742,7 @@ def versions_from_parentdir(parentdir_prefix, root, verbose=False):
|
|||
print("guessing rootdir is '%s', but '%s' doesn't start with prefix '%s'" %
|
||||
(root, dirname, parentdir_prefix))
|
||||
return None
|
||||
return {"version": dirname[len(parentdir_prefix):], "full": ""}
|
||||
return {"version": dirname[len(parentdir_prefix):], "full": "", "branch": ""}
|
||||
import os.path
|
||||
import sys
|
||||
|
||||
|
|
@ -801,15 +802,13 @@ SHORT_VERSION_PY = """
|
|||
|
||||
version_version = '%(version)s'
|
||||
version_full = '%(full)s'
|
||||
version_branch = %(branch)r
|
||||
version_branch = '%(branch)s'
|
||||
def get_versions(default={}, verbose=False):
|
||||
if version_branch:
|
||||
return {'version': version_version, 'full': version_full, 'branch': version_branch}
|
||||
return {'version': version_version, 'full': version_full}
|
||||
return {'version': version_version, 'full': version_full, 'branch': version_branch}
|
||||
|
||||
"""
|
||||
|
||||
DEFAULT = {"version": "unknown", "full": "unknown"}
|
||||
DEFAULT = {"version": "unknown", "full": "unknown", "branch": "unknown"}
|
||||
|
||||
def versions_from_file(filename):
|
||||
versions = {}
|
||||
|
|
@ -824,6 +823,9 @@ def versions_from_file(filename):
|
|||
mo = re.match("version_full = '([^']+)'", line)
|
||||
if mo:
|
||||
versions["full"] = mo.group(1)
|
||||
mo = re.match("version_branch = '([^']+)'", line)
|
||||
if mo:
|
||||
versions["branch"] = mo.group(1)
|
||||
f.close()
|
||||
return versions
|
||||
|
||||
|
|
@ -866,7 +868,7 @@ def parse_lookup_file(root, lookup_path=None):
|
|||
return lookup
|
||||
|
||||
def get_versions(default=DEFAULT, verbose=False):
|
||||
# returns dict with two keys: 'version' and 'full'
|
||||
# returns dict with three keys: 'version', 'full' and 'branch'
|
||||
assert versionfile_source is not None, "please set versioneer.versionfile_source"
|
||||
assert tag_prefix is not None, "please set versioneer.tag_prefix"
|
||||
assert parentdir_prefix is not None, "please set versioneer.parentdir_prefix"
|
||||
|
|
|
|||
Loading…
Reference in a new issue