Getting error while running Covalent Machine Learning Tutorial Code on Google Colab #111
iotaisolutions
started this conversation in
General
Replies: 2 comments
-
@iotaisolutions First thing to check is whether the processes related to the servers were indeed started. For mac, you can do If the servers weren’t actually started, we can hone in more on that. If on the other hand, the servers are indeed active, would you mind posting what your workflow looks like? |
Beta Was this translation helpful? Give feedback.
0 replies
-
@iotaisolutions Can you try checking if Python3.8 is being used? Currently, python3.7 is not supported. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
!covalent restart
tornado.application - ERROR - Exception in callback functools.partial(<function wrap..null_wrapper at 0x7f3abd57aa70>, <Future finished exception=StreamClosedError('Stream is closed')>)
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/tornado/gen.py", line 1141, in run
yielded = self.gen.throw(*exc_info)
File "/usr/local/lib/python3.7/dist-packages/tornado/tcpclient.py", line 232, in connect
af, addr, stream = yield connector.start(connect_timeout=timeout)
File "/usr/local/lib/python3.7/dist-packages/tornado/gen.py", line 1133, in run
value = future.result()
File "/usr/local/lib/python3.7/dist-packages/tornado/tcpclient.py", line 112, in on_connect_done
stream = future.result()
tornado.iostream.StreamClosedError: Stream is closed
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/tornado/ioloop.py", line 758, in _run_callback
ret = callback()
File "/usr/local/lib/python3.7/dist-packages/tornado/stack_context.py", line 300, in null_wrapper
return fn(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/tornado/gen.py", line 1233, in inner
self.run()
File "/usr/local/lib/python3.7/dist-packages/tornado/gen.py", line 1173, in run
future_set_exc_info(self.result_future, sys.exc_info())
File "/usr/local/lib/python3.7/dist-packages/tornado/concurrent.py", line 643, in future_set_exc_info
future.set_exception(exc_info[1])
asyncio.base_futures.InvalidStateError: invalid state
Covalent dispatcher server has restarted on port http://0.0.0.0:48008/.
Covalent UI server was not running.
Covalent UI server has started at http://0.0.0.0:47007/
dispatch_id = ct.dispatch(workflow)()
HTTPError Traceback (most recent call last)
in ()
----> 1 dispatch_id = ct.dispatch(workflow)()
1 frames
/usr/local/lib/python3.7/dist-packages/requests/models.py in raise_for_status(self)
939
940 if http_error_msg:
--> 941 raise HTTPError(http_error_msg, response=self)
942
943 def close(self):
HTTPError: 500 Server Error: INTERNAL SERVER ERROR for url: http://0.0.0.0:48008/api/submit
Could you please suggest what step I am missing, or how I can resolve the issue
Beta Was this translation helpful? Give feedback.
All reactions