2.4K Star 8.2K Fork 4.4K

GVPMindSpore / mindspore

 / 详情

[CT][MS][OPS][asinh]asinh ascend pynative模式不支持float64、complex128、complex64类型

ACCEPTED
RFC
创建于  
2023-12-29 10:21
name about labels
Bug Report Use this template for reporting a bug kind/bug

Describe the current behavior / 问题描述 (Mandatory / 必填)

asinh ascend pynative模式core dump

Environment / 环境信息 (Mandatory / 必填)

  • Hardware Environment(Ascend/GPU/CPU) / 硬件环境:

Please delete the backend not involved / 请删除不涉及的后端:
/device ascend

  • Software Environment / 软件环境 (Mandatory / 必填):
    -- MindSpore version (e.g., 1.7.0.Bxxx) :
    -- Python version (e.g., Python 3.7.5) :
    -- OS platform and distribution (e.g., Linux Ubuntu 16.04):
    -- GCC/Compiler version (if compiled from source):

  • Excute Mode / 执行模式 (Mandatory / 必填)(PyNative/Graph):

Please delete the mode not involved / 请删除不涉及的模式:
/mode pynative

Related testcase / 关联用例 (Mandatory / 必填)

test_f_asinh_x_0d_float64
test_p_asinh_input_1_fp64
test_p_asinh_input_20x7x88_fp64
test_p_asinh_input_20x7x8x2x3_complex128
test_p_asinh_input_20x7x8x2x3_complex64

Steps to reproduce the issue / 重现步骤 (Mandatory / 必填)

  1. pytest -s -v test_f_asinh.py::test_f_asinh_x_0d_float64
  2. pytest -s -v test_asinh.py::test_p_asinh_input_1_fp64
    pytest -s -v test_asinh.py::test_p_asinh_input_20x7x88_fp64
    pytest -s -v test_asinh.py::test_p_asinh_input_20x7x8x2x3_complex128
    pytest -s -v test_asinh.py::test_p_asinh_input_20x7x8x2x3_complex64

Describe the expected behavior / 预期结果 (Mandatory / 必填)

用例执行通过

Related log / screenshot / 日志 / 截图 (Mandatory / 必填)

../test_f_asinh.py::test_f_asinh_x_0d_float64 Fatal Python error: Segmentation fault

Thread 0x0000ffff4e9360e0 (most recent call first):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 379 in _recv
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 407 in _recv_bytes
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 250 in recv
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 470 in _handle_results
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/threading.py", line 870 in run
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/threading.py", line 926 in _bootstrap_inner
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/threading.py", line 890 in _bootstrap

Thread 0x0000ffff4e1350e0 (most recent call first):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 422 in _handle_tasks
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/threading.py", line 870 in run
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/threading.py", line 926 in _bootstrap_inner
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/threading.py", line 890 in _bootstrap

Thread 0x0000ffff4d9340e0 (most recent call first):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 413 in _handle_workers
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/threading.py", line 870 in run
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/threading.py", line 926 in _bootstrap_inner
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/threading.py", line 890 in _bootstrap

Thread 0x0000ffff4d1330e0 (most recent call first):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 379 in _recv
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 407 in _recv_bytes
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 250 in recv
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/managers.py", line 819 in _callmethod
  File "<string>", line 2 in get
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/tbe/common/repository_manager/utils/multiprocess_util.py", line 91 in run
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/threading.py", line 926 in _bootstrap_inner
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/threading.py", line 890 in _bootstrap

Current thread 0x0000ffffaa38e010 (most recent call first):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/mindspore/common/api.py", line 1151 in __call__
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/mindspore/ops/composite/base.py", line 379 in after_grad
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/mindspore/common/api.py", line 121 in wrapper
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/mindspore/ops/composite/base.py", line 392 in after_grad
  File "/data3/CT/jenkins-slave/workspace/1980b_mindspore_ascend_opensource/MindSporeTest/share/grad.py", line 54 in construct
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/mindspore/nn/cell.py", line 475 in _run_construct
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/mindspore/nn/cell.py", line 699 in __call__
  File "/data3/CT/jenkins-slave/workspace/1980b_mindspore_ascend_opensource/MindSporeTest/share/grad.py", line 34 in __call__
  File "/data3/CT/jenkins-slave/workspace/1980b_mindspore_ascend_opensource/MindSporeTest/share/ops/functional/asinh_ops.py", line 59 in grad_mindspore_impl
  File "/data3/CT/jenkins-slave/workspace/1980b_mindspore_ascend_opensource/MindSporeTest/share/ops/functional/asinh_ops.py", line 104 in grad_cmp
  File "/data3/CT/jenkins-slave/workspace/1980b_mindspore_ascend_opensource/MindSporeTest/operations/test_f_asinh.py", line 210 in test_f_asinh_x_0d_float64
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/python.py", line 167 in pytest_pyfunc_call
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/manager.py", line 87 in <lambda>
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/manager.py", line 93 in _hookexec
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/hooks.py", line 286 in __call__
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/python.py", line 1445 in runtest
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/runner.py", line 134 in pytest_runtest_call
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/manager.py", line 87 in <lambda>
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/manager.py", line 93 in _hookexec
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/hooks.py", line 286 in __call__
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/runner.py", line 210 in <lambda>
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/runner.py", line 237 in from_call
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/runner.py", line 210 in call_runtest_hook
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/runner.py", line 185 in call_and_report
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/runner.py", line 99 in runtestprotocol
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/runner.py", line 84 in pytest_runtest_protocol
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/manager.py", line 87 in <lambda>
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/manager.py", line 93 in _hookexec
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/hooks.py", line 286 in __call__
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/main.py", line 271 in pytest_runtestloop
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/manager.py", line 87 in <lambda>
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/manager.py", line 93 in _hookexec
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/hooks.py", line 286 in __call__
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/main.py", line 247 in _main
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/main.py", line 197 in wrap_session
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/main.py", line 240 in pytest_cmdline_main
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/manager.py", line 87 in <lambda>
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/manager.py", line 93 in _hookexec
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/pluggy/hooks.py", line 286 in __call__
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/site-packages/_pytest/config/__init__.py", line 93 in main
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/bin/pytest", line 8 in <module>
/bin/sh: line 1: 2977437 Segmentation fault      (core dumped) pytest -sv --timeout 900 ../test_f_asinh.py::test_f_asinh_x_0d_float64
Process ForkServerPoolWorker-4:
Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 127, in worker
    put((job, i, result))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 132, in worker
    put((job, i, (False, wrapped)))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
Process ForkServerPoolWorker-9:
Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 127, in worker
    put((job, i, result))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 132, in worker
    put((job, i, (False, wrapped)))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
Process ForkServerPoolWorker-3:
Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 127, in worker
    put((job, i, result))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 132, in worker
    put((job, i, (False, wrapped)))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
Process ForkServerPoolWorker-5:
Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 127, in worker
    put((job, i, result))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 132, in worker
    put((job, i, (False, wrapped)))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
Process ForkServerPoolWorker-2:
Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 127, in worker
    put((job, i, result))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 132, in worker
    put((job, i, (False, wrapped)))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
Process ForkServerPoolWorker-8:
Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 127, in worker
    put((job, i, result))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 132, in worker
    put((job, i, (False, wrapped)))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
Process ForkServerPoolWorker-6:
Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 127, in worker
    put((job, i, result))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 132, in worker
    put((job, i, (False, wrapped)))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
Process ForkServerPoolWorker-7:
Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 127, in worker
    put((job, i, result))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/pool.py", line 132, in worker
    put((job, i, (False, wrapped)))
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/queues.py", line 364, in put
    self._writer.send_bytes(obj)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
/home/ci/miniconda3/envs/torch_1.12_op3.7/lib/python3.7/multiprocessing/semaphore_tracker.py:144: UserWarning: semaphore_tracker: There appear to be 95 leaked semaphores to clean up at shutdown
  len(cache))

Special notes for this issue/备注 (Optional / 选填)

评论 (5)

tanxinglian 创建了Bug-Report
tanxinglian 添加了
 
kind/bug
标签
tanxinglian 添加了
 
sig/ops
标签
tanxinglian 添加了
 
attr/function
标签
tanxinglian 添加了
 
v2.3.0
标签
展开全部操作日志

初步分析如下:
1、框架切换流程后不支持float64,(切换流程PR:62720),报错如下:
输入图片说明
2、框架新增退避流程后,退避到cpu执行报错coredump(退避流程PR:62944)。

i-robot 添加了
 
gitee
标签

2.3 分支Ascend后端Pynative不支持float64类型,需要转需求。

gaoshuanglong 里程碑B-SIG-Kit 修改为B-SIG-OPS
gaoshuanglong 关联分支设置为r2.3
liangchenghui 修改了标题
panzhihui 任务状态TODO 修改为ACCEPTED

2024-01-15 CCB 结论:数据类型支持不全问题,转需求处理。

liangchenghui 里程碑B-SIG-OPS 修改为未设置
liangchenghui 任务类型Bug-Report 修改为RFC
liangchenghui 添加了
 
ccb/rfc
标签
liangchenghui 添加了
 
ccb/rfc
标签
tanxinglian 修改了标题

图模式走kbk也不支持

登录 后才可以发表评论

状态
负责人
项目
里程碑
Pull Requests
关联的 Pull Requests 被合并后可能会关闭此 issue
分支
开始日期   -   截止日期
-
置顶选项
优先级
预计工期 (小时)
参与者(4)
6561470 liangchenghui 1584762793
Python
1
https://gitee.com/mindspore/mindspore.git
git@gitee.com:mindspore/mindspore.git
mindspore
mindspore
mindspore

搜索帮助