Sorry, you must verify to complete this action. Please click the verification link in your email. You may re-send via your
profile
.
hi, i have problem to import python inference api in latest versions (R2, R2.01).
Python 3.6.5 |Anaconda, Inc.| (default, Apr 29 2018, 16:14:56)
[GCC 7.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import openvino
>>> from openvino import inference_engine
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/intel/openvino_2019.2.242/python/python3.6/openvino/inference_engine/__init__.py", line 1, in <module>
from .ie_api import *
ImportError: /opt/intel/openvino_2019.2.242/python/python3.6/openvino/inference_engine/ie_api.so: undefined symbol: PyFPE_jbuf
i followed all install steps correclty including running demos and configure(<INSTALL_DIR>/bin/setupvars.sh).
it works for older versions bellow R2.01 but i need to use R2.x version to optimize BERT NLP model.
i am testing on AWS instance (Ubuntu16.04) with t2.xlarge with following cpus (no gpu)
Architecture: x86_64
CPU op-mode(s): 32-bit, 64-bit
Byte Order: Little Endian
CPU(s): 4
On-line CPU(s) list: 0-3
Thread(s) per core: 1
Core(s) per socket: 4
Socket(s): 1
NUMA node(s): 1
Vendor ID: GenuineIntel
CPU family: 6
Model: 79
Model name: Intel(R) Xeon(R) CPU E5-2686 v4 @ 2.30GHz
Stepping: 1
CPU MHz: 2300.036
BogoMIPS: 4600.07
Hypervisor vendor: Xen
Virtualization type: full
L1d cache: 32K
L1i cache: 32K
L2 cache: 256K
L3 cache: 46080K
NUMA node0 CPU(s): 0-3
Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx pdpe1gb rdtscp lm constant_tsc rep_good nopl xtopology pni pclmulqdq ssse3 fma cx16 pcid sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm abm invpcid_single kaiser fsgsbase bmi1 avx2 smep bmi2 erms invpcid xsaveopt
Dear kim, donggyu,
PYTHONPATH is set when you run setupvars.sh. Here is the Windows version (setupvars.bat):
set PYTHONPATH=%INTEL_OPENVINO_DIR%\python\python%Major%.%Minor%;%INTEL_OPENVINO_DIR%\python\python3;%PYTHONPATH%
echo PYTHONPATH=%PYTHONPATH%
However you're right. This error undefined symbol: PyFPE_jbuf which I missed earlier has nothing to do with PYTHONPATH. Googling around though, it has to do with numpy. In fact, this error has nothing to do with OpenVino (or Model Optimizer).
If you google undefined symbol: PyFPE_jbuf you will find several hits.
Here is a
3 year old Stack overflow post
. Three years seems like an eternity in tech but this issue could very well be related to numpy (which OpenVino does depend on, i.e. the Calibration tools and Model Optimizer).
I believe that there is something buggy with your environment. Fix that, and you won't have this issue.
Thanks,
Shubha
Shubha R. (Intel) wrote:
Dear kim, donggyu,
Your PYTHONPATH is incorrect. Whenever this fails,
>>> import openvino
>>> from openvino import inference_engine
That is the reason. Make sure PYTHONPATH is set at the
correct directory level
to be able to import openvino stuff.
Hope it helps,
Shubha
Dear, Shubha. thank you for you comment.
can you give me some more detailed instruction?
The official instruction of openVINO python api describes i can set environment as follows :
-------------------------------------------------------------------------------------------------------------------
(https://docs.openvinotoolkit.org/latest/_inference_engine_ie_bridges_python_docs_api_overview.html)
Setting Up the Environment
To configure the environment for the Inference Engine Python* API, run:
On Ubuntu* 16.04 or 18.04, CentOS* 7.4 or macOS* 10.x: source <INSTALL_DIR>/bin/setupvars.sh .
On Windows* 10: call <INSTALL_DIR>\deployment_tools\inference_engine\python_api\setenv.bat
The script automatically detects latest installed Python* version and configures required environment if the version is supported. If you want to use certain version of Python*, set the environment variable PYTHONPATH=<INSTALL_DIR>/deployment_tools/inference_engine/python_api/<desired_python_version> after running the environment configuration script.
-------------------------------------------------------------------------------------------------------------------
But i found out "PYTHONPATH=<INSTALL_DIR>/deployment_tools/inference_engine/python_api/<desired_python_version>" is not available in latest versions (there is no such directory in my installation).
So i set the path to "PYTHONPATH=<INSTALL_DIR>/python/<desired_python_version>/openvino/inference_engine" as instructed in
https://software.intel.com/es-es/node/814765
Is the above process incorrect?
In addition, i currently got import error below :
-------------------------------------------------------------------------------------------------------------------
ImportError: /opt/intel/openvino_2019.2.242/python/python3.6/openvino/inference_engine/ie_api.so: undefined symbol: PyFPE_jbuf
-------------------------------------------------------------------------------------------------------------------
As far as i know, i think this error indicates there is a problem in compiled "ie_api.so", not its path.
Thank for your comment, again.
kim, donggyu
Dear kim, donggyu,
PYTHONPATH is set when you run setupvars.sh. Here is the Windows version (setupvars.bat):
set PYTHONPATH=%INTEL_OPENVINO_DIR%\python\python%Major%.%Minor%;%INTEL_OPENVINO_DIR%\python\python3;%PYTHONPATH%
echo PYTHONPATH=%PYTHONPATH%
However you're right. This error undefined symbol: PyFPE_jbuf which I missed earlier has nothing to do with PYTHONPATH. Googling around though, it has to do with numpy. In fact, this error has nothing to do with OpenVino (or Model Optimizer).
If you google undefined symbol: PyFPE_jbuf you will find several hits.
Here is a
3 year old Stack overflow post
. Three years seems like an eternity in tech but this issue could very well be related to numpy (which OpenVino does depend on, i.e. the Calibration tools and Model Optimizer).
I believe that there is something buggy with your environment. Fix that, and you won't have this issue.
Thanks,
Shubha
Python 3.6.8 (default, Oct 7 2019, 12:59:55)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from openvino.inference_engine import IENetwork, IEPlugin
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/intel/openvino_2019.3.334/python/python3.6/openvino/inference_engine/__init__.py", line 1, in <module>
from .ie_api import *
ImportError: /opt/intel/openvino_2019.3.334/python/python3.6/openvino/inference_engine/ie_api.so: undefined symbol: PyFPE_jbuf
>>>
May be a problem with numpy, but this is my mapping of native openvino to the docker container,So it may not be found, can you help me?
Zeng, Xiaohui (Intel) wrote:
Python 3.6.8 (default, Oct 7 2019, 12:59:55)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from openvino.inference_engine import IENetwork, IEPlugin
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/intel/openvino_2019.3.334/python/python3.6/openvino/inference_engine/__init__.py", line 1, in <module>
from .ie_api import *
ImportError: /opt/intel/openvino_2019.3.334/python/python3.6/openvino/inference_engine/ie_api.so: undefined symbol: PyFPE_jbuf
>>>
May be a problem with numpy, but this is my mapping of native openvino to the docker container,So it may not be found, can you help me?
I solved it and the solution is as follows:
1. Copy the openvino installation package to the docker container
2. Run the ./install.sh file in the installation package
3. Select the repair option or press Enter
Hi,
I still facing the same issue it's seem:
Traceback (most recent call last):
File "/opt/intel/openvino/deployment_tools/inference_engine/samples/python_samples/classification_sample_async/classification_sample_async.py", line 25, in <module>
from openvino.inference_engine import IENetwork, IECore
File "/opt/intel/openvino_2019.3.376/python/python3.6/openvino/inference_engine/__init__.py", line 1, in <module>
from .ie_api import *
ImportError: /opt/intel/openvino_2019.3.376/python/python3.6/openvino/inference_engine/ie_api.so: undefined symbol: PyFPE_jbuf
This issue seem related to Cython and numpy. I use virtualenv, conda changed version many time but sitll block.
With Openvino 2018.*, I didn't have this issue.
Thanks for help.
I was getting this error :
from openvino.inference_engine import IENetwork, IEPlugin
File "/opt/intel/openvino_2020.2.120/python/python3.6/openvino/inference_engine/__init__.py", line 1, in <module>
from .ie_api import *
ImportError: /opt/intel/openvino_2020.2.120/python/python3.6/openvino/inference_engine/ie_api.so: undefined symbol: PyFPE_jbuf
while I was trying out the smart video workshop and I tried to execute the tutorial1.py file.
cd $SV/object-detection/Python python3 tutorial1.py -i $SV/object-detection/Cars\ -\ 1900.mp4 -m $SV/object-detection/mobilenet-ssd/FP32/mobilenet-ssd.xml
l solved by simply running the setupvars.sh file using python3.5 version instead of 3.6.
source /opt/intel/openvino/bin/setupvars.sh -pyver 3.5
hi,
i was geting this error with the latest versions (openvino_2020.3.194) :
Python 3.6.8 |Anaconda, Inc.| (default, Dec 30 2018, 01:22:34)
[GCC 7.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from openvino.inference_engine import IECore
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/hsj/intel/openvino_2020.3.194/python/python3.6/openvino/inference_engine/__init__.py", line 1, in <module>
from .ie_api import *
ImportError: /home/hsj/intel/openvino_2020.3.194/python/python3.6/openvino/inference_engine/ie_api.so: undefined symbol: PyFPE_jbuf
I try to solved it and the solution is as follows:
1 . updata numpy ( version==1.18.5)
2. updata cython : "python setup.py cython"
but all fail.
Community support is provided during standard business hours (Monday to Friday 7AM - 5PM PST). Other contact methods are available here.
Intel does not verify all solutions, including but not limited to any file transfers that may appear in this community. Accordingly, Intel disclaims all express and implied warranties, including without limitation, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement, as well as any warranty arising from course of performance, course of dealing, or usage in trade.
For more complete information about compiler optimizations, see our Optimization Notice.