Hi there,
I have succesfully gotten my python application to a working stage but I am struggling to enable the NPU delegation for my tflite-runtime. I have read over all the documents I know on torizon pages and torizon-samples/tflite-rtsp and have not come up with anything.
How do I modify my Dockerfile | torizonPackages.json | requirements.txt and docker-compose.yml files to ensure this can occur? The hardware I have does support CPU, GPU & NPU acceleration.
Yes I can run the tflite example, pushing the image to my device and then running it. For my application code I used a torizon ide python template, so how do I alter my code using either the Dockerfile, requirements.txt or docker-compose etc
Hi Leon,
How do I run my default Docker & Docker.debug container to make the dependencies from the recipes folder install correctly. I have attached my two Docker files to this message.
Help on this issue would be much appreciated as I have been at this for a long time.
Please correct me I get this differently, I understand you are asking how to build tflite-rtsp demo which is using npu, from torizon-sample using VS code extension V2.
I am not sure about exact changes as we also have not tested at our end but ideally if you copy the required folder to existing extension project structure and modify Dockerfile and Dockerfile.debug accordingly then it should build fine.
Note the Dockerfile for tflite-rtsp uses stage build, same we need to replicate. Dockerfile.debug (5.9 KB)
You may required to make some more changes at the end to deploy right python3 program.I have used python3 console template to show changes required.
Hi Ritesh,
I have tried you recommendation and the application built but I am now getting this error:
“Exception has occurred: ImportError
libtensorflow-lite.so.2.9.1: cannot open shared object file: No such file or directory
File “/usr/local/lib/python3.11/dist-packages/tflite_runtime/interpreter.py”, line 33, in
from tflite_runtime import _pywrap_tensorflow_interpreter_wrapper as _interpreter_wrapper
File “/home/agriai/Torizon/mvagriai/src/services.py”, line 23, in
import tflite_runtime.interpreter as tflite
File “/home/agriai/Torizon/mvagriai/src/main.py”, line 26, in
import services
ImportError: libtensorflow-lite.so.2.9.1: cannot open shared object file: No such file or directory”
I have attached the python application folder below for your review.
Please note you may required to adapt at various places accordingly to your use case. At the moment we only copied the required folder and build the image successfully.
When you deploy image you also need to pass right permission for accessing device, please check torizon-sample repo README.
For error you get can you share exact steps you have followed, additionally share main.py also.
Thanks for sharing error and zip, allow me some time to check in detail and get back to you.
I will suggest you to use the one I shared where we modified the main.py with content of ibject-detection.py to verify in first place and then modify as per use case.
coming to error you shared, seems like it failed connecting to device.
Can you check on device if you see the deployed docker image
docker images
then test starting docker image using below command
I just tried connecting over ssh and I recieved this error: The authenticity of host ‘192.168.1.26 (192.168.1.26)’ can’t be established.
ED25519 key fingerprint is SHA256:2WwyyPT70cBT/JdBb027FuE/+laQo+/wk2s8R2WxPCU.
This key is not known by any other names
Are you sure you want to continue connecting (yes/no/[fingerprint])? no
Host key verification failed.
yes, I can confirm similar behaviour. For me if I build outside VSCode I can see vx-delegate but not with VSCode. Yet to find reason for same. Not sure if we hit similar issue like this post Tflite RTSP demo for torizoncore not working - #6 by leon.tx
I will check internally with concerned person and revert back.
Hi Ritesh,
That’s great because I have been struggling for a long time and I have gotten the gpu delegation (“tflite.load_delegate(‘/usr/lib/libvx_delegate.so’)”)to work but not the NPU which is critical for my application and I am under time pressure. So anything would help ASAP.
Quick update, can you try copying mobilenet_v1_1.0_224_quant.tflite from usr/bin/tensorflow-lite-2.9.1/examples to same directory of main.py and update main.py as below to load new tflite model.
# Create the tensorflow-lite interpreter
self.interpreter = tf.Interpreter(model_path="mobilenet_v1_1.0_224_quant.tflite",
experimental_delegates=delegates)
Please test and let us know if this work. Also please share error logs you get to further check.
Did you get time to check? Can you share if you are able to run with NPU.
Please note if you are using USB Camera for testing please update Torizon OS to 6.4.0. We tested with building by our own Docker image using same Dockerfile which we shared with you and NPU is working good.
docker run -it --entrypoint=bash --rm -p 8554:8554 -v /dev:/dev -v /tmp:/tmp -v /run/udev/:/run/udev/ --device-cgroup-rule='c 4:* rmw' --device-cgroup-rule='c 13:* rmw' --device-cgroup-rule='c 199:* rmw' --device-cgroup-rule='c 226:* rmw' --device-cgroup-rule='c 81:* rmw' -e ACCEPT_FSL_EULA=1 -e CAPTURE_DEVICE=/dev/video2 -e USE_HW_ACCELERATED_INFERENCE=1 -e USE_GPU_INFERENCE=0 --name tflite-rtsp rkt0589/tflite-rtsp:tc6
root@ddcb0819ffac:/home/torizon# ls
src
root@ddcb0819ffac:/home/torizon# cd src/
root@ddcb0819ffac:/home/torizon/src# python3 main.py
[ WARN:0@4.096] global ./modules/videoio/src/cap_gstreamer.cpp (1405) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1
Vx delegate: allowed_cache_mode set to 0.
Vx delegate: allowed_builtin_code set to 0.
Vx delegate: error_during_init set to 0.
Vx delegate: error_during_prepare set to 0.
Vx delegate: error_during_invoke set to 0.
WARNING: Fallback unsupported op 32 to TfLite
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
W [HandleLayoutInfer:272]Op 162: default layout inference pass.
Sorry for the late reply, I have been away on a work trip. Yes I finally got it working after I used a quantised tflite model (uint8), after making a tonne of changes to the Dockerfile scripts. Thank you!
I cannot get the torizon OS to build and I am getting multiple errors! is there any advice you can give me to get the camera working? Also I want to be able to eventually get the MPI CSI-2 Driver to work for the VC MIPI IMX327C sensor. Here is the repo for it but it appears to be for only the verdin i.MX8M Mini and Dahlia carrier board. I am using the Verdin I.MX8M Plus, and the Mallow Carrier board, I am planning on integrating the mismatch in wiring of the MIPI lanes in a daughter PCB board.
It would be great to get this working and I would be happy to use the Toradex contracting services to do it if at all possible?
I cannot get the torizon OS to build and I am getting multiple errors! is there any advice you can give me to get the camera working? Also I want to be able to eventually get the MPI CSI-2 Driver to work for the VC MIPI IMX327C sensor. Here is the repo for it but it appears to be for only the verdin i.MX8M Mini and Dahlia carrier board. I am using the Verdin I.MX8M Plus, and the Mallow Carrier board, I am planning on integrating the mismatch in wiring of the MIPI lanes in a daughter PCB board.
Can you please write or create new post for this question and we will check accordingly. Also I don’t see any link so please do check to share link.
for now I am marking this post solved.
Hi Ritesh,
I made a new topic on the Toradex Community page which follows:
Hi there,
I have been trying for over three days now to add the yocto project layer for the camera drivers to work on my Verdin IMX8M+ using the AR0521 camera. I keep running into issues with the build after I have added the necessary layer as per the instructions for BSP6. May I please have a compatible image so I can install using the Easy Installer, sent to me so I can at least get the camera running as my application is time critical?
I am having serious trouble building the image with the recommended layer from both the Torizon and Linux Pages for the AR0521 Camera. Is there any way you have an existing image you can share with me?