Background
- Onnxruntime is an an open-source, high-performance inference engine that accelerates machine learning models in the Open Neural Network Exchange (ONNX) format.
- Building it for
android was a huge challenge. I have followed some patterns from termux-packages & after many days of hit-n-try, I was able to build it.
Custom recipe link
Constraints
My thoughts
I'm not a build expert on cmake, wheel etc. I tried my best to make it work and I am sure that the developers can make it more robust.
Please feel free to let me know your thoughts. Thank you
Background
androidwas a huge challenge. I have followed some patterns fromtermux-packages& after many days of hit-n-try, I was able to build it.Custom recipe link
onnxruntime& here is the recipe link which builds thecmakepart first & then it build the wheel.Constraints
numpy 1.26.5due to opencv 4.5.1 build fails with numpy 2.3.0 (all from current develop branch) #3203 issue as I also needopencvin my project.ndk 25band that problem is also written in the above mentioned issue.My thoughts
I'm not a build expert on cmake, wheel etc. I tried my best to make it work and I am sure that the developers can make it more robust.