Is CoreML fast? Compare inference time with Pytorch Mobile
We tried to compared
I want to run a model that cannot be converted to CoreML on iOS
I think that some models written in Pytorch have operations that CoreMLTool does not support, and it is difficult to convert to CoreML.
In such a case, you can run the model on iOS by executing TorchScript on iOS.
CoreML seems to have a neural engine and is optimized for the device, but how different is the execution speed?
Compare with Yolov5
Use this for Pytorch Mobile
GitHub - pytorch/ios-demo-app: PyTorch iOS examples
A list of iOS apps built on the powerful PyTorch Mobile platform. HelloWorld is a simple image classification…
Use this for CoreML
GitHub - john-rocky/CoreML-YOLOv5: A sample project how to use YOLOv5 in iOS
A sample project how to use YOLOv5 in iOS. You can run model on your image from photo library. 1, Clone or download…
The input image is the first cake image in the article.
The time from resizing the input image to NMS processing was measured.
After all fast CoreML.
However, CoreML takes a few seconds to initialize, while torchScript takes very little.
Also, torch can be inferred in less than 1 second.
In cases where processing speed is not severely required, running a model with difficult CoreML conversion on torchMobile may be an option.
I’m a freelance engineer.
Please feel free to contact us with a brief development description.
I am making an app that uses Core ML and ARKit.
We send machine learning / AR related information.