Is CoreML fast? Compare inference time with Pytorch Mobile

We tried to compared

I want to run a model that cannot be converted to CoreML on iOS

I think that some models written in Pytorch have operations that CoreMLTool does not support, and it is difficult to convert to CoreML.
In such a case, you can run the model on iOS by executing TorchScript on iOS.

CoreML seems to have a neural engine and is optimized for the device, but how different is the execution speed?

Compare with Yolov5

Use this for Pytorch Mobile

Use this for CoreML

The input image is the first cake image in the article.
The time from resizing the input image to NMS processing was measured.

result

torch:
0.91 seconds

coreml:
0.23 seconds

After all fast CoreML.
However, CoreML takes a few seconds to initialize, while torchScript takes very little.
Also, torch can be inferred in less than 1 second.
In cases where processing speed is not severely required, running a model with difficult CoreML conversion on torchMobile may be an option.

🐣

I’m a freelance engineer.
Work consultation
Please feel free to contact us with a brief development description.
rockyshikoku@gmail.com

I am making an app that uses Core ML and ARKit.
We send machine learning / AR related information.

GitHub

Twitter
Medium

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store