Is CoreML fast? Compare inference time with Pytorch Mobile

MLBoy
2 min readFeb 16, 2022

--

We tried to compared

I want to run a model that cannot be converted to CoreML on iOS

I think that some models written in Pytorch have operations that CoreMLTool does not support, and it is difficult to convert to CoreML.
In such a case, you can run the model on iOS by executing TorchScript on iOS.

CoreML seems to have a neural engine and is optimized for the device, but how different is the execution speed?

Compare with Yolov5

Use this for Pytorch Mobile

Use this for CoreML

The input image is the first cake image in the article.
The time from resizing the input image to NMS processing was measured.

result

torch:
0.91 seconds

coreml:
0.23 seconds

After all fast CoreML.
However, CoreML takes a few seconds to initialize, while torchScript takes very little.
Also, torch can be inferred in less than 1 second.
In cases where processing speed is not severely required, running a model with difficult CoreML conversion on torchMobile may be an option.

🐣

I’m a freelance engineer.
Work consultation
Please feel free to contact us with a brief development description.
rockyshikoku@gmail.com

I am making an app that uses Core ML and ARKit.
We send machine learning / AR related information.

GitHub

Twitter
Medium

--

--

MLBoy
MLBoy

No responses yet