Skip to content
Advertisement

Loading Python 2D ndarray into Android for inference on TFLite

I’d like to test inference on a TensorFlow Lite model I’ve loaded into an Android project.

I have some inputs generated in a Python environment I’d like to save to a file, load into my Android app and use for TFLite inference. My inputs are somewhat large, one example is:

<class 'numpy.ndarray'>, dtype: float32, shape: (1, 596, 80)

I need some way of serialising this ndarray and loading this into Android.

More information on TFLite inference can be found here. In essence, this should be a multi-dimensional array of primitive floats, or a ByteBuffer.

What is the most simple way to:

  • Serialise this ndarray on the Python side
  • Deserialise this blob on the Java side from a file

Thanks!

Advertisement

Answer

I figured this out in the end, there’s a handy Java library called JavaNpy that allows you to open .npy files in Java, and therefore Android.

On the Python side I saved a flattened .npy in the normal way:

JavaScript

In Android I placed this into the assets folder.

I then loaded it into JavaNpy:

JavaScript

And finally converted it into a TensorBuffer:

JavaScript

I then used this tensorBuffer for inference on my TFLite model.

User contributions licensed under: CC BY-SA
7 People found this is helpful
Advertisement