I’d like to test inference on a TensorFlow Lite model I’ve loaded into an Android project.
I have some inputs generated in a Python environment I’d like to save to a file, load into my Android app and use for TFLite inference. My inputs are somewhat large, one example is:
<class 'numpy.ndarray'>, dtype: float32, shape: (1, 596, 80)
I need some way of serialising this ndarray and loading this into Android.
More information on TFLite inference can be found here. In essence, this should be a multi-dimensional array of primitive floats, or a ByteBuffer.
What is the most simple way to:
- Serialise this ndarray on the Python side
- Deserialise this blob on the Java side from a file
Thanks!
Advertisement
Answer
I figured this out in the end, there’s a handy Java library called JavaNpy that allows you to open .npy files in Java, and therefore Android.
On the Python side I saved a flattened .npy
in the normal way:
data_flat = data.flatten() print(data_flat.shape) np.save(file="data.npy", arr=data_flat)
In Android I placed this into the assets
folder.
I then loaded it into JavaNpy:
InputStream stream = context.getAssets().open("data.npy") Npy npy = new Npy(stream); float[] npyData = npy.floatElements();
And finally converted it into a TensorBuffer:
int[] inputShape = new int[]{1, 596, 80}; //the data shape before I flattened it TensorBuffer tensorBuffer = TensorBuffer.createFixedSize(inputShape, DataType.FLOAT32); tensorBuffer.loadArray(npyData);
I then used this tensorBuffer for inference on my TFLite model.