Mat image1 = cv::imread(“frame0.png”); // some image
Mat input1 = cv::dnn::blobFromImage(image1, 1.0, Size(input_width, input_height), Scalar(), true); // pre-processing for a model
I need to export input1 to an npy file or pb file. I know it can be done in python, but the size of the resulting file is huge, 2 mega bytes.
please explain, why, the context, so we can help better.
uncompressed, float image data. size makes a lot of sense to me. why would you expect it to be less, done from c++ ?
dig into the c++ unit test code, there’s a c++ helper to serialize to npy. but again, result won’t gwt smaller, i bet…
store to PNG or TIFF. I see no reason for the data to be numpy-native or protobuf-native.
Sorry for getting back late to this. The context is a pull request to OpenCV to test an optical flow model, and the test module expects the input data to be either in npy or pb format. This is the pull request for reference GSoC Add ONNX Support for GatherElements by Aser-Abdelfatah · Pull Request #1082 · opencv/opencv_extra · GitHub
Sorry for getting late to this. I need it in numpy-native or protobuf-native because it’s a test data for a model, and the testing module expects the test data to be either in numpy-native or protobuf-native format.