Model input dimension

im trying to use opencv to do face recognition using facenet512. i converted the model to onnx format using tf2onnx. i know that the input of the model should be an image like :(160,160,3). so i tried doing this using this script :

void convertDimention(cv::Mat input, cv::Mat &output)
{
    vector<cv::Mat> channels(3);
    cv::split(input, channels);

    int size[3] = { 112, 112, 3 };
    cv::Mat M(3, size, CV_32F, cv::Scalar(0));

    for (int i = 0; i < size[0]; i++) {
      for (int j = 0; j < size[1]; j++) {
        for (int k = 0; k < size[2]; k++) {
          M.at<float>(i,j,k) = channels[k].at<float>(i,j)/255;
        }
      }
    }
    M.copyTo(output);

}

after converting the image from (160,160) to (160,160,3) i still get this error :

error: (-215:Assertion failed) (int)_numAxes == inputs[0].size() in function ‘getMemoryShapes’

full code :

#include <iostream>

#include <opencv2/dnn.hpp>
#include "opencv2/core/core.hpp"
#include "opencv2/highgui/highgui.hpp"
#include "opencv2/imgproc/imgproc.hpp"

#include <vector>

using namespace std;


void convertDimention(cv::Mat input, cv::Mat &output)
{
    vector<cv::Mat> channels(3);
    cv::split(input, channels);

    int size[3] = { 160, 160, 3 };
    cv::Mat M(3, size, CV_32F, cv::Scalar(0));

    for (int i = 0; i < size[0]; i++) {
      for (int j = 0; j < size[1]; j++) {
        for (int k = 0; k < size[2]; k++) {
          M.at<float>(i,j,k) = channels[k].at<float>(i,j)/255;
        }
      }
    }
    M.copyTo(output);

}



int main()
{ 
    cv::Mat input,input2, output;

    input = cv::imread("image.png");
    cv::resize(input,input, cv::Size(160,160));
    convertDimention(input,input2);

    cv::dnn::Net net = cv::dnn::readNetFromONNX("facenet512.onnx");
    net.setPreferableBackend(cv::dnn::DNN_BACKEND_CUDA);
    net.setPreferableTarget(cv::dnn::DNN_TARGET_CUDA);
    cout << input.size << endl;
    cout << input2.size << endl;


    net.setInput(input2);
    output = net.forward();



}

I know that i’m doing this in the wrong way(since i’m new to this). Is there any other way to change the dimensions so that it fits the model input ?

thanks in advance. :blush:

please dont write loops like this. you also only copy a [112,112] slice

opencv’s dnn usually expects 4d blobs with NCHW order,
use dnn::blobFromImage() to convert it

can you give us a link to the tf model / code, so we can estimate, what it really wants here ?

[edit]
in the end, you probably need something like this:

    cv::dnn::Net net = cv::dnn::readNetFromONNX("facenet512.onnx");

    cv::Mat input = cv::imread("image.png");
    cv::Mat blob = cv::dnn::blobFromImage(input, 1.0, cv::Size(160,160));

    net.setInput(blob);
    output = net.forward().clone();
    // if you want to save the output for comparisons later, 
    // you must `clone()` it, else it gets overwritten by the next forward() pass !!!

i found the weights of the model in this link
I tried using dnn::blobFromImage() but it gave me the same output error.
facenet512 is a keras model based on InceptionResNetV2 so the input layer is something like this : inputs = tensorflow.keras.layers.Input(shape=(160, 160, 3))

Hi, I’m also trying to work with opencv and onnx. For the input layer, I used the flag " --inputs-as-nchw" of tf2onnx. Then the dataset would match the format produced by blobFromImage (nchw-format). I wonder if the input layer in really 3d or should be 4d? (You could check with GitHub - lutzroeder/netron: Visualizer for neural network, deep learning, and machine learning models the input and output shapes of your model)

2 Likes

hi again, looks like there is an additionnal dimension when i use netron the output is:
screen

as adife said using “inputs-as-nchw” solved the issue and netron outputs this now:
screen2

i just had to replace this :

model_proto, _ = tf2onnx.convert.from_keras(model, output_path='facenet512.onnx')

to this

nchw_inputs_list = [model.inputs[0].name]
model_proto, _ = tf2onnx.convert.from_keras(model, output_path='facenet512.onnx',inputs_as_nchw=nchw_inputs_list)

thank you guys for the help :star_struck:

2 Likes