Trouble with DFT Complex output

I am trying to do a 2D FFT with complex input and output and I am not getting the expected results in the imaginary part. Using OpenCV 4.5.1 on Debian 11.

Here is my code:

    const cv::Size imgSize = { 256, 256 };
    cv::Mat I = cv::Mat::zeros(imgSize, CV_64FC1);
    cv::circle(I, {imgSize.width/2, imgSize.height/2}, imgSize.height/2-1,
               cv::Scalar::all(1.0), cv::FILLED, cv::LINE_4);
    cv::imshow("Intensity", I);
    cv::Mat phase = cv::Mat::zeros(imgSize, CV_64FC1);
    cv::imshow("Phase", phase);

    // Make complex matrix
    cv::Mat planesIn[] = { I, phase };
    cv::Mat complexFieldIn = cv::Mat::zeros(imgSize, CV_64FC2);
    cv::merge(planesIn, 2, complexFieldIn);

    cv::Mat complexFieldOut = cv::Mat::zeros(imgSize, CV_64FC2);
    cv::dft(complexFieldIn, complexFieldOut, cv::DFT_COMPLEX_INPUT|cv::DFT_COMPLEX_OUTPUT);
    std::cout << complexFieldOut.rows << "x" << complexFieldOut.cols
              << " " << complexFieldOut.depth() << " " << complexFieldOut.type()
              << " " << complexFieldOut.elemSize1()
              << " " << complexFieldOut.step1() << std::endl;

    fftshift(complexFieldOut); // (impl. copied from example code)

    cv::Mat planesOut[] = { cv::Mat::zeros(imgSize, CV_64FC1), cv::Mat::zeros(imgSize, CV_64FC1) };
    cv::split(complexFieldOut, planesOut);

    // Normalize for display
    cv::Mat real, imag;
    cv::normalize(planesOut[0], real, 0, 1, cv::NORM_MINMAX);
    cv::imshow("REAL", real);
    cv::normalize(planesOut[1], imag, 0, 1, cv::NORM_MINMAX);
    cv::imshow("IMAG", imag);

    // Compute magnitude
    cv::Mat magI;
    cv::magnitude(planesOut[0], planesOut[1], magI);

    // Normalize for display
    cv::normalize(magI, magI, 0, 10, cv::NORM_MINMAX);
    cv::imshow("Magnitude", magI);

Here is the output (normalized for display):

And here is the output from the equivalent in MATLAB using fft2:
(forum not letting me upload another image here)

What is happening with the imaginary part in the OpenCV implementation?

Here is what I’m expecting, from MATLAB:

I see now the trouble is with the test input, with drawing a circle using cv::circle, which puts it off by one pixel.

you might also wanna check whether matlab “shifts” the spectrum to put DC in the middle, or not.

MATLAB requires an fftshift to move the origin to the center just like numpy and opencv (though OpenCV doesn’t provide the function)

I am still getting some bizarre results when adding an imaginary component. Say I put in a gradient for the “phase” array above.
What I am expecting is this that the spot in the center moves over some number of pixels, depending on how big the gradient is.
But what I am getting is this:
image

Is it possible OpenCV wraps the imaginary part (like -1 to 1 or something?). It’s clearly not ignoring it, but it’s not doing what I would expect.

Again desired result from MATLAB:
image
(spot is shifted over 30 pixels)

Ah it was the input again. I was missing this line from the MATLAB to convert to complex:
P = I.*exp(1j*Phase);
Any way to do this with OpenCV matrices without a loop? Pretty straightforward to write my own matrix exp() function using std::complex.