so, here my confusion is with rgba().size, as to why it was returning me the previewSize of the frame instead of giving me the resolution I have set using the setMaxFrameSize (which should be setting the frame size for OpenCV) and the rgba().size() in my logs was supposed to return that resolution I have set?
Thanks a lot for getting back, but, the link you have shared was posted by me in stack overflow seeking help, but I was not able to get the appropriate answer there, so looking to find any solution for the issue in the open CV forum here, can u please help me resolve this issue?
yes I think thats what I was seeing, So the device I am using has a 1280x800 screen size and I think thats why it was taking the 1280x720 as the inputFrame resolution?
but when I tried to change the previewSize of the camera screen to something smaller like “720x480” the inputFrame.rgba().size() was giving me that previewSize where It was supposed to give the frameSize I have set using the setMaxFrameSize(3840,2160) (this setMaxFrameSize should give a resolution which is less than or equal to the value I have set I was hoping this could give a value higher than 1280x720 considering the capabilities of the camera of my device?) More context:
My aim here was to achieve a higher resolution inputFrame as currently my code was giving me a 1280x720 but I have a device which can support higher resolutions as it is having a camera of 13MP.