How to use output of dnn::net::forward()

I am trying to use dnn::hed for edge detection for detecting hard to find edges in an image. I am giving below the code snippet that I am using. I am using Opencv 4.10 on Windows 10, VS2022.

// gray is the input Mat
cv::dnn::Net net = cv::dnn::readNetFromCaffe(“F:\AILineFitment\deploy.prototxt”, “F:\AILineFitment\hed\hed_pretrained_bsds.caffemodel”);
// HED expects 3-channel input; duplicate grayscale to RGB
cvtColor(gray, rgb, COLOR_GRAY2BGR);
Mat blob = cv::dnn::blobFromImage(rgb, 1.0, Size(gray.cols, gray.rows), Scalar(104.00698793, 116.66876762, 122.67891434), true, false);
cv::Mat temp = Mat::zeros(gray.rows, gray.cols, CV_MAKETYPE(CV_8U, 3));
if (blob.size >= 0)
{
net.setInput(blob);
hed = net.forward();
// Extract channel 1
Mat green(24, 24, CV_32F, blob.ptr(0, 1)); // image 0, channel 1
cv::resize(green, temp, Size(temp.cols, temp.rows));
}
I expect to get edge detected image in temp but get the original image back in temp.
I need to use the output Mat temp for further line construction. However, I am unable to get a meaningful image. What am I missing?