VideoCapture() does not work with udp protocol

Good morning to everybody.

First of all, thank you in advance, I will tell you about the situation that I am in for weeks and I cannot find a solution.

I need a camera that uses the UDP protocol to be captured by opencv in order to analyze it, the problem is that there is no way for opencv to catch the udp protocol, I have tried several ways to tell opencv to use the udp protocol and not the tcp but there is no way.

The error I get is the following: [rtsp @ 000001fb154e2880] Nonmatching transport in server reply.

Say that the version I use of opencv is 4.6.0.

Thank you very much, any other questions do not hesitate to ask.

All the best

can you receive the stream using ffmpeg/ffplay from a terminal?

Of course, first of all thank you for the speed in answering, thank you very much.

The truth is that the call I make in the terminal is simple, I don’t use what is mentioned, my call is: -i rtsp://(user)(password)(ip)(port)/(and whatever type, live,onvif…)

From what you tell me I see that I am missing a fundamental part when making the call, I am sorry for the inconvenience but it is my first project, thank you very much.

Is VideoCapture failing to connect with TCP or you prefer to use UDP? I’m asking because VideoCapture should fall back to UDP if TCP is not available since Default FFMPEG VideoCapture backend to rtsp_flags=prefer_tcp by cudawarped · Pull Request #21561 · opencv/opencv · GitHub.

If you have a TCP/UDP source and you want to request UDP you should be able achieve this by setting the environmental variable


see FFmpeg Protocols Documentation

Thank you very much for the reply.

Well, let me tell you, the program is to be used with several cameras, the problem is that some are TCP and others UDP, therefore, I can use the ones that are TCP without problem, but not the UDP ones.

I have been reading and I have seen that perhaps by using third-party libraries it does not have UDP support, I am attaching the imports that I have made.

from pyimagesearch.centroidtracker import CentroidTracker
from pyimagesearch.trackableobject import TrackableObject
from import VideoStream
from import FPS
import numpy as np
import argparse
import imutils
import time
import dlib
import threading
import queue
import cv2
import os
import mysql.connector
from mysql.connector import Error
from datetime import datetime
from basededatos import BaseDeDatos
import interfazdefinitiva
import tkinter

Could it be that libraries like pyimagesearch are to blame for UDP not playing for me?

On the other hand, it does not let me declare what you have mentioned “OPENCV_FFMPEG_CAPTURE_OPTIONS=rtsp_transport;udp” since the option cannot be found for me. Thank you very much for your help, I am running out of time and I have not seen any other option than to ask for help.


If you are using OpenCV 4.6.0 and set the OPENCV_FFMPEG_CAPTURE_OPTIONS environmental variable to rtsp_flags;prefer_tcp,


set OPENCV_FFMPEG_CAPTURE_OPTIONS=rtsp_flags;prefer_tcp


export OPENCV_FFMPEG_CAPTURE_OPTIONS=rtsp_flags;prefer_tcp

before running your python script then VideoCapture will try a TCP connection but fall back to UDP.

If you are using OpenCV 4.7.0 this is the default and you shouldn’t need to do anything.

If this doesn’t work then confirm that you can connect to the UDP camera’s with VLC and/or FFmpeg/FFplay with the “exact” same string you are using in OpenCV and no other configuration difference (password etc.). Also confirm with wireshark or similar that you are streaming via UDP in case you are not and the issue isn’t the transport protocol.

Just out of interest which ones are UDP, I haven’t come accross any camera’s which don’t do both?


Thank you very much for the reply. The first thing I did was update opencv to version 4.7.0 and import it, then I opened the stream in VLC and it plays perfectly, the problem comes when I open it in opencv, it keeps giving me the error: “[rtsp @ 000001f57fc94040] Nonmatching transport in server reply”.

Then I’m sorry for my ignorance but I don’t quite understand where I have to put this "set OPENCV_FFMPEG_CAPTURE_OPTIONS=rtsp_flags;prefer_tcp
". In the code I don’t get the available option, directly when I put it it tells me that nothing has been declared, should I put it in the terminal? Sorry to be annoying, but I don’t understand why in all the forums they manage to solve it in such a way so simple and it is costing me a lot. Also clarify that I use ONVIF, I don’t know if it has something to do with it.

And on the subject of which cameras are the UDP are of the YooSee brand.

Thank you very much again.

If OpenCV 4.7.0 doesn’t work then the fallback to UDP is not working or its not a UDP issue.

Set it in the command line, but because you are using 4.7.0 and prefer_tcp isn’t working you can try

set OPENCV_FFMPEG_CAPTURE_OPTIONS=rtsp_transport;udp

although I wouldn’t expect this to be any different.

Thanks for the quickness.

Tell you, that indicating the line that you tell me in the CMD when executing the UDP does not throw the error, it does not open it directly and says that it has not been able to catch any FPS, I am going to check what is happening now and I tell you, because in VLC it works perfectly for me.

Thank you.

You need to check FFmpeg/FFplay. If that works then VideoCapture should work with the CAP_FFMPEG switch.

Perfect, I will tell you a little about the doubts that have arisen.

  1. Does OpenCV support the ONVIF2 protocol?

  2. Does FFMPEG support the ONVIF2 protocol?

I put the link that VLC is playing me perfectly but I can’t play it through OPENCV or FFPLAY, I don’t know if the command I’m using is correct.


Again, thank you very much for the kindness, you are being very kind.

I am not sure there is an ovvif2 protocol only onvif. I would guess onvif2 is just part of the url. If you can’t play thorugh FFplay then it won’t work with OpenCV with CAP_FFMPEG. If you can connect with GStreamer you could try CAP_GSTREAMER but you would have to build against gstreamer.

From a quick search it may be that you have to enable something in your camera for it to work

Either way if you can’t access your camera through FFplay or GStreamer then this is not an OpenCV issue.

Ok, for me to clarify I understand the following:

OpenCV and FFmpeg are two different things, but are they connected or are they totally independent?

With VLC I can open the link perfectly, to open it with OPENCV I do it through the terminal, with FFmpeg I also do it through the terminal, the procedure I do is to execute the .exe from the terminal and put: ffplay.exe -i and in the UDP link, too I have tried with ffmpeg.exe -i and the link but nothing.

Am I very lost?

Thank you very much and sorry for the inconvenience.

If you open VideoCapture by passing CAP_FFMPEG or if your on windows OS and you haven’t specified a specific capture back end then OpenCV will be using the FFmpeg libraries to decode the video source. Therefore if you cannot stream via FFmpeg you will not be able to via OpenCV with the default back end. I would investigate why FFmpeg which can open practically anything can’t access your camera. If you get it to work with FFmpeg it should then work with OpenCV.

Did you check the link I posted. It may be that you need to enable an extra option in your camera for FFmpeg to work.

There are other backends GStreamer etc. but you need to build OpenCV against their libraries and request them when opening a stream with VideoCapture.

Good afternoon.

I have not answered before because I have kept trying to get it and nothing, in my code do I have to import FFMPEG? I don’t understand the process, could you please explain the process in detail
and so I see what is wrong with me? I put my videocapture which contains the code “vs = cv2.VideoCapture(args[“input”])”

Do I have to put something else? Then the only thing I do is call the program and put the link as mentioned above. Something tells me that I am having a very big and basic error and I am not seeing it.

Now I’m going to try to activate what you mentioned, but I don’t think I have to see if my VLC reproduces it perfectly.

Thank you very much and sorry.

You don’t need to do anything with VideoCapture.

Your camera has to work with FFmpeg/FFplay.

If you cannot access your camera with FFmpeg then it won’t work with OpenCV.

So, and sorry for being so annoying.

Once I get it to be opened by FFmpeg, do I just have to run my code? Or how do I get my connected to FFmpeg.

Excuse my ignorance, there’s something I’m missing. I already had the option you mentioned activated, thank you very much.

Should it work like this without doing anything else?

No FFmpeg/FFplay is just a test to see the FFmpeg library can play your stream. The FFmpeg/FFplay applications are built using the FFmpeg libraries.

If FFmpeg/FFplay cannot play the rtsp stream OpenCV will not be able to play the stream because it also uses the FFmpeg libraries.

Does FFplay display the rtsp stream?

If so does VideoCapture now open the rtsp stream?

I just figured out how ffmpeg works, sorry it took me so long.

The problem is that FFmpeg doesn’t open the rtsp stream either, if I can open it with the same link in VLC, as I show in the screenshot, I run FFmpeg and then FFplay (I don’t know if I have to do it like this) and it stays as shown in the screenshot , without showing anything, I understand that directly with play, a window would have to open playing the video.

I will look for information on how to use FFmpeg because something tells me that I am using it wrong, in the screenshot that I have given you, can you see something that is wrong?

Thank you very much and greetings

I suspect that it is either the camera cannot work with FFmpeg or there is a setting in the camera or alternative URL to access the camera which you need to use in order to get it to work with FFmpeg.