Background estimation with OpenCV in Swift/ObjC

I was following this article to estimate the background across multiple video frames. It works great when I do it in Python with NumPy call:

medianFrame = np.median(frames, axis=0).astype(dtype=np.uint8)   

But in Swift or Objective-C I have to implement the algorithm manually because there is no OpenCV method to estimate the background, and this algorithm follows the article and super slow.

Is there a way to estimate background using OpenCV functions?

you can use getBackgroundImage() method of BackgroundSubtractor

see OpenCV: How to Use Background Subtraction Methods

Yes, this will work for me, moreover, I’m already using it for the foreground mask detection, it will be easy for me to add that call instead of calculating it manually. Thanks!