Camera positional tracking

I am working on a project where I need to detect the position of the camera. so for example when the camera moves I should be able to detect the movement and plot it somewhere, openGL for instance.

ZED cameras already have this feature, it is called positional tracking. I am trying to implement the same concept but using my notebook camera.

can someone direct me to the right approach to tackle the problem ?
here is a link of a youtube video demonstrating positional tracking using ZED Camera

the problem is called Simultaneous localization and mapping - Wikipedia

you should use a stereo camera for that because it already produces point clouds reliably.

if you absolutely have to use a single (=monocular) camera, you face a second problem: Structure from motion - Wikipedia

Thanks for the information
New smartphones like Samsung Galaxy S20+, they have a depth vision camera.

so If I pick one of those smartphones, can I use openCV in simultaneous localization ?
what do you think about breezySlam for instance ?

Please expound on the setting and constraints. If the environment is static and has some sort of landmarks, you can simplify the problem.

yes the environment is static but there are no landmarks
my plan is to depend on natural features and I know that a lot of libraries are using SURF, SIFT and ORB for feature detection

but my goal is to detect if the localization can be done with a monocular camera, also if you happen to know libraries or modules that might support that

The ROS community is a good place to start.