Subprocess read issue in docker

I’m running nginx in a docker container which takes RTMP stream from my fisheye cameras and when detected exec_publish opens bash file with my python script that contains all the undistortion/stitching and transcoding. Everything nicely packaged into a docker file but, when I run python script directly from Docker/Exec it works and the second RTMP gets created ok, but when the same script gets executed from nginx/bash it fails reading from Subprocess…

    # NEW
    def read_frames():
        while True:
            in_bytes = input_process.stdout.read(width * height * 3)
            if not in_bytes:
                if DEBUG:
                    f = open("./log.txt", "w")
                    # FAILS here when automatically executed via nginx
                    f.write('read err')
                    f.close()
                break
            try:
                if DEBUG:
                    f = open("./log.txt", "w")
                    f.write("success")
                    f.close()
                frame_queue.put_nowait(in_bytes)
            except queue.Full:
                frame_queue.get() # Drop the oldest frame
                frame_queue.put(in_bytes)

    threading.Thread(target=read_frames, daemon=True).start()

I thought this could be due to the bash file permissions but granting full access in docker doesn’t make any difference. Has anybody faced similar issue? where should I be looking at to fix this? Thanks for any help

I found out that input_process.stderr returns this error “ffmpeg: error while loading shared libraries: libavdevice.so.61: cannot open shared object file: No such file or directory” when running ffmpeg command, again I don’t understand why this works in the main docker Exec but not when executed from exec_publish

EDIT:
this solved the issue:
export LD_LIBRARY_PATH=/usr/lib64:$LD_LIBRARY_PATH

all of it totally off-topic here ;(