Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implemented a very simple server to pipe the DJI USB to WiFi. #10

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

MythicLionMan
Copy link

This is more of a proof of concept at this point, the latency is pretty high, but it can be used to get DJI video onto an iOS device live. I haven't created a service for it.
Since the HDMI output would likely conflict with the WiFi access (as they both need access to the USB port) I added a check to fpvout-start.sh that only enables output when an HDMI monitor is connected. This allows both modes of operation to co-exist, since the server is only triggered by an incoming connection, whereas the HDMI is triggered when the USB connection is detected. I suspect that this may not work properly if the user is using the composite out to render the video.

Some of the paths were different in the Git repository than they are on the image that I got. I've tried to use the newer path format in the repo instead of the format on the Pi image, but this isn't fully tested.

This is more of a proof of concept at this point, the latency is pretty high, but it can be used to get DJI video onto an iOS device live.
I haven't created a service for it.
Since the HDMI output would likely conflict wit the WiFi access (as they both need access to the serial port) I added a check to
fpvout-start.sh that only enables output when an HDMI monitor is connected. This allows both modes of operation to co-exist, since
the server is only triggered by an incoming connection, whereas the HDMI is triggered when the USB connection is detected. I suspect
that this may not work properly if the user is using the composite out to generate the video.

Some of the paths were different in the Git repository than they are on the image that I got. I've tried to use the newer path format in
the repo instead of the format on the Pi image, but this isn't fully tested.
@@ -0,0 +1,43 @@
#!/usr/bin/python3
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not familiar with python - but does this script need to be called from somewhere to start?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this a very basic server. It waits for a connection and when it gets one it pipes the data from the goggles to the remote client on line 38. It can only handle 1 client at a time because the client takes over the connection to the goggles.

Originally I had wanted to implement this as an inetd service (and the script wouldn't have been required) but inetd isn't installed anymore. I think that the mechanism used to launch fpv-video-out when the USB connection to the goggles is made would be able to launch on an incoming connection, but I wasn't familiar with that, and this was a quick proof of concept.

I thought that I did set up something to launch this automatically, but I don't remember what it is, and it doesn't look like I comitted it. I'll have to check my image to see exactly what I did.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't use inetd, it's all about systems for many years now.

Also pipelining video to a webserver might be faster if threaded and called via go or node maybe, or somehow wrap lighttpd or other super small and fast server into the mix.

Sadly I lack time to really write a good doc on this, but if you dig hard, at least find ways to multithread it in python, etc.

I want to see this done, but I'd likely bodge out a crap c++ implementation, and then do it in rust (that's my daily driver language) but I'm so backlogged now. :(

@dellch
Copy link

dellch commented Aug 17, 2021

@MythicLionMan I'm also planning on implementing this - I'll take a look at what you have so far and see if I can find any improvements when I get time

My thought was the best place to start would be in /Scripts/fpvoutstart.sh, modifying line 7 and changing what the stdout from ./fpv-video-out pipes into. Something like

#!/bin/bash 

export DISPLAY=:0;

cd /home/fpvout/Scripts/fpv-c/ 

./fpv-video-out | network_stream.sh #or .py or whatever the implementation of the network video stream is

@dellch
Copy link

dellch commented Aug 17, 2021

or something like this:

./fpv-video-out | ffmpeg -vcodec copy | vlc --sout '#standard{access=http,mux=ts,dst=localhost:8080}'

@MythicLionMan
Copy link
Author

@MythicLionMan I'm also planning on implementing this - I'll take a look at what you have so far and see if I can find any improvements when I get time

My thought was the best place to start would be in /Scripts/fpvoutstart.sh, modifying line 7 and changing what the stdout from ./fpv-video-out pipes into. Something like

#!/bin/bash 

export DISPLAY=:0;

cd /home/fpvout/Scripts/fpv-c/ 

./fpv-video-out | network_stream.sh #or .py or whatever the implementation of the network video stream is

One complication with that is that the web service is triggered by a remote connection, whereas the display service is triggered by a remote client. So the network clients don't start streaming immediately when the goggles are connected. This works well to allow one device to support both connection types, but it means that the network server can't start fpv-out until it gets a client connection. (If it did the data would be stale when a client did connect).

I think that just calling fpv-video-out when a client connection is received (rather than when a USB connection is made) is sufficient for single destination use. Ideally there would be a buffer to store the stream and direct it to multiple destinations (a display and multiple network clients). Unfortunately I don't think this is as simple as just calling tee to fork the output of fpv-out to multiple destinations because each destination has a different lifetime. (When USB is first connected it starts streaming to hello_video immediately from the start of the stream, and each network client starts forwarding data at the time that the connection is made.)

I don't know if an mp4 stream can be sampled from any point, or if a header is required. If it can, then the output of fpv-out could be stored in a rolling buffer and each destination could start sampling the buffer when it was started. I don't know of an existing tool to do this, or what the best approach would be to make it happen.

@MythicLionMan
Copy link
Author

or something like this:

./fpv-video-out | ffmpeg -vcodec copy | vlc --sout '#standard{access=http,mux=ts,dst=localhost:8080}'

I'm not sure if "vcodec copy" does anything in this context. It should just copy the input unchanged to the output. I believe that "vcodec copy" is more useful when paired with other options (to process audio differently than video for example). But it may be that it restructures the stream in a useful way.

I'm not sure about the vlc command. There is reference in the vlc docs to udb multicasting a stream, which may be a way for multiple clients to access the same stream, but I don't know how that would work.

@j005u
Copy link

j005u commented Oct 1, 2021

Cool stuff going on here. A few thoughts:

  • A circular buffer sounds about right. Technically you'd want to start transmission on an iframe and I believe DJI-s internal code does some very basic parsing of the stream to find them, but most players won't care and will just drop the initial frames.
  • I think it should be possible to use named pipes as a shared buffer with multiple consumers?
  • Likewise when splitting outputs via tee (if it ends up being required at all - see above) consider directing them to a named pipe, that way you can start/stop any consumers independently
  • Consider using gstreamer, it'll handle ffmpeg-s "job" of cleaning up the stream and I believe it can also act as a multi client streaming server

@ShadowZero3000
Copy link

This has been quiet for a while, but I just got here, and was quite interested in how this might work out for me.

I ended up installing ffmpeg and simple-rtsp-server (https://github.com/aler9/rtsp-simple-server) and adjusting my fpv-out script to have:

./fpv-video-out \
  | ffmpeg -re -f h264 \
      -i pipe:0 -f h264 -vcodec copy pipe:1 \
      -f rtsp -vcodec copy -rtsp_transport tcp rtsp://localhost:8554/live.stream \
  | /opt/vc/src/hello_pi/hello_video/hello_video.bin

And it....works!?!?
It seems like my video is about 2-3s delayed, and the rtsp stream (connected to via VLC on my desktop) was about 6s behind.
So, not exactly "live", but functional. (This is on a Pi 3b)
top showed about 10-15% CPU on ffmpeg, 10% or so on simple-rtsp-server and another 10-20% on fpv-video-out, so I'm not sure that CPU load is the issue, and I'm definitely no ffmpeg expert, so I may have missed something key, but maybe there's something here we could build on?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants