-
-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implemented a very simple server to pipe the DJI USB to WiFi. #10
base: main
Are you sure you want to change the base?
Conversation
This is more of a proof of concept at this point, the latency is pretty high, but it can be used to get DJI video onto an iOS device live. I haven't created a service for it. Since the HDMI output would likely conflict wit the WiFi access (as they both need access to the serial port) I added a check to fpvout-start.sh that only enables output when an HDMI monitor is connected. This allows both modes of operation to co-exist, since the server is only triggered by an incoming connection, whereas the HDMI is triggered when the USB connection is detected. I suspect that this may not work properly if the user is using the composite out to generate the video. Some of the paths were different in the Git repository than they are on the image that I got. I've tried to use the newer path format in the repo instead of the format on the Pi image, but this isn't fully tested.
@@ -0,0 +1,43 @@ | |||
#!/usr/bin/python3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not familiar with python - but does this script need to be called from somewhere to start?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, this a very basic server. It waits for a connection and when it gets one it pipes the data from the goggles to the remote client on line 38. It can only handle 1 client at a time because the client takes over the connection to the goggles.
Originally I had wanted to implement this as an inetd service (and the script wouldn't have been required) but inetd isn't installed anymore. I think that the mechanism used to launch fpv-video-out when the USB connection to the goggles is made would be able to launch on an incoming connection, but I wasn't familiar with that, and this was a quick proof of concept.
I thought that I did set up something to launch this automatically, but I don't remember what it is, and it doesn't look like I comitted it. I'll have to check my image to see exactly what I did.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't use inetd, it's all about systems for many years now.
Also pipelining video to a webserver might be faster if threaded and called via go or node maybe, or somehow wrap lighttpd or other super small and fast server into the mix.
Sadly I lack time to really write a good doc on this, but if you dig hard, at least find ways to multithread it in python, etc.
I want to see this done, but I'd likely bodge out a crap c++ implementation, and then do it in rust (that's my daily driver language) but I'm so backlogged now. :(
@MythicLionMan I'm also planning on implementing this - I'll take a look at what you have so far and see if I can find any improvements when I get time My thought was the best place to start would be in
|
or something like this:
|
One complication with that is that the web service is triggered by a remote connection, whereas the display service is triggered by a remote client. So the network clients don't start streaming immediately when the goggles are connected. This works well to allow one device to support both connection types, but it means that the network server can't start fpv-out until it gets a client connection. (If it did the data would be stale when a client did connect). I think that just calling fpv-video-out when a client connection is received (rather than when a USB connection is made) is sufficient for single destination use. Ideally there would be a buffer to store the stream and direct it to multiple destinations (a display and multiple network clients). Unfortunately I don't think this is as simple as just calling tee to fork the output of fpv-out to multiple destinations because each destination has a different lifetime. (When USB is first connected it starts streaming to hello_video immediately from the start of the stream, and each network client starts forwarding data at the time that the connection is made.) I don't know if an mp4 stream can be sampled from any point, or if a header is required. If it can, then the output of fpv-out could be stored in a rolling buffer and each destination could start sampling the buffer when it was started. I don't know of an existing tool to do this, or what the best approach would be to make it happen. |
I'm not sure if "vcodec copy" does anything in this context. It should just copy the input unchanged to the output. I believe that "vcodec copy" is more useful when paired with other options (to process audio differently than video for example). But it may be that it restructures the stream in a useful way. I'm not sure about the vlc command. There is reference in the vlc docs to udb multicasting a stream, which may be a way for multiple clients to access the same stream, but I don't know how that would work. |
Cool stuff going on here. A few thoughts:
|
This has been quiet for a while, but I just got here, and was quite interested in how this might work out for me. I ended up installing
And it....works!?!? |
This is more of a proof of concept at this point, the latency is pretty high, but it can be used to get DJI video onto an iOS device live. I haven't created a service for it.
Since the HDMI output would likely conflict with the WiFi access (as they both need access to the USB port) I added a check to fpvout-start.sh that only enables output when an HDMI monitor is connected. This allows both modes of operation to co-exist, since the server is only triggered by an incoming connection, whereas the HDMI is triggered when the USB connection is detected. I suspect that this may not work properly if the user is using the composite out to render the video.
Some of the paths were different in the Git repository than they are on the image that I got. I've tried to use the newer path format in the repo instead of the format on the Pi image, but this isn't fully tested.