Sample Header Ad - 728x90

Unix & Linux Stack Exchange

Q&A for users of Linux, FreeBSD and other Unix-like operating systems

Latest Questions

1 votes
1 answers
3060 views
Mopidy with pipewire-pulse
I run Mopidy as a service for my mpd because of the nice Spotify integration. I've followed the documentation's instructions to use it with pulseaudio by setting ``` load-module module-native-protocol-tcp auth-ip-acl=127.0.0.1` ``` in my `/etc/pulse/default.pa` and ```json [audio] output = pulsesink...
I run Mopidy as a service for my mpd because of the nice Spotify integration. I've followed the documentation's instructions to use it with pulseaudio by setting
load-module module-native-protocol-tcp auth-ip-acl=127.0.0.1`
in my /etc/pulse/default.pa and
[audio]
output = pulsesink server=127.0.0.1
in my /etc/mopidy/modipy.conf.
All of this was working perfectly, however recently I've started using Pipewire as a replacement for PulseAudio on Arch Linux by following this and installing pipewire,pipewire-alsa, pipewire-pulse and pipewire-media-session. But now I've noticed that whenever I try to play any music nothing happens. I checked the journalctl for Mopidy and found this
ERROR ... [536:MainThread] mopidy.audio.gst
... mopidy:   GStreamer error: Failed to connect: Connection refused
... mopidy: WARNING ... [536:Audio-2] mopidy.audio.actor
... mopidy:   Setting GStreamer state to GST_STATE_PLAYING failed
... mopidy: WARNING ... [536:Core-11] mopidy.core.tracklist
... mopidy:   Track is not playable: local:track:...
I searched around and found the gst-plugin-pipewire package, but installing it didn't help at all. Apart from Mopidy, everything else audio-wise is working perfectly. If I could get some help with this I'd be thankful.
SandWood Jones (63 rep)
Apr 7, 2021, 11:51 PM • Last activity: Jul 23, 2025, 03:09 AM
0 votes
1 answers
30 views
de-fisheye webcam stream with gstreamer
I try to de-fisheye my webcam image using gstreamer, but got no plugin working that supposedly fulfills the task. I tried 1. `frei0r-filter-defish0r`, which only yields a black stream. Pipelines with other frei0r filters form `videotestsrc` and `v4l2src` work flawlessly. Minimal (not) working exampl...
I try to de-fisheye my webcam image using gstreamer, but got no plugin working that supposedly fulfills the task. I tried 1. frei0r-filter-defish0r, which only yields a black stream. Pipelines with other frei0r filters form videotestsrc and v4l2src work flawlessly. Minimal (not) working example: gst-launch-1.0 videotestsrc ! frei0r-filter-defish0r ! videoconvert ! autovideosink 2. cameraundistort I don't really know how to calibrate that... But it would return an image. I assume the first option would do what I want, given the name and the "defish" option. I use CachyOS and GStreamer Verion is 1.26.3.
MP Felder (168 rep)
Jul 7, 2025, 02:21 PM • Last activity: Jul 7, 2025, 08:43 PM
1 votes
0 answers
25 views
UVC camera only works properly when usbmon is running
I have the following setup:\ i.MX8MM + uDP72020 (PCIe to USB-3.0) + UVC-Camera\ Linux: 6.6.69\ GStreamer: 1.24.0 (imx) GStreamer pipeline I am using: ``` gst-launch-1.0 -e -vvv v4l2src device=/dev/video0 \ ! video/x-raw,width=1920,height=1080,format=YUY2,framerate=30/1 \ ! imxvideoconvert_g2d ! kmss...
I have the following setup:\ i.MX8MM + uDP72020 (PCIe to USB-3.0) + UVC-Camera\ Linux: 6.6.69\ GStreamer: 1.24.0 (imx) GStreamer pipeline I am using:
gst-launch-1.0 -e -vvv v4l2src device=/dev/video0 \
               ! video/x-raw,width=1920,height=1080,format=YUY2,framerate=30/1 \
               ! imxvideoconvert_g2d ! kmssink connector-id=35 can-scale=false sync=false
Unfortunately this doesn't work, as I get around 1 fps. Lower resolutions like 1280x720 work fine. So I thought maybe it is a performance issue on the host side, but the i.MX8MM should be able to handle this resolution. While debugging the issue and looking into stuff, I started usbmon to maybe see errors or something. But as soon as I started usbmon, everything worked fine, even at 1920x1080@30fps. So the above GStreamer pipeline only works when usbmon is running at the same time! So my question now is, why? Why is the USB communication suddenly so much better when usbmon is running and logging all the traffic? Is there anything usbmon does, which I could potentially also do to improve my setup? Thank you and best regards, Stefan
user1243392 (11 rep)
Jun 18, 2025, 09:56 AM
0 votes
1 answers
5483 views
Ubuntu gstreamer "Could not open resource for reading and writing"
I'm using gstreamer to get rtsp stream. I use command: `gst-launch-1.0 rtspsrc location='rtsp://user:password@address:554/live/main' latency=100 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! videoscale ! video/x-raw,width=640,height=480 ! autovideosink` I used it on the first comp...
I'm using gstreamer to get rtsp stream. I use command: gst-launch-1.0 rtspsrc location='rtsp://user:password@address:554/live/main' latency=100 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! videoscale ! video/x-raw,width=640,height=480 ! autovideosink I used it on the first computer and it worked. Then I used it on the second computer and it worked too. After that I did some uninstall/install operations with opencv on the second one. And now I can't get the stream from the second PC. But on the first one it still works with the same command. Output: Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayX11\)\ gldisplayx11-0"; Progress: (open) Opening Stream Progress: (connect) Connecting to rtsp://user:password@address:554/live/main ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not open resource for reading and writing. Additional debug info: gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Failed to connect. (Generic error) ERROR: pipeline doesn't want to preroll. Setting pipeline to PAUSED ... Setting pipeline to READY ... Setting pipeline to NULL ... Freeing pipeline ... What did I do and how can I repair it? Thanks in advance!
Dmitry (143 rep)
Dec 14, 2022, 10:07 AM • Last activity: May 13, 2025, 12:02 AM
0 votes
1 answers
466 views
gstreamer - Missing plugin "Quicktime demuxer"
I am trying to play a simple mp4 file with gstreamer on OpenSuse TW, unfortunately it appears to be a bit of trouble, since it can't find the Quicktime decoder, which yet is installed according to ```qst-inspect```. **The command:** ```gst-launch-1.0 playbin uri=file:///$(pwd)/video2.mp4``` **The er...
I am trying to play a simple mp4 file with gstreamer on OpenSuse TW, unfortunately it appears to be a bit of trouble, since it can't find the Quicktime decoder, which yet is installed according to
-inspect
. **The command:**
-launch-1.0 playbin uri=file:///$(pwd)/video2.mp4
**The error message:**
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Missing element: Quicktime demuxer
WARNING: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: No decoder available for type 'video/quicktime, variant=(string)iso'.
Additional debug info:
../gst/playback/gsturidecodebin.c(1003): unknown_type_cb (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0
ERROR: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: Your GStreamer installation is missing a plug-in.
Additional debug info:
../gst/playback/gsturidecodebin.c(1070): no_more_pads_full (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0:
no suitable plugins found:
../gst/playback/gstdecodebin2.c(4736): gst_decode_bin_expose (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0:
no suitable plugins found:
Missing decoder: Quicktime (video/quicktime, variant=(string)iso)

ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
ERROR: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind: Internal data stream error.
Additional debug info:
../plugins/elements/gsttypefindelement.c(1257): gst_type_find_element_loop (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Freeing pipeline ...
**Output of
-inspect-1.0
**
...
typefindfunctions: video/quicktime: mov, mp4
...
**Output of
-inspect-1.0 | grep -i missing
**: Nothing **What I did:** - I installed the following packages
-plugins-libav gstreamer-plugins-base gstreamer-plugins-good gstreamer-plugins-bad gstreamer-plugins-ugly gstreamer-plugins-ugly-codecs gstreamer-plugins-bad-codecs
- Recoded the mp4 to mp4 and to mov with ffmpeg, resulting in similar issues -
~/.cache/gstreamer-1.0/registry.x86_64.bin
- Asked ChatGPT of course with no benefit - Researched for a solution, with almost all of them resulting the same suggestions Thanks in advance!
Cane (33 rep)
Nov 14, 2024, 07:53 PM • Last activity: Nov 14, 2024, 08:21 PM
2 votes
1 answers
173 views
How can I mock up a screen share?
I can use v4l2loopback to create a dummy video device, Xephyr to create a nested X server in its own window, and a gst pipeline to link the two so that the contents of the Xephyr window appear as my webcam. This gives a nice sandbox where only the applications I want to share are made visible, and i...
I can use v4l2loopback to create a dummy video device, Xephyr to create a nested X server in its own window, and a gst pipeline to link the two so that the contents of the Xephyr window appear as my webcam. This gives a nice sandbox where only the applications I want to share are made visible, and it means that I can switch back to the video conference without the other participants losing the view of the nested desktop. However, this appears as my webcam, not as a screen share, and so I lose some desirable aspects of true screen sharing. For instance, it means that each other user has to "pin" my feed in order that it doesn't lose focus when someone else speaks. So I'm searching for a way to fool my browser into taking its screen share input from an X server other than the one in which it's running, or from a video device. In case it matters, this is Firefox running in a Cinnamon environment.
Aoeuid (123 rep)
Nov 1, 2024, 03:08 PM • Last activity: Nov 1, 2024, 03:30 PM
0 votes
0 answers
32 views
Green screen when doing a network streaming with gstreamer on iMX8
I try to implement a network stream from a Raspberry Pi to a iMX8 platform by using the NXP VPU plugins to decode the stream but the NXP side is showing a green screen with the following output: ``` gst-launch-1.0 tcpclientsrc host=192.168.178.64 port=5004 ! queue ! h264parse ! vpudec ! videoconvert...
I try to implement a network stream from a Raspberry Pi to a iMX8 platform by using the NXP VPU plugins to decode the stream but the NXP side is showing a green screen with the following output:
gst-launch-1.0 tcpclientsrc host=192.168.178.64 port=5004  ! queue ! h264parse ! vpudec ! videoconvert ! fpsdisplaysink sync=false
Setting pipeline to PAUSED ...
====== VPUDEC: 4.7.2 build on Sep  1 2022 09:49:28. ======
        wrapper: 3.0.0 (VPUWRAPPER_ARM64_LINUX Build on Aug 31 2022 01:28:14)
        vpulib: 1.1.1
        firmware: 1.1.1.65535
Pipeline is PREROLLING ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
Redistribute latency...
New clock: GstSystemClock
ERROR: cabac_init_idc
ERROR: SLICE_HEADER
ERROR: disable_deblocking_filter_idc
I use the following pipelines: **Raspberry Pi (Sender)** gst-launch-1.0 videotestsrc ! video/x-raw,width=1280,height=720 ! videoconvert ! x264enc tune=zerolatency byte-stream=true bitrate=500 threads=2 ! mpegtsmux ! tcpserversink host=0.0.0.0 port=5004 **iMX8 (Receiver)** gst-launch-1.0 tcpclientsrc host= port=5004 ! queue ! h264parse ! vpudec ! imxvideoconvert_g2d ! fpsdisplaysink sync=false What does this error mean and how can I fix it?
Kampi (175 rep)
Oct 4, 2024, 07:06 PM
1 votes
0 answers
122 views
Frame Buffer as RTSP stream
I have a RTSP viewer on a small local network that I would like to use as a monitor. I'ts really just to access the CLI in the event that an automated service on a Rspberry pi doesn't start or run correctly. so far I have this script using gstreamer, but I can't seem to connect to it. It appears tha...
I have a RTSP viewer on a small local network that I would like to use as a monitor. I'ts really just to access the CLI in the event that an automated service on a Rspberry pi doesn't start or run correctly. so far I have this script using gstreamer, but I can't seem to connect to it. It appears that the server is running but I can't connect to it. I have shutdown the firewall and have verified that the port is open with netstat. I'mnot sure what do try next. import gi gi.require_version('Gst', '1.0') gi.require_version('GstRtspServer', '1.0') from gi.repository import Gst, GstRtspServer, GLib class CLI_RTSPServer(GstRtspServer.RTSPMediaFactory): def __init__(self): super(CLI_RTSPServer, self).__init__() #self.set_launch("( fbdevsrc device=/dev/fb0 ! videoconvert ! x264enc tune=zerolatency ! rtph264pay name=pay0 pt=96 )") self.set_launch("( fbdevsrc device=/dev/fb0 ! videoscale ! video/x-raw,width=640,height=480 ! videoconvert ! x264enc tune=zerolatency bitrate=500 ! rtph264pay name=pay0 pt=96 )") if __name__ == "__main__": Gst.init(None) server = GstRtspServer.RTSPServer() factory = CLI_RTSPServer() factory.set_shared(True) server.set_service("554") # Bind to the default RTSP port 554 server.get_mount_points().add_factory("/cli", factory) server.attach(None) print("RTSP server is running at rtsp://localhost:8554/cli") loop = GLib.MainLoop() loop.run()
Steven Lutz (173 rep)
Sep 18, 2024, 06:23 PM
0 votes
0 answers
142 views
Pulseaudio combined sinks resulting in latency & poor audio quality
I want to be able to hear sound from 3 different devices. I have the following cards. pactl list cards ``` Card #0 Name: alsa_card.pci-0000_00_0e.0 Driver: module-alsa-card.c Owner Module: 1 Profiles: input:analog-stereo: Analog Stereo Input (sinks: 0, sources: 1, priority: 60, available: yes) outpu...
I want to be able to hear sound from 3 different devices. I have the following cards. pactl list cards
Card #0
    Name: alsa_card.pci-0000_00_0e.0
    Driver: module-alsa-card.c
    Owner Module: 1
    Profiles:
        input:analog-stereo: Analog Stereo Input (sinks: 0, sources: 1, priority: 60, available: yes)
        output:analog-stereo: Analog Stereo Output (sinks: 1, sources: 0, priority: 6000, available: yes)
        output:analog-stereo+input:analog-stereo: Analog Stereo Duplex (sinks: 1, sources: 1, priority: 6060, available: yes)
        output:hdmi-stereo: Digital Stereo (HDMI) Output (sinks: 1, sources: 0, priority: 5400, available: yes)
        output:hdmi-stereo+input:analog-stereo: Digital Stereo (HDMI) Output + Analog Stereo Input (sinks: 1, sources: 1, priority: 5460, available: yes)
        output:hdmi-stereo-extra1: Digital Stereo (HDMI 2) Output (sinks: 1, sources: 0, priority: 5200, available: no)
        output:hdmi-stereo-extra1+input:analog-stereo: Digital Stereo (HDMI 2) Output + Analog Stereo Input (sinks: 1, sources: 1, priority: 5260, available: yes)
        output:hdmi-surround-extra1: Digital Surround 5.1 (HDMI 2) Output (sinks: 1, sources: 0, priority: 100, available: no)
        output:hdmi-surround-extra1+input:analog-stereo: Digital Surround 5.1 (HDMI 2) Output + Analog Stereo Input (sinks: 1, sources: 1, priority: 160, available: yes)
        output:hdmi-surround71-extra1: Digital Surround 7.1 (HDMI 2) Output (sinks: 1, sources: 0, priority: 100, available: no)
        output:hdmi-surround71-extra1+input:analog-stereo: Digital Surround 7.1 (HDMI 2) Output + Analog Stereo Input (sinks: 1, sources: 1, priority: 160, available: yes)
        output:hdmi-stereo-extra2: Digital Stereo (HDMI 3) Output (sinks: 1, sources: 0, priority: 5200, available: no)
        output:hdmi-stereo-extra2+input:analog-stereo: Digital Stereo (HDMI 3) Output + Analog Stereo Input (sinks: 1, sources: 1, priority: 5260, available: yes)
        output:hdmi-surround-extra2: Digital Surround 5.1 (HDMI 3) Output (sinks: 1, sources: 0, priority: 100, available: no)
        output:hdmi-surround-extra2+input:analog-stereo: Digital Surround 5.1 (HDMI 3) Output + Analog Stereo Input (sinks: 1, sources: 1, priority: 160, available: yes)
        output:hdmi-surround71-extra2: Digital Surround 7.1 (HDMI 3) Output (sinks: 1, sources: 0, priority: 100, available: no)
        output:hdmi-surround71-extra2+input:analog-stereo: Digital Surround 7.1 (HDMI 3) Output + Analog Stereo Input (sinks: 1, sources: 1, priority: 160, available: yes)
        off: Off (sinks: 0, sources: 0, priority: 0, available: yes)
    Active Profile: output:analog-stereo+input:analog-stereo
Card #1
    Name: alsa_card.usb-BurrBrown_from_Texas_Instruments_USB_AUDIO_DAC-00
    Driver: module-alsa-card.c
    Owner Module: 2
    Profiles:
        output:analog-mono: Analog Mono Output (sinks: 1, sources: 0, priority: 200, available: yes)
        output:analog-stereo: Analog Stereo Output (sinks: 1, sources: 0, priority: 6000, available: yes)
        output:iec958-stereo: Digital Stereo (IEC958) Output (sinks: 1, sources: 0, priority: 5500, available: yes)
        off: Off (sinks: 0, sources: 0, priority: 0, available: yes)
    Active Profile: output:analog-stereo
The profiles I want to be able to use are
output:analog-stereo+input:analog-stereo (Current Active Profile) in Card 0

output:hdmi-stereo+input:analog-stereo in Card 0

output:analog-stereo in Card 1
Which from aplay -l
**** List of PLAYBACK Hardware Devices ****
card 0: PCH [HDA Intel PCH], device 0: ALC662 rev3 Analog [ALC662 rev3 Analog]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 0: PCH [HDA Intel PCH], device 3: HDMI 0 [HDMI 0]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 0: PCH [HDA Intel PCH], device 7: HDMI 1 [HDMI 1]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 0: PCH [HDA Intel PCH], device 8: HDMI 2 [HDMI 2]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
card 1: DAC [USB AUDIO    DAC], device 0: USB Audio [USB Audio]
  Subdevices: 1/1
  Subdevice #0: subdevice #0
would mean hw:0,0 (analog), hw:0,3 (hdmi) & hw:1,0 (usb). I'm currently testing by combining 2 sinks from Card 0 - hw:0,0 & hw:0,3 Before making any changes this was the state of the system pactl info
Server String: /var/run/pulse/native
Library Protocol Version: 32
Server Protocol Version: 32
Is Local: yes
Client Index: 0
Tile Size: 65472
User Name: pulse
Server Name: pulseaudio
Server Version: 10.0
Default Sample Specification: s16le 2ch 44100Hz
Default Channel Map: front-left,front-right
Default Sink: alsa_output.pci-0000_00_0e.0.analog-stereo
Default Source: alsa_input.pci-0000_00_0e.0.analog-stereo
Cookie: a0ec:1eec
This produced sound aplay -D plughw:0,0 /usr/share/sounds/alsa/Front_Right.wav And this did not produce any sound aplay -D plughw:0,3 /usr/share/sounds/alsa/Front_Right.wav I changed the defaults like this
pactl set-card-profile 0 output:hdmi-stereo+input:analog-stereo
pactl set-default-sink alsa_output.pci-0000_00_0e.0.hdmi-stereo
pactl load-module module-alsa-sink device=hw:0,0 sink_name=analog
pactl load-module module-alsa-sink device=hw:0,3 sink_name=hdmi
pactl load-module module-combine-sink sink_name=combined slaves=analog,hdmi
pactl set-default-sink combined
This is the new state of my system pactl info
Server String: /var/run/pulse/native
Library Protocol Version: 32
Server Protocol Version: 32
Is Local: yes
Client Index: 39
Tile Size: 65472
User Name: pulse
Server Name: pulseaudio
Server Version: 10.0
Default Sample Specification: s16le 2ch 44100Hz
Default Channel Map: front-left,front-right
Default Sink: combined
Default Source: alsa_input.pci-0000_00_0e.0.analog-stereo
Cookie: a0ec:1eec
Now both of these produce sounds from the correct devices
aplay -D plughw:0,0 /usr/share/sounds/alsa/Front_Right.wav
aplay -D plughw:0,3 /usr/share/sounds/alsa/Front_Right.wav
I'm also using gstreamer to set up the pipeline and have updated my "out_sink" to "combined"
out_sink = gst_element_factory_make ("pulsesink", "out_sink");
g_assert_nonnull (out_sink);
g_object_set (out_sink, "device", "combined", NULL);
Now I try testing the audio from the actual app/device Here I only hear sound from hw:0,0 when it is connected. The audio seems fine. There is no latency for the short period I've tested. When I disconnect hw:0,0, I hear a delayed audio from hw:0,3. Not only is it delayed but I don't hear all the audio. It sounds broken, like packets have been dropped. So my questions I suppose are 1. Why is hw:0,0 prioritized over hw:0,3? Is it based on the priorities as listed in the Card 0 profiles? Is there a way to change that? 2. How do I ensure that the latency and the quality of the audio is the same through all the devices? When I have only 1 device as the sink there is no problem with the quality or the latency. So why has that changed and how do I fix it? 3. I'm fairly new to these concepts so is there specific documentation I can check out for more info.
Zephyr (103 rep)
May 8, 2024, 09:18 PM • Last activity: May 10, 2024, 11:28 AM
0 votes
1 answers
2242 views
Why does rtpbin example from Gstreamer not work?
I am trying to run the [rtpbin example][1] an Ubuntu 21.10 VirtualBox VM with GStream 1.18.5. - I set up GStreamer and have been able to run many of the [basic][2] and [playback][3] tutorials. - I've also read through a good portion of [Application Developer Manual][4]. Doing everything from C code...
I am trying to run the rtpbin example an Ubuntu 21.10 VirtualBox VM with GStream 1.18.5. - I set up GStreamer and have been able to run many of the basic and playback tutorials. - I've also read through a good portion of Application Developer Manual . Doing everything from C code seems straightforward but the rtpbin example uses gst-launch-1.0 (covered in one of the basic tutorials ). I couldn't get the rtpbin example to run without errors initially: - ffenc_h263 and ffdec_h263 (WARNING: erroneous pipeline: no element "ffenc_h263"), so I replaced them with avenc_h263 and avdec_h263, respectively - v4l2srcwas looking for a device that wasn't available in /dev so I switched to videotestsrc. To make sure these substitutions weren't a problem, I got rid of the RTSP and UDP stuff and checked by running: gst-launch-1.0 videotestsrc ! videoconvert ! avenc_h263 ! rtph263pay \ ! rtph263depay ! avdec_h263 ! xvimagesink and saw the "bars" video. I also ran gst-launch-1.0 audiotestsrc ! amrnbenc ! rtpamrpay ! rtpamrdepay ! amrnbdec ! alsasink and heard that annoying test tone. Based on this, I think the issue is with UDP and RTSP. Running gst-launch-1.0 rtpbin name=rtpbin \ videotestsrc ! videoconvert ! avenc_h263 ! rtph263pay ! rtpbin.send_rtp_sink_0 \ rtpbin.send_rtp_src_0 ! udpsink port=5000 \ rtpbin.send_rtcp_src_0 ! udpsink port=5001 sync=false async=false \ udpsrc port=5005 ! rtpbin.recv_rtcp_sink_0 \ audiotestsrc ! amrnbenc ! rtpamrpay ! rtpbin.send_rtp_sink_1 \ rtpbin.send_rtp_src_1 ! udpsink port=5002 \ rtpbin.send_rtcp_src_1 ! udpsink port=5003 sync=false async=false \ udpsrc port=5007 ! rtpbin.recv_rtcp_sink_1 shows Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Pipeline is PREROLLED ... Setting pipeline to PLAYING ... New clock: GstSystemClock 0:00:59.0 / 99:99:99 # This is counting up Then in a different window, I run gst-launch-1.0 rtpbin name=rtpbin \ udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263-1996" \ port=5000 ! rtpbin.recv_rtp_sink_0 \ rtpbin. ! rtph263depay ! avdec_h263 ! xvimagesink \ udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 \ rtpbin.send_rtcp_src_0 ! udpsink port=5005 sync=false async=false \ udpsrc caps="application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)AMR,encoding-params=(string)1,octet-align=(string)1" \ port=5002 ! rtpbin.recv_rtp_sink_1 \ rtpbin. ! rtpamrdepay ! amrnbdec ! alsasink \ udpsrc port=5003 ! rtpbin.recv_rtcp_sink_1 \ rtpbin.send_rtcp_src_1 ! udpsink port=5007 sync=false async=false and see Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Pipeline is PREROLLED ... Setting pipeline to PLAYING ... New clock: GstSystemClock I expect to see some UDP ports open (5000, 5001, 5002, 5003, 5005, and 5007). Sure enough, running netstat shows: rtsp@rtsp-VirtualBox:~$ sudo netstat -apn | grep -w 500[0-9] udp 0 0 0.0.0.0:5000 0.0.0.0:* 7076/gst-launch-1.0 udp 0 0 0.0.0.0:5001 0.0.0.0:* 7076/gst-launch-1.0 udp 0 0 0.0.0.0:5002 0.0.0.0:* 7076/gst-launch-1.0 udp 0 0 0.0.0.0:5003 0.0.0.0:* 7076/gst-launch-1.0 udp 0 0 0.0.0.0:5005 0.0.0.0:* 6862/gst-launch-1.0 udp 0 0 0.0.0.0:5007 0.0.0.0:* 6862/gst-launch-1.0 To make sure all the ports were working - I installed from snap rtsp-test-server and VLC - I was able to stream video over the ports provided by rtsp-test-server. I suppose this isn't a perfect test since TCP ports are being used instead of UDP but it was easy to test so I gave it a shot. But I'm not seeing any video or hearing any sound. Can someone point out my error(s)?
MrMas (295 rep)
Dec 10, 2021, 08:52 PM • Last activity: Mar 14, 2024, 07:00 PM
0 votes
1 answers
454 views
Internal Data Stream Error Reading v4l2 Virtual Cameras with Gstreamer Pipeline
I'm using v4l2loopback to create 2 virtual cameras that I can then stream an mp4 into and simultaneously read from those cameras with a gstreamer pipeline as if the mp4 was the video being recorded by those virtual cameras. My gstreamer pipeline videomixer name=mix sink_0::xpos=0 sink_0::ypos=0 ! te...
I'm using v4l2loopback to create 2 virtual cameras that I can then stream an mp4 into and simultaneously read from those cameras with a gstreamer pipeline as if the mp4 was the video being recorded by those virtual cameras. My gstreamer pipeline videomixer name=mix sink_0::xpos=0 sink_0::ypos=0 ! tee name=t ! videorate ! videoscale ! videoconvert ! video/x-raw,framerate=3/1,format=RGB,height=240,width=320 ! filesink location=/tmp/realtime_ai_video t. ! x264enc name=h264enc tune=zerolatency ! video/x-h264,stream-format=avc,alignment=au ! queue ! appsink name=mp4_appsink enable-last-sample=true emit-signals=true sync=false drop=false v4l2src device=/dev/video0 do-timestamp=true ! image/jpeg,height=240,width=320,framerate=3/1 ! jpegparse ! jpegdec ! video/x-raw,format=I420 ! mix.sink_0 The command I'm using to create the virtual cameras sudo modprobe v4l2loopback devices=2 card_label="Loopback-1,Loopback-2" The commands I'm using to stream the mp4 into the virtual cameras ffmpeg -re -i r4.mp4 -map 0:v -f v4l2 /dev/video0 ffmpeg -re -i r4.mp4 -map 0:v -f v4l2 /dev/video1 The errors I get from gstreamer when running the pipeline to read from the virtual cameras #033[33m13772#033[00m 0x560d53aeca40 #033[33;01mWARN #033[00m #033[00m basesrc gstbasesrc.c:3127:gst_base_src_loop:#033[00m error: Internal data stream error. #033[33m13772#033[00m 0x560d53aeca40 #033[33;01mWARN #033[00m #033[00m basesrc gstbasesrc.c:3127:gst_base_src_loop:#033[00m error: streaming stopped, reason not-negotiated (-4) ERROR:store.cameras.ringbuffer_camera_writer:gst-stream-error-quark: Internal data stream error. (1) | ../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0 - v4l2: v0.12.7 - GStreamer: 1.20.3 - ffmpeg: 4.4.2-0ubuntu0.22.04.1 - OS: Ubuntu 22.04 jammy **The stipulation is that I'd like not to change the pipeline command unless it's a small addition or difference**, I'm wanting to exhaust any other possibility for instance with ffmpeg or v4l2loopback before editing the pipeline.
dan178 (113 rep)
Sep 11, 2023, 07:02 PM • Last activity: Sep 12, 2023, 05:10 AM
0 votes
1 answers
135 views
Why are GStreamer packages always named "gstreamer1-*"?
Why are GStreamer packages always named `gstreamer1-*`? What is `1`? Why not just `gtsreamer-*`?
Why are GStreamer packages always named gstreamer1-*? What is 1? Why not just gtsreamer-*?
linuxer (27 rep)
Aug 29, 2023, 04:22 PM • Last activity: Aug 30, 2023, 01:14 PM
1 votes
1 answers
1003 views
How do I resolve this GStreamer codec update fault?
I recently installed KUbuntu on a Lenovo Ideapad laptop. It's functioning fine aside from a problem that came up with a recent update—after updating all packages, three remained: GStreamer Multimedia Codecs libgstreamer-plugins-bad1.0-0 libgstreamer-plugins-base1.0-0 With the message that: Package f...
I recently installed KUbuntu on a Lenovo Ideapad laptop. It's functioning fine aside from a problem that came up with a recent update—after updating all packages, three remained: GStreamer Multimedia Codecs libgstreamer-plugins-bad1.0-0 libgstreamer-plugins-base1.0-0 With the message that: Package failed to install: Error while installing package: trying to overwrite '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstcamerabin.so', which is also in package gstreamer1.0-plugins-good This is very strange. The only things I've added on to the system have been Krita, Blender (as a direct download, nothing fidgety), and OpenToonz through Snap. Attempting sudo apt upgrade from the terminal gives me this output: Reading package lists... Done Building dependency tree... Done Reading state information... Done You might want to run 'apt --fix-broken install' to correct these. The following packages have unmet dependencies: gstreamer1.0-libav : Depends: libgstreamer-plugins-base1.0-0 (>= 1.22.3) but 1.22.1-1ubuntu1 is installed gstreamer1.0-plugins-base : Depends: libgstreamer-plugins-base1.0-0 (>= 1.22.3) but 1.22.1-1ubuntu1 is installed Breaks: gstreamer1.0-plugins-bad (= 1.22.3) but 1.22.1-1ubuntu1 is installed gstreamer1.0-x : Depends: libgstreamer-plugins-base1.0-0 (>= 1.22.3) but 1.22.1-1ubuntu1 is installed libgstreamer-gl1.0-0 : Depends: libgstreamer-plugins-base1.0-0 (>= 1.22.3) but 1.22.1-1ubuntu1 is installed Breaks: libgstreamer-plugins-bad1.0-0 (< 1:1.16.0) but 1.22.1-1ubuntu1 is installed E: Unmet dependencies. Try 'apt --fix-broken install' with no packages (or specify a solution). If I go ahead and use apt --fix-broken install, the problem persists, and no changes are made: Reading package lists... Done Building dependency tree... Done Reading state information... Done Correcting dependencies... Done The following packages were automatically installed and are no longer required: libbdplus0 libdca0 libdirectfb-1.7-7 libfaad2 liblrdf0 libmjpegutils-2.1-0 libmpeg2-4 libmpeg2encpp-2.1-0 libmplex2-2.1-0 libneon27 libopenni2-0 libpocketsphinx3 libsidplay1v5 libsphinxbase3 libssh-gcrypt-4 libvidstab1.1 linux-headers-6.2.0-20 linux-headers-6.2.0-20-generic linux-image-6.2.0-20-generic linux-modules-6.2.0-20-generic linux-modules-extra-6.2.0-20-generic pocketsphinx-en-us Use 'sudo apt autoremove' to remove them. The following additional packages will be installed: gstreamer1.0-plugins-bad libgstreamer-plugins-bad1.0-0 libgstreamer-plugins-base1.0-0 Suggested packages: frei0r-plugins libvisual-0.4-plugins The following packages will be upgraded: gstreamer1.0-plugins-bad libgstreamer-plugins-bad1.0-0 libgstreamer-plugins-base1.0-0 3 upgraded, 0 newly installed, 0 to remove and 2 not upgraded. 148 not fully installed or removed. Need to get 0 B/4,619 kB of archives. After this operation, 700 kB of additional disk space will be used. Do you want to continue? [Y/n] y (Reading database ... 261983 files and directories currently installed.) Preparing to unpack .../gstreamer1.0-plugins-bad_1%3a1.22.3-dmo1+deb12u1_amd64.deb ... Unpacking gstreamer1.0-plugins-bad:amd64 (1:1.22.3-dmo1+deb12u1) over (1.22.1-1ubuntu1) ... dpkg: error processing archive /var/cache/apt/archives/gstreamer1.0-plugins-bad_1%3a1.22.3-dmo1+deb12u1_amd64.deb (--unpack): trying to overwrite '/usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstcamerabin.so', which is also in package gstreamer1.0-plugins-good:amd64 1.22.1-1ubuntu1 Preparing to unpack .../libgstreamer-plugins-bad1.0-0_1%3a1.22.3-dmo1+deb12u1_amd64.deb ... Unpacking libgstreamer-plugins-bad1.0-0:amd64 (1:1.22.3-dmo1+deb12u1) over (1.22.1-1ubuntu1) ... dpkg: error processing archive /var/cache/apt/archives/libgstreamer-plugins-bad1.0-0_1%3a1.22.3-dmo1+deb12u1_amd64.deb (--unpack): trying to overwrite '/usr/lib/x86_64-linux-gnu/libgstbasecamerabinsrc-1.0.so.0', which is also in package libgstreamer-plugins-good1.0-0:amd64 1.22.1-1ubuntu1 dpkg: regarding .../libgstreamer-plugins-base1.0-0_1.22.3-dmo1+deb12u2_amd64.deb containing libgstreamer-plugins-base1.0-0:amd64: libgstreamer-plugins-base1.0-0:amd64 conflicts with libgstreamer-plugins-bad1.0-0 (<< 1:1.16.0) libgstreamer-plugins-bad1.0-0:amd64 (version 1.22.1-1ubuntu1) is present and installed. dpkg: error processing archive /var/cache/apt/archives/libgstreamer-plugins-base1.0-0_1.22.3-dmo1+deb12u2_amd64.deb (--unpack): conflicting packages - not installing libgstreamer-plugins-base1.0-0:amd64 Errors were encountered while processing: /var/cache/apt/archives/gstreamer1.0-plugins-bad_1%3a1.22.3-dmo1+deb12u1_amd64.deb /var/cache/apt/archives/libgstreamer-plugins-bad1.0-0_1%3a1.22.3-dmo1+deb12u1_amd64.deb /var/cache/apt/archives/libgstreamer-plugins-base1.0-0_1.22.3-dmo1+deb12u2_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) I'm very hesitant to an autoremove with a broken package, and have held off on that. The bigger problem is that I can't seem to install anything, or remove anything, with this broken package issue. I would be all for simply removing and—strictly if necessary—reinstalling whatever uses GStreamer from a stable system, but apparently I can't even do that. Can someone kindly help me correct this strange dependency issue?
Michael Macha (323 rep)
Jul 15, 2023, 09:38 PM • Last activity: Jul 26, 2023, 04:37 PM
7 votes
1 answers
3932 views
Measuring latency in a GStreamer pipeline
I'm using a RTP pipeline to stream video from a camera over local network. The pipeline is: camera > h264enc > RTP > UDP > receiver_and_display How can I find out how the latency is composed?
I'm using a RTP pipeline to stream video from a camera over local network. The pipeline is: camera > h264enc > RTP > UDP > receiver_and_display How can I find out how the latency is composed?
sonium (173 rep)
Apr 11, 2016, 07:08 AM • Last activity: Dec 14, 2022, 11:32 AM
2 votes
1 answers
742 views
Got EOS from element "pipeline0" on gst fbdevsink
I'm trying to forward video file to the framebuffer on my device that has no X. I'm using `gstreamer` with `fbdevsink` plugin. * When I test it with ``` gst-launch-1.0 videotestsrc ! fbdevsink ``` it works perfectly. * However when I try to open any video file on my device with command ``` gst-launc...
I'm trying to forward video file to the framebuffer on my device that has no X. I'm using gstreamer with fbdevsink plugin. * When I test it with
gst-launch-1.0 videotestsrc ! fbdevsink
it works perfectly. * However when I try to open any video file on my device with command
gst-launch-1.0 filesrc location=right_top1.mp4 ! fbdevsink
it stops working immediately with output
Setting pipeline to PAUSED ...
    Pipeline is PREROLLING ...
    Pipeline is PREROLLED ...
    Setting pipeline to PLAYING ...
    New clock: GstSystemClock
    Got EOS from element "pipeline0".
    Execution ended after 0:00:00.006988697
    Setting pipeline to NULL ...
    Freeing pipeline ...
I cannot figure out what is going on, because even when I add debugging (-v --gst-debug-level=2) output is the same. If it matters, I'm working on Nvidia Jetson Nano with Yocto OS. Do you guys have any idea how to resolve or just debug it?
przemoch (161 rep)
Oct 26, 2021, 08:40 AM • Last activity: Oct 28, 2021, 07:32 AM
0 votes
1 answers
1147 views
Soundconverter error encoding to mp3
I am running Ubuntu 20.04 on a Lenovo Thinkpad X1 Tablet. A few years ago I wrote a bash script using soundconverter to transcode some of my music to mp3 files when I need to use a player that only has mp3 capability (like my car and my swimming player). I tried to use my script recently and got the...
I am running Ubuntu 20.04 on a Lenovo Thinkpad X1 Tablet. A few years ago I wrote a bash script using soundconverter to transcode some of my music to mp3 files when I need to use a player that only has mp3 capability (like my car and my swimming player). I tried to use my script recently and got the following error
faac gstreamer element not found
I did some research and found that the faac plugin in not included in the Ubuntu 20.04 package
.0-plugins-bad
nor
-ugly
. But I also found that there is a lame (mp3 library) gstreamer plugin which is installed. Additionally I found that running soundconverter in gui mode could transcode to mp3 without problem. So I have three possible solutions but don't know how to pursue any of therm 1) If soundconverter can transcode to mp3 in gui, I am guessing there is some option that will enable this in batch mode. Does anyone know how? 2) Is there a way to ask soundconverter to use gstreamer's lame plugin rather then the faac plugin to transcode to mp3? 3) Does anyone know how to install the gstreamer faac plugin on Ubuntu 20.04?
brett stevens (101 rep)
Mar 7, 2021, 03:25 PM • Last activity: Jul 3, 2021, 06:11 PM
0 votes
0 answers
567 views
ffplay fails to play certain files
There are certain files that fails to play when I try to play them with `ffplay`. Here is the error they throw when trying to play, ``` ❯ ffplay /usr/share/lmms/samples/instruments/harpsichord01.ogg ffplay version n4.4 Copyright (c) 2003-2021 the FFmpeg developers built with gcc 10.2.0 (GCC) configu...
There are certain files that fails to play when I try to play them with ffplay. Here is the error they throw when trying to play,
❯ ffplay /usr/share/lmms/samples/instruments/harpsichord01.ogg
ffplay version n4.4 Copyright (c) 2003-2021 the FFmpeg developers
  built with gcc 10.2.0 (GCC)
  configuration: --prefix=/usr --disable-debug --disable-static --disable-stripping --enable-amf --enable-avisynth --enable-cuda-llvm --enable-lto --enable-fontconfig
--enable-gmp --enable-gnutls --enable-gpl --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libdav1d --enable-libdrm --enable-libfreetype --enable-libfribidi --enable-libgsm --enable-libiec61883 --enable-libjack --enable-libmfx --enable-libmodplug --enable-libmp3lame --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librav1e --enable-librsvg --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxml2 --enable-libxvid --enable-libzimg --enable-nvdec --enable-nvenc --enable-shared --enable-version3
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
[wav @ 0x7f2660000c80] Could not find codec parameters for stream 0 (Audio: none (Og / 0x674F), 44100 Hz, 1 channels, 144 kb/s): unknown codec
Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options
Input #0, wav, from '/usr/share/lmms/samples/instruments/harpsichord01.ogg':
  Duration: 00:00:02.64, bitrate: 115 kb/s
  Stream #0:0: Audio: none (Og / 0x674F), 44100 Hz, 1 channels, 144 kb/s
No decoder could be found for codec none
Failed to open file '/usr/share/lmms/samples/instruments/harpsichord01.ogg' or configure filtergraph
    nan    :  0.000 fd=   0 aq=    0KB vq=    0KB sq=    0B f=0/0
This file is from the DAW lmms, incase anyone wants to try this particular file, just install lmms and it should in the same path. Apart from ffplay I have also tried playing this particular file with gst-play-1.0, aplay, and mpv. They all fail to play this file, however Audacity and some players like Audcaious plays this file fine. I'm wondering what is wrong here.
apoorv569 (51 rep)
May 3, 2021, 08:02 AM
1 votes
1 answers
682 views
how do I set properties of a v4l2loopback device and make them visible to my web browser?
I've created a couple of v4l2loopback devices for use as virtual webcams, and have been able to get Chrome to recognize them via `navigator.mediaDevices.enumerateDevices()`. I've also been able to construct gstreamer pipelines to send video and image data to these virtual webcams. what I haven't bee...
I've created a couple of v4l2loopback devices for use as virtual webcams, and have been able to get Chrome to recognize them via navigator.mediaDevices.enumerateDevices(). I've also been able to construct gstreamer pipelines to send video and image data to these virtual webcams. what I haven't been able to do is designate any of these devices as front-facing, as reported by InputDeviceInfo.getCapabilities(). is this possible to do with v4l2loopback parameters? is it possible to do by configuring my gstreamer pipeline somehow?
Dan O (133 rep)
Feb 22, 2021, 08:56 PM • Last activity: Feb 24, 2021, 03:06 PM
1 votes
1 answers
1584 views
How to convert stream data which come over udp to video device?
I'm trying to share my computer's camera with the remote computer. In order to do this, I wanted to share my own computer's camera with the udp port(stream), take the stream on the remote computer and put it on the virtual camera. So I thought I could use my own camera on platforms like google meet...
I'm trying to share my computer's camera with the remote computer. In order to do this, I wanted to share my own computer's camera with the udp port(stream), take the stream on the remote computer and put it on the virtual camera. So I thought I could use my own camera on platforms like google meet with my remote computer. My application steps are as follows; I started camera stream from local pc (/dev/video0) with use gstream udpsink and I could get stream data on remote pc like below: gst-launch-1.0 -e -v udpsrc port=5001 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink I created a virtual camera on the remote pc like below: sudo modprobe v4l2loopback exclusive_caps=1 video_nr=3 card_label="VirtualCAM" At this point, I want to convert this stream data to video device. I tried to use v4l2sink device=/dev/video5 instead of autovideosink but I got some error. Do you have any suggestion?
Purgoufr (131 rep)
Jan 23, 2021, 11:45 AM • Last activity: Feb 20, 2021, 11:35 AM
0 votes
1 answers
865 views
ROS Camera Calibration GStreamer Error
I have a VM with ROS Noetic installed and I want to run a camera calibration on my USB Webcam. I followed the following guide and for the launcher, I put the following code: ``` ``` I try to launch it and get the following response: ``` ... logging to /home/dragonros/.ros/log/e62adeac-5d96-11eb-b879...
I have a VM with ROS Noetic installed and I want to run a camera calibration on my USB Webcam. I followed the following guide and for the launcher, I put the following code:
I try to launch it and get the following response:
... logging to /home/dragonros/.ros/log/e62adeac-5d96-11eb-b879-09066182755f/roslaunch-ubuntu-42560.log
Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

invalid ROS_HOSTNAME (an empty string)
invalid ROS_HOSTNAME (an empty string)
started roslaunch server http://ubuntu:35479/ 

SUMMARY
========

PARAMETERS
 * /left/gscam_driver_v4l/camera_info_url: package://gscam/e...
 * /left/gscam_driver_v4l/camera_name: default
 * /left/gscam_driver_v4l/frame_id: /v4l_frame_l
 * /left/gscam_driver_v4l/gscam_config: v4l2src device=/d...
 * /left/gscam_driver_v4l/sync_sink: True
 * /rosdistro: noetic
 * /rosversion: 1.15.9

NODES
  /left/
    gscam_driver_v4l (gscam/gscam)

ROS_MASTER_URI=http://localhost:11311

process[left/gscam_driver_v4l-1]: started with pid 
[ WARN] [1611445425.340701580]: invalid ROS_HOSTNAME (an empty string)
[ INFO] [1611445425.361034554]: Using gstreamer config from rosparam: "v4l2src device=/dev/video0 ! video/x-raw-rgb,framerate=30/1 ! ffmpegcolorspace"
[ INFO] [1611445425.366515919]: camera calibration URL: package://gscam/examples/uncalibrated_parameters.ini
[ INFO] [1611445425.366725052]: Loaded camera calibration from package://gscam/examples/uncalibrated_parameters.ini

(gscam:42574): GStreamer-WARNING **: 15:43:45.455: 0.10-style raw video caps are being created. Should be video/x-raw,format=(string).. now.
[ INFO] [1611445425.455679084]: Time offset: 1611418553.717
[ INFO] [1611445425.555216304]: Publishing stream...
[ INFO] [1611445425.555412447]: Started stream.
[ERROR] [1611445425.555446013]: Could not get gstreamer sample.
[ INFO] [1611445425.555455586]: Stopping gstreamer pipeline...
[ INFO] [1611445425.557306732]: GStreamer stream stopped!
[ INFO] [1611445425.557374812]: Cleaning up stream and exiting...
[left/gscam_driver_v4l-1] process has finished cleanly
log file: /home/dragonros/.ros/log/e62adeac-5d96-11eb-b879-09066182755f/left-gscam_driver_v4l-1*.log
all processes on machine have died, roslaunch will exit
shutting down processing monitor...
... shutting down processing monitor complete
done
It says that it could not get gstreamer sample. Also, for some reason, when I plug my camera in to the computer, it creates both
/dev/video0
and
/dev/video1
. This also what I get if I run
gscam gscam
. What should I do?
DragonflyRobotics (103 rep)
Jan 24, 2021, 07:16 PM • Last activity: Feb 8, 2021, 04:01 PM
Showing page 1 of 20 total questions