You are not logged in.
Pages: 1
A friend of mine and I want to play point and click adventure games together... but we are about 500km from each other.
I thought, ``no problem". One of the two controls the game using glc and sends the game output (screen and audio) to the
other via netcat.
For this reason I wanted to feed a gstreamer pipeline with glc output. But I am missing something...
First of all I tried with mplayer, to see if it was feasible.
first shell, mplayer reads the stream.
$ mkfifo mux
$ mkfifo video
$ mplayer -identify video
second shell, glc converts from its internal format.
$ glc-play mux -t -y 1 -o video
third shell, glc starts the application and starts the screencast.
$ glc-capture --disable-audio -s -o mux glxgears
mplayer shows the glxgear output just fine, so using the mplayer -identify output, I made a minimal gstreamer pipeline...
$ gst-launch-1.0 -tev filesrc location=video \
! videoparse height=300 width=300 framerate=30 \
! autovideosink
But it does not work, the gears appear distorted and after a while disappear altogether...
I also tried the y4mdec element instead of videoparse, but no can do.
Do anyone has experience? Am I doing something wrong? Or it is a bug in one of the two programs?
Thanks
Last edited by ezzetabi (2012-11-28 09:47:10)
Offline
I stream stuff quite often.
I'd use FFmpeg for that, which can output to nearly anything.
I think you can stream through rtp for a very basic link: http://ffmpeg.org/trac/ffmpeg/wiki/Stre … tstreaming
I personnaly use this command:
ffmpeg \
-f x11grab -s 1920x1080 -r 30 -show_region 1 -i :0 \
-f pulse -ac 2 -ar 44100 -i default \
-isync \
-vcodec libx264 -b:v 700k -tune zerolatency -preset fast -pix_fmt yuv420p -threads 4 -fflags nobuffer -s 960x560 \
-acodec libmp3lame -ab 192k \
-f flv rtmp://myserver/app/live #<< Change it there, use the protocal you want. You can use "tcp://" or "udp://" here.
I tried using GLC for the capture, with pipes like that, but it alway crashed after a while, and it eated a lot of CPU... It would be really nice if GLC could output raw videos so another program can take care of it, but I never found. I got the same mess as you. Also, you can't resize the frames, so you will probably end up using a lot of bandwidth too...
Extra: I use nginx with the rtmp module as a server, and jwplayer as a flash player so my friends can easily play it. The only drawback so far is the latency, it has around 1.5 seconds of latency, which would work nice with a point-n-click anyway.
Last edited by Max-P (2012-11-30 04:12:01)
Offline
It seems a nice solution, do you know a good reference site about how to configure nginx with its rtmp module to share the broadcast?
Offline
Sorry, I don't have any doc for it. The only documentation that I found is the officiel wiki on github. Here my configuration:
rtmp_auto_push on; # VERY IMPORTANT if nginx uses more than one thread
rtmp {
server {
listen 0.0.0.0:1935;
ping 0;
ping_timeout 0;
timeout 1d;
application mystream {
allow publish 10.0.5.103; # Only that IP can publish to the server
allow play all; # Anyone can play the stream
live on; # It's live
hls off; # Never understood that, but since it's disabled you can remote the 3 hls_* lines.
hls_path /tmp/nginx_max-p/;
hls_fragment 1s;
}
}
}
It's pretty straightforward
But since you only want to feed to a single person, using udp or tcp directly (see the FFmpeg wiki link in my previous post) seems more than enough.
Offline
Pages: 1