You are not logged in.

#1 2012-02-24 21:51:48

KejPi
Member
Registered: 2011-06-17
Posts: 40

[SOLVED] How to stream audio output over wifi?

Hello,
I have following problem. I would like to listen to the music from my laptop on my home Onkyo AV receiver (amplifier). Both laptop and receiver are connected on the same network, laptop via wifi, receiver using LAN. Receiver is capable to play internet radio streams in various format - e.g. ogg or mp3.

My idea is to produce ogg stream on laptop and listen to this stream through amplifier. I was searching for this solution a found something that has always some disadvantage, like MPD + MPC (can play only from database of local files, not from jamendo or other network services), DLNA can play only local files, Icecast + Ices can play from predefined playlist.

My idea is to have something independent - something what takes audio data normally played on soundcard and feeds it to network stream, something what can work with any player - audio or video, or even with desktop notifications.

Does anybody here know how to do it?

Thanks!

Last edited by KejPi (2012-03-14 20:18:28)

Offline

#2 2012-02-24 21:53:27

lucke
Member
From: Poland
Registered: 2004-11-30
Posts: 4,018

Re: [SOLVED] How to stream audio output over wifi?

I believe you could accomplish that with pulseaudio.

Offline

#3 2012-02-24 22:11:54

6ng4n
Member
Registered: 2012-02-07
Posts: 84

Re: [SOLVED] How to stream audio output over wifi?

Use darkice with jack.

Install icecast and darkice jack2

and create a cfg file:

[general]
duration        = 0         # duration of encoding, in seconds. 0 means forever
bufferSecs      = 5         # size of internal slip buffer, in seconds
reconnect       = yes       # reconnect to the server(s) if disconnected

# this section describes the audio input that will be streamed
[input]
device          = jack        # Set jack as input
sampleRate      = 44100     # sample rate in Hz. must be same with jack
bitsPerSample   = 16        # bits per sample.
channel         = 2                 

[icecast2-0]
format          = ogg
bitrateMode     = cbr 
bitrate         = 128
quality         = 0.8
server          = localhost
mountPoint      = myradio #Url of the radio to access ex. localhost/myradio
port            = 8000
password        = your-icecast-password
name            = My Stream 
description     =Example Stream
url             = http://example.org
genre           = Stream
public          = yes

start icecast
start jack with cli or qjackctl
and start darkice:

darkice -c mydarkicesettings.cfg

you're ready connect an application via QJackCtl that supports jack like vlc(Select Tools->Preferences->Audio->Output Module as Jack)

jack2 and icecast packages can be installed with pacman darkice is in AUR

For more info:
http://wiki.radioreference.com/index.ph … tu_Darkice

Last edited by 6ng4n (2012-02-24 22:15:51)

Offline

#4 2012-02-25 22:24:43

KejPi
Member
Registered: 2011-06-17
Posts: 40

Re: [SOLVED] How to stream audio output over wifi?

Thank for your answer and hint with darkice. Honestky I do not like much jack as support for that is quite low - as far as I know, chromium does not support it. But I did a tria with drakice + pulseaudio that was more or less successfull but still it seems to me that pulseaudio is still far from being stable and widely supported.

The problem is thet there are 3 cases in general:
1) KDE4 apps going through phonon (somehow supported by PA but not by JACK)
2) Some apps directly supporting JACK or PA or both like mplayer
3) Some other apps like java, flash, chromium, etc. that are somehow a mistery for me

And to make the working all together using one approach seems to be mission impossible at the moment :-( It is working well without JACK or PA but also without possibility for streaming.

Offline

#5 2012-02-27 05:06:26

ZekeSulastin
Member
Registered: 2010-09-20
Posts: 266

Re: [SOLVED] How to stream audio output over wifi?

You never indicated HOW it was "more or less successful but unsupported".  FWIW, just about everything these days supports either Pulseaudio directly (if only you didn't need to stream it as mp3 or whatever - I use direct pulse-over-network quite a bit and it's quite nice in that regard) or through the ALSA/OSS wrapper.  We can't help you fix it we don't know what's wrong :S

Last edited by ZekeSulastin (2012-02-27 05:09:58)

Offline

#6 2012-02-27 06:17:56

ngoonee
Forum Fellow
From: Between Thailand and Singapore
Registered: 2009-03-17
Posts: 7,354

Re: [SOLVED] How to stream audio output over wifi?

KejPi wrote:

3) Some other apps like java, flash, chromium, etc. that are somehow a mistery for me

You mean ALSA apps. The pulseaudio wiki has instructions on how to make those (and everything else) work with pulse. AFAIK there's no equivalent one-stop resource for JACK, but its the same principle of needing a compatibility layer for ALSA.

And to make the working all together using one approach seems to be mission impossible at the moment :-( It is working well without JACK or PA but also without possibility for streaming.

Its entirely possible, my pulseaudio setup has working java, flash, AND chromium. I can also switch that to a jack setup fairly quickly (and have, before), but I prefer pulseaudio for my use.


Allan-Volunteer on the (topic being discussed) mailn lists. You never get the people who matters attention on the forums.
jasonwryan-Installing Arch is a measure of your literacy. Maintaining Arch is a measure of your diligence. Contributing to Arch is a measure of your competence.
Griemak-Bleeding edge, not bleeding flat. Edge denotes falls will occur from time to time. Bring your own parachute.

Offline

#7 2012-03-01 16:11:05

6ng4n
Member
Registered: 2012-02-07
Posts: 84

Re: [SOLVED] How to stream audio output over wifi?

You can use Pulseaudio JACK plugin

to enable add these lines to /etc/pulse/default.pa:

load-module module-jack-source
load-module module-jack-sink

and connect darkice-PID to Pulseaudio jack input via Qjackctl

or just set the device setting in your darkice config as "pulse"

...bla bla...
[input]
device          = pulse        # Set pulseaudio as input
...foo bar...

Last edited by 6ng4n (2012-03-01 16:12:40)

Offline

#8 2012-03-05 14:00:08

KejPi
Member
Registered: 2011-06-17
Posts: 40

Re: [SOLVED] How to stream audio output over wifi?

Sorry for late response, I was quite busy with other things. At the moment I have removed pulse from my system, but I will try again. To be more specific - one of the problems I have recognized was that flash was playing music (youtube in chromium), amarok was playing music - but not simultaneously. Either one or the second. Then kmix lost almost all his controls and looked very strange.
JACK for me is not the right direction. I think PA is better at the moment if I succeed to make it running smooth (it means without knowing about this monster in background) like my current solution without any sound server. My feeling from one day of trials was that without streaming feature the benefit is more or less nothing, just more problems.

Offline

#9 2012-03-05 14:14:28

kokoko3k
Member
Registered: 2008-11-14
Posts: 2,390

Re: [SOLVED] How to stream audio output over wifi?

See also this post:
https://bbs.archlinux.org/viewtopic.php?id=134207

The trick is to output to an alsa loopback device and have ffmpeg to encode it and stream over rtp.

With a custom .asoundrc you can even output any stream to the real device AND the loopback one:

#modprobe snd-aloop first

pcm.!default {
  type asym
  playback.pcm "LoopAndReal"
  #capture.pcm "looprec"
  capture.pcm "hw:0,0"
}

#"type plug" is mandatory to convert sample type 
pcm.looprec {
  type plug
  slave {
    rate 44100
    pcm "hw:Loopback,1,0"
    channels 2
    format S16_LE
  }
    hint {   show on
      description "default rec koko"
    }

}



pcm.LoopAndReal {
  type plug
  slave.pcm mdev
  route_policy "duplicate"
}


pcm.mdev {
  type multi
  slaves.a.pcm pcm.MixReale
  slaves.a.channels 2
  slaves.b.pcm pcm.MixLoopback
  slaves.b.channels 2
  bindings.0.slave a
  bindings.0.channel 0
  bindings.1.slave a
  bindings.1.channel 1
  bindings.2.slave b
  bindings.2.channel 0
  bindings.3.slave b
  bindings.3.channel 1
}

pcm.MixReale {
  type dmix
  ipc_key 1024
  slave {
    pcm "hw:0,0"
    rate 48000
    #rate 44100
    periods 128
    period_time 0
    period_size 1024 # must be power of 2
    buffer_size 8192
  }
}

pcm.MixLoopback {
  type dmix
  ipc_key 1025
  slave {
    pcm "hw:Loopback,0,0"
    rate 48000
    #rate 44100
    periods 128
    period_time 0
    period_size 1024 # must be power of 2
    buffer_size 8192
  }
}

Server is the device that will play the 'real' media files
Client is the one that will play them by network

On the server, launch the encoding/streaming process (change clientip)

fmpeg -f alsa -ac 2 -i hw:Loopback,1,0 -acodec libmp3lame -b 128k -f rtp rtp://clientip:6000

On the client, create a text file named stream.sdp
You can even compile this file using informations from the previous ffmpeg output.

o=- 0 0 IN IP4 clientip
c=IN IP4 clientip
m=audio 6000 RTP/AVP 14

Then play it on the client:

mplayer stream.sdp

Last edited by kokoko3k (2012-03-05 14:32:17)


Help me to improve ssh-rdp !
Retroarch User? Try my koko-aio shader !

Offline

#10 2012-03-14 20:16:10

KejPi
Member
Registered: 2011-06-17
Posts: 40

Re: [SOLVED] How to stream audio output over wifi?

Thank all of you for your hints. Finally I have decided for ALSA based solution that simply works for all applications smoothly in background and it is here when I need it - in other words, it is something that doesn't borther me when I do not need it. I use ALSA loopback with ffmpeg and ffserver for streaming in mp3 format.This is my final solution:

You need following:

  • alsa

  • ffmpeg

  • if you use KDE4, then you need phonon-gstreamer backend, as in my case phonon-vlc didn't work (it was heavily distorted)

Procedure:

  1. Modprobe snd-aloop (2 loopback channels are enough)

    modprobe snd-aloop pcm_substreams=2

    For permanent solution follow arch wiki https://wiki.archlinux.org/index.php/Modprobe

  2. Create ALSA configuration file and store it either to ~/.asoundrc (just for user) or to /etc/asound.conf as system wide:

    pcm.!default {
      type asym
      playback.pcm "LoopAndReal"
      capture.pcm "hw:0,0"
      hint {   
        show on
        description "Default with loopback"
      }
    }
    
    #"type plug" is mandatory to convert sample type 
    pcm.LoopAndReal {
      type plug
      slave.pcm mdev
      route_policy "duplicate"
      hint {   
        show on
        description "LoopAndReal"
      }
    }
                                                                                                                                                                                                                                                                                                                                              
    pcm.mdev {                                                                                                                                                           
      type multi                                                                                                                                                         
      slaves.a.pcm pcm.MixReal                                                                                                                                           
      slaves.a.channels 2                                                                                                                                                
      slaves.b.pcm pcm.MixLoopback                                                                                                                                       
      slaves.b.channels 2                                                                                                                                                
      bindings.0.slave a                                                                                                                                                 
      bindings.0.channel 0                                                                                                                                               
      bindings.1.slave a                                                                                                                                                 
      bindings.1.channel 1                                                                                                                                               
      bindings.2.slave b
      bindings.2.channel 0
      bindings.3.slave b
      bindings.3.channel 1
    }
    
    pcm.MixReal {
      type dmix
      ipc_key 1024
      slave {
        pcm "hw:0,0"
        #rate 48000
        #rate 44100
        #periods 128
        #period_time 0
        #period_size 1024 # must be power of 2
        #buffer_size 8192
      }
    }
    
    pcm.MixLoopback {
      type dmix
      ipc_key 1025
      slave {
        pcm "hw:Loopback,0,0"
        #rate 48000
        #rate 44100
        #periods 128
        #period_time 0
        #period_size 1024 # must be power of 2
        #buffer_size 8192
      }
    }

    You can play with sample rates and buffer sizes you you have any problem. This configuration works on my system.

  3. Prepare ffserver configration and store either to default location /etc/ffserver.conf as system wide setup or anywhere to your home:

    # Port on which the server is listening. You must select a different
    # port from your standard HTTP web server if it is running on the same
    # computer.
    Port 8090
    
    # Address on which the server is bound. Only useful if you have
    # several network interfaces.
    BindAddress 0.0.0.0
    
    # Number of simultaneous HTTP connections that can be handled. It has
    # to be defined *before* the MaxClients parameter, since it defines the
    # MaxClients maximum limit.
    MaxHTTPConnections 2000
    
    # Number of simultaneous requests that can be handled. Since FFServer
    # is very fast, it is more likely that you will want to leave this high
    # and use MaxBandwidth, below.
    MaxClients 1000
    
    # This the maximum amount of kbit/sec that you are prepared to
    # consume when streaming to clients.
    MaxBandwidth 1000
    
    # Access log file (uses standard Apache log file format)
    # '-' is the standard output.
    CustomLog -
    
    # Suppress that if you want to launch ffserver as a daemon.
    NoDaemon
    
    
    ##################################################################
    # Definition of the live feeds. Each live feed contains one video
    # and/or audio sequence coming from an ffmpeg encoder or another
    # ffserver. This sequence may be encoded simultaneously with several
    # codecs at several resolutions.
    
    <Feed feed1.ffm>
    
    # You must use 'ffmpeg' to send a live feed to ffserver. In this
    # example, you can type:
    #
    # ffmpeg http://localhost:8090/feed1.ffm
    
    # ffserver can also do time shifting. It means that it can stream any
    # previously recorded live stream. The request should contain:
    # "http://xxxx?date=[YYYY-MM-DDT][[HH:]MM:]SS[.m...]".You must specify
    # a path where the feed is stored on disk. You also specify the
    # maximum size of the feed, where zero means unlimited. Default:
    # File=/tmp/feed_name.ffm FileMaxSize=5M
    File /tmp/feed1.ffm
    FileMaxSize 200K
    
    # You could specify
    # ReadOnlyFile /saved/specialvideo.ffm
    # This marks the file as readonly and it will not be deleted or updated.
    
    # Specify launch in order to start ffmpeg automatically.
    # First ffmpeg must be defined with an appropriate path if needed,
    # after that options can follow, but avoid adding the http:// field
    #Launch ffmpeg
    
    # Only allow connections from localhost to the feed.
    #ACL allow 127.0.0.1
    
    </Feed>
    
    
    ##################################################################
    # Now you can define each stream which will be generated from the
    # original audio and video stream. Each format has a filename (here
    # 'test1.mpg'). FFServer will send this stream when answering a
    # request containing this filename.
    
    # MP3 audio
    <Stream stream.mp3>
    Feed feed1.ffm
    Format mp2
    AudioCodec libmp3lame
    AudioBitRate 320
    AudioChannels 2
    AudioSampleRate 44100
    NoVideo
    </Stream>
    
    
    # Ogg Vorbis audio
    #<Stream test.ogg>
    #Feed feed1.ffm
    #Format ogg
    #AudioCodec libvorbis
    #Title "Stream title"
    #AudioBitRate 64
    #AudioChannels 2
    #AudioSampleRate 44100
    #NoVideo
    #</Stream>
    
    ##################################################################
    # Special streams
    
    # Server status
    
    <Stream stat.html>
    Format status
    
    # Only allow local people to get the status
    ACL allow localhost
    ACL allow 192.168.1.0 192.168.1.255
    
    #FaviconURL http://pond1.gladstonefamily.net:8080/favicon.ico
    </Stream>
    
    # Redirect index.html to the appropriate site
    <Redirect index.html>
    URL http://www.ffmpeg.org/
    </Redirect>

    This sets ffserver for streaming in MP3 format, stereo, 320kbps. Unfortunately I haven't succeeded with OGG Vorbis streaming.

  4. Now you have all configuration you need and if you want to stream following two commands do that:

    ffserver -f ffserver.conf
    ffmpeg -f alsa -ac 2 -i hw:Loopback,1,0 http://localhost:8090/feed1.ffm
  5. You can test it for example by mplayer:

    mplayer http://YourLinuxBox:8090/stream.mp3

And that's it. Sound is played by normal sound card and sent to stream simultaneously. If you do not want to listen sound from computer you can mute your soundcard. It has an advantage that one can normally listen to music on the computer with or without streaming and in both cases without any reconfiguration. To start streaming just call ffserver and ffmpeg.

Advantages:
+ very simple solution without any special sound server
+ no special SW required (in my case I had already instaled all I need for that)
+ streaming on request by two simple commands
+ normal soundcard function
+ streaming in MP3 format that is supported by many home AV receivers

Disadvantages
- phonon-vlc backend not compatible (also VLC does not work)
- OGG streaming does not work
- some latency (~ 5 sec)
- all sounds are sent to stream, including various desktop notifications (in KDE could be managed by phonon)

Offline

#11 2012-03-19 12:54:08

kokoko3k
Member
Registered: 2008-11-14
Posts: 2,390

Re: [SOLVED] How to stream audio output over wifi?

The big latency was the main reason because i choose to use udp/rtp streaming; that way i can reach 300ms, not bad.


Help me to improve ssh-rdp !
Retroarch User? Try my koko-aio shader !

Offline

#12 2012-03-20 15:51:02

KejPi
Member
Registered: 2011-06-17
Posts: 40

Re: [SOLVED] How to stream audio output over wifi?

Yes, latency would be much lower in rtp/udp case, but still using ffmpeg/ffserver for MP3 streaming instead of darkice/icecast keeps it in reasonable range for my usecase. The problem is that I need some format supported by Onkyo AV receiver.

Offline

#13 2012-03-22 19:44:41

iv597
Member
From: United States
Registered: 2011-09-18
Posts: 96
Website

Re: [SOLVED] How to stream audio output over wifi?

I believe Rsound would work here...


Currently running Arch on a Samsung Chromebook Pro (dual booted with ChromeOS), and various VPSes and Docker containers.

Dotfiles on Github

Offline

#14 2012-03-22 19:47:41

iv597
Member
From: United States
Registered: 2011-09-18
Posts: 96
Website

Re: [SOLVED] How to stream audio output over wifi?

Well, for directing ALSA over the network. If you're trying to stream audio FILES over the network, ignore me big_smile


Currently running Arch on a Samsung Chromebook Pro (dual booted with ChromeOS), and various VPSes and Docker containers.

Dotfiles on Github

Offline

#15 2012-03-23 07:27:11

kokoko3k
Member
Registered: 2008-11-14
Posts: 2,390

Re: [SOLVED] How to stream audio output over wifi?

I tried rsound in the past, nice idea, but quite unstable


Help me to improve ssh-rdp !
Retroarch User? Try my koko-aio shader !

Offline

#16 2012-09-22 20:42:12

gbc921
Member
From: Brazil
Registered: 2011-08-05
Posts: 64
Website

Re: [SOLVED] How to stream audio output over wifi?

I don't know if now is too old to came back to this thread but, for me the loopback sound did not work.

When I do:

ffmpeg -f alsa -ac 2 -i hw:Loopback,1,0 http://localhost:8090/feed1.ffm

I get no sound, but the stream is fine.

However, if I put on hw: "hw:0,0" I can get the sound from my internal mic.
I tried other possibilities but, could not get it.

I've checked the .asoundrc file and it is the same as "KejPi" posted.

I'm using the phono-gstreamer backend and the loopback interface shows on the Gnome Sound configuration, although if I select it no sound is heard.

Sincerely,


- "Nothing is impossible, impossible just takes longer"
- My Arch Linux Mirror located in São Paulo, Brazil: http://archmirror.duckdns.org:9000/

Offline

#17 2012-09-22 21:22:29

kokoko3k
Member
Registered: 2008-11-14
Posts: 2,390

Re: [SOLVED] How to stream audio output over wifi?

...more or less the same, but check it out:
https://bbs.archlinux.org/viewtopic.php?id=147852


Help me to improve ssh-rdp !
Retroarch User? Try my koko-aio shader !

Offline

Board footer

Powered by FluxBB