audio-streaming

Live audio streaming container formats

纵然是瞬间 提交于 2020-02-20 06:11:07
问题 When I start receiving the live audio (radio) stream (e.g. MP3 or AAC) I think the received data are not kind of raw bitstream (i.e. raw encoder output), but they are always wrapped into some container format. If this assumption is correct, then I guess I cannot start streaming from arbitrary place of the stream, but I have to wait to some sync byte. Is that right? Is it usual to have some sync bytes? Is there any header following the sync byte, from which I can guess the used codec, number

Live audio streaming container formats

萝らか妹 提交于 2020-02-20 06:10:34
问题 When I start receiving the live audio (radio) stream (e.g. MP3 or AAC) I think the received data are not kind of raw bitstream (i.e. raw encoder output), but they are always wrapped into some container format. If this assumption is correct, then I guess I cannot start streaming from arbitrary place of the stream, but I have to wait to some sync byte. Is that right? Is it usual to have some sync bytes? Is there any header following the sync byte, from which I can guess the used codec, number

Live audio streaming container formats

左心房为你撑大大i 提交于 2020-02-20 06:10:10
问题 When I start receiving the live audio (radio) stream (e.g. MP3 or AAC) I think the received data are not kind of raw bitstream (i.e. raw encoder output), but they are always wrapped into some container format. If this assumption is correct, then I guess I cannot start streaming from arbitrary place of the stream, but I have to wait to some sync byte. Is that right? Is it usual to have some sync bytes? Is there any header following the sync byte, from which I can guess the used codec, number

Playing audio with FFMPEG

孤街浪徒 提交于 2020-02-03 10:42:31
问题 I have been trying to port FFMPEG (for playing audio) into Android using NDK. I have had some success I could build FFMPEG and link it via NDK. I could call avcodec_decode_audio3() and decode a given audio file. So here I have a audio buffer output from the function. How do I play this now? Any ffmpeg guys can tell me the exact steps to decode and play audio. I am really clueless of what to do with the audio buffers created I got from avcodec_decode_audio3() . thanks a lot. 回答1: I have

how to trim header and side information of mp3 frame using Naudio and c#

冷暖自知 提交于 2020-01-25 11:51:15
问题 My Problem is to get ACTUAL DATA of a mp3 frame. For this I have used NAudio and get RawData but I think in the RawData property, it returns all the bytes of the frame including header and side information. Code is given below: private void button1_Click(object sender, EventArgs e) { Mp3FileReader reader = new Mp3FileReader("file.mp3"); Mp3Frame mp3Frame = reader.ReadNextFrame(); byte [] FrameByteArray = mp3Frame.RawData; BitArray bits = new BitArray(FrameByteArray); Console.Write(mp3Frame

Audio streaming by websockets

心不动则不痛 提交于 2020-01-22 13:16:05
问题 I'm going to create voice chat. My backend server works on Node.js and almost every connection between client and server uses socket.io. Is websockets appropriate for my use case? I prefer communication client -> server -> clients than P2P because I expect even 1000 clients connected to one room. If websocket is ok, then which method is the best to send AudioBuffer to server and playback on other clients? I do it like that: navigator.getUserMedia({audio: true}, initializeRecorder,

How to write FLAC files in java

折月煮酒 提交于 2020-01-22 12:45:05
问题 I have a requirement to write FLAC files in java. Earlier I was writing the audio input into a WAV file and then converting it to FLAC file using a external converter I was looking into JFlac to find any API through which I can write FLAC files. I found that AudioFileFormat.TYPE in java supports only the following file formats - AIFC, AIFF, SND, AU, WAVE . I would like to have a method where I can capture the audio from the microphone and, using an API such as Audiosystem.write, write it to a

Background audio playback in UIWebView

▼魔方 西西 提交于 2020-01-14 04:34:08
问题 My app allows users to access a service that plays audio books via the web. I am using a UIWebView to handle this. When the app is exited or the device is put to sleep, the audio stops playing. Since I am just displaying the web view not playing the audio file directly, I cannot use those methods to play the audio in the background. However, if you access the same link via the Safari app, the audio keeps playing after the phone is put to sleep. How can I achieve the same effect in my app? 回答1

Pipe ffmpeg stream to sox rec

六眼飞鱼酱① 提交于 2020-01-13 19:29:00
问题 I am reading an audio stream via ffpmeg like this: ffmpeg -i http://icecast.radiovox.org:8000/live.ogg -f mp3 filename and want to pipe it to a sox command: rec filename rate 32k silence 1 0.1 3% 1 3.0 3%. Ultimately, what I am trying to achieve, is to record the audio from a live Icecast stream of a talk show. I only want recordings though of the individual's speaking. Everytime there is silence, I want to stop the recording and start a new one once they start speaking again. 回答1: What

android code for streaming shoutcast stream breaks in 2.2

六眼飞鱼酱① 提交于 2020-01-10 07:36:40
问题 The following code works fine on Android 2.1update1 - package com.troubadorian.android.teststreaming; import android.app.Activity; import android.content.Context; import android.os.Bundle; import android.util.Log; import android.view.View; import android.view.ViewGroup; import android.view.Window; import android.view.animation.AnimationUtils; import android.widget.AdapterView; import android.widget.BaseAdapter; import android.widget.Button; import android.widget.Gallery; import android.widget