openal

OpenAL device, buffer and context relationship

最后都变了- 提交于 2019-12-10 12:49:04
问题 I'm trying to create an object oriented model to wrap OpenAL and have a little problem understanding the devices, buffers and contexts. From what I can see in the Programmer's Guide, there are multiple devices, each of which can have multiple contexts as well as multiple buffers. Each context has a listener, and the alListener*() functions all operate on the listener of the active context. (Meaning that I have to make another context active first if I wanted to change it's listener, if I got

Reverb with OpenAL on iOS

眉间皱痕 提交于 2019-12-08 17:07:23
问题 Is there any possible way to do reverb using OpenAL on iOS? Anyone have any code snippets to achieve this effect? I know it's not included in the OpenAL library for iOS, but I would think there's still a way to program it in. Thanks. 回答1: Reverb is now natively supported in OpenAL (as of iOS 5.0). You can view a sample implementation on the ObjectAL project: https://github.com/kstenerud/ObjectAL-for-iPhone Just grab the most recent source from this repository, load "ObjectAL.xcodeproj" and

Why are my audio sounds not playing on time?

你。 提交于 2019-12-08 13:57:48
One of my apps has a simple metronome-style feature that plays a click sound a specified number of times per minute (bpm). I'm doing this by starting an NSTimer, with an interval calculated from the specified bpm, that calls a method that plays the sound. If I put an NSLog line into the play method, I can see that NSTimer is firing accurately to about 1 millisecond. However, if I record the sound output into an audio editor and then measure the interval between clicks, I can see that they are not evenly spaced. For example, with 150 bpm, the timer fires every 400 milliseconds. But most of the

Play stream in OpenAL library

馋奶兔 提交于 2019-12-08 13:22:02
问题 I need to play stream in OpenAL. But i dont understand what i need to do with buffers and source. My pseudocode: FirstTime = true; while (true) { if (!FirstTime) { alSourceUnqueueBuffers(alSource, 1, &unbuf); } //get buffer to play in boost::array buf (882 elements) (MONO16). if (NumberOfSampleSet >=3) { alBufferData(alSampleSet[NumberOfSampleSet], AL_FORMAT_MONO16, buf.data(), buf.size(), 44100); alSourceQueueBuffers(alSource, 1, &alSampleSet[NumberOfSampleSet++]); if (NumberOfSampleSet == 4

multi track mp3 playback for iOS application

偶尔善良 提交于 2019-12-07 18:17:02
问题 I am doing an application that involves playing back a song in a multi track format (drums, vocals, guitar, piano, etc...). I don't need to do any fancy audio processing to each track, all I need to be able to do is play, pause, and mute/unmute each track. I had been using multiple instances of AVAudioPlayer but when performing device testing, I noticed that the tracks are playing very slightly out of sync when they are first played. Furthermore, when I pause and play the tracks they continue

Which framework should I use to play an audio file (WAV, MP3, AIFF) in iOS with low latency?

爷,独闯天下 提交于 2019-12-07 12:06:42
问题 iOS has various audio frameworks from the higher-level that lets you simply play a specified file, to the lower level that lets you get at the raw PCM data, and everything in between. For our app, we just need to play external files (WAV, AIFF, MP3) but we need to do so in response to pressing a button and we need that latency to be as small as possible. (It's for queueing in live productions.) Now the AVAudioPlayer and such work to play simple file assets (via their URL), but its latency in

maximum number of OpenAL sound buffers on iPhone

冷暖自知 提交于 2019-12-07 11:02:43
问题 I'm writing a sound library, for the iPhone, that uses OpenAL. The app generates a unique buffer id for each sound, during startup. The problem that I'm having is that OpenAL is unable to generate more than 1024 buffer ids. I would've thought that the total number of buffer ids would've been limited by memory, not by some pre-determined number. I haven't been able to find any documentation that specifies the maximum number of buffers available to OpenAL on an iOS device. Can anyone confirm

Getting notified when a sound is done playing in OpenAL

蹲街弑〆低调 提交于 2019-12-07 03:33:47
问题 I'm using OpenAL on iPhone to play multiple audio samples simultaneously. Can I get OpenAL to notify me when a single sample is done playing? I'd like to avoid hardcoding the sample length and setting a timer. 回答1: If you have the OpenAL source abstracted into a class, I guess you can simply call performSelector:afterDelay: when you start the sound: - (void) play { [delegate performSelector:@selector(soundHasFinishedPlaying) afterDelay:self.length]; … } (If you stop the sound manually in the

Streaming Data to Sound Card Using C on Windows [closed]

痴心易碎 提交于 2019-12-06 08:10:30
Closed. This question is off-topic . It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 4 years ago . As part of a university project I have to do some signal processing and would like to output the results using the PC sound card. The software has to be written in C and needs to work with Windows (preferably 7 and XP). I have found code examples for outputting .wav and similar files, but I am interested in continuously outputting data rather than outputting from files. It is likely that the data for output will

How do I stream audio into OpenAL Sources?

醉酒当歌 提交于 2019-12-06 05:22:24
I've just began working with OpenAL. I've successfully loaded in WAV files into it, and played them successfully. It was easy enough. Now, I need to be able to stream music into OpenAL rather than loading entire files into it. While this is good for sound effects and the like, it can be very dangerous to do with music, as you probably know. The problem is, I cannot seem to find anything on Google related to this. While I have found some examples related to streaming OGG files, I'd much rather make a system that supports all music files. From what I understand, OpenAL should have built in