avaudioengine

Audio won't play after app interrupted by phone call iOS

邮差的信 提交于 2020-01-22 17:10:48
问题 I have a problem in my SpriteKit game where audio using playSoundFileNamed(_ soundFile:, waitForCompletion:) will not play after the app is interrupted by a phone call. (I also use SKAudioNodes in my app which aren't affected but I really really really want to be able to use the SKAction playSoundFileNamed as well.) Here's the gameScene.swift file from a stripped down SpriteKit game template which reproduces the problem. You just need to add an audio file to the project and call it "note" I

Audio won't play after app interrupted by phone call iOS

拥有回忆 提交于 2020-01-22 17:10:31
问题 I have a problem in my SpriteKit game where audio using playSoundFileNamed(_ soundFile:, waitForCompletion:) will not play after the app is interrupted by a phone call. (I also use SKAudioNodes in my app which aren't affected but I really really really want to be able to use the SKAction playSoundFileNamed as well.) Here's the gameScene.swift file from a stripped down SpriteKit game template which reproduces the problem. You just need to add an audio file to the project and call it "note" I

How to play multiple sounds from buffer simultaneously using nodes connected to AVAudioEngine's mixer

若如初见. 提交于 2020-01-14 04:28:07
问题 I am making a basic music app for iOS, where pressing notes causes the corresponding sound to play. I am trying to get multiple sounds stored in buffers to play simultaneously with minimal latency. However, I can only get one sound to play at any time. I initially set up my sounds using multiple AVAudioPlayer objects, assigning a sound to each player. While it did play multiple sounds simultaneously, it didn't seem like it was capable of starting two sounds at the same time (it seemed like it

Play audio from AVAudioPCMBuffer with AVAudioEngine

与世无争的帅哥 提交于 2020-01-12 05:24:13
问题 I have two classes, MicrophoneHandler , and AudioPlayer . I have managed to use AVCaptureSession to tap microphone data using the approved answer here, and and converted the CMSampleBuffer to NSData using this function: func sendDataToDelegate(buffer: CMSampleBuffer!) { let block = CMSampleBufferGetDataBuffer(buffer) var length = 0 var data: UnsafeMutablePointer<Int8> = nil var status = CMBlockBufferGetDataPointer(block!, 0, nil, &length, &data) // TODO: check for errors let result = NSData

Play audio from AVAudioPCMBuffer with AVAudioEngine

こ雲淡風輕ζ 提交于 2020-01-12 05:24:07
问题 I have two classes, MicrophoneHandler , and AudioPlayer . I have managed to use AVCaptureSession to tap microphone data using the approved answer here, and and converted the CMSampleBuffer to NSData using this function: func sendDataToDelegate(buffer: CMSampleBuffer!) { let block = CMSampleBufferGetDataBuffer(buffer) var length = 0 var data: UnsafeMutablePointer<Int8> = nil var status = CMBlockBufferGetDataPointer(block!, 0, nil, &length, &data) // TODO: check for errors let result = NSData

AVAudioEngine downsample issue

风流意气都作罢 提交于 2020-01-09 11:45:14
问题 I'm having an issue with downsampling audio taken from the microphone. I'm using AVAudioEngine to take samples from the microphone with the following code: assert(self.engine.inputNode != nil) let input = self.engine.inputNode! let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false) let mixer = AVAudioMixerNode() engine.attach(mixer) engine.connect(input, to: mixer, format: input.inputFormat(forBus: 0)) do { try engine.start() mixer

AVAudioEngine downsample issue

旧街凉风 提交于 2020-01-09 11:45:09
问题 I'm having an issue with downsampling audio taken from the microphone. I'm using AVAudioEngine to take samples from the microphone with the following code: assert(self.engine.inputNode != nil) let input = self.engine.inputNode! let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false) let mixer = AVAudioMixerNode() engine.attach(mixer) engine.connect(input, to: mixer, format: input.inputFormat(forBus: 0)) do { try engine.start() mixer

iOS 9 detect silent mode

南楼画角 提交于 2020-01-04 06:28:12
问题 I've been looking for a long time, posting here as a final try before giving up. I want to detect if I'm currently on Silent mode or not. I found a workaround (playing a fake sound and checking the completion) that works fine but only when I'm not in AVAudioSessionCategoryPlayAndRecord mode. This is precisely on a screen where I can record audio and video that I want to achieve this in order to know whether I should play UI sounds or not. To sum up, I trying to find a way to detect silent

Changing the volume of an SCNAudioPlayer in real time - Swift

别说谁变了你拦得住时间么 提交于 2020-01-01 14:49:13
问题 I am trying to work out how I can changed the volume of an SCNAudioPlayer in real time. Currently, I have an SCNAudioSource connected to an SCNAudioPlayer . This audio player is then assigned to a SCNNode so that my sound makes use of SceneKits spatial audio processing. As it stands, I am able to change the volume of each SCNNode using SCNAudioSource.volume triggered by the boolean variable vol . An extract of my code for this is shown below: (audioSource, audioCount) = soundFileSelect

SKAudioNode() crashes when plugging in/out headphones

时光毁灭记忆、已成空白 提交于 2020-01-01 08:34:06
问题 I am using a SKAudioNode() to play background music in my game. I have a play/pause function and everything is working fine until I plug in my headphones. There is no sound at all and when I call the pause/play function I get this error AVAudioPlayerNode.mm:333: Start: required condition is false: _engine->IsRunning() com.apple.coreaudio.avfaudio', reason: 'required condition is false: _engine->IsRunning() Does anyone knows what this means? Code: import SpriteKit class GameScene: SKScene {