avaudioengine

SKAudioNode() crashes when plugging in/out headphones

安稳与你 提交于 2020-01-01 08:34:06
问题 I am using a SKAudioNode() to play background music in my game. I have a play/pause function and everything is working fine until I plug in my headphones. There is no sound at all and when I call the pause/play function I get this error AVAudioPlayerNode.mm:333: Start: required condition is false: _engine->IsRunning() com.apple.coreaudio.avfaudio', reason: 'required condition is false: _engine->IsRunning() Does anyone knows what this means? Code: import SpriteKit class GameScene: SKScene {

Set AVAudioEngine Input and Output Devices

霸气de小男生 提交于 2019-12-31 22:26:29
问题 I've been playing around with Apple's shiny new AVFoundation library, but so far I've unable to set the input or output devices (e.g. a USB sound card) used by an AVAudioEngine , and I can't seem to find anything in the documentation to say it's even possible. Does anyone have any experience with this? 回答1: Ok, after re-reading the docs for the 10th time, I noticed AVAudioEngine has members inputNode and outputNode (not sure how I missed that!). The following code seems to do the job:

Is AVAudioPlayerNode isPlaying flag set as soon as scheduleBuffer is called?

人盡茶涼 提交于 2019-12-24 20:13:17
问题 I've got two AVAudioPlayerNodes. I schedule one using scheduleBuffer. I want to cancel it before it starts but not if it has already started. How do I know if it has actually started? The isPlaying flag appears to be set as soon as it is scheduled rather than when it actually starts at the scheduled time. 回答1: AVAudioPlayerNode takes a render cycle after calling play() before actual audio starts playing. If you are trying to identify the time between play() and audio being played, you can get

Swift AVAudioEngine: Changing the Audio Input Device for MacOS

拟墨画扇 提交于 2019-12-24 08:08:15
问题 I'm trying to change the input device used to listen to incoming audio. I've tried a number of solutions, but most end up with the following error when preparing and starting the audio-engine: AVAEInternal.h:82:_AVAE_CheckAndReturnErr: required condition is false: [AVAudioEngineGraph.mm:1295:Initialize: (IsFormatSampleRateAndChannelCountValid(outputHWFormat))] Current (simplified) code: var engine = AVAudioEngine() var inputDeviceID: AudioDeviceID = 41 // another audio input device let

Connect AVAudioInputNode to AVAudioUnitEffect using AVAudioEngine

只愿长相守 提交于 2019-12-24 02:07:50
问题 I want to process the audio from my device's built-in microphone ( AVAudioInputNode ) with an audio unit effect ( AVAudioUnitEffect ). For my example, I'm using AVAudioUnitReverb . Connecting AVAudioUnitReverb is causing the application to crash. import UIKit import AVFoundation class ViewController: UIViewController { let audioEngine = AVAudioEngine() let unitReverb = AVAudioUnitReverb() var inputNode: AVAudioInputNode! override func viewDidLoad() { super.viewDidLoad() inputNode =

AVAudioPlayerNode does not play sound

旧城冷巷雨未停 提交于 2019-12-23 04:55:11
问题 The AudioPlayerNode is an instance variable code is a follows: class HXAudioEngine { private var audioEngine: AVAudioEngine = AVAudioEngine() var digitFileUrl: URL? { didSet { if let digitUrl = digitFileUrl { do { digitAudioFile = try AVAudioFile(forReading: digitUrl) } catch let error { print("Error loading Digit file: \(error.localizedDescription)") } } } } var digitAudioFile: AVAudioFile? { didSet { if let digitFile = digitAudioFile { digitAudioFormat = digitFile.processingFormat

Connecting AVAudioMixerNode to AVAudioEngine

谁都会走 提交于 2019-12-21 20:59:08
问题 I use AVAudioMixerNode to change audio format. this entry helped me a lot. Below code gives me data i want. But i hear my own voice on phone's speaker. How can i prevent it? func startAudioEngine() { engine = AVAudioEngine() guard let engine = engine, let input = engine.inputNode else { // @TODO: error out return } let downMixer = AVAudioMixerNode() //I think you the engine's I/O nodes are already attached to itself by default, so we attach only the downMixer here: engine.attach(downMixer) /

AVAudioSequencer Causes Crash on Deinit/Segue: 'required condition is false: outputNode'

非 Y 不嫁゛ 提交于 2019-12-20 06:36:31
问题 The below code causes a crash with the following errors whenever the object is deinitialized (e.g. when performing an unwind segue back to another ViewController): required condition is false: [AVAudioEngineGraph.mm:4474:GetDefaultMusicDevice: (outputNode)] Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: outputNode' The AVAudioSequencer is the root of the issue, because the error ceases if this is removed. How can this crash be

How to save the audio with changed pitch and speed iOS?

﹥>﹥吖頭↗ 提交于 2019-12-20 03:19:13
问题 I'm able to change the pitch and speed of my audio but I'm getting problem in saving the audio with changed pitch and speed //this is method which set the pitch [self.audioEngine connect:audioPlayerNode to:timePitchEffect format:nil]; [self.audioEngine connect:timePitchEffect to:self.audioEngine.outputNode format:nil]; [audioPlayerNode scheduleFile:self.audioFile atTime:nil completionHandler:nil]; [self.audioEngine startAndReturnError:&audioEngineError]; NSLog(@"%@",self.audioFile.url); if

Can we incorporate the Speech Recognition framework with Today Extension

青春壹個敷衍的年華 提交于 2019-12-20 03:09:42
问题 I am trying to build a widget which has a speech button. When I press it , it should convert the speech uttered to text. However when I try to record the voice , AVAudioEngine fails to start. Is it because AVAudioEngine is not allowed in Today Extension? 回答1: Searching the internet I seem to have found my answer. The short answer is no, you can't record Audio in extension. 来源: https://stackoverflow.com/questions/44365523/can-we-incorporate-the-speech-recognition-framework-with-today-extension