问题
How can the iPhone be set to vibrate once?
For example, when a player loses a life or the game is over, the iPhone should vibrate.
回答1:
From "iPhone Tutorial: Better way to check capabilities of iOS devices":
There are two seemingly similar functions that take a parameter kSystemSoundID_Vibrate
:
1) AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
2) AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
Both of the functions vibrate the iPhone. But, when you use the first function on devices that don’t support vibration, it plays a beep sound. The second function, on the other hand, does nothing on unsupported devices. So if you are going to vibrate the device continuously, as an alert, common sense says, use function 2.
First, add the AudioToolbox framework AudioToolbox.framework
to your target in Build Phases.
Then, import this header file:
#import <AudioToolbox/AudioServices.h>
回答2:
Swift 2.0+
AudioToolbox now presents the kSystemSoundID_Vibrate
as a SystemSoundID
type, so the code is:
import AudioToolbox.AudioServices
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)
AudioServicesPlayAlertSound(kSystemSoundID_Vibrate)
Instead of having to go thru the extra cast step
(Props to @Dov)
Original Answer (Swift 1.x)
And, here's how you do it on Swift (in case you ran into the same trouble as I did)
Link against AudioToolbox.framework
(Go to your project, select your target, build phases, Link Binary with Libraries, add the library there)
Once that is completed:
import AudioToolbox.AudioServices
// Use either of these
AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
AudioServicesPlayAlertSound(SystemSoundID(kSystemSoundID_Vibrate))
The cheesy thing is that SystemSoundID
is basically a typealias
(fancy swift typedef
) for a UInt32
, and the kSystemSoundID_Vibrate
is a regular Int
. The compiler gives you an error for trying to cast from Int
to UInt32
, but the error reads as "Cannot convert to SystemSoundID", which is confusing. Why didn't apple just make it a Swift enum is beyond me.
@aponomarenko's goes into the details, my answer is just for the Swifters out there.
回答3:
A simple way to do so is with Audio Services:
#import <AudioToolbox/AudioToolbox.h>
...
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
回答4:
I had great trouble with this for devices that had vibration turned off in some manner, but we needed it to work regardless, because it is critical to our application functioning, and since it is just an integer to a documented method call, it will pass validation. So I have tried some sounds that were outside of the well documented ones here: TUNER88/iOSSystemSoundsLibrary
I have then stumbled upon 1352, which is working regardless of the silent switch or the settings on the device (Settings->vibrate on ring, vibrate on silent)
.
- (void)vibratePhone;
{
if([[UIDevice currentDevice].model isEqualToString:@"iPhone"])
{
AudioServicesPlaySystemSound (1352); //works ALWAYS as of this post
}
else
{
// Not an iPhone, so doesn't have vibrate
// play the less annoying tick noise or one of your own
AudioServicesPlayAlertSound (1105);
}
}
回答5:
Important Note: Alert of Future Deprecation.
As of iOS 9.0, the API functions description for:
AudioServicesPlaySystemSound(inSystemSoundID: SystemSoundID)
AudioServicesPlayAlertSound(inSystemSoundID: SystemSoundID)
includes the following note:
This function will be deprecated in a future release.
Use AudioServicesPlayAlertSoundWithCompletion or
AudioServicesPlaySystemSoundWithCompletion instead.
The right way to go will be using any of these two:
AudioServicesPlayAlertSoundWithCompletion(kSystemSoundID_Vibrate, nil)
or
AudioServicesPlayAlertSoundWithCompletion(kSystemSoundID_Vibrate) {
//your callback code when the vibration is done (it may not vibrate in iPod, but this callback will be always called)
}
remember to
import AVFoundation
回答6:
And if you're using Xamarin (monotouch) framework, simply call
SystemSound.Vibrate.PlayAlertSound()
回答7:
In my travels I have found that if you try either of the following while you are recording audio, the device will not vibrate even if it is enabled.
1) AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
2) AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
My method was called at a specific time in the measurement of the devices movements. I had to stop the recording and then restart it after the vibration had occurred.
It looked like this.
-(void)vibrate {
[recorder stop];
AudioServicesPlaySystemSound (kSystemSoundID_Vibrate);
[recorder start];
}
recorder
is an AVRecorder instance.
Hope this helps others that have had the same problem before.
回答8:
In iOS 10, and on newer iPhones, you can also use haptic API. This haptic feedback is softer than the AudioToolbox API.
For your GAME OVER scenario, a heavy UI impact feedback should be suitable.
UIImpactFeedbackGenerator(style: .heavy).impactOccurred()
You could use the other haptic feedback styles.
回答9:
For an iPhone 7/7 Plus or newer, use these three Haptic feedback APIs.
Available APIs
For notifications:
let generator = UINotificationFeedbackGenerator()
generator.notificationOccured(style: .error)
Available styles are .error
, .success
, and .warning
. Each has its own distinctive feel.
From the docs:
A concrete
UIFeedbackGenerator
subclass that creates haptics to communicate successes, failures, and warnings.
For simple vibrations:
let generator = UIImpactFeedbackGenerator(style: .medium)
generator.impactOccured()
Available styles are .heavy
, .medium
, and .light
. These are simple vibrations with varying degrees of "hardness".
From the docs:
A concrete
UIFeedbackGenerator
subclass that creates haptics to simulate physical impacts
For when the user selected an item
let generator = UISelectionFeedbackGenerator()
generator.selectionChanged()
This is the least noticeable of all the haptics, and so is the most suitable for when haptics should not be taking over the app experience.
From the docs:
A concrete
UIFeedbackGenerator
subclass that creates haptics to indicate a change in selection.
Notes
There are a couple of things worth remembering when using these APIs.
Note A
You do not actually create the haptic. You request the system generate a haptic. The system will decide based on the below:
- If haptics are possible on the device (whether it has a Taptic Engine in this case)
- Whether the app may record audio (haptics do not generate during recording to prevent unwanted interference)
- Whether haptics are enabled in system Settings.
Therefore, the system will silently ignore your request for a haptic if it is not possible. If this is due to an unsupported device, you could try this:
func haptic() {
// Get whether the device can generate haptics or not
// If feedbackSupportLevel is nil, will assign 0
let feedbackSupportLevel = UIDevice.current.value(forKey: "_feedbackSupportLevel") as? Int ?? 0
switch feedbackSupportLevel {
case 2:
// 2 means the device has a Taptic Engine
// Put Taptic Engine code here, using the APIs explained above
case 1:
// 1 means no Taptic Engine, but will support AudioToolbox
// AudioToolbox code from the myriad of other answers!
default: // 0
// No haptic support
// Do something else, like a beeping noise or LED flash instead of haptics
}
Substitute the comments in the switch
-case
statements, and this haptic generation code will be portable to other iOS devices. It will generate the highest level of haptic possible.
Note B
- Due to the fact that generating haptics is a hardware-level task, there may be latency between when you call the haptic-generation code, and when it actually happens. For this reason, the Taptic Engine APIs all have a
prepare()
method, to put it in a state of readiness. Using your Game Over example: You may know that the game is about to end, by the user having very low HP, or a dangerous monster being near them. - If you don't generate a haptic within a few seconds, the Taptic Engine will go back into an idle state (to save battery life)
In this case, preparing the Taptic Engine would create a higher-quality, more responsive experience.
For example, let's say your app uses a pan gesture recogniser to change the portion of the world visible. You want a haptic to generate when the user 'looks' round 360 degrees. Here is how you could use prepare()
:
@IBAction func userChangedViewablePortionOfWorld(_ gesture: UIPanGestureRecogniser!) {
haptic = UIImpactFeedbackGenerator(style: .heavy)
switch gesture.state {
case .began:
// The user started dragging the screen.
haptic.prepare()
case .changed:
// The user trying to 'look' in another direction
// Code to change viewable portion of the virtual world
if virtualWorldViewpointDegreeMiddle = 360.0 {
haptic.impactOccured()
}
default:
break
}
回答10:
In Swift:
import AVFoundation
...
AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
回答11:
In my case I was using the AVCaptureSession. AudioToolbox was in project's build phases and it was imported but still didn't work. In order to make it work I stopped the session before vibration and continued on after that.
#import <AudioToolbox/AudioToolbox.h>
...
@property (nonatomic) AVCaptureSession *session;
...
- (void)vibratePhone;
{
[self.session stopRunning];
NSLog(@"vibratePhone %@",@"here");
if([[UIDevice currentDevice].model isEqualToString:@"iPhone"])
{
AudioServicesPlaySystemSound (kSystemSoundID_Vibrate);
}
else
{
AudioServicesPlayAlertSound (kSystemSoundID_Vibrate);
}
[self.session startRunning];
}
回答12:
You can use
1) AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
for iPhone and few newer iPods.
2) AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
for iPads.
来源:https://stackoverflow.com/questions/4724980/making-the-iphone-vibrate