The proximity sensor on the iPhone detects when the device is close to your face (or otherwise covered). There aren’t many times when using the sensor is of value, however, the Google Voice Search application has put this to good use as a means to trigger voice recording for a search request. If you have an interest in doing something similar, read on.
Proximity Sensor Monitoring
It all begins by enabling proximity monitoring, this is followed by setting up a notification request to call a method when the proximity state changes:
- // Enabled monitoring of the sensor
- [[UIDevice currentDevice] setProximityMonitoringEnabled:YES];
- // Set up an observer for proximity changes
- [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(sensorStateChange:)
- name:@"UIDeviceProximityStateDidChangeNotification" object:nil];
The method below will be called when the sensor state is updated, a message is printed to the debug console based on the sensor proximity.
- - (void)sensorStateChange:(NSNotificationCenter *)notification
- {
- if ([[UIDevice currentDevice] proximityState] == YES)
- NSLog(@"Device is close to user.");
- else
- NSLog(@"Device is ~not~ closer to user.");
- }
Detecting Proximity Sensor
Not all iOS device have proximity sensors. The Apple API documentation states that you should enable proximity monitoring and check the proximityState, if the return value is NO, then the device does not have a sensor.
I was unable to successfully use this approach to determine if a device has a sensor. Any additional ideas or suggestions are welcome.
本文介绍如何利用iPhone的接近传感器实现Google语音搜索的自动触发,通过监测传感器状态变化来启动语音记录功能。
4661

被折叠的 条评论
为什么被折叠?



