Tutorial: Detecting When A User Blows Into The Mic

By
On August 19, 2009

If, a couple of years back, you’d told me that people would expect to be able to shake their phone or blow into the mic to make something happen I would have laughed. And here we are.

Detecting a shake gesture is straightforward, all the more so in 3.0 with the introduction of motion events.

Detecting when a user blows into the microphone is a bit more difficult. In this tutorial we’ll create a simple simple single-view app that writes a log message to the console when a user blows into the mic.

Source/Github

The code for this tutorial is available on GitHub. You can either clone the repository or download this zip.

Overview

The job of detecting when a user blows into the microphone is separable into two parts: (1) taking input from the microphone and (2) listening for a blowing sound.

We’ll use the new-in-3.0 AVAudioRecorder class to grab the mic input. Choosing AVAudioRecorder lets us use Objective-C without — as other options require — dropping down to C.

The noise/sound of someone blowing into the mic is made up of low-frequency sounds. We’ll use a low pass filter to reduce the high frequency sounds coming in on the mic; when the level of the filtered signal spikes we’ll know someone’s blowing into the mic.

Creating The Project

Launch Xcode and create a new View-Based iPhone application called MicBlow:

  1. Create a new project using File > New Project… from Xcode’s menu
  2. Select View-based Application from the iPhone OS > Application section, click Choose…
  3. Name the project as MicBlow and click Save

Adding The AVFoundation Framework

In order to use the SDK’s AVAudioRecorder class, we’ll need to add the AVFoundation framework to the project:

  1. Expand the Targets branch in the Groups & Files panel of the project
  2. Control-click or right-click the MicBlow item
  3. Choose Add > Existing Frameworks…
  4. Click the + button at the bottom left beneath Linked Libraries
  5. Choose AVFoundation.framework and click Add
  6. AVFoundation.framework will now be listed under Linked Libraries. Close the window

Next, we’ll import the AVFoundation headers in our view controller’s interface file and set up an AVAudioRecorder instance variable:

  1. Expand the MicBlow project branch in the Groups & Files panel of the project
  2. Expand the Classes folder
  3. Edit MicBlowViewController.h by selecting it
  4. Update the file. Changes are bold:
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudioTypes.h>

@interface MicBlowViewController : UIViewController {
	AVAudioRecorder *recorder;
}

@end

To save a step later, we also imported the CoreAudioTypes headers; we’ll need some of its constants when we set up the AVAudioRecorder.

Taking Input From The Mic

We’ll set everything up and start listening to the mic in ViewDidLoad:

  1. Uncomment the boilerplate ViewDidLoad method
  2. Update it as follows. Changes are bold:
- (void)viewDidLoad {
	[super viewDidLoad];

  	NSURL *url = [NSURL fileURLWithPath:@"/dev/null"];

  	NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
  	  	[NSNumber numberWithFloat: 44100.0],                 AVSampleRateKey,
  	  	[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
  	  	[NSNumber numberWithInt: 1],                         AVNumberOfChannelsKey,
   	  	[NSNumber numberWithInt: AVAudioQualityMax],         AVEncoderAudioQualityKey,
  	  nil];

  	NSError *error;

  	recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];

  	if (recorder) {
  		[recorder prepareToRecord];
  		recorder.meteringEnabled = YES;
  		[recorder record];
  	} else
  		NSLog([error description]);

}

The primary function of AVAudioRecorder is, as the name implies, to record audio. As a secondary function it provides audio-level information. So, here we discard the audio input by dumping it to the /dev/null bit bucket — while I can’t find any documentation to support it, the consensus seems to be that /dev/null will perform the same as on any Unix — and explicitly turn on audio metering.

Note: if you’re adapting the code for your own use, be sure to send the prepareToRecord (or, record) message before setting the meteringEnabled property or the audio level metering won’t work.

Remember to release the recorder in dealloc. Changes are bold:

- (void)dealloc {
  	[recorder release];
  	[super dealloc];
}

Sampling The Audio Level

We’ll use a timer to check the audio levels approximately 30 times a second. Add an NSTimer instance variable and its callback method to it in MicBlowViewController.h. Changes are bold:

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudioTypes.h>

@interface MicBlowViewController : UIViewController {
	AVAudioRecorder *recorder;
	NSTimer *levelTimer;
}

- (void)levelTimerCallback:(NSTimer *)timer;

@end

Update the .m file’s ViewDidLoad to enable the timer. Changes are bold:

- (void)viewDidLoad {
	[super viewDidLoad];

  	NSURL *url = [NSURL fileURLWithPath:@"/dev/null"];

  	NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
  	  	[NSNumber numberWithFloat: 44100.0],                 AVSampleRateKey,
  	  	[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
  	  	[NSNumber numberWithInt: 1],                         AVNumberOfChannelsKey,
   	  	[NSNumber numberWithInt: AVAudioQualityMax],         AVEncoderAudioQualityKey,
  	  nil];

  	NSError *error;

  	recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];

  	if (recorder) {
  		[recorder prepareToRecord];
  		recorder.meteringEnabled = YES;
  		[recorder record];
		levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.03 target: self selector: @selector(levelTimerCallback:) userInfo: nil repeats: YES];
  	} else
  		NSLog([error description]);

}

For now, we’ll just sample the audio input level directly/with no filtering. Add the implementation of levelTimerCallback: to the .m file:

- (void)levelTimerCallback:(NSTimer *)timer {
	[recorder updateMeters];
	NSLog(@"Average input: %f Peak input: %f", [recorder averagePowerForChannel:0], [recorder peakPowerForChannel:0]);
}

Sending the updateMeters message refreshes the average and peak power meters. The meter use a logarithmic scale, with -160 being complete quiet and zero being maximum input.

Don’t forget to release the timer in dealloc. Changes are bold:

- (void)dealloc {
	[levelTimer release];
	[recorder release];
  	[super dealloc];
}

Listening For A Blowing Sound

As mentioned in the overview, we’ll be using a low pass filter to diminish high frequencies sounds’ contribution to the level. The algorithm creates a running set of results incorporating past sample input; we’ll need an instance variable to hold the results. Update the .h file. Changes are bold:

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudioTypes.h>

@interface MicBlowViewController : UIViewController {
	AVAudioRecorder *recorder;
	NSTimer *levelTimer;
	double lowPassResults;
}

Implement the algorithm by replacing the levelTimerCallback: method with:

- (void)levelTimerCallback:(NSTimer *)timer {
	[recorder updateMeters];

	const double ALPHA = 0.05;
	double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0]));
	lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults;

	NSLog(@"Average input: %f Peak input: %f Low pass results: %f", [recorder averagePowerForChannel:0], [recorder peakPowerForChannel:0], lowPassResults);
}

Each time the timer’s callback method is triggered the lowPassResults level variable is recalculated. As a convenience, it’s converted to a 0-1 scale, where zero is complete quiet and one is full volume.

We’ll recognize someone as having blown into the mic when the low pass filtered level crosses a threshold. Choosing the threshold number is somewhat of an art. Set it too low and it’s easily triggered; set it too high and the person has to breath into the mic at gale force and at length. For my app’s need, 0.95 works. We’ll replace the log line with a simple conditional:

- (void)listenForBlow:(NSTimer *)timer {
	[recorder updateMeters];

	const double ALPHA = 0.05;
	double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0]));
	lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults;

	if (lowPassResults > 0.95)
		NSLog(@"Mic blow detected");
}

Voila! You can detect when someone blows into the mic.

Caveats and Acknowledgements

This approach works well in most situations, but not universally: I’m writing this article in-flight. The roar of the engines constantly triggers the algorithm. Similarly, a noisy room will often have enough low-frequency sound to trigger the algorithm.

The algorithm was extracted/adapted from this Stack Overflow post. The post used the SCListener library for its audio level detection. SCListener pre-dates AVAudioRecorder; it was created to hide the details of dropping down to C to get audio input. With AVAudioRecorder this is no longer so tough.

Finally, this does work on the simulator. You just need to locate the built in mic on your Mac. To my surprise, the mic is located in the tiny hole to the left of the camera on my first generation Macbook.

33 responses to “Tutorial: Detecting When A User Blows Into The Mic”

  1. Luca says:

    thank you for the tutorial, it’s pretty hard to find decent information on iphone sdk audio related stuff.
    i have question regarding the measurement … right now you detect a certain peak level to check if the user blows in the mic. would it be possible to detect a certain frequency range??
    thank you again!

  2. Dan Grigsby says:

    Luca: that would requires using a “Fast Fourier Transform” — An FFT is the same algorithm that e.g., guitar tuner apps use. For more on FFT, you might like this interview with Pete Schwamb. Pete uses FFT to pick out the frequency range of cricket chirps to detect the outside air temperature!

  3. Luca says:

    Hi Dan! Thank you for the fast response, i will definitely check out the podcast.

  4. Trevor says:

    “Remember to release the recorder in dealloc.”

    This is wrong. Calling scheduledTimerWithTimeInterval does not give you ownership of the timer. It is retained by the run loop, not you. Therefore you should not release it. You should instead call invalidate, which will cause the run loop to release the timer.

  5. Camilo says:

    Hi, I think I found my problem but not the solution, when you create the url you are creating /dev/null, what is URL refering to. If I change it in your code i get my same results than mine, so the problem is that I have to set this up, but I havent found a good explanation on the AVaudio documentation, so any help would be great

    Thanks.

  6. Camilo says:

    Hello I actually found the problem, and its that I was using a Singleton Sound manager to play my sounds, as soon as I disabled it i was able to get the sound input.

    Now my problem is htat I cannot play my sounds, how are u guys playing yours, and record the input at the same time?

    Thanks for the help

  7. balagurubaran says:

    hai friend
    thanks for posting those kind of information

    but i need one more help from, please explain how to specify the path?

    thanks for advance

  8. David Cann says:

    Hi Dan, thanks for the code!

    Correction to the github code: it has a less than sign where it should have a greater than sign on line 38 of MicBlowViewController.m! The code on this web page is correct, though.

    I also agree with the commenter above that it should be [levelTimer invalidate] instead of [levelTimer release] on line 44.

  9. thyphuong says:

    Good day Dan,

    I met a problem that whenever I stop blowing to microphone, the nslog still detect the action for a period of about 2-3s. Anyway to stop it when we stop blowing? And is the method levelTimerCallback just called 1 times in 1 breath we close?

  10. Dev says:

    Thanks for posting this. I tried changing this code a bit. Added this line to the AVAudioRecorder settings

    [recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];

    This works ok. But if I change the 16 to 8 (for reducing the size of the output wav file), the app does not record. According to the docs, 8 should also work.

    Any ideas? Thanks.

  11. Tamer says:

    Very good tutorial! Thank you for sharing! 🙂

  12. Felix says:

    hey Dan! thanks for posting this.
    I have a problem that the blowing runs well on the iphone simulator but when i install my app on the device it cannot recognize the blowing.
    any1 have solution to this?
    thanks.

  13. BigPapoo says:

    Hi

    Very nice post.
    Just a minor typo : It’s “Voila” and not “Viola”. For french speaking people “Viola” means “Raped”… Not what you were thinking of, I’m sure 🙂

  14. Dan Grigsby says:

    BigPapoo: whoops. fixed!

  15. anon says:

    Thank you so much for this!!! It works amazingly!

    @felix – try changing the if(lowpassresults > 95) to a smaller number. I’ve not actually tried this on anything but the simulator yet (I’ve not yet paid for the dev license), but I assume it’ll work… It may be worth putting a sensitivity slider/variable in your game for the users to change in an options screen (that’s what I’ll do!).

  16. Zeyad says:

    thanx for the tutorial. but it didn’t work for me… i was blowing on the mic and the iphone like crazy and nothing happen.
    so i changed the code to this .. and now it works fine.. but still i have no idea why …
    if (lowPassResults > 0.5) {

    NSLog(@”Mic blow detected”);
    label.text = @”Mic blow detected”;
    }

    thanks again.
    Zeyad
    Kuwait

  17. BigMac says:

    Same problem as felix. Work on simulator but not on the device.
    Zeyad: doesn’t work for me.

    My log is always this one (with the iPhone)
    Average input: -120.000000 Peak input -120.000000 Low pass results: 0.000001

  18. BigMac says:

    I’ve found the problem but not the solution.

    Here some explications about the problem :

  19. BigMac says:

    I’ve found the solution ! Click on the link above !

  20. alfa says:

    hi, i want to ask , what is the reason of using the Low Pass Filter in this application??

  21. Zeyad says:

    BigMac .. please provide the sample code . i can not play sound on device.

  22. Antoine says:

    Thank you very much, this tutorial is extremely helpful.

    I have a question, do you know if there is a way set up a filter that can calculate the direction of the sound. What I would like to do is for the app to only log the data if the sound is coming straight at the device.

    Thanks again
    Antoine

  23. Ren says:

    I’m implementing the audio recording in my app, i’ve tried the code you’ve wrote in the simulator and all work good, but when i try in the real device the audio is staying at the following level all the time. any idea ?

    2014-10-27 17:29:09.835 MicBlow[1095:143453] Average input: -120.000000 Peak input: -120.000000

    2014-10-27 17:29:09.836 MicBlow[1095:143453] Average input: -120.000000 Peak input: -120.000000 Low pass results: 0.000000

    • iProgramer says:

      add this code in ViewDidLoad

      [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];

  24. sandeep says:

    Is it possible to use this for RTSP streams from a wireless mic

  25. Henry Ross says:

    Nice post, but at now i am using Total recall app fro recording call. This app is also great work.!! recording app

  26. jaffery says:

    i have to made the musical flute and i have to blow in mic as an input.
    then intensity of input is measured and same as its intensity flute produce a sound..
    kindly guide me brother
    thankx in advance