iOS AVAudioEngine tutorial: Pitch, Tempo and Play multiple audio files simultaneously in iOS 16.0

iOS platform provides a comprehensive set of audio frameworks that are required to create a rich app experience. Use it to get your user in multiple channels of audio with visual rendering. Below the list of the framework which are being used for Audio.

  1. Media Player Framework – The AudioToolbox framework provides interfaces for recording, playback, and stream parsing. In iOS, the framework provides additional interfaces for managing audio sessions.
  2. AVPlayer – A player is a controller object that manages the playback and timing of a media asset. Use an instance of AVPlayer to play local and remote file-based media, such as QuickTime movies and MP3 audio files, as well as audiovisual media served using HTTP Live Streaming.
  3. AVFoundation Framework – AVFoundation combines several major technology areas that together encompass a wide range of tasks for inspecting, playing, capturing, and processing audiovisual media on Apple platforms. It is used for recording and processing Audio.
  4. Audio Unit – It is for Adding sophisticated audio manipulation and processing capabilities to your app.
  5. Audio Toolbox – The AudioToolbox framework provides interfaces for recording, playback, and stream parsing. In iOS, the framework provides additional interfaces for managing audio sessions.

In this AVAudioEngine tutorial we will use AVAudioEngine class that is part of AVFoundation framework.

How the AVAudioEngine works. Below classes’s objects are rquired to play the audio file.

  1. AVAudioEngine.
  2. AVAudioPlayerNode
  3. AVAudioFile

You can see working on these class objects by following Diagram.

In Above image – Audio file is object of AVAudioFile.

let audioUrl = ”audio.wav”
   do {
            audioFile = try AVAudioFile(forReading: audioUrl)
        } catch {
            print("Error loading audio file: \(error.localizedDescription)")
        }

In Above image – Player is object of AVAudioPlayerNode

 let audioPlayer = AVAudioPlayerNode()

AVAudioEngine connect the effect and player to apply effect in realtime.

In Above image – Here is effect pitchControl-

let pitchControl = AVAudioUnitTimePitch()

Below line connection are established.

audioEngine.attach(audioPlayer)
audioEngine.attach(pitchControl)
audioEngine.connect(audioPlayer, to: pitchControl, format:audioFile?.processingFormat)
audioEngine.connect(pitchControl, to: audioEngine.mainMixerNode, format: audioFile?.processingFormat)

Below function to play after connection done.

audioPlayer.scheduleFile(audioFile, at: nil) // Player schedule the audio file 

Start Audio Engine

     do {
            try audioEngine.start()
        } catch {
            fatalError("Unable to start audio engine: \(error.localizedDescription)")
        } 
audioPlayer.play(at: startTime) // Play the audio

Note- In my code below, I am calculating the time after that specific time, player will be started. you you want to delay you in playing any particular audio, then assign value to
let kStartDelayTime : Float = 0.0

let outputFormat = audioPlayer.outputFormat(forBus: 0)
let startSampleTime = Float(audioPlayer.lastRenderTime?.sampleTime ?? AVAudioFramePosition(0.0))
let startTime = AVAudioTime(sampleTime: AVAudioFramePosition((startSampleTime + (Float(kStartDelayTime) * Float(outputFormat.sampleRate)))), atRate: outputFormat.sampleRate)

Below are the methods to change the pitch value and Temp effect in realtime –

func adjustPitch(pitchValue: Float) {
   pitchControl.pitch = pitchValue * 15
}
    
func adjustSpeed(tempoValue: Float) {
   pitchControl.rate = tempoValue
}

Below is the AudioPlayerManager.swift file.

//
//  AudioPlayerManager.swift
//  AudioPlayer
//
//  Created by Chaman Sharma on 19/11/23.
//

import UIKit
import AVFoundation

class AudioPlayerManager {
    var audioEngine: AVAudioEngine!
    var audioPlayer: AVAudioPlayerNode!
    var audioFile: AVAudioFile!
    let pitchControl = AVAudioUnitTimePitch()
    var kStartDelayTime: Float = 0.0

    init() {
        audioEngine = AVAudioEngine()
        audioPlayer = AVAudioPlayerNode()
        audioEngine.attach(audioPlayer)
        audioEngine.attach(pitchControl)
    }
    
    func loadAudioFile(audioUrl: URL) {
        do {
            audioFile = try AVAudioFile(forReading: audioUrl)
        } catch {
            print("Error loading audio file: \(error.localizedDescription)")
        }
        audioEngine.reset()
        audioEngine.connect(audioPlayer, to: pitchControl, format:audioFile?.processingFormat)
        audioEngine.connect(pitchControl, to: audioEngine.mainMixerNode, format: audioFile?.processingFormat)
    }
    
    func play() {
        audioPlayer.scheduleFile(audioFile, at: nil)
        if audioPlayer.isPlaying {
            stop()
        }
        do {
            try audioEngine.start()
        } catch {
            fatalError("Unable to start audio engine: \(error.localizedDescription)")
        }
        
        let outputFormat = audioPlayer.outputFormat(forBus: 0)
        let startSampleTime = Float(audioPlayer.lastRenderTime?.sampleTime ?? AVAudioFramePosition(0.0))
        let startTime = AVAudioTime(sampleTime: AVAudioFramePosition((startSampleTime + (Float(kStartDelayTime) * Float(outputFormat.sampleRate)))), atRate: outputFormat.sampleRate)
        audioPlayer.play(at: startTime)
    }
    
    func stop() {
        audioPlayer.stop()
        audioEngine.stop()
    }
    
    func adjustPitch(pitchValue: Float) {
        pitchControl.pitch = pitchValue * 15
    }
    
    func adjustSpeed(tempoValue: Float) {
        pitchControl.rate = tempoValue
    }
    
    func adjustVolume(volume: Float) {
        audioPlayer.volume = volume
    }
}

Below is the method which is used for playing multiple audio file with specific time (BPM) in infinite loop. you can find this project in github.

  func initiateLoop() {
        let gap = 60000 / (Float((bpmText?.text ?? "1")) ?? 0)
        var audioGap: Float = 0.0
        for _ in 0...1000 {
            let _ = self.filteredAudioArray.map { audio in
                let miliSecond: Float = audioGap / 1000
                 let timer = Timer.scheduledTimer(withTimeInterval: TimeInterval(miliSecond), repeats: false) { timer in
                    if !self.permanentStop {
                        audio.audioPlayerManager.loadAudioFile(audioUrl: audio.url!)
                        audio.audioPlayerManager.play()
                    }
                }

                self.sharedTimers.append(timer)
                audioGap = audioGap + (gap * (audio.gap ?? 1.0))
            }
        }
    }

You can find dull project from Github – https://github.com/ChamanSharma1234/AVAudioEngine

Conclusion – The functioning of the Audio engine and how it applies effects in real time to audio were discussed in this session. I hope the link to Project Githu will be helpful to you. If you have any questions, kindly leave a remark below.

Leave a Reply

Your email address will not be published. Required fields are marked *