Audio Recording in SwiftUI MVVM with AVFoundation | An iOS App

Yasir
5 min readFeb 15, 2021

If You are an iOS Developer , you must take a look here . You will find this article very helpfull .

Why This Article ?

The time when I was exploring about AVFoundation , for a good examples and documentation , there was literally no such explanations and documentation . There are some but very poorly written .

Overview

Objective of This Article

We are going to understand the AVFoundation and Implement that in an SwiftUI app project . We will follow the MVVM pattern . And Line by Line Understanding of Code .

Sources and Codes

Below is the repository for this project , just give it a star and fork then download and open it . Keep matching the articles code and projects code for better understanding .

Introduction to AVFoundation

AVFoundation is an extremely large framework , we can do a lot with it. But in this project we will handle and use the recording and playing feature only.

The best way to understand the AVFoundation or anything is by just doing that in practical :)

Key Points for AVFoundation

  • Recordings directory stores in system in urls form and URL type .
  • Recording file name should be unique .
  • There are lots of INBuilds , try to explore more about it .

Naming Our Project

Lets give a good name to our project , I decided a name as “CO-Voice” . What do you think about name ? let me know on my social links given last in this article .

Setting UP Our Project

Now open the project and take a look then come back here . Now lets understand the naming of folders we have created .

Views : Our view folder will contain the views in swiftUI for our app
ViewModels : It will contain the our all view models
Models : Here we will create some model for recording
Extensions : It will have some extensions

First Step in Our Project

Note : Make sure to open final project and compare the following code with projects code .

The first thing that we are going to see is our view model . Our view model is so reactive in nature so that UI can observe it easly . And UI get changed .

Imported AVFoundation and created a class as VoiceViewModel of type ObservableObject . Then created audioRecorder and audioPlayer , two variables of type AVAudioRecorder! and AVAudioPlayer! .

Implementing Start & Stop Recording

Now the first feature we are going to implement is to start & stop recording feature :

VoiceViewModel

Above Code Explanation

Line 9 : We are creating a variable to check if recording has started , we will need it while playing with UI .

Line 11 : We are creating an Array to store our URL of recordings and some details , and the type of that array is Recording . Recording struct is also there. You can find that in Model folder in project repositry.

Line 14 : We are initialising and we will call a function here letter .

Line 19 : Creating the start recording function and doing some formalities , but there are some lines to understand are as follow .

Line 29 : The path will contain the directory of the recording.

Line 30 : We have to give a unique name to every recording file , so we are giving the name as current date and time . Notice the last words “.m4a” is really important to give . We are using a function call to fetch the current date into string . You can find that function in extension folder in project repository.

Line 46 : When we started our recording successfully , then we are doing true that variable .

Line 54 : Here we are just creating a function to stop the recording and converting that recording variable as false .

Fetching All the Recordings

Now we will fetch all the recordings from local to the array we have created , see the following code and line by line explanations :

VoiceViewModel

Above Code Explanation

Line 6 : We are traveling in our directory of recordings and appending the recording in our array .

Line 10 : We are sorting the array as in descending order .

Playing & Stoping the Audio

So now its time to implement the functinon for playing aur recorded audio .

VoiceViewModel

Above Code Explanation

Line 1 : We are passing the url of recorded file , so that we can play that audio url only .

Line 16 : Here we iterating over the list and making the isPlaying variable as true since it is playing now .

Line 28 : Stop playing will stop all the playing audios , but the reason we are taking the url is to toggle the variable in our list of recordings .

Line 32 : We are iterating in our list and making that recording file as false .

Deleting the Recording

Now its time to delete our recording , see the following code to learn the implementation .

VoiceViewModel

Above Code Explanation

Line 1 : To delete the recording from the system , we need their url .

Line 4 : In this line , we are deleting that recording .

Line 9 : We are iterating over the our recording list and checking if the audio is playing , if playing then stop it the check if it is the recording we want to delete. .

Line 16 : Finally we are deleting recording from our recording array .

Looking to Our ViewModel

Our view model is almost ready , but we need some more features , for example we want to show a text timer in UI to show our recording duration live . We have ignored some codes in above code snippets , you will find all code in final projects .

Take a look to the final app and start tracking from view model . You will got the reasons for every line of code one by one .

CO-Voice UI

The UI for this app is really simple , just follow the line by line code and track them in final projects UI file .

CO-Voice

Wrapping UP

In this app , there is a lot to implement and also there is a lot to refactor . You can also contribute to this project , its open for all . I am also uploading the continues updates in this app with some better features .

Where to Go From Here ?

My Socials : LinkedInGitHub

Clap the Article if You like it :)

--

--

Yasir

Exploring the Swift, SwiftUI, and Apple universe.