My app Balletrax is a music player for people to use while they teach ballet. Used to be you could silence notifications during use, but now the customer seems to have to know how to use Focus mode, remember to turn it on and off, and have to check the notifications one does and doesn't want to use. Is there no way to silence all notifications when the app is in use?
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello everyone,
I’m new to Swift development and have been working on an audio module that plays a specific sound at regular intervals - similar to a workout timer that signals switching exercises every few minutes.
Following AVFoundation documentation, I’m configuring my audio session like this:
let session = AVAudioSession.sharedInstance()
try session.setCategory(
.playback,
mode: .default,
options: [.interruptSpokenAudioAndMixWithOthers, .duckOthers]
)
self.engine.attach(self.player)
self.engine.connect(self.player, to: self.engine.outputNode, format: self.audioFormat)
try? session.setActive(true)
When it’s time to play cues, I schedule playback on a DispatchQueue:
// scheduleAudio uses DispatchQueue
self.scheduleAudio(at: interval.start) {
do {
try audio.engine.start()
audio.node.play()
for sample in interval.samples {
audio.node.scheduleBuffer(sample.buffer, at: AVAudioTime(hostTime: sample.hostTime))
}
} catch {
print("Audio activation failed: \(error)")
}
}
This works perfectly in the foreground. But once the app goes into the background, the scheduled callback runs, yet the audio engine fails to start, resulting in an error with code 561015905.
Interestingly, if the app is already playing audio before going to the background, the scheduled sounds continue to play as expected.
I have added the required background audio mode to my Info plist file by including the key UIBackgroundModes with the value audio.
Is there anything else I should configure? What is the best practice to play periodic audio when the app runs in the background? How do apps like turn-by-turn navigation handle continuous audio playback in the background?
Any advice or pointers would be greatly appreciated!
I'm reaching out regarding a recurring issue I'm experiencing with MusicKit developer tokens.
I'm using a valid .p8 private key to sign JWTs for Apple MusicKit integration. Each token I generate includes the appropriate claims (iss, iat, exp) and is signed with the ES256 algorithm, with an expiration date set approximately 6 months ahead.
Everything works as expected immediately after generating the token. However, after a few days, the same JWT (still well within its expiration period) suddenly begins returning invalid/unauthorized responses when used in Postman and other API clients.
Importantly:
I did not delete or revoke the .p8 key during this time.
I verified the JWT contains valid claims and a proper structure.
The issue consistently resolves only when I create a new .p8 file and regenerate a fresh JWT with it—after which the cycle repeats.
This issue occurs even when the environment and app identifiers remain unchanged.
I would greatly appreciate it if you could help me understand:
Why these tokens become invalid after a few days, despite having a long exp value and an unchanged key.
Whether there's any automatic revocation or timeout policy on .p8 keys that could explain this behavior.
If there's a better way to maintain long-lived developer tokens without requiring new .p8 key generation every few days.
Thank you for your help and clarification on this issue.
Best regards,
Liad Altif
I am using MusicKit ApplicationMusicPlayer to play music in my app. Everything works fine as long as I'm not playing large playlists that contain hundreds of songs. When I to play collection of songs that is larger than around 300 I'm always getting the error message saying:
"Prepare to play failed" UserInfo={NSDebugDescription=Prepare to play failed, NSUnderlyingError=0x121d42dc0 {Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Remote call timed out" UserInfo={NSDebugDescription=Remote call timed out}}}))
It doesn't matter if songs are downloaded to the device or not.
I am aware that there is another initializer for player's queue that accepts Playlist instances but in my app users can choose to sort playlist tracks in different order than the default and that makes using that initializer not feasible for me.
I tried everything I could think of, I tried to fall back on MPMusicPlayerController and pass array of MPMusicPlayerPlayParameters to it but the result was the same.
typealias QueueEntry = ApplicationMusicPlayer.Queue.Entry
let player = ApplicationMusicPlayer.shared
let entries: [QueueEntry] = tracks
.compactMap {
guard let song = $0 as? Song else { return nil }
return QueueEntry(song)
}
Task(priority: .high) { [player] in
do {
player.queue = .init(entries, startingAt: nil)
try await player.play() // prepareToPlay failed
} catch {
print(error)
}
}
When I play an m3u8 video using AVPlayer, it can play smoothly at 2x speed. However, when I set it to 3x speed, the playback is not smooth and there is no sound.
Topic:
Media Technologies
SubTopic:
Video
Since the last update to IOS 26.0 (23A5276f) the AirPods connect to my IPhone and the Audio is still running through the phone. They are shown in the Bluetooth Icon that they’re paired.
Topic:
Media Technologies
SubTopic:
Audio
Hello! I have been following the UsingAVFoundationToPlayAndPersistHTTPLiveStreams sample code in order to test persisting streams to disk. In addition to support for m3u8, I have noticed in testing that this also seems to work for MP3 Audio, simply by changing the plist entries to point to remote URLs with audio/mpeg content. Is this expected, or are there caveats that I should be aware of?
Thanks you!
We are facing a strange issue where a small portion of our large userbase can not start the capture session in our app, as it gets interrupted with the following reason:
AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps
Our users are all from iPhones, no one is using an iPad. Just to be sure we have set
session.isMultitaskingCameraAccessEnabled = true
but it does not seem to make any difference.
Another weird interruption we are seeing
The app registers a periodic time observer to the AVPlayer when the playback starts and it works fine. When switching to AirPlay during playback, the periodic time observation continues working as expected.
However, when switching back to local playback, the periodic time observer does not fire anymore until a seek is performed. The app removes the periodic time observer only when the playback stops.
I can see that when switching back to local playback, the timeControlStatus successively changes
to .waitingToPlayAtSpecifiedRate (reason: .evaluatingBufferingRate)
then to .waitingToPlayAtSpecifiedRate (reason: .toMinimizeStalls)
and finally to .playing
But the time observation does not work anymore.
Also, the issue is systematic with Live and VOD streams providing a program date (with HLS property #EXT-X-PROGRAM-DATE-TIME), with or without any DRM, and is never reproduced with other VOD streams.
Hi, I’ve developed a photo app that includes a photo deletion feature.
Some users have reported encountering PHPhotosError.operationInterrupted (3301) when attempting to delete photos.
Initially, I suspected that some of the assets might have a sourceType of typeiTunesSynced, since the documentation notes that iTunes-synced assets cannot be edited or deleted.
However, after checking the logs, all of the assets involved are of typeUserLibrary.
Additionally, the user mentioned that some photos in the iPhone Photos do not show a delete button.
I’m unsure whether the absence of the delete button is related to the 3301 error.
I’d like to confirm the following:
Under what conditions does PHPhotosError.operationInterrupted (3301) occur, and how should it be handled?
Why do some photos in the iPhone Photos not show a delete button?
The code for deleting photos is as follows:
PHPhotoLibrary *library = [PHPhotoLibrary sharedPhotoLibrary];
[library performChanges:^{
PHFetchResult *assetsToBeDeleted = [PHAsset fetchAssetsWithLocalIdentifiers:delUrls options:nil];
if (assetsToBeDeleted) {
[PHAssetChangeRequest deleteAssets:assetsToBeDeleted];
}
} completionHandler:^(BOOL success, NSError *error) {
We noticed the behaviour of expiration of FairPlay license changed from iOS 16.x to some version iOS 17 and the latest iOS 18.3.2 and Safari.
On iOS 16.x, the video playback will stop when the license expires, but on iOS 17.x + the video continues but no audio and no error fired.
On latest Safari the video and audio all continues.
Any changes for the latest FairPlay and how we adapt this from the license server?
Thanks
I’m building a music app using Apple Music streaming via ApplicationMusicPlayer.
My goal is to decrease the volume of the current song during the last 10 seconds, and when the next track begins, restore the volume to its normal level.
I know that ApplicationMusicPlayer doesn’t expose a volume API, and I want to avoid triggering the system volume HUD.
✅ Using Apple Music streaming (not local files)
❓ Is it possible to implement per-track fade-out/fade-in logic with ApplicationMusicPlayer?
Appreciate any clarification or official guidance!
I set the device format and colorspace to Apple Log and turn off the HDR, why the movie output is still in HDR format rather than ProRes Log?
Full runnable demo here:
https://github.com/SpaceGrey/ColorSpaceDemo
session.sessionPreset = .inputPriority
// get the back camera
let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .back)
backCamera = deviceDiscoverySession.devices.first!
try! backCamera.lockForConfiguration()
backCamera.automaticallyAdjustsVideoHDREnabled = false
backCamera.isVideoHDREnabled = false
let formats = backCamera.formats
let appleLogFormat = formats.first { format in
format.supportedColorSpaces.contains(.appleLog)
}
print(appleLogFormat!.supportedColorSpaces.contains(.appleLog))
backCamera.activeFormat = appleLogFormat!
backCamera.activeColorSpace = .appleLog
print("colorspace is Apple Log \(backCamera.activeColorSpace == .appleLog)")
backCamera.unlockForConfiguration()
do {
let input = try AVCaptureDeviceInput(device: backCamera)
session.addInput(input)
} catch {
print(error.localizedDescription)
}
// add output
output = AVCaptureMovieFileOutput()
session.addOutput(output)
let connection = output.connection(with: .video)!
print(
output.outputSettings(for: connection)
)
/*
["AVVideoWidthKey": 1920, "AVVideoHeightKey": 1080, "AVVideoCodecKey": apch,<----- prores has enabled.
"AVVideoCompressionPropertiesKey": {
AverageBitRate = 220029696;
ExpectedFrameRate = 30;
PrepareEncodedSampleBuffersForPaddedWrites = 1;
PrioritizeEncodingSpeedOverQuality = 0;
RealTime = 1;
}]
*/
previewSource = DefaultPreviewSource(session: session)
queue.async {
self.session.startRunning()
}
}
I’m building a music app using Apple Music streaming via ApplicationMusicPlayer.
My goal is to decrease the volume of the current song during the last 10 seconds, and when the next track begins, restore the volume to its normal level.
I know that ApplicationMusicPlayer doesn’t expose a volume API, and I want to avoid triggering the system volume HUD.
✅ Using Apple Music streaming (not local files)
❓ Is it possible to implement per-track fade-out/fade-in logic with ApplicationMusicPlayer?
Appreciate any clarification or official guidance!
Short summary
When setting exposureMode to .locked or .custom the brightness of a video stream still changes depending on the composition and contrast of the visible scene. These changes seem to come from contrast enhancements or dynamic range optimizations and totally break any analysis of the image that requires to assess absolute luminance. While exposure lock seems to indeed lock the physical exposure parameters of the camera (shutter speed and ISO), I cannot find any way to control these "soft" modifiers.
Details
Background
I am the developer of the app "phyphox", an educational app that makes the phone's sensors accessible to students as measurement tools in science experiments. Currently I am working on implementing photometric measurements through the camera and one very important aspect of it is luminance measurements.
This is particularly relevant since the light sensor of the phone has no publicly accessible API and the camera could to some extend make experiments available to Apple users that are otherwise only possible on Android devices.
Implementation
The app uses AVFoundation and explicitly picks individual cameras since camera groups do not support custom exposure settings. This means that it handles camera switching during zoom by itself and even implements its own auto exposure routines to optimize for the use in experiments. Therefore it always stays in custom exposure mode. The app uses YUV420 color space and the individual frames are analyzed in Metal using compute shaders.
However, the effects discussed here still occur if I remove all code to control the camera and replace it with a simple sequence of setting the exposure mode to custom, setting custom exposure values, setting a fixed white balance and then setting the exposure mode to locked as suggested on stackoverflow. This neither helps on an iPhone 14 Pro nor on an iPhone 8 despite a report on the developer forums that it would resolve the issue for older devices.
The app is open source, so the code can be seen in our current development branch (without the changes for the tests here, though) on github.
The videos below use the implementation with the suggestion from stackoverflow, but they can be reproduced in the same way with "professional" camera apps that promise manual control over the camera (like the Blackmagic cam to quote a reputable company) as well as the stock camera app after pressing and holding on the preview to enable AE/AF lock.
Demonstration
These examples were captured on an iPhone 14 Pro. The central part of the image (highlighted by the app using metal shaders after capture) should not change with fixed exposure settings, but significant changes are noticable if there are changes at the edge of the frame when I move a black piece of cardboard in from above:
https://share.icloud.com/photos/0b1f_3IB6yAQG-qSH27pm6oDQ
The graph above the camera preview is the average luminance (gamma corrected and weighted based on sRGB) across the highlighted central area and as mentioned before it should not change because of something happening at the side of the frame (worst case it should get a bit darker because of the cardboard's shadow).
In my opinion, the iPhone changes its mind on the ideal contrast as soon as it has a different exposure histogram because of the dark image part from the cardboard, but that's just me guessing.
For completeness here is the same effect in the stock camera app with AE/AF lock enabled:
https://share.icloud.com/photos/0cd7QM8ucBZKwPwE9mybnEowg
Here you can also see that the iPhone "ramps" the changes. The brightness of the gray area does not change immediately but transitions smoothly, so this is clearly deliberate postprocessing.
So...
Any suggestion on how to prevent this behavior would be highly appreciated.
We’ve encountered a reproducible issue where the iPhone fails to reconnect to a Wi-Fi access point under the following conditions:
The device is connected to a 2.4GHz Wi-Fi network.
A Bluetooth audio accessory is connected (e.g. headset).
AVAudioSession is active (such as during a voice call or when using the Voice Memos app).
The user moves away from the access point, causing a disconnect.
Upon returning within range, the access point is no longer recognized or reconnected while AVAudioSession remains active.
However, if the Bluetooth device is disconnected or the AVAudioSession is deactivated, the Wi-Fi access point is immediately recognized again.
We confirmed this behavior not only in my app but also using Apple's built-in Voice Memos app, suggesting this is not specific to our implementation.
It appears that the Wi-Fi system deprioritizes reconnection while AVAudioSession is engaged. Could this be by design? Or is this a known issue or limitation with Wi-Fi and AVAudioSession interaction?
Test Environment:
Device: iPhone 13 mini
iOS: 17.5.1
Wi-Fi: 2.4GHz band
Accessories: Bluetooth headset
We’d appreciate clarification on whether this is expected behavior or a bug. Thank you!
I'm developing a TTS Audio Unit Extension that needs to write trace/log files to a shared App Group container. While the main app can successfully create and write files to the container, the extension gets sandbox denied errors despite having proper App Group entitlements configured.
Setup:
Main App (Flutter) and TTS Audio Unit Extension share the same App Group
App Group is properly configured in developer portal and entitlements
Main app successfully creates and uses files in the container
Container structure shows existing directories (config/, dictionary/) with populated files
Both targets have App Group capability enabled and entitlements set
Current behavior:
Extension can access/read the App Group container
Extension can see existing directories and files
All write attempts are blocked with "sandbox deny(1) file-write-create" errors
Code example:
const char* createSharedGroupPathWithComponent(const char* groupId, const char* component) {
NSString* groupIdStr = [NSString stringWithUTF8String:groupId];
NSString* componentStr = [NSString stringWithUTF8String:component];
NSURL* url = [[NSFileManager defaultManager]
containerURLForSecurityApplicationGroupIdentifier:groupIdStr];
NSURL* fullPath = [url URLByAppendingPathComponent:componentStr];
NSError *error = nil;
if (![[NSFileManager defaultManager] createDirectoryAtPath:fullPath.path
withIntermediateDirectories:YES
attributes:nil
error:&error]) {
NSLog(@"Unable to create directory %@", error.localizedDescription);
}
return [[fullPath path] UTF8String];
}
Error output:
Sandbox: simaromur-extension(996) deny(1) file-write-create /private/var/mobile/Containers/Shared/AppGroup/36CAFE9C-BD82-43DD-A962-2B4424E60043/trace
Key questions:
Are there additional entitlements required for TTS Audio Unit Extensions to write to App Group containers?
Is this a known limitation of TTS Audio Unit Extensions?
What is the recommended way to handle logging/tracing in TTS Audio Unit Extensions?
If writing to App Group containers is not supported, what alternatives are available?
Current entitlements:
<dict>
<key>com.apple.security.application-groups</key>
<array>
<string>group.com.<company>.<appname></string>
</array>
</dict>
I have an app that has a WKWebView for watching YouTube videos. When the videos are windowed the audio seems fine, positionally as well. All perfectly.
When I fullscreen the video and it goes into the native visionOS video player the audio messes up.
It will suddenly sound like it is in your ears, or maybe even just one ear channel, or the position will be wrong. It might be fine for a moment but the second I touch the controls or move the window the sound jumps across the room, away from the window, or switches to stereo.
Sometimes exiting windows entirely you will still hear the videos playing. Even if you open the window back up and go to another screen and open another video, now you hear 2 videos playing at the same time with no way to stop the first one in the background, requiring to force restart the app.
It is all sorts of glitchy. I haven't the slightest clue what is happening here. I am strongly feeling this is a visionOS bug.
I tried using AVAudioSession to change some of the sound settings, and that makes zero difference in behavior.
Multiple testers have also reported this behavior and it has been seen on both visionOS 2.3 and 2.4 betas.
Thanks for the help! This is driving me mad! It is extremely consistent behavior!
Hello everyone,
I'm working on implementing a screen sharing feature using RPSystemBroadcastPickerView and a Broadcast Upload Extension to share the entire app screen in an iOS application.
The Broadcast Upload Extension is set up following Apple's ReplayKit guidelines. However, I’m encountering an issue during the broadcast startup sequence:
❗ Problem Description
The Screen Broadcast UI appears as expected
I tap “Start Broadcast”
The countdown (3 → 2 → 1) completes
Then it immediately reverts to the "Start Broadcast" screen, and screen sharing does not begin
No error messages are displayed
None of the extension lifecycle methods (broadcastStarted(withSetupInfo:), processSampleBuffer, etc.) are called
There are no logs or crash reports, neither in the main app nor in the extension
✅ What Has Been Verified
Info.plist of the Broadcast Upload Extension includes:
NSExtensionPointIdentifier = com.apple.broadcast-services-upload
NSExtensionPrincipalClass set correctly
RPBroadcastProcessMode = RPBroadcastProcessModeSampleBuffer
preferredExtension is set properly to the extension’s bundle identifier
Extension is listed in the main app's build settings under "Frameworks, Libraries, and Embedded Content"
⚠️ Additional Concern
We noticed that in Xcode (latest version), the Broadcast Upload Extension is listed under "Embedded Frameworks" with the setting "Embed Without Signing", and there is no option to change it to "Embed & Sign". We're wondering if this could be the reason the extension fails to launch correctly at runtime, despite being detected by the broadcast picker.
❓ Questions
Has anyone faced similar issues where the broadcast never starts despite correct setup?
Could the "Embed Without Signing" be causing the system to silently cancel or ignore the extension at runtime?
Are there any provisioning profile or entitlement requirements specific to Broadcast Upload Extensions that might trigger this behavior silently?
Any insights, suggestions, or workarounds would be greatly appreciated.
Thank you in advance!
I am trying to stream audio from local filesystem.
For that, I am trying to use an AVAssetResourceLoaderDelegate for an AVURLAsset. However, Content-Length is not known at the start. To overcome this, I tried several methods:
Set content length as nil, in the AVAssetResourceLoadingContentInformationRequest
Set content length to -1, in the ContentInformationRequest
Both of these cause the AVPlayerItem to fail with an error.
I also tried setting Content-Length as INT_MAX, and setting a renewalDate = Date(timeIntervalSinceNow: 5). However, that seems to be buggy. Even after updating the Content-Length to the correct value (e.g. X bytes) and finishing that loading request, the resource loader keeps getting requests with requestedOffset = X with dataRequest.requestsAllDataToEndOfResource = true. These requests keep coming indefinitely, and as a result it seems that the next item in the queue does not get played. Also, .AVPlayerItemDidPlayToEndTime notification does not get called.
I wanted to check if this is an expected behavior or is there a bug in this implementation. Also, what is the recommended way to stream audio of unknown initial length from local file system?
Thanks!