I have found that following code runs without issue from Xcode, either in Debug or Release mode, yet crashes when running from the binary produced by archiving - i.e. what will be sent to the app store.
import SwiftUI
import AVKit
@main
struct tcApp: App {
var body: some Scene {
WindowGroup {
VideoPlayer(player: nil)
}
}
}
This is the most stripped down code that shows the issue. One can try and point the VideoPlayer at a file and the same issue will occur.
I've attached the crash log:
Crash log
Please note that this was seen with Xcode 26.2 and MacOS 26.2.
Video
RSS for tagDive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Facing an issue with audio playback using AVPlayerViewController in iOS application. We are using the native player to play recorded audio files.
When the AVPlayerViewController appears, the native user interface is displayed correctly, including the playback controls and the volume slider.
However, when the user interacts with the volume slider
The slider UI moves and responds to touch events.
The actual audio output volume does not change. The audio continues playing at the initial volume level regardless of the slider position.
We initialize the player and present it modally using the following code:
AVPlayerViewController *avController = [[AVPlayerViewController alloc] init];
avController.player = [AVPlayer playerWithURL:videoURL];
// Setting initial volume
avController.player.volume = 1.0f;
avController.modalPresentationStyle = UIModalPresentationOverFullScreen;
avController.allowsPictureInPicturePlayback = NO;
// Present the controller
[self presentViewController:avController animated:YES completion:nil];
Has anyone tried to make an ILPD based AIME file?
When I try the resulting AIME switches to USDZ Mesh instead of saving the ILPD Data.
I'm relatively new to Swift development (and native iOS development for that matter)
I've got an iOS app that uses the iPhone / iPad built in cameras, and am looking to make this more compatible with macOS.
Using the normal AVCaptureDevice.DiscoverySession I seem to get the iPhone Continuity Camera and the in-built MacBook Pro camera but I don't see other input devices that I see in QuickTime Player (for example) such as connected external cameras or Virtual Inputs provided by NDI Virtual Input and OBS.
Is there a way to see access these without a specific Mac build (as the rest of the functionality works great, and I'd rather not diverge the codebase too much as it's easier to update one app than two!
I am conducting a forensic examination of a video, and it would be helpful if anyone knew how to decode the LiveTrackInfo of the metadata of a QuickTake .MOV recorded on iOS 17.5.1 There are 27 different fields, and I am not sure what each one represents. Any help would be appreciated! Thank you!
log-file
I have exhausted standard debugging approaches and need guidance on Final Cut Pro's AU plugin loading behavior.
PLUGIN OVERVIEW
My plugin is an AUv2 audio plugin built with JUCE 8.
The plugin loads and functions correctly in:
Pro Tools (AAX)
Media Composer (AAX)
Reaper (AU + VST3)
Logic Pro (AU)
GarageBand (AU)
Audacity (AU)
DaVinci Resolve / Fairlight (AU)
Harrison Mixbus (AU)
Ableton Live (AU)
Cubase (VST3)
Nuendo (VST3)
It does NOT load in Final Cut Pro (tested on 10.8.x, macOS 14.6 and 15.2).
DIAGNOSTICS COMPLETED
auval passes cleanly:
auval -v aufx Hwhy Manu → AU VALIDATION SUCCEEDED
Plugin is notarized and stapled:
xcrun stapler validate → The validate action worked
spctl --assess --type exec → Note: returns 'not an app' which we understand is expected for .component bundles.
Hardened Runtime is enabled on the bundle.
We identified that our Info.plist contained a 'resourceUsage' dictionary in the AudioComponents entry. We found via system logs that this was setting au_componentFlags = 2 (kAudioComponentFlag_SandboxSafe), causing FCP to attempt loading the plugin in-process inside its sandbox, where our UDP networking is denied. We removed the resourceUsage dict, confirmed au_componentFlags = 0 in the CAReportingClient log, and FCP now loads the plugin out-of-process via AUHostingServiceXPC_arrow.
Despite au_componentFlags = 0 and out-of-process loading confirmed, the plugin still does not appear in FCP's effects browser.
We also identified and fixed a channel layout issue — our isBusesLayoutSupported was not returning true for 5-channel layouts (which FCP uses internally). This is now fixed.
AU cache has been fully cleared:
~/Library/Caches/com.apple.audio.AudioComponentRegistrar
/Library/Caches/com.apple.audio.AudioComponentRegistrar
coreaudiod and AudioComponentRegistrar killed to force rescan
auval -a run to force re-registration
Topic:
Media Technologies
SubTopic:
Video
Tags:
Audio
Professional Video Applications
Media
AudioUnit