Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

Diagnose data access latency
The code is pretty simple kernel void naive( constant RunParams *param [[ buffer(0) ]], const device float *A [[ buffer(1) ]], // [N, K] device float *output [[ buffer(2) ]], uint2 gid [[ thread_position_in_grid ]]) { uint a_ptr = gid.x * param->K; for (uint i = 0; i < param->K; i++, a_ptr++) { val += A[b_ptr]; } output[ptr] = val; } when uint a_ptr = gid.x * param->K, the code got 150 GFLops when uint a_ptr = gid.y * param->K, the code got 860 GFLops param->K = 256; thread per group: [16, 16] I'd like to understand why the performance is so different, and how can I profile/diagnose this to help with further optimization.
0
0
92
Apr ’25
screencapturekit
I have code that captures a window and displays a cropped image. The problem is 2 fold. Kit doesn't seem to allow to modify stop and recapture image in window mode to capture a portion of the screen. So this makes me having to crop and display the cropped image via a published variable. This all works find. But seems to stop after some time. Using an M1 16gig ram. program is taking less than 100meg of mem with 40-70%cpu as the crow flies. printing captured success in debug mode and sometimes frame isn't valid so guarding against it. any ideas on how to improve my strategy?
0
0
143
Jun ’25
ARView [.showStatistics] doesn't work on Xcode Canvas
Hi, I can't see RealityKit statistics on Xcode Canvas using: arView.debugOptions = [.showStatistics] The statistics only show on a physical device, not Xcode live canvas with #Preview. Testing in Xcode 26.0.1 (17A400) on Tahoe 26.0.1 (25A362). Use case: I'm using RealityKit as a non-AR 3D engine. Xcode Canvas is useful for live iterations. Is this expected behavior? How can I see FPS on Xcode canvas? SKView for example shows all debug options on both Xcode Canvas and physical devices.
0
0
465
Oct ’25
MetalFX for Unity 2022.3.62f3?
Hi, I’m testing Unity’s Spaceship HDRP demo on iPhone 17 Pro Max and iPad Pro M4 (iOS 26.1). Everything renders correctly, and my custom MetalFX Spatial plugin initializes successfully — it briefly reports active scaling (e.g. 1434×660 → 2868×1320 at 50% scaling), then reverts to native rendering a few frames later. Setup: Xcode 16.1 (targeting iOS 18) Unity 2022.3.62f3 (HDRP) Metal backend Dynamic Resolution enabled in HDRP assets and cameras Relevant Xcode console excerpt: [MetalFXPlugin] MetalFX_Enable(True) called. [SpaceshipOptions] MetalFX enabled with HDRP dynamic resolution integration. [SpaceshipOptions] Disabled TAA for MetalFX Spatial. [SpaceshipOptions] Created runtime RenderTexture: 1434x660 [MetalFX] Spatial scaler created (1434x660 → 2868x1320). [MetalFX] Processed frame with scaler. [MetalFXPlugin] Sent RenderTexture (1434x660) to MetalFX. Output target 2868x1320. [SpaceshipOptions] MetalFX target set: 1434x660 [SpaceshipOptions] Camera targetTexture cleared after MetalFX handoff. It looks like HDRP clears the camera’s target texture right after MetalFX submits the frame, which causes it to revert to native rendering. Is there a recommended way to persist or rebind the MetalFX output texture when using HDRP on iOS? Unity doesn’t appear to support MetalFX in the Editor either: Thanks!
0
0
251
Nov ’25
ParticleEmitterComponent Position Offset Issue After iOS 26.1 Update – Seeking Solutions & Workarounds
Problem Summary After upgrading to iOS 26.1 and 26.2, I'm experiencing a particle positioning bug in RealityKit where ParticleEmitterComponent particles render at an incorrect offset relative to their parent entity. This behavior does not occur on iOS 18.6.2 or earlier versions, suggesting a regression introduced in the newer OS builds. Environment Details Operating System: iOS 26.1 & iOS 26.2 Framework: RealityKit Xcode Version: 16.2 (16C5032a) Expected vs. Actual Behavior Expected: Particles should render at the position of the entity to which the ParticleEmitterComponent is attached, matching the behavior on iOS 18.6.2 and earlier. Actual: Particles appear away from their parent entity, creating a visual misalignment that breaks the intended AR experience. Steps to Reproduce Create or open an AR application with RealityKit that uses particle components Attach a ParticleEmitterComponent to an entity via a custom system Run the application on iOS 26.1 or iOS 26.2 Observe that particles render at an offset position away from the entity Minimal Code Example Here's the setup from my test case: Custom Component & System: struct SparkleComponent4: Component {} class SparkleSystem4: System { static let query = EntityQuery(where: .has(SparkleComponent4.self)) required init(scene: Scene) {} func update(context: SceneUpdateContext) { for entity in context.scene.performQuery(Self.query) { // Only add once if entity.components.has(ParticleEmitterComponent.self) { continue } var newEmitter = ParticleEmitterComponent() newEmitter.mainEmitter.color = .constant(.single(.red)) entity.components.set(newEmitter) } } } AR Setup: let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true) let model = Entity() model.components.set(ModelComponent(mesh: boxMesh, materials: [material])) model.components.set(SparkleComponent4()) model.position = [0, 0.05, 0] model.name = "MyCube" let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: [0.2, 0.2])) anchor.addChild(model) arView.scene.addAnchor(anchor) Questions for the Community Has anyone else encountered this particle positioning issue after updating to iOS 26.1/26.2? Are there known workarounds or configuration changes to ParticleEmitterComponent that restore correct positioning? Is this a confirmed bug, or could there be a change in coordinate system handling or transform inheritance that I'm missing? Additional Information I've already submitted this issue via Feedback Assistant(FB21346746)
0
0
600
Dec ’25
Game Center Challenges and Activities are not appearing
Hi, I'm trying to add game center challenges and activities to an already live game, but they are not appearing in game for testing, GameCenter, or the Games app. I know the game is setup with GameKit entitlements since this is a live game and it has working leaderboards and achievements. I've updated to Tahoe beta 8, added a challenge and activity on app store connect, added that to a new distribution and added that distribution to 'Add for Review' I'm using Unity and the Apple Unity plugin Not sure what other steps I'm missing Thanks
0
0
1.1k
Sep ’25
Can a compute pipeline be as efficient as a render pipeline for rasterization?
I'm new to graphics and game design and I just wanted to know if a compute pipeline could be as efficient as a render pipeline for rasterization and an explanation on how and why. Also is it possible to manually perform rasterization with a render pipeline as in manipulate individual pixel data in a metal texture yourself but do it with a render pipeline?
0
0
117
2d
请问Game Center的数据保存逻辑
我们想在游戏类 App 内接入 Game Center。用户可以在游戏内创建多个角色,若用户在游戏内创建了2个角色:角色1、角色2,请问: 当用户将角色1与 Game Center 绑定后,数据将上报至 Game Center。此时玩家想要将角色1与 Game Center 解除绑定,解绑后,再将角色2与 Game Center 绑定。那么这时角色1的数据是留存在 Game Center 中,还是将被移除?
0
0
332
Oct ’25
Using Metal compute for scientific simulation (lattice QCD gauge theory)
I've been using Metal compute shaders for lattice quantum chromodynamics simulations and wanted to share the experience in case others are doing scientific computing on Metal. The workload involves SU(2) matrix operations on 4D lattice grids — lots of 2x2 and 3x3 complex matrix multiplies, reductions over lattice sites, and nearest-neighbor stencil operations. The implementation bridges a C++ scientific framework (Grid) to Metal via Objective-C++ .mm files, with MSL kernels compiled into .metallib archives during the build. Things that work well: Shared memory on M-series eliminates the CPU↔GPU copy overhead that dominates in CUDA workflows The .metallib compilation integrates cleanly with autotools builds using xcrun Float4 packing for SU(2) matrices maps naturally to MSL vector types Things I'm still figuring out: Optimal threadgroup sizes for stencil operations on 4D grids Whether to use MTLHeap for gauge field storage or stick with individual buffers Best practices for double precision — some measurements need float64 but Metal's double support varies by hardware The application is measuring chromofield flux distributions between static quarks, ultimately targeting multi-quark systems. Production runs are on MacBook Pro M-series and Mac Studio. Code: https://github.com/ThinkOffApp/multiquark-lattice-qcd
0
0
75
1w
SpriteKit framerate drop on iOS 26.4 (ongoing for months)
I have noticed that the performance drop on SpriteKit-based projects running on iOS 26 is still ongoing With iOS 26 back in Sep 2025 a framerate problem was introduced. My app was always running smoothly with 60fps even on very old devices suddenly started to stutter with 40fps - and lower on a rather normal iPhone 13. This problem continued with BETA 26.1 The problem was fixed in 26.2. But 26.3 brought the problem back and its still ongoing with 26.4 of yesterday This is easily reproducible with a very simple example // // BareboneSpriteKitApp.swift // BareboneSpriteKit // // Created by Bernd Beyreuther on 24.02.26. // import SwiftUI import SpriteKit @main struct BareboneSpriteKitApp: App { var body: some Scene { WindowGroup { BareboneSceneView() } } } final class BareboneScene: SKScene { override func didMove(to view: SKView) { size = view.bounds.size scaleMode = .resizeFill anchorPoint = CGPoint(x: 0.5, y: 0.5) backgroundColor = .darkGray let s = SKSpriteNode(color: .cyan, size: CGSize(width: 64, height: 64)) addChild(s) let action = SKAction.rotate(byAngle: .pi, duration: 2) s.run(.repeatForever(action)) let t = SKLabelNode(text: deviceInfoString()) t.fontSize = 15 t.position.y = -100 addChild(t) } } struct BareboneSceneView: View { var body: some View { SpriteView( scene: BareboneScene(), debugOptions: [.showsFPS] ) .ignoresSafeArea() } } func deviceInfoString() -> String { let os = ProcessInfo.processInfo.operatingSystemVersion let osString = "iOS \(os.majorVersion).\(os.minorVersion).\(os.patchVersion)" let model = UIDevice.current.model // "iPhone", "iPad" let machine = { var sysinfo = utsname() uname(&sysinfo) return withUnsafePointer(to: &sysinfo.machine) { ptr -> String in ptr.withMemoryRebound(to: CChar.self, capacity: 1) { cptr in String(cString: cptr) } } }() // z.B. "iPhone15,2" return "Model Identifier: \(model) (\(machine)), \(osString)" } I file a bugreport via Feedback Assistant FB22038921 The problem is no around for such a long time ! This is deeply concerning, because it questions if it is really feasable to continue to develop using Spritekit ?
0
0
111
1w
Unity iOS Game Name Display Issue
When building a Unity iOS game, the app name displays incorrectly as "BigBall" on the iPhone home screen, despite setting the project name and bundle identifier to "Big Ball" in Unity and Apple Developer account. The correct name, "Big Ball," appears in TestFlight. I tried solutions from ChatGPT and DeepSeek, but none were satisfactory. Please help me.
0
0
81
2w
Xcode Metal Capture crash when using MTLSamplerState
The sample code just draw a triangle and sample texture. both sample code can draw a correct triangle and sample texture as expected. there are no error message from terminal. Sample code using constexpr Sampler can capture and replay well. Sample code using a argumentTable to bind a MTLSamplerState was crashed when using Metal capture and replay on Xcode. Here are sample codes. Sample Code Test Environment: M1 Pro MacOS 26.3 (25D125) Xcode Version 26.2 (17C52) Feedback ID: FB22031701
0
0
72
1w
Open Shading Language (OSL) in Metal
Hi. I'm a 3D designer, using Blender for most of my work. The most recent Blender conference discussed utilizing the Open Shading Language (OSL) in their latest versions, which allows designers to write custom shaders for their workflows. At the moment, only Nvidia Optix GPU's can utilize this language for rendering (from what I understand), but Blender developers stated they are waiting on other GPU manufacturers to implement this feature as well. I'm not sure if there are any licensing issues here, but would this be something Apple could implement in Metal to make their hardware more attractive to the 3D design community? Any help or knowledge on this topic would be greatly appreciated.
0
0
228
Feb ’26
Custom Cameras in RealityKit
Hi all, I've developed some code that enables an arcball camera interaction with my scene. I've done this using components and systems. The implementation feels a bit messy as I've got gesture code on my realityView, and then a bunch of other code that uses those gesture inputs in my component and system. Is there a demo app, or some example code that shows a nice way to encapsulate these things in to one item for custom cameras, something like Apple's .realityViewCameraControls(.orbit) If not can anyone recommend an approach to take?
0
0
294
Oct ’25
RealityKit crashes when rendering SpriteKit scene with SKShapeNode in postProcess callback
I'm converting my game from SceneKit to RealityKit. It has a SpriteKit overlay that according to Explore advanced rendering with RealityKit 2 I can add with the code below. The code runs fine if the SKScene only contains a SKSpriteNode (see the commented out line), but when I add a SKShapeNode with a fillColor instead, the app crashes with this error: -[MTLDebugRenderCommandEncoder validateCommonDrawErrors:]:5970: failed assertion `Draw Errors Validation MTLDepthStencilDescriptor uses frontFaceStencil but MTLRenderPassDescriptor has a nil stencilAttachment texture MTLDepthStencilDescriptor uses backFaceStencil but MTLRenderPassDescriptor has a nil stencilAttachment texture ' I don't know enough about low-level graphics and stencils yet to figure out a quick solution, so I would appreciate if anyone could share an easy fix or explanation of what's wrong. Thanks! class ViewController: NSViewController { var device: MTLDevice! var renderer: SKRenderer! override func loadView() { let arView = ARView(frame: NSScreen.main!.frame) view = arView arView.renderCallbacks.prepareWithDevice = { [weak self] device in guard let self = self else { return } self.device = device renderer = SKRenderer(device: MTLCreateSystemDefaultDevice()!) let scene = SKScene() let shape = SKShapeNode(rectOf: CGSize(width: 10, height: 10)) shape.fillColor = .red scene.addChild(shape) // scene.addChild(SKSpriteNode(color: .red, size: CGSize(width: 10, height: 10))) renderer.scene = scene } arView.renderCallbacks.postProcess = { [weak self] context in guard let self = self else { return } let encoder = context.commandBuffer.makeBlitCommandEncoder() encoder?.copy(from: context.sourceColorTexture, to: context.targetColorTexture) encoder?.endEncoding() renderer.update(atTime: context.time) let descriptor = MTLRenderPassDescriptor() descriptor.colorAttachments[0].loadAction = .load descriptor.colorAttachments[0].storeAction = .store descriptor.colorAttachments[0].texture = context.targetColorTexture renderer.render(withViewport: CGRect(x: 0, y: 0, width: context.targetColorTexture.width, height: context.targetColorTexture.height), commandBuffer: context.commandBuffer, renderPassDescriptor: descriptor) } } }
0
0
191
2d
Slow compilation
Hi, I am working with a large project. We are compiling each material to its own .metallib. They all include many common files full of inline functions. Finally we link it all together at the end with a single big pathtrace kernel. Everything works as expected, however the compile times have gotten completely out of hand and it takes multiple minutes to compile at runtime (to native code). I have gathered that I can do this offline by using metal-tt however if I am wondering if there is a way to reduce the compile times in such a scenario, and how to investigate what the root cause of the problem is. I suspect it could have to do with the fact that every materials metallib contains duplications of all the inline functions. Any ideas on how to profile and debug this? Thanks, Rasmus
0
1
184
Mar ’25
How to attach SwiftUI Views to entities on non-visionOS platforms?
What is the recommended way to attach SwiftUI views to RealityKit entities on macOS, iOS, etc? All the APIs seem to be visionOS only: https://developer.apple.com/documentation/realitykit/realityviewattachments https://developer.apple.com/documentation/realitykit/viewattachmentcomponent https://developer.apple.com/documentation/realitykit/presentationcomponent https://developer.apple.com/documentation/realitykit/imagepresentationcomponent My only idea is to do it "manually" with a ZStack and RealityView somehow? I submitted this as a feedback since it seemed like an oversight: FB18034856.
0
2
143
Jun ’25
Turn-Based Game and Invitations
I have two devices (iPod, iPhone), each using a different Apple ID. I have an existing game to which I'm adding TBM. When the iPod invites the iPhone, it sends an iMessage invite to the iPhone; when I click on that message, I get "Retrieving", then Game Center in Settings is opened, not my App (same version installed on both devices). I start my App on the iPhone and that match is not shown in the Matchmaker View Controller. When I send an invite from the iPhone to the iPod and I click on the iMessage invite, the app starts but the match isn't listed in the MatchMaker ViewController on the iPod (but is on the iPhone). In addition, when I click on the info circle on the iPhone, it who's the two players and "App Store" under the Game Center name. However, When I do the same on the iPod, it has a "Play your turn" there. Any ideas?
0
0
665
Nov ’25