Getting Started with Swift 6 and On-Device AI
Swift 6 introduces strict concurrency checking and safer isolation by default. In this chapter, we build a production-ready Core ML pipeline with actor boundaries, model lifecycle controls, and deployable project scaffolding for iOS 20 and macOS 15.
Difficulty
Intermediate to Advanced
Read time
18 minutes
Outcome
Working baseline pipeline
Prerequisites
- - Xcode 17.0 or newer
- - iOS 20.0+ and macOS 15.0+ deployment targets
- - Basic SwiftUI, async/await, and actor isolation knowledge
- - Familiarity with Core ML model conversion basics
If your app still targets earlier runtime versions, keep this architecture and branch feature gates around runtime checks. Do not split your entire codebase by OS version; isolate capability checks in the inference boundary.
1) Project Setup and Compiler Baseline
Start from a SwiftUI App template, then enforce concurrency diagnostics immediately. The earlier strict mode surfaces unsafe patterns, the less migration debt you create when team size grows.
// Build settings recommendations
SWIFT_VERSION = 6.0
SWIFT_STRICT_CONCURRENCY = complete
OTHER_SWIFT_FLAGS = -warn-concurrency -enable-upcoming-feature StrictConcurrency
// Target matrix
- iOS App target: iOS 20.0
- macOS App target: macOS 15.0
- Shared package: Swift 6 language modeKeep your AI service in a separate module (or package target) so strict-concurrency fixes remain localized. This also simplifies unit tests and allows isolated benchmarking binaries.
2) Actor-Isolated Inference Service
In Swift 6, actor boundaries are not optional architecture sugar. They are your safety envelope for model lifecycle, cache state, and cancellation behavior. Treat inference as a stateful service, not a utility function.
import CoreML
actor InferenceService {
private var model: MLModel?
private var isLoaded = false
func loadModel() async throws {
guard !isLoaded else { return }
let config = MLModelConfiguration()
config.computeUnits = .all
model = try await MyClassifier(configuration: config).model
isLoaded = true
}
func predict(_ input: FeatureInput) async throws -> FeatureOutput {
if !isLoaded { try await loadModel() }
guard let model else { throw InferenceError.modelUnavailable }
let result = try model.prediction(from: input)
return FeatureOutput(result)
}
func unloadModel() {
model = nil
isLoaded = false
}
}This shape gives you deterministic ownership for model memory and an explicit unload path when the app moves to background or enters thermal pressure states.
3) SwiftUI Integration Pattern
Keep UI updates on the main actor and isolate async prediction calls in task scopes. The easiest way to introduce race bugs is allowing overlapping requests to mutate shared view state.
@MainActor
final class InferenceViewModel: ObservableObject {
@Published var outputText: String = ""
@Published var isRunning = false
private let service = InferenceService()
private var currentTask: Task<Void, Never>?
func run(input: FeatureInput) {
currentTask?.cancel()
currentTask = Task {
isRunning = true
defer { isRunning = false }
do {
let output = try await service.predict(input)
outputText = output.summary
} catch is CancellationError {
outputText = "Cancelled"
} catch {
outputText = "Prediction failed: (error.localizedDescription)"
}
}
}
}4) Core ML Pipeline Checklist
- 1. Validate model input/output schema and shape assumptions.
- 2. Benchmark cold-start and warm-path latency separately.
- 3. Add runtime capability checks for unsupported hardware paths.
- 4. Store model metadata (version, checksum, intended task) in app config.
- 5. Add fallback behavior when inference cannot complete in budget.
Production issue patterns usually come from skipped schema validation and missing fallback logic, not from the model itself.
5) Common Swift 6 Migration Traps
- - Capturing mutable shared state in detached tasks.
- - Mixing sync model loading with async UI-bound calls.
- - Accessing ML model references outside actor boundary.
- - Ignoring task cancellation in long-running prediction loops.
- - Leaving deprecated concurrency diagnostics suppressed in CI.