Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Swift 6 Complete Concurrency Checking #85

Draft
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

jdisho
Copy link
Collaborator

@jdisho jdisho commented Dec 20, 2024

Addressing: #54

Swift6 - Complete Concurrency Checking

♻️ Current situation & Problem

Xcode is throwing concurrency warnings for the SpeziLLM, SpeziLLMFog, SpeziLLMLocal, SpeziLLMLocalDownload, and SpeziLLMLocalOpenAI targets because they currently have concurrency concerns.

⚙️ Release Notes

📚 Documentation

No additional changes.

✅ Testing

Warnings will be resolved after #64 is merged.

📝 Code of Conduct & Contributing Guidelines

By submitting creating this pull request, you agree to follow our Code of Conduct and Contributing Guidelines:

Copy link

codecov bot commented Dec 20, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 38.57%. Comparing base (26b1e07) to head (ec16833).

Additional details and impacted files

Impacted file tree graph

@@           Coverage Diff           @@
##             main      #85   +/-   ##
=======================================
  Coverage   38.57%   38.57%           
=======================================
  Files          64       64           
  Lines        2331     2331           
=======================================
  Hits          899      899           
  Misses       1432     1432           
Files with missing lines Coverage Δ
Sources/SpeziLLM/Helpers/LLMContext+Chat.swift 73.69% <ø> (ø)
Sources/SpeziLLM/LLMSessionProvider.swift 87.50% <100.00%> (ø)
Sources/SpeziLLM/Models/LLMContextEntity.swift 35.00% <ø> (ø)

Continue to review full report in Codecov by Sentry.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 26b1e07...ec16833. Read the comment docs.

@philippzagar philippzagar added the enhancement New feature or request label Dec 20, 2024
@philippzagar philippzagar self-requested a review December 20, 2024 20:36
@philippzagar philippzagar marked this pull request as draft December 20, 2024 20:37
Copy link
Member

@philippzagar philippzagar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @jdisho for the PR, great work! 🚀
As you mentioned, it makes sense to wait for the SpeziLLM OpenAI layer to be rewamped, but you can already start tackling the local layer in the meantime.

Also, as mentioned, SpeziSpeech and SpeziChat need to be lifted to Swift 6 language mode as well at some point, let me know if you're interested in that!
Edit: Tackled in StanfordSpezi/SpeziSpeech#10 and StanfordSpezi/SpeziChat#20

Package.swift Outdated
Comment on lines 13 to 18
#if swift(<6)
let swiftConcurrency: SwiftSetting = .enableExperimentalFeature("StrictConcurrency")
#else
let swiftConcurrency: SwiftSetting = .enableUpcomingFeature("StrictConcurrency")
#endif

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to do that if you just specify // swift-tools-version:6.0 at the top of the file.
See https://github.com/StanfordSpezi/Spezi/blob/main/Package.swift

Package.swift Outdated Show resolved Hide resolved
Sources/SpeziLLM/Helpers/LLMContext+Chat.swift Outdated Show resolved Hide resolved
Sources/SpeziLLM/LLMSessionProvider.swift Show resolved Hide resolved
@jdisho
Copy link
Collaborator Author

jdisho commented Dec 21, 2024

@philippzagar - Thanks for the input! This is great 🚀
I'll take care of these two modules.

@jdisho
Copy link
Collaborator Author

jdisho commented Dec 21, 2024

Ah, I just saw that you opened the PRs already.

@philippzagar philippzagar changed the title [WIP] Swift6 - Complete Concurrency Checking Swift 6 Complete Concurrency Checking Dec 21, 2024
}

@MainActor

private func downloadWithHub() async throws {
let repo = Hub.Repo(id: model.hubID)
let modelFiles = ["*.safetensors", "config.json"]

try await HubApi.shared.snapshot(from: repo, matching: modelFiles) { progress in
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is forcing me to make the whole class @unchecked Sendable, because we can't pass progress safely without potentially causing data races. Do you have another idea?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah we don't want to mark the manager Sendable cause it isn't.
The issue is the double capture of self, once within the snapshot() callback and once within the Task.

Did you try something like (rough sketch)

try await HubApi.shared.snapshot(from: repo, matching: modelFiles) { progress in
        Task { @MainActor [mutate = self.mutate] in
            mutate(progress)
        }
    }
}

@MainActor private func mutate(progress: Progress) {
      self.state = .downloading(progress: progress)
}

@jdisho jdisho requested a review from philippzagar December 22, 2024 00:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Backlog
Development

Successfully merging this pull request may close these issues.

2 participants