Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

spark api now lives at https://api.filspark.com/ #37

Merged
merged 1 commit into from
Oct 19, 2023

Conversation

juliangruber
Copy link
Member

No description provided.

@juliangruber juliangruber requested a review from bajtos October 18, 2023 20:23
Copy link
Member

@bajtos bajtos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:shipit:

We will need to go through the full release ceremony to get this out to our users (release Spark, Station Core, Station Desktop). I am getting a bit tired of that, TBH.

@juliangruber
Copy link
Member Author

I can do that dance 👍

@juliangruber juliangruber merged commit e3e6054 into main Oct 19, 2023
1 check passed
@juliangruber juliangruber deleted the update/spark-address branch October 19, 2023 09:05
@juliangruber
Copy link
Member Author

@juliangruber
Copy link
Member Author

juliangruber commented Oct 19, 2023

We can think about simplifying the procedure step by step. What do you think about adding logic to core to always install the latest version of spark? Then we don't need to cut a new core release

@juliangruber
Copy link
Member Author

We didn't have to create a new release btw, spark-api would just redirect to the new url. The only advantage we get with this release really is slightly lower latency

@bajtos
Copy link
Member

bajtos commented Oct 19, 2023

We can think about simplifying the procedure step by step. What do you think about adding logic to core to always install the latest version of spark? Then we don't need to cut a new core release

+1 to simplify this incrementally.

We can modify core to always install the latest spark version, but that's IMO not enough - we also need to periodically check for a new version of spark to trigger the upgrade. We will also need some way to allow SPARK to signal the minimum Zinnia version it requires - this will become important when we add more features to Zinnia and then use these new features in SPARK. Think of the "engines" field in Node.js' package.json file.

@juliangruber
Copy link
Member Author

Right, it's not as easy as I initially thought 🤔

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: ✅ done
Development

Successfully merging this pull request may close these issues.

2 participants