Skip to content

solid-contrib/solidservers.org

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

57 Commits
 
 
 
 
 
 

Repository files navigation

Solid Test Suite

Join the chat at https://gitter.im/solid/test-suite

Solid's test suite verifies the interoperability of pod servers and is maintained by the Solid test suite panel. We are partially volunteering and partially sponsored by our employers, but ultimately the continuity of the Solid test suite is guaranteed through an Open Collective, and you can show your support by donating even just 1 USD there, and we'll add you or your logo to our list of sponsors.

We operate as a Solid Panel, and as such we have a charter with deliverables.

NB: This does not in any way give you a vote in the contents or the reporting of the test suite, in the Solid spec, or in any aspect of the Solid ecosystem. For that, you should join our W3C-CG.

Initial Sponsor

NLNet Foundation donated 15,000 euros in 2020 as part of the Solid-Nextcloud integration project.

NLNet

Current Sponsors

These awesome Solid-related startups collectively sponsor the maintenance of the Solid test suite through our Open Collective, click on their logos to check them out!

Digita O Team GraphMetrix Interition Ontola
Understory Startin'blox Muze Ponder Source Meisdata

And a very big "Thank You" to the following individuals from the Solid community, who are donating through our Open Collective to make the Solid test suite possible, you are all awesome!

(anonymous backer) Sjoerd van Groning Jan Schill Travis Vachon Sharon Stratsianis Matthias Evering
(anonymous backer) Sjoerd van Groning Jan Schill Travis Vachon Sharon Stratsianis Matthias Evering

Context

The reports on solidservers.org are generated using the Jest-based tests, that is, the CRUD tests, the WAC tests, and a few others. Although the Jest-based tests are published by the Solid Test Suite Panel, it is important to note that they have not been reviewed by the Solid spec editors. We are working on migrating to the Gherkin-based specification-tests which can be run using the conformance test harness. Once that migration is complete we will retire the Jest-based tests.

We are also in the process of getting all specification-tests reviewed by spec editors, and until that process is complete we will not publish the output of these test runs directly. Instead, when we see a test failure on specification-tests, we use this as a research observation to do more tests, and if necessary, add a missing test to one of the Jest-based tests so that we can report it on solidservers.org as the output of a Jest-based test.

The more tests you run, the more information you collect. When all tests are green, it confirms what we already thought we knew, and improves our confidence. Even better, especially when test results from different sources contradict each other, this information adds up to help us move forward. This test suite tries to cover all Solid-related protocols and to only test for behaviours that are undisputed in the spec, but it's evolving and never perfect.

All tests are written from assumptions, and sometimes the same assumption that slipped into your code, also slipped into your tests. In that case, the tests will be green for the wrong reasons. This can be as simple as a typo in a predicate which was maybe copied twice from the same source. Easy to fix, but very important for interoperability!

Sometimes we find a test is incorrect or too strict. Sometimes we don't know what the correct behaviour is. In this case we mark the test as 'skip' and open a spec issue for debate. So at least we can turn an "unknown unknown" into a "known unknown". When servers disagree, we need to document the difference. If we can describe the differences with reproducable tests, this will help us all have more detailed spec discussions! :)

Is this test suite a single complete and correct source of truth? The answer is no. Solid is still evolving and although there is a lot of consensus around how a Solid pod server should behave, there is no complete single truth. This test suite is an additional layer of defence that will help you compare your implementation of Solid with those of others! That way, we all collectively become more interoperable, and that will ultimately increase the value of Solid for everyone.

See also our chartered deliverables within the Solid community.

Overview

Running parts of the test suite against servers can be fiddly, and we're here to help you. If you have any questions about how to run some tests suite against some server, (whether on your localhost, on the public internet, or in a continuous integration hook like GitHub Actions), please join our Gitter chat for guidance.

The following Solid pod server implementations have been tested in one of three ways; you can tell from the 'Version' column:

  • The ones that say '(each PR)' run these tests as part of their development process, that's what we always recommend!
  • The ones that mention a code revision are run against an instance of the server that we built from source, on one of our laptops.
  • The ones that mention a URL are run against a public instance of the server.

When a test that is run in one of these three ways fails, we write steps-to-reproduce with curl, and if we are confident the server is violating the spec, we mark it as a fail. If we don't find any failing tests, we mark the server as '✓'. The following table reports our conclusions from that process:

Required parts:

Optional parts:

For the 'version' column, servers have "(each PR)" if their continuous integration is set up to automatically test against each PR. For closed-source servers we list the public instance against which we run the test suite.

Table

# name last tested prog.lang IDP CRUD WAC (WPS) (CON) (MON)
1. Node Solid Server (each PR) JavaScript
2. PHP Solid Server (each PR) PHP 7)
3. Solid-Nextcloud (each PR) PHP
4. Pivot 30 june 2022 TypeScript 1)
5. Community Solid Server 10 may 2022 TypeScript 1) 6)
6. TrinPod october 2021 Lisp 1) 2)
7. Inrupt ESS 26 september 2022 Java 1) 8) 3) 4) 5)
8. Manas (testing starts) Rust
9. Naamio (coming soon!) Rust
10. Reactive-SoLiD (coming soon!) Scala
11. DexPod (coming soon!) Ruby
12. Disfluid (coming soon!) C

Footnotes

  1. for some servers we have manually tested that they include a working webid-oidc identity provider, but we don't have the headless-browser tests that confirm this automatically for these servers. The solid-oidc IDP tester page, in contrast, requires human interaction, but with that it can test any publicly hosted IDP.

  2. TrinPod will support this in the future

  3. Solid 0.9 requires WAC as a "MUST", yet by default ESS uses ACP instead. ESS can be configured to be compliant with WAC (and thus with Solid 0.9), but this configuration is not enabled on pod.inrupt.com and it is also not supported in production.

  4. See #136

  5. Due to architectural trade-offs, global locks are not supported in Inrupt ESS

  6. See #145 and #146

  7. PSS supports PATCH with application/sparql-update but not with the newly required text/n3, see https://github.com/solid/solid-crud-tests/pull/53/files

  8. From our tests it looks like Inrupt Pod Spaces supports PATCH with application/sparql-update but not with the newly required text/n3, see #60

Test-suite report

When run locally a test-suite-report app can be run :

  • the app uses the output json files produces by the tests run locally.
  • the report links each failed test to the reported error.

See latest test-suite-report.md. The report actually covers CRUD and WAC tests of CSS, ESS and NSS.

Access Control Policy Tests (coming soon)

Version 0.9 of the Solid protocol requires support for WAC, but future versions of the spec will (probably) require servers to support "either WAC or ACP". We are working on adding test reports for ACP support.

Monetization Tests (version 1.0.0, experimental, work in progress)

As of 2021, Web Monetization in Solid is an experiment, no real specifications have been written for it yet. These versioned tests are meant to help the discussion as it progresses. The tests themselves are a work in progress, too. More to come as the project progresses. If you're not yourself working on WebMonetization yourself, don't spend too much time trying to implement this feature. If youre a Solid app developer and wondering which servers to use when experimenting with WebMonetization in your Solid app, these tests might help you find your way. See https://github.com/solid/monetization-tests.

Running the Test Suite

There is an outdated runTests.sh script, which is still the best starting point to run for instance Kjetil's RDF-based tests, see old-instructions.md.

To run the test suite against the various servers, it's best to follow their own CI scripts, see the list above.

The scripts are very similar but have small differences in how they start up the system-under-test. One key step is there obtaining a login cookie. Roughly, it works as follows:

  • A Docker 'testnet' network is created
  • The system-under-test is started in there, at https://server.
  • For the WAC tests, a third-party server is also started, at https://thirdparty.
  • Then, for each, a process is started that logs in and harvests a login cookie:
    • For node-solid-server, this is a www-form-urlencoded POST to /login/password with username and password
    • For php-solid-server, this is a www-form-urlencoded POST to /login with username and password
    • For Solid-Nextcloud, this is a headless browser (Puppeteer) which fills in the username and password on the page and clicks 'login'
    • Note that for community-solid-server, this step is not implemented yet, since it only supports unauthenticated CRUD so far.

Using the cookie, the testers will go through the WebID-OIDC dance (adding the Cookie header in each http request). This allows the testers to get their DPop tokens signed, which they can then use to construct Authorization and DPop headers. We use solid-auth-fetcher for this, specifically its obtainAuthHeaders functionality.

The webid-provider-tests stop when the tester successfully obtains auth headers.

The solid-crud-tests can run unauthenticated (which is what Community-Solid-Server currently does), or with Authorization and DPop headers.

The web-access-control-tests have to run authenticated. To pass these tests, the server currently needs to be an identity provider as well as a wac+crud storage. The 'Alice' identity on the server should have full R/W/A/C access (accessTo+default) to the entire pod. The tests then instantiate two Solid Logic instances, one for 'Alice' on https://server, and one for 'Bob' on https://thirdparty. Through those, Alice will edit her ACL documents to give Bob various kinds of access, and then Bob will test various operations.

See also:

Solid Pod Hosters for Enterprise

Solid can separate apps from data. So to use Solid, you need Solid apps and a Solid Pod Server. As an organisation who wants to offer data sovereignty to their users, you can of course host a pod server in-house or even on-prem, but you can also contact one of the following commercial pod hosting providers to handle this for you. Note that this list is compiled to the best our knowledge, and inclusion in this list does not imply any kind of endorsement from the Solid CG or vice versa. Please contribute or open an issue if you have any additions or corrections.

Company Product "Entreprise Grade" "Open Source" IDP CRUD Access Control Notifications From
GraphMetrix TrinPod US / EU
Digita Use ID BE
Inrupt ESS US
Muze Nextcloud NL
The Good Cloud Nextcloud NL
Lansol Nextcloud DE
Penta Nextcloud CH
Netways Nextcloud DE
Jaba Hosting Nextcloud DE
LibreBit Nextcloud ES

As a developer or end-user, you can self-host an open source pod server or get a pod from a (demo) pod provider.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •