Skip to content
This repository has been archived by the owner on Feb 24, 2021. It is now read-only.

Shall we have a Resource Module Incubation process (from good to HQRM)? #490

Open
gaelcolas opened this issue Mar 20, 2019 · 6 comments
Open
Labels
discussion The issue is a discussion.

Comments

@gaelcolas
Copy link
Contributor

gaelcolas commented Mar 20, 2019

Opening this discussion to avoid polluting #346.

The current guideline states that only HQRM-compliant resources can be considered to be integrated to the Resource Kit.

That made sense when the repository and releases were to be owned by Microsoft, but since we've moved to a more scalable model where new resources' repositories and PSGallery releases will be owned by the authors, I wonder if we need to keep the bar that high.

I obviously value quality and trust into the resource kit, but I think we should improve the experience of contributors by giving the effort they put in their modules more visibility, by listing them in the DscResources repo, and be tracked by all Resource Kit contributors.

It would enable:

  • better visibility on some decent quality modules, even if not yet HQRM compliant
  • to focus community efforts on some modules, instead of people re-inventing the wheel
  • Improve visibility and use case coverage of DSC
  • ramp-up new contributors to become regulars and eventually maintainers, by providing guidance & visibility
  • reduce the bar of entry and foster continuous improvement (reducing the 'selective club' feeling, and more of a 'grow together' process)

We need to avoid:

  • Taking on new modules that will increase the load on the existing contributors, for little value (niche modules irregularly maintained)
  • Taking on modules that have a big quality risk (too many unknown: not enough testing or usage in production)

To do so, we could:

  • Make sure all common tests passes (style guidelines, repo config...) before considering
  • Ensure the module already has active and committed maintainers willing to improve (not offloading the work onto the rest of the community) and seeking for guidance
  • offering an incubation process, maybe with probationary period (get to HQRM within x months, or be delisted until you get there)
  • Having different 'labels' or 'stages' of quality obtained (Tech debt = In the Resource kit for historic reasons but not yet up to standard, incubator = Good Quality, deprecated, orphaned = Seeking a maintainer/owner, HQRM)

As an example, a module like UpdateServicesDsc might be of good quality and has been used in prod, but has no Integration tests. It passes all common tests, just has a low coverage. Having it within the resource kit (as an incubation) would give us a bit more visibility. We can probably trust the maintainers (Michael & I).

I think that the question we should ask ourselves every time is amongst those lines: Is it making the community better, or worse (by, for instance, spreading us up too thin, or not be clear enough about quality, or maturity)?

So, what do you think we should do? What considerations shall we have in mind? What do we have to avoid? What should be our baseline?

/cc @kwirkykat @PlagueHO @johlju @mgreenegit

@gaelcolas
Copy link
Contributor Author

On a related note, I do think enabling repeatable Integration testing (potentially with test-kitchen or similar principles) is a pre-requisite for enforcing integration tests, but that's further ahead. So what do we do in the meantime?

This discussion is not about testing, but I want to point out the existing challenges we have with testing, that derive from our limitations of running dev/test 'locally' (whether AppVeyor or local system when developing/reviewing).

What is the most conclusive test? The ultimate test... It's running it in prod...

Is it really an integration test if you don't use DSC (LCM) to effect those changes?

Sure, testing the resources by simulating the code path taken by the Get/Set/Test is good, making sure that the Get/Test/Set do the right things when called is great, but we're still missing the test that it behaves correctly within the LCM, and that DSC can change those settings with the given DSC Resource (it's more of an operation validation, than just a code test).

@PlagueHO
Copy link
Contributor

I'm on board with this @gaelcolas. I think we could use terms such as "incubating", "graduated" etc to indicate different levels of progress through to becoming HQRM. All modules should be aiming to meet HQRM guidelines regardless but it will often take a long time to move there for many modules.

But I'd also like to actually include more specific metrics to measure a resource' quality -e.g. code quality, coverage, which common tests are opted into etc.

@X-Guardian
Copy link
Contributor

I've written a script which produces a report on the current MetaTest OptIn Status of all the DSC resource modules listed within this repo: DscResources-MetaTest-OptIn. Thought it might be useful.

@PlagueHO
Copy link
Contributor

PlagueHO commented Jul 4, 2019

Awesome @X-Guardian !!! That is cool! We should look at how we could have the output of that indexed and continuously run.

@gaelcolas
Copy link
Contributor Author

Sounds like a good candidate for Azure function... :)

@PlagueHO
Copy link
Contributor

PlagueHO commented Jul 4, 2019

Very cool idea!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
discussion The issue is a discussion.
Projects
None yet
Development

No branches or pull requests

3 participants