Slides:
https://softwaresaved.github.io/softwaresuccess-rse2017
Neil Chue Hong, Software Sustainability Institute
Your name
Where you're from
One thing you want to learn from this workshop
... add it to the shared notes ...
http://bit.ly/softwaresuccess-rse2017
What was important?
What was unexpected?
Does the opposite hold (absence == failure)?
What's easier to measure?
What's more accurate?
... be ready to share ...
Downloads
Popularity (Forks/Stars)
Citations
Downstream Use
... is someone else using it ...
Functionality
User-friendliness
Performance
Documentation Quality
Multi-platform
... often subjective ...
Number of Contributors
Community Activity
Contribution Acceptance
Response Rate
Release Velocity
... visible activity ...
Contribution diversity
Contributor diversity
Contributor breadth
Decision distribution
Language bias
... hidden indicators ...
Test coverage
Maintainability Indices
Cyclomatic complexity
... can mask other failures ...
Bus factor
Known Vulnerabilities
Bug Age
Downstream Use
... reliability leads to use ...
High search ranking in Google
See our reference list: https://github.com/softwaresaved/softwaresuccess-rse2017/blob/master/references.md
Contribute via pull requests
Many different ways to measure software
Different people will place different importance
on different measures
Some things are easy to measure,
some things are useful to measure
There are already examples out there -
some may fit your requirements
Contribute your own ideas!