ROS Community Metrics
We measure aspects of the ROS community to understand and track the impact of our work and identify areas for improvement. We take inspiration from the MeeGo Project's metrics.
Related: a crowd-sourced map of ROS users around the world.
Ohloh Metrics: Ohloh provides metrics on open source code repositories. See ros stack metrics and follow the related links to see other stacks.
Website with Visualization
https://metrics.ros.org hosts fine grain visualizations of various metrics from the ROS hosting services.
Reports
We periodically publish a metrics report that provides a quantitative view of the ROS community. We expect to publish a report quarterly, though more automation in the data gathering steps could make it feasible to do more frequently.
We're collectively learning what to measure and how. Please provide feedback! Add your suggestions on how to improve these reports below, or post them to http://discourse.ros.org/c/site-feedback.
Wishlist
What should we measure differently? Put your suggestions here.
- Choice of IDE. Also, choice of IDE per programming language.
- How are wiki edits spread across users?
- How big are wiki edits?
- How are answers spread across users?
- How many commits were made? (requires a multi-VCS crawler)
- How many tickets were opened/closed? (requires a multi-tracker crawler)
- Which OS is used? (view count comparison for the installation subpages)
- what kinds of packages are popular ? (requires tags on packages)
How many participants use in large epic projects (e.g. APC, DRC)?
- How often packages are updated?
- Number of unique users over time (monthly or quarterly)
- Number of wiki tutorial pages under in any package