We are in the process of evaluating alternatives and selecting a common Collections Management platform.
Analysis of existing applications and needs has been performed over an extended period of time and we are evaluating Arctos, Specify and CollectionSpace as potential alternatives.
Summary Collections Management Strategy Document (April 2010)
The attached document was distributed to the BNHM-IST Steering on April 26. BNHM Directors met on April 29, 2010, and signed the agreement.
Latest Evaluation Timeline
The BNHM-IST Partnership is currently working on decision-making criteria for the collection management system evaluation. The current schedule for this effort is:
- Done. Spring 2009: Presentations by CollectionSpace, Arctos and Specify teams.
- Done. Summer 2009: Steering Committee and Advisory/Technology committee work on functional, business/economic, and technology/architecture criteria.
- Done. 9/4/2009: Steering Committee discusses principles, must-haves, and weighting of criteria.
- Done. Week of 9/7: IST to revise, include functional criteria, and resend to Steering Committee.
- Done. Week of 9/14: Steering Committee sends feedback on criteria to Patrick McGrath.
- Done. Last half September 2009 and into October: Initial Informatics evaluation and scoring of CollectionSpace, Specify, and Arctos using the established criteria. Dedicated effort from IST. See BNHM-IST Collections Evaluation Process and Initial Scoring Notes.
- Done. 9/28: Steering Committee meets to review initial analysis.
- Done. October 2009: Criteria sent to Advisory/Technology committee and BNHM Directors. Discussions being scheduled with individual museums, BNHM Directors, and with combined Advisory/Technology group. See BNHM-IST Collections Evaluation Feedback for notes from some of these conversations.
- Done. October 29. Steering committee meets and discuss status to date.
- Done. Beginning November 2009: Analysis and recommendation edited and sent to Steering Committee, Advisory/Technology committee and to BNHM Directors. Advisory/Technology group meets to discuss. Revisions made as needed and sent to the Steering Committee.
- Done. November 9. BNHM-IST collections group meeting held. Recommendation/statement to be drafted based on the meeting notes.
- Done. December 8, 2009: Steering Committee meets to discuss evaluation. See the summary notes and slides.
- Done. January through April 2010: Steering Committee develops the Summary Collections Strategy Document, Transition Plan, Collaboration Plan, and Fund Raising Plan. The strategy document will mark the completion of the platform selection effort. New content will be developed via the BNHM-IST Collections Data Working Group to document the work taking place post-selection.
- Done. April 29, 2010: BNHM Directors meet and sign BNHM-IST Collections Strategy Document.
Principles for the Evaluation
- To make the most optimal use of scarce resources, we will move towards a single Collection Management platform that can be extended and integrated across all collections on campus, including non-natural history collections.
- IST will guarantee operation of existing applications that are supported for the museums until they can be migrated to a new solution.
- Museums may elect to not participate in the use of a shared platform. If this is the case, they will cover all costs for deployment and operations themselves.
- Existing legacy systems will be put on "life support" with a focus on keeping them available, and freezing enhancements. Exceptions to this will be submitted for approval to the BNHM-IST Steering Committee.
- There is a recognized tradeoff between rapid deployment for a single museum, and accommodation of broader functional requirements. In general, functional criteria are easier to address over time, than core business and architecture considerations. Architectural criteria should be seen as directly enabling functional and business goals.
- Regardless of the final decisions, we should look to seek / evaluate partnerships with the other solutions we have evaluate.
Solution Must Have's
- Solutions must either a) meet the core functional needs described within our functional criteria list and Spectrum/CHIN guidelines, or b) must guarantee that it will be able to support those needs within a reasonable timeframe.
- UCB needs to be able to significantly influence the direction and participate in the development of products to meet local needs.
- The solution is an established community development project, with active contribution from multiple institutions and 5 dedicated developers/support staff.
- The solution should be, as much as possible, an open source application with an open source underlying technology stack.
- In the interests of supportability across a wider campus user-base, the application should leverage a web-based client. Non-web clients will require a significant additional level of management and support.
- Multi-tenancy. We do not want to continually create separate instances for each collection, we need to minimize duplicated source code what will quickly become a support issue.
- Scalable and extensible - Ability to easily and quickly add collections - and within each collection - extend data schemas, add functions, change branding. Each collection should be able to leverage common data schemas, and have the ability to increment with it's own.
- Architecture and technology stack is consistent with shared services approach and architecture roadmap.
- Solutions (app and stack) should be able to be applied to other museum domains and problem spaces to drive down total cost of ownership and to diversify funding opportunities/applications.
- Solutions must be architected to support research applications to be built on top of the collection management systems using common technologies.
- The solution must have a clear fundraising and sustainability potential, and a plan.
Detailed Evaluation Criteria
The selection criteria for a solution will be weighted in the following categories:
Scoring Process and Notes
- Detail on process: More information about the process for developing the criteria, weights, and scores
- Scoring notes: Sources of information, assumptions made, and remaining questions
Scorecard Template and Initial Scoring
- Initial Scoring (updated to version 3.3): Based on evaluation and analyses of the alternatives performed by Chris Hoffman and members of the Informatics Services team.