We hope you can join us for the next Research IT Reading Group as we learn about and discuss the Connected Communities Initiative at CITRIS, the Center for Information Technology Research in the Interest of Society.
Presenting: Brandie Nonnecke, Research & Development Manager for CITRIS, UC Berkeley and Program Director for CITRIS, UC Davis
Facilitating: Patrick Schmitz, Research IT
Aaron Culich, Research IT
Aron Roberts, Research IT
Barbara Gilson, SAIT
Camille Crittenden, CITRIS
Cody Hennessey, Library
Indu Tandon, Human Resources
Maurice Manning, Research IT
Patrick Schmitz, Research IT
Quinn Dombrowski, Research IT
Rick Jaffe, Research IT
Steve Masover, Research IT
[See slides – PDF]
ACA driving further interest in delivering health care outside hospitals; CITRIS's health oriented work drives toward equity in in-home care
Captricity (scans surveys, puts data into a spreadsheet): https://captricity.com/
CAFE: Collaborative Filtering is a feature largely missing from other social innovation tools.
Indu: How did you develop the user interfaces
[Discussion of how a non-reading community unfamiliar with touchscreen technology catches on to the interface]
"Social influence bias" -- in California Report implementation of CAFE, found that when average response is presented to respondents, most go back to their own response and move their own answer closer toward the average. In other implementations, this answer-changing is not possible.
Feature phone mode of survey administration: audio presentation, keypad or audio responses
Patrick: Traditional QDA vs. participant-voted evaluation of ideas
BN: Have not, but could study. E.g., tweak algorithm to present novel ideas and see if people's response to those is what we find using traditional QDA. Must also look at whether people upvote common ideas because they like them or because they are presented frequently.
Aaron: Where's the CAFE code?
BN: It's on Github. Can send URL.
Aaron: MOOCs: any work with campus MOOCs
BN: Have used CAFE in on-campus courses, not many MOOCs -- one that Armando Fox taught/created
Aaron: Would be interested in possibility of using / adapting CAFE to trainings Research IT does for Savio trainings
Camille: What happens to the data gathered by CAFE implementations?
BN: Self-selection bias prejudices the representativeness of data we're currently collection
[discussion of data sharing, de-identification]
BN: Used BlueMix (IBM, https://en.wikipedia.org/wiki/Bluemix), tagged more accurately than we (survey admins) did
Order of tasks -- rate ideas before being asked to contribute one -- helps to work against attrition that occurs when people are asked to contribute at a more complex level, e.g., articulating/contributing their own idea.
[Discussion of how presentation of ideas for respondents to rate does or doesn't encourage novel ideas, and what kinds of presentation can encourage novelty]