Learning computer science with music in creative, collaborative and fun ways.
Inventive Design and Research for Interest Formation in Computer Science.
Literature Review, Requirements Analysis, Paper Prototyping, Agile Prototyping, Decision-matrix,
Traceability Test Matrix, Design Review, Expert Analysis, Qualitative Usability Testing (remote), Surveys, Data logging
Markers, Paper, tangible prototyping materials, Google Docs, Google Sheets, Github, GIT, Microsoft Bluejeans, Adobe XD, Axure, Unity (webGL), Kanban Board (Trello), Ideum Touchtable, Gesture Works, Tangible Simulator
Sara Milkes Espinosa (Lead PhD researcher), Chloe Choi, Lauren McCall, Michael Fewkes, Tom Mcklin, Taneisha Lee, Erin Truesdell. Advised by Brian Magerko PhD and Jason Freeman PhD.
Research. Project Management and design, Collected research insights, Designed system architecture iteratively through expert reviews and user testing, Created prototypes, Communicated research and design needs to programmers,
Designed research study, Conducted Remote Usability studies
1 year (Ongoing)
Responding to psychology and cognitive science research that points to the importance of emotion and creativity in learning, we designed Xylocode, a system that allows children in K12 to explore computer science concepts collaboratively with music and spatial reasoning. Xylocode is an iteration of the TuneTable, a NSF-funded research project of the Expressive Machinery Lab that aims to enable novel and fun ways to increase interest in computer science in informal education contexts. Initially designed for museums, Xylocode has a Tangible version, and a Web version that was created due to the constraints imposed by COVID. While the final development of the tangible version is pending, the web version is being deployed in elementary schools in partnership with the Amazon Future Engineer program. The project spans initial research, design, development, evaluation and theoretical implications.
An ongoing design and user experience challenge of the project is to match the music-making systems to Computer Science concepts in collective, fun and easily comprehensible ways for our target users. However, without a synthesis of the results of previous iterations, it was difficult to discern the useful system features from what had not worked. Additionally, from the research perspective, the new design had to be adapted to the technical and physical constraints of a multitouch capacitive table. Finally, with grant funding nearing an end, the focus of this version was to facilitate the evaluation process.
COVID created a radical challenge to our process that redefined the scope of the project.
Pre-COVID: Tangible Version
I lead the research, design and evaluation of the tangible version of Xylocode, a system that enables a physically engaging creative experience with music and code at museums and public installations for families and groups of children. Users collectively manipulate tangible objects to modify the system's rules, learning basic computer science concepts like conditionals, booleans, variables and arrays; and music concepts like rhythm and composition.
Post-COVID: Web Version
I lead the redesign and user testing of Xylocode as a web version, keeping the constructivist design of the tangible version and making it appropriate for a web experience. The system is redefined to allow single-user interaction through a browser with a laptop or a tablet. Tutorials emphasize the educational content and a logging backend collects system data for further analysis. This version is currently part of the Amazon Future Engineer program.
Xylocode Web has a stronger narrative design: users help voiceless birds sing with a deconstructed xylophone. The emitters shoot balls that make sounds as they impact the geometric constructions built by users. The Code boxes to the left hold the conditional logic controlling the system.
NOTE ON COVID The Tangible version and the Web version of Xylocode followed a similar research /development/ evaluation pipeline, sharing much of the insights from the literature review phase, iterative and prototyping process. The transition towards the Web version required complementing these phases with additional research and steps as noted below. Whereas an in depth evaluation and deployment of the Tangible version is not possible at the time due to social distancing constraints, the Web version has been deployed and is undergoing Beta testing.
Material exploration & Paper Prototyping
Hi-fi Prototyping: Micro-interactions
Research Questions: Pilot
Research Questions: elaboration
Large scale deployment
I conducted a focused literature review with two other members of the team about the previous publications of the TuneTable project to analyze the designs and results of their deployment at museums. We synthesized the results of testing from unpublished documentation and evaluation videos. I complemented this analysis with a review of literature about related projects such as Osmo, as well as literature in computer science education and HCI in museums. I synthesized a list of best practices and design guidelines from this step.
The transition to the Web version required reviewing additional literature about Scratch, and Ed Tech guidelines about computational thinking with online programming platforms.
I carried out informal, semi-structured interviews with three experts in the field of co-creative installations involved in previous iterations of the TuneTable. On expert was a former PhD student at the lab, a second one was a Music Tech researcher, and the last one was a professor at Northwestern University working in tangible computer science education. I asked questions concerning the systems design, narrative design and the target users. This step helped me cross-validate findings from the literature review regarding the functionality of previous designs and the relationship to published literature. In addition, the expert interviews helped me gain insight into the logic behind previous designers decisions and to take away useful heuristics for future design.
Sketch from an expert interview with a former researcher involved in the TuneTable project.
I met extensively with the directors of the TuneTable Project and the evaluation team to discuss my synthesis of the literature review and expert interviews, and to gather the requirements this project needed to fulfill. The project directors wanted to simplify the design so rules would be clearly understandable for a 5 minute engagement. This has been referred to as 'immediate apprehensibility' by other researchers. Together we identified five main pain points from previous designs. The evaluation team required that the design followed the criteria of the APEX coding system, an evaluation instrument used by the lab with the following categories: Physical Interaction, Intellectual Engagement, and Emotional engagement for Interest formation.
For the web version, the interaction and technical requirements changed significantly: instead of a 5 minute interaction requiring immediate apprehensibility for a group of users, the system was required to follow curriculum guidelines for an interaction possible in formal education. Another expert dealing with CS education with music was added to the team to develop tutorials.
Summary of the research Phase
I performed a synthesis of these initial research steps to create the design specifications to start the prototyping process. Here i present a brief summary of the result. You can contact me to find about the detailed report.
Previous publications' results
Ed Tech Guidelines
Museum HCI insights
Previous systems' design
Heuristics for design
Measurable CS concepts
5 minute interaction (tangible version) / 15 minute interaction (web)
Extensive Alpha testing
Abundant visual feedback
Adaptable learning goals
Material Exploration & Paper Prototyping
We conducted an iterative process with several cycles of paper prototyping > Design Critique > Agile prototyping > Expert review. The iterative process was flexible to accommodate emerging needs and to allow the creative process space for exploration. Informed by the research phase, I guided the design and development with 1) 3D material exploration to understand the tangible interaction 2) lots of whiteboarding and paper prototyping to communicate with the project PIs, the evaluation team and the programmers.
We started with paper prototyping various system designs based on the results of the research phase. The main ideas for the interaction were inspired by a digital artifact called 'ball droppings' by JT Nimoy. To iterate through the system design in order to figure out the interactivity, the co-creative aspects, the music mapping and the tangibles, we followed the paper prototyping by design critiques with members of the lab. Once a feature had been refined on paper, we followed an agile development cycle then show the prototype to the whole team in a weekly meeting.
Early Sketches of the User Interface for discarded design and physical prototyping of the conductive tangibles.
Initial sketches with PI Brian Magerko, PhD of what would become the final interaction with lines and ball emitters affecting conditionals that control logic of sounds and visual elements.
Design critiques were carried out often with two or three regular members of the lab that were familiar with the project. These meetings took place in person before COVID and online after transitioning to the web version. Following the requirement for constant testing of interaction features for robustness, we iterated through paper prototypes for main features, Wireframes for micro-interactions and options of visual design. This was critical for the tangible version considering the only developer at the time (Chloe) was hired part time. Later, the hired programmer (Michael ) reformulated the system with added complexity to the sound system adding MIDI.
Spread of paper prototypes with details of the user interaction features to be implemented in the agile prototyping pipeline.
We carried out a continuous cycle of prototyping where I was in close communication with the developer (meeting twice a week) and even jumped into development as needed. We used a Kanban board, Slack (later Microsoft Teams because of Georgia Tech requirements), and in person or online meetings for our updates. Our code was hosted on the Expressive Machinery Lab's Github. We developed in Unity (C++ scrpting) and later Unity WebGL builds hosted in a Georgia Tech server.
Reviews took place weekly or bi-weekly during the joint lab meeting. I presented our WIP to the Principal Investigators and the evaluation team and collected their feedback on the software prototypes. Te purpose of the meetings was to make sure that the features passing to the development phase were agreed upon by our multi stakeholder lab, including the Principal Investigators who came from different disciplines (Cognitive Science and Music Technology), the evaluation team and other members advising on education guidelines and musical design.
Views of the lab during the weekly review, white boarding and prototype review process.
System Consolidation & Micro-interactions
As the different interaction features were chosen through iteration, I started managing the project with a Traceability Text Matrix to coordinate requirements, features requested, quick tests results, bugs, UX needs and development needs. Since the Web version had overlapping features with the Tangible version, the system consolidation for the web version happened faster and allowed us to go into higher fidelity prototyping earlier. To compare different interaction paradigms for the Code Boxes where the conditional logic was presented to users, we carried out a decision matrix.
Screenshots of the four UI prototype versions and decision-matrix to decide on system and user interface. Live prototype here.
4 box Version
3 Box Version
Single Box Version
We conducted six sessions of Alpha Testing with members of the lab, Georgia Tech Colleagues and research collaborators. Among these we carried out 3 black box and 3 white box sessions. We identified bugs, scenarios and usability issues previously not considered.
The development of the Tangible Version was put on halt at this point due to COVID. The lab decided to carry out a translation of the main elements of the system to an online version.