What We Learned

Beats Empire is designed to be a formative assessment tool to support learning about data and data analysis. While we attended most directly to the CS K-12 Framework in the design of Beats Empire, working with data is also required in math, science standards, social studies, business, and many other subjects. Research findings about the value of Beats Empire fit into two main categories: research about the design of learning/assessment games, and research on the efficacy of Beats Empire as a formative assessment tool.

Research on Learning and Assessment Game Design

We made an explicit effort to create a formative assessment game that would be meaningful to both students and teachers. To this end we held multiple focus groups with students throughout NYC to determine a compelling aesthetic and narrative thread. We found that students were quite interested in music, were knowledgeable (to a degree) about how music is made and shared, and had strong personal and social connections to various music genres. In the resulting game design, players take on the role of a music studio manager and are given a high degree of freedom to make choices about what artists to sign, what kind of songs to release, and where to release them. A powerful affordance of this design is that players use their prior experiences and interest in music, to make sense of in-game representations and key game mechanics.

In Beats Empire available artists resemble, but do not mirror, real world artists. For example, players can sign Beyonde (similar to the artist Beyonce) or Half Dollar (similar to the artist 50 cent). In think-alouds during game play and focus groups some players indicate that they choose to sign artists because of the resemblance to a real life artist they enjoy. Furthermore, when recording songs, these same students occasionally make choices in song title, mood, or topic where they believe the real life artist would excel. For example, choosing a song name that “speaks to me” or that “matches the mood” or recording a song about a particular topic because that topic seemed to match what the real world artist might record. Evidence of this behavior indicates that students are meaningfully engaging with the game and game mechanics and doing so at a personal level

In a game with a high degree of freedom and personalization, it can be difficult to assess whether or not a player is making data-based decisions, or decisions based on their real-world interest in music. To address this challenge, we developed the “insight mechanic” where players can tag the choices they are making in the game with a particular data assumption. For example, when recording a song, players have the opportunity to view either a bar or line graph of current interest in a particular music genre, topic, or mood in a given borough. Players can then select a data series, such as the “hip-hop” line, and then indicate whether they are choosing this genre because they think it is the “most popular,” or “trending up.” By collecting data about the state of the world at a given moment the player chooses to record a song, and their particular prediction (if they choose to make one), we are able to determine, with some degree of certainty, whether or not the player understands the current state of the data model and is using data to make thoughtful decisions in the game.

Research on Beats Empire as a Formative Assessment Tool

In a series of pilots where middle school students play Beats Empire over two days in a math or CS class students overwhelmingly enjoyed the game and were enthusiastic about engaging meaningfully with the game mechanics. Some students used their understanding of popular culture (making connections between artists’ game-names and real-life artists) as well as their preferred music genres to choose artists to hire. Several other students used artists’ talent and reliability scores as well as the popularity of song types played by the artists to decide whom to hire. On average, students recorded between 20 and 50 songs each over the approximately hour of play time. Students also paid considerable attention to choosing the song title such that it matched the selected mood, genre and/or topic.
An important goal for this analysis was the degree to which the game would provide information about how students are, or are not, using available data to make decisions in the game. As an assessment game, the goal isn’t to ensure all students are equally successful, but rather to be able to see the diversity of student understanding and to have meaningful indicators of that understanding. In an analysis of two pilot studies with middle school players we found about half of the students were consistently able to draw meaningful inferences from data visualizations to decide what type of songs to record in which location, and a few students showed improvement in this area over time. Most students were not proficient with the use of line graphs and had difficulty distinguishing the utility of line and bar graphs. Few students demonstrated an awareness of the relations between data collected and storage required. The research team believes the game does provide information about what students do and do not understand about data and analysis at a fidelity that this information can be used by teachers to provide formative feedback for instruction.


Publications and Presentations 

Basu, S., Disalvo, B., Rutstein, D., Xu, Y., Roschelle, J., & Holbert, N. (2020, February). The Role of Evidence Centered Design and Participatory Design in a Playful Assessment for Computational Thinking About Data. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education (pp. 985-991).

Basu, S., Rutstein, D., Xu, Y., & Fujii, R. (Accepted, 2020). Using a game-based formative assessment to measure middle school students’ data and analysis skills. Paper to be presented at the annual conference of the American Educational Research Association (AERA), San Francisco, CA.

Basu, S., Rutstein, D., Snow, E., & Shear, L. (2018). Evidence-centered design: A principled approach to creating assessments of computational thinking practices. In The State of the Field in Computational Thinking Assessment. Symposium conducted at the 13th annual International Conference of the Learning Sciences (ICLS), London.

Holbert, N., Berland, M., DiSalvo, B., Rutstein, D., Roschelle, J., Kumar, V., Basu, S., & Villeroy, M. (2019). Designing constructionist formative assessment games. In game based assessment: How has the field matured over the past 10 years? Poster presented at the annual conference of the Educational Research Association (AERA), Toronto, Canada.

Rutstein, D., DiSalvo, B., Basu. S., & Roschelle, J. (2019). Game-based assessment of data and analysis for middle school students. Paper presented at the annual conference of the American Educational Research Association (AERA), Toronto, Canada.