CTArcade
Tak Yeon Lee, Awalin Nabila Sopan, & Matthew Louis Mauriello


Final Report


Progress Report

CTArcade progress report

Proposal

Design a system that will help to develop a non-programmer’s computational thinking (CT) skill through game play while providing motivation for social and competitive interactions.

Purposes:
The motivation behind this project is to develop a tool for teaching users about CT through a simple interface that utilizes existing CT skills internalized in the user’s mind, as opposed to a complex graphical programming tool. Rather than having users program logical rules directly in a bottom up approach, users will “discover” and rank rules to train an “artificial intelligence” (AI) that will play for them on a tournament server composed of other similarly trained entities. Playing on the server will result in a score and a record of the history of matches. These two artifacts are intended to provide motivation for the user to reflect and improve on their AI by learning from their experiences.

The design of the tournament server aims to facilitate social learning of computational thinking with competitive / collaborative interaction between users. The project targets groups of children in 3rd to 5th grade, thus it essential to design a motivating and engaging user interface for the game that appeals to this audience.

Background:
The CTArcade project was initiated by Dr. Ben Bederson’s. Initial research was performed by the Human Computer Interaction Lab (HCIL) at the University of Maryland. The HCIL’s worked within the confines of a KidsTeam session by providing challenges that would gauge the CT skill of the group of children. The initial findings suggest that a CT tool would be of benefit to teachers and students. A complete description of the project is provided in [Ahn. J, 2012], which summarizes the challenges and motivation of this project.

Thus far, a rough web-based trainer prototype for non-programmers has been developed. This trainer implements a version of TicTacToe with a few uncommon features such as dialog-based training component, a logical component for inferring applied rules based on a user’s move, and a history mode where user and AI can trace back the game while trying alternative strategies. The current limitation of the trainer is that users can only use preset rules and it is currently only available as a single game session. Future versions may have a custom rule creator component where users can manually generate their own rules and all information will be logged for use during multiple gaming sessions.

CTArcade will eventually encompass multiple games and a generalizable framework for players to train AI.

Scope:
The project will run through the course of the semester. The main goal is to solve the initial engineering challenges, complete a round of testing, and summarize these activities in a paper suitable for academic publication. Early work will involve finalizing the user interface and game mechanics associated with “training” the AI. Simultaneous work will be conducted to generate the mechanisms of the tournament server: account creation, social interaction, data acquisition (through tournament games and assessments of each user’s computation skills at various points of system use), and project administration. Engineering work will continue through multiple iterations until a prototype is ready for testing. Initial testing of the prototype will be performed in a KidsTeam with ~7 students while primary testing will be performed in a school classroom setting with ~30 students. An analysis of the system will be performed through system log data and pre & post assessments of students CI skills.

The biggest challenge this project will attempt to overcome is making the CTArcade a social experience. There are several ideas being considered for implementation: a reputation system powered by helping other users improve their AI, an online chat interface or private messaging system, or a paired programming interface.

Method:
  1. Users from a specified class will create, or have created for them, individual accounts to access the prototype system.
  2. Users will be prompted to complete an assessment on their CT skills.
  3. Users will then begin to “play” with the AI application in an effort to teach it the rules and strategy of “Tic-Tac-Toe” by the following procedure:
    1. Users will take a move and describe to the AI the reasoning for this move. Users will rank the possible moves that are displayed to them
    2. The computer will then make a move and generate a response. The user will correct and/or rank the moves that the AI could have made.
    3. Users will, over successive games, record a strategy that “their” artificial intelligence can employ on the competition server.
    4. Users, when ready, will commit their AI to the tournament server.
  4. AI, upon commitment, will “play” other user committed AI’s to generate a score. The methods of “play” will be as followed:
    1. User directly initiates play between their AI and another committed AI.
    2. User directly initiates play between their AI and all committed AIs.
    3. A full system “play” of all committed AIs against each other is initiated on a predefined schedule.
  5. Users will be prompted to complete an assessment on their CT skills.
  6. Users will logout of the system.

sketches:



CTArcade.jpg
figure 1. Lobby interface has the entire list of users with a button to play a match with them. Signup and Login UI are on the right side of screen as well. Recent activity log shows how other users doing on CTArcade server in real time.


Screen_Shot_2011-10-17_at_2.08.02_PM.png
figure 2. History view of evolution of one's AI. When user's AI is revised, it runs automatic matches with all the other AI's to check how strong it is. Horizontal bar chart indicates the strength of AI in term of winning game rate for different versions.

Screen_Shot_2011-10-17_at_2.08.15_PM.png

figure 3. Detail view of a version of AI. It shows the pattern of winning/loosing games of the AI.


Screen_Shot_2011-10-17_at_2.21.08_PM.png
figure 4. Trainer UI.



Timetable:
Complete literature review by November 7th, 2011
Complete prototype system November 14th, 2011
Complete initial testing by November 30th, 2011
Complete analysis by December 5th, 2011
Give on December 12th, 2011
Complete final report by December 12th, 2011

Limitations:
The actual testing of the prototype will be performed by an affiliated classroom and an on campus group, KidsTeam. The testing schedule will have to be prearranged with both groups in advance. It is unknown if we will be able to schedule both rounds of testing within the current semester; the project will first be tested by the KidsTeam followed by the affiliated class.

References:

Ahn, J. Designing games to help children learn computation thinking. In Proc. CHI 2012, ACM Process (2012),1-4.

Google, Inc. (2011). Exploring computational thinking. Retrieved from: www.google.com/edu/computational-thinking/

Wing, J. Computational Thinking [PDF document]. Retrieved from Lecture Slides Web site: www.cosbi.eu/mergingknowledge/index.php/presentationsandvideos/



Comment (Ben): Looks good. My only significant comment is to think about what specific research question you would like to address in the scope of this class? I know this project will entail substantial software engineering - but what do you hope to gain with this? You say you want to make this a "social experience", but I would say that is the solution, not the problem. The problem (I think) is that you want to make an *engaging* experience so that students will want to work hard. Your hypothesis is that adding social experiences with good game design will help. Make sure you are clear in your thinking about what you really are trying to achieve.