Algorithms developed in the Cornell Lab for Intelligent Systems and Controls can predict the actions of volleyball players in-game with more than 80% accuracy, and now the lab is collaborating with the Big Red hockey team to extend the applications of the research project.
The algorithms are unique in that they take a holistic approach to anticipating action, combining visual data – for example, where an athlete is on the court – with more implicit information, such as the specific role of an athlete on the team.
“Computer vision can interpret visual information such as jersey color and a player’s body position or posture,” said Silvia Ferrari, John Brancaccio Professor of Mechanical and Aerospace Engineering, who led the research. “We still use this information in real time, but incorporate hidden variables such as team strategy and player roles, things that we as humans are able to infer because we are experts in this particular context.
Ferrari and PhD students Junyi Dong and Qingze Huo trained the algorithms to infer hidden variables the same way humans acquire their sports knowledge – by watching matches. The algorithms used machine learning to extract data from videos of volleyball matches and then used that data to help make predictions when presenting a new set of games.
The results were published on September 22 in the journal ACM transactions on intelligent systems and technology, and show that the algorithms can infer player roles – for example, distinguishing a defensive setter from a blocker – with an average accuracy of nearly 85%, and can predict multiple actions over a sequence of up to 44 frames with an average accuracy of more than 80%. Actions included poking, placing, blocking, digging, running, crouching, falling, standing, and jumping.
Ferrari envisions teams using the algorithms to better prepare for competition by training them with an opponent’s existing game footage and using their predictive abilities to practice specific games and game scenarios.
Ferrari filed for a patent and is now working with the Big Red men’s hockey team to further develop the software. Using game images provided by the team, Ferrari and its graduate students, led by Frank Kim, design algorithms that autonomously identify players, actions and game scenarios. of the project is to help annotate the game movie, which is a tedious task when done manually by team staff members.
“Our program emphasizes video analytics and data technology,” said Ben Russell, director of hockey operations for Cornell’s men’s team. “We are constantly looking for ways to grow as coaches to better serve our players. I have been very impressed with the research Professor Ferrari and his students have conducted so far. I believe this project has the potential to significantly influence the way teams study and prepare for competition.
Beyond sports, the ability to anticipate human actions holds great potential for the future of human-computer interaction, according to Ferrari, who said improved software can help self-driving vehicles take better decisions, bring robots and humans together in warehouses, and can even make video games more enjoyable by improving the computer’s artificial intelligence.
“Humans aren’t as unpredictable as machine learning algorithms claim right now,” said Ferrari, who is also associate dean for cross-campus engineering research, “because if you really take into account everything content, everything from contextual clues, and you observe a group of people, you can do a lot better at predicting what they’re going to do.
The research was supported by the Office of Naval Research Code 311 and Code 351, and commercialization efforts are supported by the Cornell Office of Technology Licensing.
This article appears courtesy of a content sharing agreement with the Cornell Chronicle