Judy Hoffman and James Hays

Interactive Computing Faculty Earn Test of Time Awards for Impactful Research

More than a decade after publication, the research impact of School of Interactive Computing faculty members Judy Hoffman and James Hays still resonates. 

Hoffman, an assistant professor in computer vision, received a test of time award Thursday at the International Conference on Machine Learning (ICML) in Vienna, Austria, for a paper she co-authored in 2014. 

Hays, an associate professor in computer vision and robotics, will receive a test of time award next week at the 2024 SIGGRAPH conference in Denver for a paper he co-authored in 2012. SIGGRAPH is the official conference hosted by the Association for Computing Machinery’s (ACM) Special Interest Group on Computer Graphics and Interactive Techniques.

ICML, SIGGRAPH, and other computer science conferences recognize researchers whose work is at least 10 years old and has had a lasting impact since publication.

Hoffman co-authored the paper DeCAF: A Deep Convolutional Activation Feature for Generic Visual Recognition while earning her Ph.D. from the University of California, Berkley. 

Hoffman said the paper laid the groundwork for Caffe. This deep learning vision framework became highly popular among academic and industry researchers.

“This work was one of the early efforts that convinced the computer vision community toward using large-scale pre-trained deep models for visual recognition tasks,” Hoffman said. “Recognition through this award is particularly salient as the larger AI community embraces pre-trained foundation models to advance emerging applications.”

Hays co-authored How Do Humans Sketch Objects. The paper was the first research effort to collect and characterize a large dataset of sketched objects. It explored a universally intelligible form of communication.

“We asked questions like ‘How well can humans recognize other people’s sketches?’” Hays said. “Generally, quite well — 73% when looking at sketches from 250 categories.

“Before our work, sketching as an input to a computational method had been widely explored, but there hadn’t been a serious effort to look at the distribution of actual human sketches.”

Tablets with interactive drawing applications had yet to emerge at the time. The researchers only had the option to crowdsource sample sketches from a wide population, most of which were drawn using a computer mouse.

“To learn about sketching, we needed thousands of sketches,” Hays said. “The iPad had been released only a year before our project. At first, sketching with a mouse seemed a bit off-the-wall, but pilot studies showed that crowd workers were good at drawing with mice or touchpads. We collected 20,000 human object sketches at a relatively low cost.”

Hays said the dataset has collected 1,100 citations and is still widely used.