FEATURE — Summer 2004

Depicting a lighter side to robotics, members of the Media and Machines Lab—(from left) Assistant Professors William Smart and Cindy Grimm (seated), two of the lab's founders; Shannon Lieberg, Engineering Class of '04; research assistant Michael Dixon, B.S./M.A. '03; and Nik Melchior, a fifth-year B.S./M.S. student in computer science and engineering—can provide "Lewis, the photographer" with playful props, in addition to the necessary integrative programming.

'Get Picture'
[It's a snap!]

Professors and undergraduates in the Department of Computer Science and Engineering have created an award-winning new breed: the world's first robotic photographer, "Lewis."

by Judy H. Watts

As if marking the new age of computer science and engineering, members of the University's Media and Machines Lab have welcomed a robot into their midst. Swaddled in styrofoam when it arrived in 2001, the hefty bundle of hardware was soon up on its wheels—bright red and built like a trashcan, standing 4 feet 6 inches and weighing 300 pounds (batteries included). It needed a name to join the network world, and was called "Lewis," after Meriwether Lewis, whose own navigational skills served well on his expedition with William Clark. Because the sturdy fellow came equipped only with a standard PC and basic software, so people could read his sensors and make him wheel to and fro, his remarkable development—and the public acclaim that has followed—can be ascribed only to nurture, not nature. And a village of undergraduates in the robotics/computer vision/computer graphics laboratory in Lopata Hall made it all happen.

"That was the idea—to involve undergraduates in research and give them a project to showcase at a national conference," says Cindy Grimm, assistant professor of computer science and engineering (CSE) and a computer graphics researcher who double-majored in computer science and art. (She plans to resume painting in 40 years or so.) Co-founder of the Media and Machines Lab with fellow CSE Assistant Professors William Smart, whose field is artificial intelligence and robotics and who is also her husband, and Robert Pless, a computer vision expert, Grimm proposed a unique project over dinner one night: Lewis-as-Wedding-Photographer.

"Bill had been working on applying machine-learning techniques to robot control," explains Grimm. "His emphasis was on navigation—getting a robot from point A to point B while avoiding obstacles such as chairs—which is very difficult to do. But I was trying to think what else we could have a robot do besides delivering bagels!" (Enabling research robots to serve party snacks without rolling over anyone is a common, though extremely challenging, project in the field.) "It was a nice way of blending our interests in a way that would benefit undergraduates," adds Smart. Just saying undergraduates sets Smart on a verbal detour. In his Scottish inflection he exults: "The undergrads here are phenomenal—outstanding! They take high-level direction and are self-propelled enough to find out things they don't know, try new approaches, and independently figure out what to do next! Many are better than graduate students at other places."

Lewis—who has worked a real wedding reception—has been programmed to observe basic rules of portraiture but, like any artist, to break the rules now and then.

Lewis and his undergraduate programmers' first public appearance was the Siggraph (Special Interest Group for Graphics) July 2002 Emerging Technologies conference, in Austin, Texas. Students had groomed the robot for months—imagining what it could be; determining what commands would control features such as the camera's pan-tilt unit; and writing and testing the face detector and other software.

Their prodigy performed flawlessly. Techies were charmed to find the world's first robotic photographer rolling around, topped with a camera that found their faces and composed photos they could take away as prints. To focus on a person, Lewis uses his foot-high 180-laser-beam range finder to register what appear to be legs, and then searches higher for colors that might signify faces.

Lewis—who has worked a real wedding reception—has been programmed to observe basic rules of portraiture but, like any artist, to break the rules now and then. Student feedback has helped him learn to save photos worth keeping. He maps where people are located, planning his path by using pre-programmed landmarks and changing course if he sees a better opportunity. For safety's sake, he is fitted with bump-sensitive panels so that the gentlest collision shuts him down.

The researchers also program the robot for varying light and different kinds of floors; the weather matters, too. When Lewis appeared at a reception at The Ritz-Carlton as part of the Council for the Advancement of Science Writing's annual New Horizons in Science Briefing at Washington University in October 2002, and again at the University's Sesquicentennial celebration in September 2003, the humidity remained outside. But at the 18th International Joint Conference on Artificial Intelligence (IJCAI) in Acapulco in August 2003, Lewis had to be adjusted for the saturated air, which changes the speed of sound enough to affect his sonar.

The early runs and the prepping paid off at the IJCAI: Lewis won the robot competition there, beating the machine that a Carnegie Mellon/Naval Research Laboratory/Metrica/Swarthmore/Northwestern team assembled, and made international news.

"Each group on the other side perfected a part of the robot, and then the groups combined all of the pieces. But when they tried to get each bit to work together, they could not get them to work—it all fell apart," explains Smart, who is co-organizing the 2004 IJCAI competition as part of the American Association for Artificial Intelligence Conference in San Jose, California, which Lewis will enter. (Rumor has it he will pretend to be a conference attendee.) The 2003 Washington University team succeeded because the researchers were always careful to have a functioning system, beginning with a few robust functions and then gradually improving them, constantly designing and testing the whole.

Nik Melchior (left), a fifth-year B.S./M.S. student in computer science and engineering, helped create the programming framework that allows others to command Lewis. Shannon Lieberg (center), Engineering Class of '04, works Lewis' controls with Assistant Professor Bill Smart.

Now a research assistant in the Media and Machines Lab who is investigating an automated visual inspection task, under contract from the Boeing Corporation, Michael Dixon, B.S./M.A. '03, worked on Lewis from the beginning. "The project was so engaging that everyone in the lab was interested and wanted to share ideas! It was very exciting, especially since there is never a single answer."

Educating Lewis wasn't easy, but the effort laid a solid groundwork. The next step will be to socialize him. "People generally have absolutely no idea what to do when a robot is in the room, so we need to train Lewis to do the interacting," says Smart. "When human beings interact, we interpret body language and facial expression, and have a sense of what may happen next. But a robot just sits there and then suddenly moves. People are very uncomfortable with that."

"We know that people ascribe feelings and states of mind to robots, so they can think of them as creatures," adds Grimm. To bridge the comfort gap, some researchers install monitors with expressive faces; others design robots resembling reassuring animals. Based on their user studies at the Saint Louis Science Center, Smart, Grimm, and their students are working on a "vocabulary of movement" for Lewis—signaling through movements or sound that an action will occur, for example—and on determining whether people are happiest with a recorded voice or beeps and whistles that proved so expressive in Star Wars. They plan to engage colleagues in the psychology department in future research on robot-human interaction.

Lewis frames Zachary Byers, B.S./A.B. '03, and Sara Shipley, a St. Louis Post-Dispatch reporter, at a reception for the Council for the Advancement of Science Writing's fall 2002 annual briefing, hosted at the University.

Shannon Lieberg, a senior who has a coveted scholarship from the Department of Homeland Security, joined the project in fall 2003. As part of her research at the Science Center, she enjoyed watching visitors humanize Lewis. "If he began beeping, people would say, 'Oh, he's talking to us!'" Now Lieberg is experimenting with using Lewis' existing code and capabilities to interface with a computerized talking face.

Nik Melchior, a fifth-year B.S./M.S. student in CSE, was instrumental in creating the programming framework that allows others to command the magic to happen. "I write the code that allows Lewis' applications and devices to talk to each other—so that another programmer can just tell Lewis, 'Get picture,' for instance, and he does." Melchior's demanding integrative work is so good that he has just been admitted to the Ph.D. program in robotics at Carnegie Mellon University. "It's the best robotics program in the world right now," says Smart, "but we really hate to lose him." He adds with a wink: "And at the IJCAI competition this year, Nik will be working for the enemy!"

Robots, however, can work for us all. Lewis, says Smart, is a student magnet—"and the undergraduates' enthusiasm reminds me how cool it is to be doing what we're doing!"

Next! Robots Take a 48-Million-Mile Field Trip to Mars

Washington University robotics researchers think locally, globally, and cosmologically. In the fall 2004 issue, meet Ray Arvidson, the University's man on-line with Mars, and the two roving research assistants he and his Earth and Planetary Remote Sensing Laboratory team helped put on NASA's Martian map.

"A wonderful synergy is developing between the students and the entire laboratory," says Catalin Roman, professor and chair of the computer science and engineering department. Lewis is also an appealing messenger of scientific knowledge—as thousands have already discovered. And, says Roman, as manufactured devices become increasingly miniaturized and specialized, tiny robots will likely be used to go where none has gone before—often working in groups. "Jobs that human beings should not be doing could be handled by robots—entering burning buildings, collecting samples in volcanoes, checking for hazardous materials, clearing fields of land mines," says Smart.

A science-fiction fan, Grimm says that while she believes the humanoid machines described since the 1920s aren't likely to materialize, specialized robots might also be used to glide along aisles in stores, "cleaning the floors and leading the way when a customer asks where something is located." (To which millions of shoppers might say to science: Just hurry.)

Judy H. Watts is a free-lance writer based in Santa Barbara, California, and a former editor of this magazine.