In the popular imagination, robotic surgery is a piece of cake. Take the 2012 science fiction film Prometheus, where a character simply climbs into a pod for an emergency cesarian section. But today’s surgical robots aren’t even close to that yet. Although a robot can make precise cuts with a blade, insert threading needles, and even tie knots, modern machines are still hampered by poor vision. In the messy, crowded environment of a soft-tissue surgery, robots struggle to keep track of where the features of the organs are relative to each other, let alone change those features with a deft nip and tuck. So at best, robots have been used by surgeons as a nonautonomous “third hand.” But a new system, the Smart Tissue Autonomous Robot (STAR), brings self-driving robo-surgery a step closer. To make sense of what it’s seeing, it lights up the scene with near-infrared fluorescence and uses a technique called plenoptic imaging, creating a 3D model of the world by comparing the points of view of several cameras. For fine control of instruments, STAR uses a robotic arm with eight degrees of freedom—one more than the human arm. It performed well doing sutures on fake tissue (see video, above), but its performance on living tissue from pigs was more impressive, as reported today in Science Translational Medicine. Working under human supervision, STAR was able to make linear sutures and anastomosis—connecting sections of small intestine end-to-end—as well as human surgeons, though it took more than five times longer. As for the robot’s bedside manner? That’s a work in progress.