We talk about self-driving cars as if the problem is solved. We assume that because a Waymo can navigate a chaotic intersection in Phoenix or a foggy street in San Francisco, the hard part is over. But the truth is stranger and more unsettling: The most experienced driver at Google has never been in a car.
Humans learn driving through vulnerability. We know the physics of a crash because we are made of meat and bone. We stop at red lights because we fear the thud . google driving simulator
Google (via its sibling company, Waymo) realized this early. The road is a sparse dataset. Most driving is boring. The truly dangerous moments—the tire rolling out of a driveway, the deer jumping the median, the drunk driver running a red light—happen maybe once every 100,000 miles. We talk about self-driving cars as if the problem is solved
If we only taught a self-driving car using real-world road data, it would take centuries. Worse, it would be lethal. To teach a neural network that a child running into the street is bad, you would have to wait for a child to actually run into the street—and hope the car stops in time. That is not engineering; that is gambling. Humans learn driving through vulnerability
Chào mừng bạn đến với diễn đàn Bạch Ngọc Sách
Để xem đầy đủ nội dung và sử dụng các tính năng, mời bạn Đăng nhập hoặc Đăng ký tài khoản