16th edition of the 9th year of SmartDrivingCars eLetter
T. Krisher, Feb 19, “The fiery crash of a Tesla near Houston with no one behind the wheel is drawing scrutiny from two federal agencies that could bring new regulation of electronic systems that take on some driving tasks.
The National Highway Traffic Safety Administration and the National Transportation Safety board said Monday they would send teams to investigate the Saturday night crash on a residential road that killed two men in a Tesla Model S.
Local authorities said one man was found in the passenger seat, while another was in the back. They’re issuing search warrants in the probe, which will determine whether the Tesla’s Autopilot partially automated system was in use. Autopilot can keep a car centered in its lane, keep a distance from cars in front of it, and can even change lanes automatically in some circumstances.
On Twitter Monday, Tesla CEO Elon Musk wrote that data logs “recovered so far” show Autopilot wasn’t turned on, and “Full Self-Driving” was not purchased for the vehicle. He didn’t answer reporters’ questions posed on Twitter….” Read more Hmmmm… I’ll stand by my quote… “…“Elon’s been totally irresponsible,” said Alain Kornhauser, faculty chair of autonomous vehicle engineering at Princeton University. Musk, he said, has sold the dream that the cars can drive themselves even though in the fine print Tesla says they’re not ready. “It’s not a game. This is serious stuff.”…” … eventhough it isn’t the most critical comment. What is more concerning,,, “Why didn’t Tesla’s Automated Emegency Braking System prevent the Tesla from hitting the tree?” The common theme in the Joshua Brown, Elaine Herzberg, Walter Huang, Firetruck/Derrick Monet, 2nd_Firetruck_Tesla crash …, Teslas seem to disrregard stationary objects directly ahead. Must assume that it can pass undeneath them. Can such an agregious oversight in Tesla’s AEB computer code really exist? Is the Society of Automotive Engineers (SAE) involved in this oversight because it had made Tesla and maybe others so adverse to false positives that they simply assume that Teslas can pass under any and all stationary objecs in the road ahead? Not a pretty situation. Alain
K. Pyle, April 18, ““It’s time to hit the start button,” is Fred Fishkin’s succinct way of summarizing the next steps in the Smart Driving Car journey. Fiskin, along with the LA Times’ Russ Mitchell co-produced the final session of the 2021 Smart Driving Car Summit, Making It Happen – Part 2. This 16th and final session in this multi-month online conference not only provided a summary of the thought-provoking speakers, but also provided food for thought on a way forward to bring mobility to “the Trentons of the World.”
Setting the stage for this final session, Michael Sena provided highlights of the Smart Driving Car journey that started in late December 2020. Safety, high-quality, and affordable mobility, particularly for those who do not have many options, was a common theme to the 2021 Smart Driving Car Summit. As Princeton Professor Kornhauser, the conference organizer put it,
“We want the value [of safe driving and driverless] to be captured by society.”…..” Read more Hmmmm…. We had another excellent Session. Thank you for the summary, Ken! Alain
Ken Pyle‘s Session Summaries of 4th Princeton SmartDrivingCar Summit:
15th Session Making it Happen – Part One: Elected Officials’ Role in Creating a Welcoming Environment in the Trentons of this World
Orf467F20_FinalReport “Analyzing Ride-Share Potential and Empty Repositioning Requirements of a Nationwide aTaxi System“
Kornhauser & He, March 2021 “AV 101 + Trenton Affordable HQ Mobility Initiative“
SmartDrivingCars Pod-Cast Episode 209, Zoom-Cast Episode 209 w/Clifford Winston, Brookings Inst.
F. Fishkin, April , “The Texas #Tesla crash that killed two continues to make headlines. The impact on the electric and automated vehicle industries? From the Brookings Institution, senior fellow Clifford Winston joins Princeton’s Alain Kornhauser and co-host Fred Fishkin for a look at what the real focus should be on..” Alexa, play the Smart Driving Cars podcast!“. Ditto with Siri, and GooglePlay … Alain
The SmartDrivingCars eLetter, Pod-Casts, Zoom-Casts and Zoom-inars are made possible in part by support from the Smart Transportation and Technology ETF, symbol MOTO. For more information: www.motoetf.com. Most funding is supplied by Princeton University’s Department of Operations Research & Financial Engineering and Princeton Autonomous Vehicle Engineering (PAVE) research laboratory as part of its research dissemination initiatives.
K. Barry, April 22, “Consumer Reports engineers easily tricked our Tesla Model Y this week so that it could drive on Autopilot, the automaker’s driver assistance feature, without anyone in the driver’s seat—a scenario that would present extreme danger if it were repeated on public roads. Over several trips across our half-mile closed test track, our Model Y automatically steered along painted lane lines, but the system did not send out a warning or indicate in any way that the driver’s seat was empty.
“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” says Jake Fisher, CR’s senior director of auto testing, who conducted the experiment. “Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road….” Read more Hmmmm…Essentially everything is hackable or can be circumvented by those who wish to mis-behave. I’m sure Consumer Report could break into Fort Knox.
The issue here is not about the mis-behavior of the driver,. We have the justice system to deal with that. Hopefully it can deal with this kind of mis-behavior better than it can deal with broken tail lights and hanging air freshner.
It is about the visions and dreams that mativate us to buy these cars that entices us to nisbehave is search of exp[eriencing those dreams. Those are inspired and created to please just a few individuals, the leaders of the corporations that make and want to us to buy these cars. They can simply lead by insisting that their messeages and the messages produced by their minions don’t enjender such behavior. As I’ve written, autoPilot and maybe even FSD, are fantastic Comfort & Control features and may well substantially improve safety, but only if they are used responsibly. The proper use should not be drowned out by hype that clearly can result in tagedy.
Moreover, these leaders can instruct their coders to write code that is smart enough to know when we misbehave in using these systems and do something about it. If the system catches you mis-behaving, it should be written to turn on the emergency flashers, slow down, find a the next opportunity to pull over and stop. It should then disable its operation until the driver gets a note from his mother.
This isn’t tough. This doesn’t need DeepLerning, MachineLearning or the latest AI. It needs responsible leadership stops promoting these products as ways to show off one’s brovado. Why all the ar commercials focused on speed and acceleration? Do we really like to plow through deep snow, driver down river banks and up Great Walls? Really??? So depressing! Alain
R. Mitchell, April 19, “It’s a 21st century riddle: A car crashes, killing both occupants — but not the driver.
That’s what happened over the weekend in Houston, where a Tesla Model S slammed into a tree and killed the two men inside. According to police, one had been sitting in the front passenger seat, the other in the back of the car.
Although investigators have not said whether they believe Tesla’s Autopilot technology was steering, the men’s wives told local reporters the pair went out for a late-night drive Saturday after talking about the system.
Tesla Chief Executive Elon Musk pushed back on speculation but also asserted no conclusion, tweeting Monday that “Data logs recovered so far show Autopilot was not enabled.” The company has resisted sharing data logs for independent review without a legal order….” Read more Hmmmm… I’ll stand by my quote…’ “I suspect there will be big fallout from this,” said Alain Kornhauser, head of the driverless car program at Princeton University.” Alain
B. Templeton, April 22, “… Regardless of whether the owner was able to activate Autopilot or ACC or just made a mistake trying to push the accelerator, the real question about this accident is what made somebody do something so fatally foolish? This has been the central question around Tesla’s deployment of these systems. Used correctly, they are useful tools. Driving with Autopilot is at a similar safety level to driving without it. (Not much safer, as Tesla misleadingly claims.) The problem arises because people decide, against all warnings, that it is better than it is, and they misuse it….” Read more Hmmmm…All really good points. The shame is that the Automated Emergncy Braking system which, by its very name, is supposed to interve in and “save-the-day”, may well have been “out-to-lunch” again. (Why does the AEB need lunch???) Alain
T. Lee, April 16, “For years, the NTSB has been calling on Tesla to beef up its driver-monitoring system. But as an investigative agency, the NTSB doesn’t have the power to require automakers to change how their vehicles are designed. The agency that does have that power—NHTSA—has yet to require the use of driver-monitoring systems or set rules for how they work..” Read more Hmmmm… I suspect that the eye tracking system in GM’s SuperCruise can be compromised, but what may well be useful is to not implicitly encourage the the mis-use of these systems, Alain
D. Shepardson, April 22, “Two U.S. senators are working to attach legislation to allow automakers to deploy tens of thousands of self-driving vehicles on U.S. roads to a bipartisan China bill, a significant reform that could help speed the commercial use of automated vehicles.
Senators Gary Peters, a Democrat, and John Thune, a Republican, have circulated a draft amendment seen by Reuters that would grant the U.S. National Highway Traffic Safety Administration (NHTSA) the power to initially exempt 15,000 self-driving vehicles per manufacturer from safety standards written with human drivers in mind. The figure would rise to 80,000 within three years…. ” Read more Hmmmm… OK. Alain
B. Pietsch, April 18, “Two men were killed in Texas after a Tesla they were in crashed on Saturday and caught fire with neither of the men behind the wheel, the authorities said.
Mark Herman, the Harris County Precinct 4 constable, said that physical evidence from the scene and interviews with witnesses led officials “to believe no one was driving the vehicle at the time of the crash.”
The vehicle, a 2019 Model S, was going at a “high rate of speed” around a curve at 11:25 p.m. local time when it went off the road about 100 feet and hit a tree, Constable Herman said. The crash occurred in a residential area in the Woodlands, an area about 30 miles north of Houston.
The men were 59 and 69 years old. One was in the front passenger seat and one in the rear seat, Constable Herman said.
He said that minutes before the crash, the men’s wives watched them leave in the Tesla after they said they wanted to go for a drive and were talking about the vehicle’s Autopilot feature….. ” Hmmmm… This was my 1st read on this. Alain
F. Lambert, April 22, “In what could be a first, Tesla has reportedly publicly released the data logs from a customer’s vehicle involved in a crash that led the owner to protest at Tesla’s booth at the Shanghai Motor Show….
She jumped on a display car to claim that Tesla’s “brakes are not working.”
The owner was eventually dragged out of the booth and reportedly put in “police detention,” but not before the event was filmed and posted to social media.”…” Read more Hmmmm… Maybe that’s what I need to do to get Tesla to release their data so that I, or some other independent entity, can ascertain the safety of autoPilot.
More importantly, “...The front collision warning and automatic emergency braking function were activated (the maximum brake master cylinder pressure reached 140.7bar) and played a role, reducing the amplitude of the collision. 1.8 seconds after the ABS was applied, the system recorded the occurrence of the collision. After the driver stepped on the brake pedal, the vehicle speed continued to decrease, and before the collision, the vehicle speed was reduced to 48.5 kilometers per hour.” ...”
Why did Elon’s coders design the AEB:
1. to wait until 1.8 seconds to activate. The coders knew/know that is too late given the speed at 1.8 seconds before collision,
2. to have the AEB, or autoPilot begin to slow down earlier that the “1.8 seconds to collision”?. It is straight forward to compute a feasible”master cylinder pressure profile” to not crash with a stationary object ahead and the Tesla going sufficiently slowly. These computations can be done quickly and repeated conctinuously. At the start of a trip this process is trivial because the car is at rest and the solution is simply the brake pressure that keeps the car stationary. As the car starts to move it is going slowly and, hopefully, its cameras can detect a stationary object ahead at a sufficient distance that many feasible “master cylinder pressure profiles”exits that can keep the Tesla from crashing. I’m sure that the coder can pick a good one to implement.
As the car gets going faster and object get closer, ferwer feasible profiles exist, but still, one can b found and implmented so that the car does not crash.
At some point, the combination of speed and distance are such that physics can no longer help and there exists no feasible profile that will keep the car from crashing.
NHTSA or some regulator or Elon himself should require that the AEB is designed to not let the car pass that point. Not let it enter the Operational Design Domain (the combination of road condition, speed and distance of the stationary object ahead) in such that therre is no feasible “master cylinder pressure profile”that will avoid a crash!
These “no-man’s lands” don’t just appear out of thin air. Every trip starts with speed = 0. Conditions rarely change instantaneously. Even the “instantaneous” appearance of stantionary objects in the lane ahead are rare. In those instances, yes it is “everybody do the best they can’, but in the perponderance of others, no crash would occur!
Now maybe Tesla needs Lidar in order to first “see’ stationary objects sufficiently far ahead when conditions are poor and the speed is high. In that case Elon should either back off his ‘no Lidar” stance or the AEB (or whatever other automated system) should not let you go that fast under those conditions, period!
This is serious business here and these systems should be there to give us a “get out of jail free” card, pass GO and collect $200″.
Again, the Tesla knew it had waited too long to stop, it was designed to not slow down in order to let it get itself in a situation that it can’t stop and it was designed to crash at 48.5 kph in this situation. While some may be happy that it slowed from above 74kph to 48.5pkh, there was no reson that additional brake pressure could have been applied by Tesla’s great autoPilot/AEB/… system earlier than 6:14:26.37PM that would have slowed it down below 74kph at that time such that no collision would have occurred. It is also likely that a brake profile could have been computed and implemented that would not have concerned the driver and almost assuredly would have had the driver praising Tesla at the Shanghai Auto Show rather than protesting. Alain
A. Hawkins, April 22 Two Senate Democrats are urging federal regulators to take “corrective actions” against Tesla to prevent further misuse of the company’s advanced driver assist feature. The request comes in the aftermath of a fatal crash in which two men from Texas were killed after their Tesla Model S crashed with no one in the driver’s seat.
In a letter sent to National Highway Traffic Safety Administration Acting Administrator Steven Cliff, Senators Ed Markey (D-MA) and Richard Blumenthal (D-CT) implored the agency to determine the exact cause of this recent crash to “better inform” future legislation around advanced driver assist systems like Tesla’s Autopilot…..” Read more Hmmmm… Alain
F. Lambert, April 22, “Elon Musk, who has generally welcomed valid criticism, has now dismissed some serious concerns about Tesla as “weird” attacks from the media.
We previously reported on how “Tesla superfandom is becoming toxic and negative for the electric revolution.”
Part of that involves Elon Musk’s feedback loop, which has been extremely valuable for Tesla, getting corrupted by those superfans.
Musk has often highlighted the feedback loop, which often consists of him responding directly to people and criticism on Twitter, as one of Tesla’s biggest advantages…” Read more Hmmmm … Again, it is a shame because Elon,, Tesla and probably autoPilot and really good all by themselves. They don’t need to be oversold and hyped. And their challenges and limitations should be made clear, the we all can really appreciate all of the excellent aspects of these cars. Alain
A. Kornhauser, Feb 6, “The focus of the Summit this year will be moving beyond the AI and the Sensors to addressing the challenges of Commercialization and the delivery of tangible value to communities. We’ve made enormous progress with the technology. We’re doing the investment; however, this investment delivers value only if is commercialized: made available and is used by consumers in large numbers. Demos and one-offs are “great”, but to deliver value that is anywhere near commensurate with the magnitude of the investment made to date, initial deployments need to scale. We can’t just have “Morgantown PRT Systems” whose initial deployment has been nothing but enormously successful for 45 years (an essentially perfect safety record, an excellent availability record and customer valued mobility). Unfortunately, the system was never expanded or duplicated anywhere. It didn’t scale. It is a one-off.
Tests, demos and one-offs are nice niche deployments; however, what one really needs are initial deployments that have the opportunity to grow, be replicated and scale. In 1888, Frank Sprague, successfully deployed a small electric street railway system in Richmond, Va. which became the reference for many other cites. “… By 1889 110 electric railways incorporating Sprague’s equipment had been begun or planned on several continents…” Substantial scaled societal benefits emerged virally from this technology. It was eventually supplanted by the conventional automobile but for more than 30 years it delivered substantial improvements to the quality-of-life for many.
In part, the 4th Summit will focus on defining the “Richmond” of Affordable Shared-ride On-demand Mobility-as-a-Service. The initial Operational Design Domain (ODD) that safely accommodates Driverless Mobility Machines that people actually choose to use and becomes the envy of communities throughout the country. ” Read more Hmmmm… Draft Program is was in flux. All named individuals actively participated. Alain
C’mon Man! (These folks didn’t get/read the memo)
Calendar of Upcoming Events
5th Annual Princeton SmartDrivingCar Summit
Live in Person
To be Announced
R. Shields, 22 – 25 March, “Recordings from the conference:
Session 1 plus opening: (Regulatory): https://youtu.be/UcDC8gXiUFk
Session 2: (Cybersecurity): https://youtu.be/ppp2hxlvebY
Session 3: (Automated Driving Systems): https://youtu.be/uL2dRHuX2Cc
Session 4: (Communications for ADS) : https://www.youtube.com/watch?v=IFQcL6yfBso
Read more Hmmmm… Russ, thank you for sharing! Alain