32nd edition of the 9th year of SmartDrivingCars eLetter
L4 Autonomous Truck Driving will not be so simple for TuSimple Holdings Inc. (NASDAQ: TSP) Why we believe the Company is All Smoke and Mirrors
Grizzly Research, Aug 10, ” TSP is the one of the latest hot China based IPO of an ambitious autonomous driving technology company, but we believe the company has systematically lied and misrepresented key information. …” Read more Hmmmm… Devastating. Grizzly is focused on the short side, so read carefully. Bias may exist here.
My “back of the envelope”: Looks like TuSimple is expecting $0.35/mile revenue for their AV stack on Class 8 trucks. This is 50% of professional driver costs. Fine if you can eliminate the driver. Not so fine if an attendant is still there.
No way anyone can really begin to eliminate a driver on any stretch of the US interstate highway system for at least 2 years where there is any substantial volume of trucks. The “2 years” assumes that “attended” operation has encountered “few” disengagements, all of which have been appropriately resolved and there have been no “spectacular ” crashes by anyone involving Driverless Trucks in North America. (“Attended” means that there is at least one professional driver overseeing the “driverless” operation. “few”… you pick a number greater than zero. Disengagements… the professional driver intervened so as to avoid a “spectacular” crash. “Spectacular” crash is one that goes viral (Herzberg/Fukushima/Chernobyl/Diana). “2 years” … pick a number….)
Point is…, everyone is still in a substantial testing phase that has similar duration to Waymo’s/GM-Cruise/ Ford-Argo testing phases (5+ years) which necessarily precedes market introduction which then involves its own ramp-up phase (which hasn’t been going well for Waymo and the others haven’t even started.) So pick a number. During that time, TuSimple’s investors will need to pick up the tab for all of the people and all of the “liDars”, etc. used in testing and marketing initiatives before they can recognize any substantial $0.35/truck-mile revenue (minus any costs of the “AI-driver” software/sensor/processor/communications stack.. pick a number. In the initial ramp up of sales, this number can easily be greater than $0.35/mile).
TSP better be really good! Plus, they can’t afford any slip ups, nor have anyone else crash; else, Grizzly is going to do very well thank you. Alain
SmartDrivingCars Pod-Cast Episode 230, Zoom-Cast Episode 230 w/Tim Higgins, author: POWER PLAY: Tesla, Elon Musk and the Bet of the Century
F. Fishkin, Aug 21, “Teslas, Humanoids and Elevators! What Elon Musk and Tesla delivered at AI Day 2021 was insight into the company’s remarkable technology and that may boost recruiting efforts. So says Princeton’s Alain Kornhauser who is joined by co-host Fred Fishkin and guest Tim Higgins of the Wall Street Journal, author of POWER PLAY… Tesla, Elon Musk and the Bet of the Century. AI Day, the NHTSA investigation and Elon Musk hops on the elevator on Episode 230 of Smart Driving Cars!
Or you can listen to Episode 230 of Smart Driving Cars on Tesla’s AI Day and more with guest Tim Higgins of the Wall Street Journal ..author of POWER PLAY… Tesla, Elon Musk and the Bet of the Century.
SmartDrivingCars Pod-Cast Episode 229, Zoom-Cast Episode 229 w/Russ Mitchell, Los Angeles Times
F. Fishkin, Aug 18, “With the National Highway Traffic Safety Administration having opened an investigation into Tesla autopilot crashes involving emergency vehicles…Los Angeles Times reporter Russ Mitchell joins Princeton’s Alain Kornhauser and co-host Fred Fishkin for a look at the issues facing Tesla and other vehicle makers. ” Alexa, play the Smart Driving Cars podcast!“. Ditto with Siri, and GooglePlay … Alain
The SmartDrivingCars eLetter, Pod-Casts, Zoom-Casts and Zoom-inars are made possible in part by support from the Smart Transportation and Technology ETF, symbol MOTO. For more information: www.motoetf.com. Most funding is supplied by Princeton University’s Department of Operations Research & Financial Engineering and Princeton Autonomous Vehicle Engineering (PAVE) research laboratory as part of its research dissemination initiative
Yamadaxio, Aug. 17, “Trading in TuSimple Holdings shares has approached three times the average trading volume on Tuesday as the lockup that prevents the sale of certain investors has expired. After a six-week slide, stocks rose 3% to end the day.
Autonomous truck software developers went public in a traditional initial public offering that raised over $ 1 billion in April with a valuation of over $ 8 billion.
The other three autonomous truck companies (Plus, Aurora Innovation and Embark Trucks) are in various stages of a merger with a special purpose acquisition company (SPAC). This is a faster route to open trading, but with increased oversight by the Securities and Exchange Commission.
TuSimple (NASDAQ: TSP) It debuted at about $ 40 per share, reached $ 79.84 by June 30, and then began to fall sharply. Shares closed on Tuesday at $ 31.34, up 3.26% after a temporary drop to a high of $ 20. They added another 35 cents on overtime transactions. There were 6,842,882 shares traded on Tuesday, with an average daily average of 2,396,129 shares….” Read more Hmmmm… And this is after TSP was featured on a rerun of 60 minutes last Sunday beginning @ 28:30. What was 60 Minutes thinking to air them again ( and not mentioning the Grizzly story above)? We miss you Mike Wallace!
Finally, Elon set a new standard for automated driving capabilities with his Tesla AI Day (next item) that is going to require everyone else trying to do automated driving to raise their stake or “fold ’em”. Alain
R. Bellan, Aug 20, “Elon Musk wants Tesla to be seen as “much more than an electric car company.” On Thursday’s Tesla AI Day, the CEO described Tesla as a company with “deep AI activity in hardware on the inference level and on the training level” that can be used down the line for applications beyond self-driving cars, including a humanoid robot that Tesla is apparently building….
There was a lot of super technical jargon, but here are the top four highlights of the day.
- Tesla Bot: A definitely real humanoid robot
- Unveiling of the chip to train Dojo
- To Full Self-Driving and beyond
- Solving computer vision problems
…. Read more Hmmmm… That is a fine list (although I’m not a fan of humanoids. I understand that “we’d prefer a machine to do menial tasks” and Elon did appropriately preface his enthusiasm for humanoids with a “promise” of universal basic income so that the poor that currently do the menial and demeaning tasks would continue to be able to put food on the table after they were unburdened. He also adroitly sidestepped an entrapment question inferring that the economics of such humanoids wasn’t very attractive because the jobs they would be doing don’t cost much to the employer. Implying no RoI for the employer (without any realization of the societal implications of creating better jobs for people and/or allowing them to feed their families without under paying them to do menial and demeaning tasks.).
I thought the session was fantastic. Enormous amount of information and clear demonstrations of capabilities/tools they have built and are continuing to build to address and get to “full self-driving” (and admitting that they aren’t there yet.)
Some of the concepts that came out in the Q&A that I fully agree and have said myself repeatedly…
1. Machine learning: “I discourage the use of Machine Learning… because it is really difficult. Unless you have to use machine learning, don’t do it!” and he goes on from there. I loved it!!!
2. His elevator analogy at the end … I’ve been using it for years. I loved it!
3. Using simulations to generate perfect training data is right on (one of the things my students (Chenyi Chen, Mark Martinez, Artur Filipowitz, and others) and I pioneered 5 or so years ago that he’s taken to a whole other level. His looks really good.
4. He repeated several times that the most basic thing is not to crash…. yet there was no mention about the Tesla crashes that NHTSA is investigating Oh well, and
5. In the many examples that were shown I never saw that the system highlighted/tagged a stationary object on or above the lane ahead. Never an overpass. Never a stopped car that had not been moving when first encountered. Never a tree. Never a large overhead sign. Are those things so rare??????
Again… a MUST watch including the Q&A (but skip the prelim music). Alain
Why do Tesla cars keep crashing into emergency response vehicles? Federal safety agency is investigating
R. Mitchell, Aug. 16,” In theory, identifying and avoiding stationary objects set off by hazard cones or flashing lights ought to be one of the easiest challenges for any autonomous-driving or driver-assist system.
Yet at least 11 times over the last seven years, cars made by Tesla Inc. and running its software have failed this test, slamming into emergency vehicles that were parked on roads and highways. Now the National Highway Traffic Safety Administration wants to know why.
A federal investigation announced Monday involves Tesla cars built between 2014 and 2021, including models S, X, 3 and Y. If the probe results in a recall, as many as 765,000 vehicles could be affected….
It’s about time, said Alain Kornhauser, director of the self-driving car program at Princeton University. “Teslas are running into stationary objects,” he said. “They shouldn’t be.”…
Read more Hmmmm… They shouldn’t be!!! It seems as if Tesla is going way too fast trying to get to FSD, it jumped over Automated Emergency Braking. It’s trying to run a “four minute mile” before demonstrating that it can crawl.
A necessary (but not sufficient) condition for a car to drive itself, is that it not run into things ahead; especially those that are just sitting there in front of it. Duh!!!!! That’s the fundamental purpose of an Automated Emergency Braking system…Stop before you hit it!!!
If you can’t see Emergency Vehicles with flashing lights, or parked cars, or big tree trunks, or stub-ends of New Jersey Barriers, or a trailer draped broadside across your lane in front of you, then what are you doing trying let me think that the car can safely drive itself, even a short distance, let alone, a “full” trip from my watering hole to my bedroom. C’Mon Elon!
The objects that these Teslas hit were stationary, in full view, right in front. They didn’t just “jump from the bushes, …” In full view… Right in front!!! This is supposed to be the crawl step on the road to the “sub-4-minute mile”.
Is this a “manhood” thing??? Figuring if you did the hard stuff, the fundamental stuff would come along with the excess baggage??
I suspect that this a fundamental design flaw. It is/was not recognized by all of the great AI gurus … that the fundamental building block to FSD is to start with a rock-solid Automated Emergency Braking System. A system that is on all the time that prevents the car from hitting anything ahead. This can be done by either stopping in time, or determining that it can safely pass under, over, to the left or to the right of any object ahead. Unless, of course, that object appears “instantly out of nowhere” and perfectly applied physics is unable to avert the crash, as might happen if a boulder drops right in from of you or a deer jumps out of the bushes as happened to me a few weeks ago.
The problem is that Tesla’s (and likely all automaker’s) Automated Emergency Braking system isn’t designed to even try to stop for stationary objects detected ahead. Instead, it/they simply ignore them. Relied upon is the high likelihood that the object can readily be passed under or over and, if not, the alert driver will see it and apply the brakes. A somewhat fine design as long as there is an alert driver behind the wheel A situation which tends to exist for all automakers other than Teslas where the driver has turned on AutoPilot and stops paying attention.
Thus, with other automakers, their drivers react and don’t crash. Nothing is ever reported about the incident.
With Tesla, the driver is not paying attention and is not able to save the day. Thus the headlines. Alain
N. Boudette, Aug 16, “The U.S. auto safety regulator said Monday that it had opened a broad investigation of the Autopilot system used in hundreds of thousands of Tesla’s electric cars.
The investigation was prompted by at least 11 accidents in which Teslas using Autopilot, an assisted-driving system that can steer, accelerate and brake on its own, drove into parked fire trucks, police cars and other emergency vehicles, the safety agency, the National Highway Traffic Safety Administration, disclosed. Those crashes killed one woman and injured 17 people. …” Read more Hmmmm… Good background on the issue, but fails to suggest that the common thread in all of these is.. crashing into an object ahead that never moved while the Tesla was approaching it, something that AutoPilot seemingly never expects to happen. Tesla’s AEB follows moving objects (other cars) and averts moving pedestrians and stationary objects on the left and on the right. Stationary objects ahead that can’t be passed under shouldn’t be there. If one is detected, it is tagged as a “false positive”; a “mirage”,and is ignored. source of the crash!!! Alain
D. Crichton, Aug 20, “In a late Friday night blow to Uber, Lyft and other gig worker-centered companies, a superior court judge ruled that California’s Proposition 22, which was passed in 2020 and designed to overrule the state’s controversial AB-5 law on the employment status of gig workers, violates the state’s constitution.
Frank Roesch, a superior court judge in Alameda County, which encompasses Oakland, Berkeley and much of the East Bay, ruled that the law would limit “the power of a future legislature” to define the employment status of gig workers….
Such a distinction is big business:…
Such fights are not limited to merely Silicon Valley’s home state,… ” Read more Hmmmm… The Uber/Lyft model for delivering high-quality mobility for all is a really good one.
However, to be environmentally responsible, it needs to have ride-sharing and to be affordable it can’t require a chauffeur that deserves to be paid a living wage with decent working conditions and it needs to have ride-sharing.
The conventional car got us part way because DiY did the chauffeuring while failing miserably with the ride-sharing piece. Alain
D. Furchtgott-Roth, Aug. 18, ” …Now the National Highway Traffic Safety Administration (NHTSA) is imposing complex reporting requirements to discourage new technology.
The NHTSA Administrator is subject to Senate confirmation, but the Administration has not yet proposed a nominee. On June 29, Steve Cliff, the Acting Administrator, announced a Standing General Order demanding regular crash data from 108 domestic and foreign companies, beginning in two weeks, for a period of three years.
These companies include most of the manufacturers of AVs and of vehicles equipped with advanced driver assistance systems, such as lane warnings, blind spot warnings, and adaptive cruise control.
According to the Order, a crash is “any physical impact between a vehicle and another road user… or property that results or allegedly results in any property damage, injury, or fatality. For clarity, a subject vehicle is involved in a crash if it physically impacts another road user or if it contributes or is alleged to contribute (by steering, braking, acceleration, or other operational performance) to another vehicle’s physical impact with another road user or property involved in that crash.
But this is far from clear. If a company’s vehicle has an incident that “allegedly” results in damage, and the driver disagrees, does it get reported? If a vehicle “is alleged to contribute” to another’s accident, who has the final say? If a driver glances at a driverless Nuro delivering groceries, and rear-ends another vehicle, Nuro has to report this accident to NHTSA as a crash. Allegations of which companies are unaware could be made by onlookers or in social media.”… ” Read more Hmmmm… Hopefully, the reporting requirements will not be onerous and will be nothing more that a cut&paste of any internal documentation of “what happened”. Safety data should be shared among all parties. but that sharing should be protected from “self incrimination” or any downside. Alain
K. Hu, Aug. 11. “Autonomous driving startup Pony.ai has put on hold plans to go public in New York through a merger with a blank-check firm at a $12 billion valuation, after it failed to gain assurances from Beijing that it would not become a target of a crackdown against Chinese technology companies, people familiar with the matter said….
Pony.ai had been in exclusive talks to go public through a merger with VectoIQ Acquisition II (VTIQ.O). …
VectoIQ II is the second SPAC to be led by former General Motors (GM.N) Vice Chairman Steve Girsky, whose first SPAC struck a deal with electric truck maker Nikola Corp (NKLA.O). It raised $345 million in an oversubscribed IPO in January…. ” Read more Hmmmm… All smells fishy! Again, if anyone is going to be real in this field, note that yesterday Elon, very explicitly set the bar for entry. Alain
N. Boudette, Aug 17, “… Mr. McGee’s statements to investigators, the accident report and court filings paint a tragic picture of overreliance on technology. They also strongly suggest that Autopilot failed at a basic function — automatic emergency braking — that engineers developed years ago. Many newer cars, including models much more affordable and less sophisticated than Teslas, can slow or stop themselves when an accident seems likely. …” Read more Hmmmm… Finally, Neil mentions Automated Emergency Braking (AEB) as the possible culprit. Yea!!
Unfortunately, I hope that AEB is NOT part of AutoPilot, but instead a base functionality on top of which autoPilot is built. AEB is supposed to be on all the time to save the driver should the car be entering into a likely crash situation.
As with Anti-lock brakes and Electronic Stability Control these systems are designed to override the driver and “do the right thing”. AutoPilot is turned on at the discretion of the driver. Some drivers may never turn it on. It would be a shame if the crash prevention capabilities of Tesla’s AEB were never afforded by those drivers. Similarly for the times when a user of AutoPilot has it turned off or in situations where AutoPilot turns itself off.
AEB should and must be on all the time! The “issue” is with Tesla’s AEB. Elon should put more of his AI Day muscle into Tesla’s AEB. As he said several times, the primary objective is to “not crash” . So Elon, please use some of that fabulous AI power to make your AEB “never” let a Tesla crash. As you said during AI Day “The Prime directive is don’t crash“!!! Alain
S. Chi, Aug. 16, “Lin Wenqin, founder of the brand management firm Meiyihao, died Thursday in accident after activating the autopilot navigation system while driving a Nio ES8, according to an obituary on Saturday, The Paper reported.
The accident, where Lin’s car rear-ended the vehicle in front, is still under investigation by the police in Fujian province, reported the China National Radio website. Zhang Bo, CEO of the electric car startup Nio’s Xiamen branch, said via the Nio app that they are cooperating with the investigation and assisting Lin’s family in the aftermath, according to The Paper.
Notably, car accidents triggered by autonomous driving have happened before and involved multiple carmakers. Among them, a Great Wall Motor’s Haval H9 rear-ended a truck when the driver used the adaptive cruise control system on Aug 4…” Read more Hmmmm… Scary! Alain
E. Wu, Aug 16, “Electric vehicle maker Nio tumbled as much as 7% on Monday on news that its autopilot system was involved in a fatal crash in China. The stock fell as low as $38.07 from Friday’s closing price of $41.03.
A Chinese businessman named Lin Wenqin reportedly died last week on Thursday while driving a Nio SUV called the ES8. Lin, who had activated the car’s autonomous autopilot system, was killed when the car rear-ended another vehicle, according to a report by China Daily.
The ES8 is equipped with several autonomous driving features, some of which are powered by tech from Intel and Nvidia. Intel was flat on the news, though Nvidia dipped almost 3% amid a broader market decline…” Read more Hmmmm… I doubt the “problem” is what “powers” the software. It may well be that the software is largely vaporware and not good enough to avert crashes, irrespective of the “power”. Alain
Chris Isidore, Aug 18, “Tesla’s Autopilot feature could soon face its second federal probe of the month.
The automaker’s driver-assist and traffic-aware cruise control features are already under investigation by federal safety regulators for their role in a series of crashes. On Wednesday Senators Richard Blumenthal of Connecticut and Edward Markey of Massachusetts asked the Federal Trade Commission to look into the claims made by Tesla and its CEO Elon Musk about Autopilot and the company’s full self-driving, or FSD, …” feature. They want the FTC to determine whether those claims amount to deceptive marketing practices….” Read more Hmmmm… Again, they should focus on AEB. Alain
P. Lienert, Aug. 18, ” Aurora, the Silicon Valley self-driving startup founded by former Tesla, Uber and Google executives, has released what it says is the industry’s first tool for evaluating whether and when autonomous trucks and cars are safe to deploy on public roads without a human behind the wheel.
“We think this is the only way you can get to a safe, commercializable product,” said co-founder and CEO Chris Urmson of Aurora’s new Safety Case Framework.
Aurora, working with partners PACCAR (PCAR.O) and Volvo Group (VOLVb.ST), aims to put its self-driving system in commercial service in heavy-duty trucks in late 2023….” Read more Hmmmm… Certainly Chris needs to have this tool so he can decide when the risk is low enough to begin to capture the reward. He’ll benefit from the reward, but he’ll also need to be willing to step up and say..”if anything happens, it’s on me!” (the risk). None of this should be thought of as a way for Chris, or Elon, or any other “benefiter” to be able to pass the risk onto someone else. Alain
A. Hawkins, Aug. 18, “Waymo announced plans to build a hub for its autonomous semi-trailer trucks on a nine-acre site near Dallas-Fort Worth, Texas. The Alphabet-owned company also said it is partnering with rental truck company Ryder on fleet management as it looks to grow the delivery and logistics portion of its business.
The hub in South Dallas will be Waymo’s “primary operations center” in the state for its fleet of autonomous trucks. The hub will be built to accommodate “hundreds of trucks and personnel” as the company gets closer to launching a full-scale freight-hauling operation using its fully autonomous vehicles — though Waymo has yet to say exactly when that will be.. …” Read more Hmmmm… Can’t be good news for TSP. Given a choice.. Waymo or TSP… which would you choose? Alain
Shhhh, The Auto Industry Doesn’t Want You To Know About Tesla Model S Autopilot Vs. Mercedes S-Class Distronic
F. Clark, Aug 18, “I drive a Tesla Model S 90D that I bought new in 2016, and have since enjoyed 60,000 trouble-free miles. I love the vehicle both as my daily driver and for long road trips from New England to the Deep South. Electric drive notwithstanding, it is the best driving car I have ever owned. It outperforms and out-handles my Porsche 911, and out-comforts my Mercedes….
Finally, the big day arrived and I excitedly drove the Mercedes to my vaccine clinic destination about 50 miles away….
As I entered the highway and engaged my cruise and Distronic autosteer, I literally could not believe the difference between Tesla autosteer and Mercedes Distronic steering.
I drove the big Mercedes an hour to the west, and the Mercedes Distronic autosteer was worse than poor. It works, barely,…” Read more Hmmmm… Thanks to Mike Sheldrick to sending the above…
I was first on my block in 2014 to have an S-Class with Distronic Autosteer (“997 package”). Couldn’t wait to get it. It barely worked then. Daimler has never offered to over-the-air or in-the-shop upgrade it. Daimler has tried to get me to buy a new one, but only because they claim they’ve improved gas mileage and I’ll save a couple of hundred in gas if I buy a new one for more than $100k (no lie!! ) Haven’t suggested they’ve improved the “997” package (no lie!!). Now I know why Daimler hasn’t… auto-steer hasn’t improved. I must admit that Daimler’s intelligent cruise control part of the “997” package has worked extremely well for me. I hardly ever drive without using it. Alain
C’mon Man! (These folks didn’t get/read the memo)
Re-see: Pop Up Metro USA Intro 09 2020
K. Pyle, April 18, “It’s time to hit the start button,” is Fred Fishkin’s succinct way of summarizing the next steps in the Smart Driving Car journey. Fiskin, along with the LA Times’ Russ Mitchell co-produced the final session of the 2021 Smart Driving Car Summit, Making It Happen: Part 2. This 16th and final session in this multi-month online conference not only provided a summary of the thought-provoking speakers, but also provided food for thought on a way forward to bring mobility to “the Trentons of the World.”
Setting the stage for this final session, Michael Sena provided highlights of the Smart Driving Car journey that started in late December 2020. Safety, high-quality, and affordable mobility, particularly for those who do not have many options, was a common theme to the 2021 Smart Driving Car Summit. As Princeton Professor Kornhauser, the conference organizer put it,…..” Read more Hmmmm…. We had another excellent Session. Thank you for the summary, Ken! Alain
Ken Pyle‘s Session Summaries of 4th Princeton SmartDrivingCar Summit:
15th Session Making it Happen – Part One: Elected Officials’ Role in Creating a Welcoming Environment in the Trentons of this World
Kornhauser & He, April 2021 “Making it Happen: A Proposal for Providing Affordable, High-quality, On-demand Mobility for All in the “Trentons” of this World”
Orf467F20_FinalReport “Analyzing Ride-Share Potential and Empty Repositioning Requirements of a Nationwide aTaxi System“
Kornhauser & He, March 2021 “AV 101 + Trenton Affordable HQ Mobility Initiative“
Calendar of Upcoming Events
5th Annual Princeton SmartDrivingCar Summit
Live in Person
Tentaively: November 2 (evening) -> 4, 2021
R. Shields, 22 – 25 March, “Recordings from the conference:
Session 1 plus opening: (Regulatory): https://youtu.be/UcDC8gXiUFk
Session 2: (Cybersecurity): https://youtu.be/ppp2hxlvebY
Session 3: (Automated Driving Systems): https://youtu.be/uL2dRHuX2Cc
Session 4: (Communications for ADS) : https://www.youtube.com/watch?v=IFQcL6yfBso
Read more Hmmmm… Russ, thank you for sharing! Alain