R. Mitchell, Feb. 25, "The nation’s top safety investigator slammed Tesla on Tuesday for failing to take adequate measures to prevent “foreseeable abuse” of its Autopilot driver-assistance technology, in a hearing into the fatal 2018 crash of a Tesla Model X SUV in Mountain View, Calif.
The National Transportation Safety Board said 38-year-old Walter Huang, an Apple software engineer, had Autopilot engaged in his 2018 Tesla Model X and was playing a video game on his iPhone when the car crashed into a defective safety barrier on U.S. Highway 101.The board also blamed the highway safety arm of the U.S. Department of Transportation for failing to properly regulate rapidly evolving robot-car technology…. The board adopted a long list of measures meant to reduce such accidents as “partially automated driving” technologies become more popular in new vehicles….
Sumwalt made clear the Mountain View crash was not an isolated incident, but illustrative of the safety issues involved as humans and robot systems increasingly share the driving, not just in Teslas but in vehicles from all manufacturers. “It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars,” he said.
the Model X drove straight down the middle of a “gore lane,” a white-striped zone where cars aren’t supposed to go," … It is clear from the images that the gore area was NOT white-striped as is supposed to be and the lane markings are badly worn. Why didn’t NTSB fault the CA DoT for its poor maintenance and marking practices. CA DoT needs to be severely reprimanded. " … a Toyota Prius crashed into it 11 days earlier…" to what extent did NTSB investigate the Prius crash. It didn’t have autoPilot, so that’s not the common factor. I suspect that the confusing lane markings and the lack of striping is the root cause… "
… The car’s collision avoidance system did not detect the crash barrier." … I suspect that this is NOT true. The system detected the stationary object, but the coded logic disregards stationary objects (classifies them as false alarms) because false positives are too likely. NTSB made a similar error in the Joshua Brown crash where the system didn’t mis-identify the stationary trailer ahead as being background sky, but instead classified the stationary object in the lane ahead as a false positive . NTSB investigators have failed to ask the right questions in these investigations… "
….The car’s forward collision warning system did not provide an alert, and the automatic braking system did not activate."… Again, the system classified stationary objects in the lane ahead as phantom objects and disregards them. Once disregarded, there is no reason to initiate a warning or apply Emergency Brakes. Yipes! Read more Hmmmm… Hopefully this will curtail the misbehavior in the use of these systems. The Self-driving systems require constant adult supervision. I suspect that NHTSA will place extraordinarily onerous regulations on personally owned self-driving cars that will effectively ban the ability to sleep, play video games, text or otherwise be non-vigilant in all non-driverless vehicles. Driverless vehicles will be required to be operated and maintained by a responsible fleet manager and not have any straight forward way for a human to drive them. Certainly no steering wheel or pedals. I expect that they’ll also ban the use of Stupid-Summon-like systems outside of one’s own personal property. They should. Alain
F. Fishkin, Feb 27, "How a new generation of affordable LiDAR can make autonomous vehicles smarter and safer. RoboSense VP Leilei Shinohara joins Princeton’s Alain Kornhauser and co-host Fred Fishkin for that plus..the Tesla investigations, California’s latest autonomous reporting, Waymo, Michigan’s initiative and more." "Alexa, play the Smart Driving Cars podcast!". Ditto with Siri, and GooglePlay … Alain
K. Wiggers< Feb 26, "This morning the California Department of Motor Vehicles released a batch of 2019 reports from the companies piloting self-driving vehicles in the state. By law, all companies actively testing autonomous cars on public roads in California are required to disclose the number of miles driven and how often human drivers were forced to take
This morning the California Department of Motor Vehicles released a batch of 2019 reports from the companies piloting self-driving vehicles in the state. By law, all companies actively testing autonomous cars on public roads in California are required to disclose the number of miles driven and how often human drivers were forced to take control of their vehicles, otherwise known as a “disengagement.”
Formally, the DMV defines disengagements as “deactivation of the autonomous mode when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle.” Critics say it leaves wiggle room for companies to withhold information about certain failures, like vehicles running red lights in order to avoid hitting pedestrians about to cross the street. But in lieu of federal rules, the reports offer one of the few metrics for comparing the industry’s pack leaders.
According to the DMV, AV permit holders — of which there are 60 — traveled approximately 2.88 million miles in autonomous mode on California’s public roads during the reporting period, an increase of more than 800,000 miles from the previous reporting cycle. Currently, 64 companies have valid permits to test autonomous vehicles with a safety driver on California public roadways, up from 48 companies in 2018. It’s worth noting that only five companies — Aurora, AutoX, Pony.ai, Waymo, and Zoox — have permits under the California Public Utilities Commission (CPUC) to transport passengers in autonomous vehicles, with Zoox receiving the first one in December 2018….
Waymo’s 153 cars and 268 drivers covered 1.45 million miles in California in 2019, eclipsing the company’s 1.2 million miles in 2018, 352,000 miles in 2017, and 635,868 miles in 2016. Indeed, it was a year of mileage milestones for the Alphabet subsidiary, which passed 1,500 monthly active riders in Phoenix, Arizona — the only state of the nine in which Waymo has driven where its commercial taxi service, Waymo One, is available.
"…Completely driverless rides remain available only to a “few hundred” riders in Waymo’s Early Rider program, the company says…" Read more Hmmmm… Very interesting. Spreadsheet of the 8,979 disengagement reports for 2019. CA_DMV site containing source data. CA_DMV Source page. Alain
Collision Between a Sport Utility Vehicle Operating With Partial Driving Automation and a Crash Attenuator Mountain View, California March 23, 2018 HWY18FH011
Report of Public Meeting, Feb 25, "This is a synopsis from the NTSB’s report and does not include the Board’s rationale for the conclusions, probable cause, and safety recommendations. NTSB staff is currently making final revisions to the report from which the attached conclusions and safety recommendations have been extracted. The final report and pertinent safety recommendation letters will be distributed to recommendation recipients as soon as possible. The attached information is subject to further review and editing to reflect changes adopted during the Board meeting. Executive Summary…"
R. Mitchell, Feb. 25, "On Jan. 21, 2019, Michael Casuga drove his new Tesla Model 3 southbound on Santiago Canyon Road, a two-lane highway that twists through hilly woodlands east of Santa Ana.
He wasn’t alone, in one sense: Tesla’s semiautonomous driver-assist system, known as Autopilot — which can steer, brake and change lanes — was activated. Suddenly and without warning, Casuga claims in a Superior Court of California lawsuit, Autopilot yanked the car left. The Tesla crossed a double yellow line, and without braking, drove through the oncoming lane and crashed into a ditch, all before Casuga was able to retake control.
Tesla confirmed Autopilot was engaged, according to the suit, but said the driver was to blame, not the technology. Casuga’s attorney, Mike Nelson in New York City, asked Tesla to release the data to show exactly what happened. Tesla refused, the suit claims, and referred Casuga and his lawyer to the car’s event data recorder, known as the black box. But the black box — a common feature in cars since the early 2000s — doesn’t record Autopilot data. Autopilot information is captured and stored separately, often sent over the airwaves to Tesla’s remote cloud computer repository.
Finding out who or what caused a car crash should be easier today. Cars have become computers on wheels, bristling with sensors, data processors and memory chips. The information “is significantly better and more potentially useful than ever,” said Jason Levine, executive director of the Center for Auto Safety, an advocacy group….
Only federal safety regulators have on-demand rights to car crash data collected onboard by the manufacturer but not on the black box. …
But safety experts say an over-aggressive attitude about intellectual property is getting in the way of basic safety assessment.
“Data associated with a crash, or even near-crash, of a vehicle operating under automated control should not be deemed proprietary,” said Alain Kornhauser, professor of operations research at Princeton University, a specialist in driverless technology and policy, and founder of the annual Smart Driving Car summit. “Safety must be a cooperative effort, not a ‘I know something you don’t know’ competitive play.”
…Musk has long insisted that his company’s Autopilot feature already is safer than the average vehicle on the highway and safer than Teslas driven without Autopilot engaged…. But Tesla has not released the data for safety researchers to evaluate. Statisticians at Rand Corp. and elsewhere have questioned his methodology, citing problems with sample size, sample representation and other issues. Princeton’s Kornhauser said he offered to do an independent safety evaluation on Tesla’s claims, using anonymized data. Tesla never responded to his invitation." Read more Hmmmm… What a shame. Tesla has safety data about the safety performance if its cars whose quality is way beyond what any of us would have ever dreamed about ever having. It is an enormous shame that they don’t anonymize the data and release it for all to see. We would all learn so much from these data. Alain
A. Kornhauser, Feb 6, "The focus of the Summit this year will be moving beyond the AI and the Sensors to addressing the challenges of Commercialization and the delivery of tangible value to communities. We’ve made enormous progress with the technology. We’re doing the investment; however, this investment delivers value only if is commercialized: made available and is used by consumers in large numbers. Demos and one-offs are "great", but to deliver value that is anywhere near commensurate with the magnitude of the investment made to date, initial deployments need to scale. We can’t just have "Morgantown PRT Systems" whose initial deployment has been nothing but enormously successful for 45 years (an essentially perfect safety record, an excellent availability record and customer valued mobility). Unfortunately, the system was never expanded or duplicated anywhere. It didn’t scale. It is a one-off.
Tests, demos and one-offs are nice niche deployments; however, what one really needs are initial deployments that have the opportunity to grow, be replicated and scale. In 1888, Frank Sprague, successfully deployed a small electric street railway system in Richmond, Va. which became the reference for many other ciites. "… By 1889 110 electric railways incorporating Sprague’s equipment had been begun or planned on several continents…" Substantial scaled societal benefits emerged virally from this technology. It was eventually supplanted by the conventional automobile but for more than 30 years it delivered substantial improvements to the quality-of-life for many.
In part, the 4th Summit will focus on defining the "Richmond" of Affordable Shared-ride On-demand Mobility-as-a-Service. The initial Operational Design Domain (ODD) that safely accommodates Driverless Mobility Machines that people actually choose to use and becomes the envy of communities throughout the country. " Read more Hmmmm… Draft Program is in flux. Consider all named individuals as "Invited yet to be confirmed". Alain
RoboSense LiDAR Announced as a Finalist in Transportation & Logistics Category in the 2020 Edison Awards
Press release, Feb 13, "RoboSense’s automotive MEMS LiDAR“RS-LiDAR-M1” has been named a finalist in transportation &Logistics category for the 2020 Edison Awards. The Edison Awards, named after Thomas Alva Edison, recognizes and honors the world’s best innovations and innovators…. The RoboSense RS-LiDAR-M1 is the world’s first and smallest MEMS Smart LiDAR Sensor to incorporate sensor hardware, AI perception algorithms, and IC chipsets, transforming conventional LiDAR sensors from an information collector to a complete data analysis and comprehension system, providing essential information for autonomous vehicle decision-making faster than ever before. The RS-LiDAR-M1 meets every automotive-grade requirement, including intelligence, low cost, stability, simplified structure and small size, vehicle body design friendliness, and algorithm processed semantic-level perception output results…." Read more Hmmmm…Interesting and nice advancement. Alain
Markings Committee, Jan. 9, "The Markings Technical Committee (MTC) Automated Driving Systems (ADS) RFI Task Force has identified three areas where pavement markings can support automated driving systems: uniformity, quality, and maintenance. This proposal addresses the highest priority uniformity issues….
Pavement markings are the most often cited traffic control device that the automated driving industry references in terms of a highway infrastructure element to support the deployment of partial to full automated driving. However, the references were often vague with inadequate details for highway agencies to assess or even implement….
The proposed recommendations represent the highest needs from the automated driving community. They are automotive “Original Equipment Manufacturers” (OEM’s) neutral and will provide safer, more robust pavement marking detection rates resulting in fewer vehicles unintentionally leaving their lane (roadway departure crashes make up over half of all fatalities and serious injury crashes in the US). …" Read more Hmmmm… This is great. Good paint is what everyone needs, both human drivers and automated driving systems! Thank you! Alain
S. Lekach, Feb 20. "This week, McAfee security researchers released 18 months worth of research that demonstrates the ease with which a "smart" autonomous vehicle can be tricked into misreading and accelerating past speed limits. The finding that some strategically placed black tape on a speed limit sign could trip up a smart car equipped with Mobileye cameras (used for advanced driving systems) to go 85 mph instead of the 35 mph limit certainly seemed alarming.
But there were some major caveats to the research. Mainly, that for self-driving vehicles (i.e., cars reliant on computer control for driving versus a hybrid system like Tesla’s that relies on humans and software for piloting) this weakness discovered in older Teslas isn’t an actual issue.
You can check out McAfee’s successful hack, in which the car’s cruise control zooms past 35 mph, in the video above.
But before freaking out about all the ways self-driving and automated vehicles are doomed, first consider that the McAfee Advanced Threat Research team tested this model hack on two 2016 Teslas; newer Tesla models have since stopped using Mobileye cameras in favor of the company’s own proprietary cameras.
Also, the version of the Mobileye camera used in those models has been updated and that version is no longer susceptible to the hack…." Read more Hmmmm… So much for McAfee’s self-serving "hack". Alain
S. O’Kane. Feb 26, "The National Highway Traffic Safety Administration (NHTSA) has partially suspended the US operations of France’s EasyMile after a passenger in Ohio was injured while riding in one of the company’s self-driving shuttles. EasyMile can continue operating its shuttles while NHTSA investigates, but the company can’t carry any passengers.
EasyMile currently operates its self-driving shuttles in a handful of US cities, including in Columbus, Ohio, where two of them have been running along a nearly 3-mile loop in a residential area at speeds of up to 25 miles per hour, according to Reuters. But last week, one of those shuttles made an “emergency stop” from a speed of just 7 miles per hour, and a passenger fell out of their seat as a result…." Read more Hmmmm… Whew, this is harsh. I guess all these systems are going to require the use of seat belts. Alain
D. Eggert, Feb 25, "Gov. Gretchen Whitmer announced Tuesday that Michigan will have a mobility officer to coordinate all initiatives related to self-driving and connected cars, an effort she said will ensure the state is the go-to place for testing and producing vehicles of the future….
She also signed an executive order to establish the Michigan Council on Future Mobility and Electrification, an advisory group that will replace but function similarly to one created by a 2016 law. The council will be housed within the Department of Labor and Economic Opportunity instead of the Department of Transportation…." Read more Hmmmm… Not a bad move. DoTs tend to be too focused on providing pavement and bridges. They are also obsessed with Vehicle Miles Traveled rather than mobility Miles Traveled. us on people rather than vehicles changes many things. Alain
A. Hawkins, Feb 25, "Pony.ai, a self-driving startup based in Silicon Valley and Guangzhou, China, is deepening its ties to Toyota. The two companies announced a pilot program to test self-driving cars on public roads in two Chinese cities, Beijing and Shanghai. The Japanese auto giant plans to invest $400 million in Pony.ai, valuing the startup at $3 billion.
Pony.ai has been working with Toyota since 2019 on public autonomous vehicle testing. With this new investment, their relationship will become even closer, with the automaker and the startup “co-developing” mobility products like “mobility services.”
…Toyota, the world’s largest automaker, has largely kept quiet on its self-driving car program…." Read more Hmmmm… Toyota seemed to have shut down right after the Uber Herzberg crash. Maybe they are beginning to resurface. ALain
Tesla Model 3 Crushes Original Tesla Roadster — Like A $70,000–100,000 Car With Cost Of Toyota Camry Or Honda Accord
Z. Shahan, Feb 25, "As you may have seen by now, 7 year Tesla insider David Havasi and I have been getting together in recent months to talk about the deep history of Tesla from an insider (David) and outsider (me) perspective in a podcast and video chat series called Tesla Inside Out. We’ve together decided that we really want to do two things with this series: 1) delve into funny, cool, interesting Tesla stories from years ago (that’s basically all David), and 2) discuss Tesla news of the day together (like friends at a coffee shop — but you’re invited), drawing on David’s history in the company and my historical perspective from covering the company for 7+ years…. " Read more Hmmmm… What I was most impressed about was the chart showing the 85% drop in prices of Lithium-ion batteries in the last 10 years. While battery prices need to continue to drop, what’s been achieved is impressive! Alain
T. Kenney, Feb 19, "ARK estimated that consumers will be able to travel on autonomous ridehailing platforms for just $0.25 per mile when they reach scale in 2024, or less than half the cost of driving a personal car and roughly one tenth the cost of a taxi. With these compelling economics driving customer adoption, ARK’s research previously concluded that companies owning and operating the autonomous technology stacks – like Waymo and Tesla – could command a take-rate of 20-30% of revenues, similar to that for Uber and Lyft, and that investors should be willing to pay $2 trillion today for the winning platforms.1…" Read more Hmmmm… This is a combination of Click-bait and Half-baked. Yes, cost of $0.25 per person mile is a reasonable value at scale with 2.0 person miles served by each vehicle mile (Average vehicle occupancy (AVO) > 2.0. (vehicle mile costs of ~ $0.50).
Scale happening by 2024 is VERY optimistic! The big question is what fare will achieve scale? Average fare of $0.50 per person mile may be achievable if the systems are welcoming, safe and anxiety-free. The chance of that occurring at scale (Serving 10% of the nation’s daily person trips (100M person trips/day using, 500M person miles/day delivered by 2M million vehicles in operation)) by 2024 is essentially zero. Even 1% of person trips is optimistic . By 2030 maybe, but only if there are no bumps on the road ahead. Each bump in the road (crash generating safety concerns, for example) probably delays everything by 5 years per event. It is unlikely that Waymo will have the 82,000 cars that they have the option to buy operating driverlessly (without attendant) by 2024. (The cost of these systems with a safety driver on-board is at least twice that of a conventional taxi (>$7.00/vehicle mile.) However, once the 10% person trip share is achieved, $0.50/passenger mile could generate nice "profits" of ~$125M per day or ~$50B per year. Not bad. Alain
J. Heinis, Feb 25, "Jersey City’s on-demand public bus service is up and running for as cheap as $1 per ride and will remain at that price through March 21st for anyone willing to try an alternative to rideshares like Uber and Lyft…." Read more Hmmmm… How much will it be after March 21. Where will the subsidy come from??? Alain
C’mon Man! (These folks didn’t get/read the memo)
There are so many bad articles. I’m overwhelmed. C’mon Man! Alain
Calendar of Upcoming Events:
evening May 19 through May 21, 2020