N. Boudette, Aug 16, "In the race to develop driverless cars, several automakers and technology companies are already testing vehicles that pilot themselves on public roads. And others have outlined plans to expand their development fleets over the next few years.
But few have gone so far as to give a definitive date for the commercial debut of these cars of the future. Now Ford Motor has done just that. At a news conference on Tuesday at the company’s research center in Palo Alto, Calif., Mark Fields, Ford’s chief executive, said the company planned to mass produce driverless cars and have them in commercial operation in a ride-hailing service by 2021….
“That means there’s going to be no steering wheel. There’s going to be no gas pedal. There’s going to be no brake pedal,’’ he said. “If someone had told you 10 years ago, or even five years ago, that the C.E.O. of a major automaker American car company is going to be announcing the mass production of fully autonomous vehicles, they would have been called crazy or nuts or both.”…
Ford also said it had acquired an Israeli start-up, Saips, that specializes in computer vision, a crucial technology for self-driving cars. And the automaker announced investments in three other companies involved in major technologies for driverless vehicles….." Read more Hmmm…This is significant because it implies that Ford, (or an entity under its control) will operate and deliver on a day-to-day basis MaaS (Mobility as a Service). In other words it will both build/assemble and operate mobility’s "Cloud". The scale economies of such a mobility "cloud" are arguably much more substantial than that of the data storage & computing "cloud". Think about it! Alain
D. Etherigton, Aug 16, "… It’s also partnering exclusively with Nirenberg Neuroscience, to bring more “humanlike intelligence” to machine learning components of driverless car systems.
SAIPS’ technology brings image and video processing algorithms, as well as deep learning tech focused on processing and classifying input signals, all key ingredients in the special sauce that makes up autonomous vehicle tech. This company’s expertise should help with on-board interpretation of data captured by sensors on Ford’s self-driving cars, and turning that data into usable info for the car’s virtual driver system. SAIPS’ offerings include detection of anomalies, persistent tracking of objects detected by sensors, and much more. The company’s past clients include HP and Trax, but its partner group doesn’t appear to have included much in the way of driving-specific applications.
Ford … identified SAIPS as a potential target through a tech scouting operation it began in Israel in 2013, and quickly determined that the company’s machine learning expertise would help bolster its own efforts.
The Nirenberg partnership similarly takes research applied in a different area to the problems of full autonomous driving. Dr. Sheila Nirenberg’s research focuses on restoring sight to patients with degenerative retinal disease, but Ford thinks the tech can be used to help its virtual drivers greatly improve their own vision systems, and process information in ways similar to how human drivers would…." Read more Hmmm…Very promising. The race is on. Who is "the Usain Bolt"? Alain
J. Seewer, Aug 19, "Ohio’s toll road, a heavily traveled connector between the East Coast and Chicago, is moving closer to allowing the testing of self-driving vehicles.
Testing is likely to begin within 12 months, and possibly before the end of the year, the Ohio Turnpike’s executive director told The Associated Press.
Officials overseeing the roadway have spent more than a year looking at the possibilities, said Randy Cole, the turnpike’s director…." Read more Hmmm…Excellent!! The NJ Turnpike and the NYS Thruway should also be doing this for a host of excellent reasons. These are excellent roads that are largely amenable to self driving and should offer to serve that capability in return for the tolls paid by the traveling public that has chosen to acquire the compatible technology. Just smart business sense. Alain
B. Vlasic, Aug 18, "…Uber also said it had acquired Otto, a 90-person start-up including former Google and Carnegie Mellon engineers that is focused on developing self-driving truck technology to upend the shipping industry…Uber plans to open a 180,000-square-foot facility in Palo Alto, Calif., to house Otto, which will operate as a stand-alone company focused specifically on upending the long-distance trucking industry. Otto engineers will also work out of offices in San Francisco and Pittsburgh.
But that talent and technology will apply more broadly to the technology behind Uber’s grander self-driving car efforts, Mr. Kalanick said. He said he believed that his company’s approach — a combination of teaming up with hardware manufacturers, Otto’s software expertise and a large network of more than 50 million monthly riders as recently as July — places Uber in the best position to be competitive with companies like Google." Read more Hmmm…Wow…I didn’t see that coming, especially after GM acquired Cruise. Congratulations Anthony and Lior. Alain
J. Bhuiyan, Aug 18, "If the fact that Uber acquired a self-driving trucking company for $680 million in stock along with an agreement that included giving the company 20 percent of its trucking profits shocked you, you’re not alone.
Uber and Otto aren’t exactly a natural fit. Uber has never once mentioned going into the long-haul trucking business or creating a logistics platform for truck drivers as one of its ambitions. On paper, Otto doesn’t yet need a company like Uber. The startup just launched out of stealth mode in May and had 91 employees. It built proprietary, autonomous technology and was already testing its self-driving technology in trucks on highways in San Francisco…" Read more Hmmm…Uber paid less for Otto than GM paid for Cruise. That’s a real Hmmmm! Alain
Press Release, Aug 18, "The two companies have signed an agreement to establish a joint project that will develop new base vehicles that will be able to incorporate the latest developments in AD technologies, up to and including fully autonomous driverless cars. The base vehicles will be manufactured by Volvo Cars and then purchased from Volvo by Uber. Volvo Cars and Uber are contributing a combined USD 300M to the project.
Both Uber and Volvo will use the same base vehicle for the next stage of their own autonomous car strategies. This will involve Uber adding its own self-developed autonomous driving systems to the Volvo base vehicle. Volvo will use the same base vehicle for the next stage of its own autonomous car strategy, which will involve fully autonomous driving."…Read more Hmmm…Each are using their own "autonomous car strategies" on the same base vehicle.?? Seems to imply that the real intellectual property is in the "autonomous car strategies" and that the "base vehicle" is text-book. Alain
The autonomous cars, launching this summer, are custom Volvo XC90s, supervised by humans in the driver’s seat.
M Chafkin, Aug 18, "Starting later this month, Uber will allow customers in downtown Pittsburgh to summon self-driving cars from their phones, crossing an important milestone that no automotive or technology company has yet achieved. …
In Pittsburgh, customers will request cars the normal way, via Uber’s app, and will be paired with a driverless car at random. Trips will be free for the time being, rather than the standard local rate of $1.05 per mile."…Read more Hmmm…Some amount of "sleight-of-hand" here. This is about Self-driving and NOT Driverless, so it doesn’t solve Uber’s Labor "challenge". HOWEVER, it is a very elegant way for both Uber and Volvo to give demonstrations of self-driving rides to the general public all the while assessing and learning from the customer response. This would have to be done in the initial stages even if the cars were actually capable of driverless operation as was done in the initial stages of the driverless Heathrow Terminal 5 podcars. The same must have been done when Otis first put in automated elevators. And I suspect that the elevators in NYC’s Tiffany & Co. can also operate operatorless (last time I was there they had operators 🙂 ). Alain
Aug 10, "The French government has announced that it will allow car companies to test self-driving cars on public French roads, reports Designyourworld.
The change in policy is an element of the New Industrial France initiative, which aims to energize the country’s industrial and manufacturing sectors…" Read more Hmmm…Another small step forward. Alain
M Chafkin, Aug 17, "History is being made in Helsinki’s Hernesaari district, as automatic buses take to the streets. Commuters and motorists will have to get used to seeing a pair of driverless mini-buses negotiating traffic in the area as the city tests the robot vehicles through mid-September.
The pilots are among the first in the world, since Finnish laws don’t require vehicles on the road to have a driver. This has made it easier for officials to get the required green light from the transport safety authority Trafi….
While Helsinki may be one of the first cities in the world to let loose the robot buses on the streets, it is not the first Finnish city to do so. Last year neighbouring Vantaa rolled out similar vehicles during its housing fair, although they only operated on routes shut off from other traffic at the time.
By contrast, Santamala considers the test track in Hernesaari to be a challenging traffic environment, because it is constantly changing. Motorists who may be prone to road rage will also have to keep their cool navigating traffic alongside – or behind – the robot buses, whose average speed is about ten kilometres an hour…." Read more Hmmm…Another small step forward. Alain
Some other thoughts that deserve your attention
M. Isaac, Aug 18, " A federal judge on Thursday struck down a proposed class-action settlement between Uber and a group of its current and former drivers, potentially continuing a protracted lawsuit that questioned a key tenet of the ride-hailing company’s business.
Under a settlement forged in April, Uber had been set to pay up to $100 million in reimbursement damages to nearly 400,000 drivers…Judge Edward M. Chen ruled that the April settlement was “not fair, adequate, and reasonable” as grounds for denial. He also said a small portion of the $100 million amount reflects only 0.1 percent of the potential full verdict value of the case…As part of the settlement agreement, Uber also made other concessions, like recognizing and speaking with quasi-unions of its drivers in California and Massachusetts. It also allowed drivers to accept tips at the end of each ride…."Read more Hmmm…Really bad when the judge rules settlement was “not fair, adequate, and reasonable”. Poor Uber drivers. Not only are they not wanted, they’re not represented…. but they were going to be allowed to accept tips! Alain
July 29, " Red light camera programs in 79 large US cities saved nearly 1,300 lives through 2014, researchers from the Insurance Institute for Highway Safety (IIHS) have found. Shutting down such programs has cost lives, with the rate of fatal red-light-running crashes shooting up 30 per cent in cities that have turned off cameras.
…."Read more Hmmm…Read the original study: W. Hu & J. Cicchino :Effects of Turning On and Off Red Light Cameras on Fatal Crashes in Large U.S. Cities. It is a a reasonably good study (especially the discussion starting on p16) but it did not account for what seems to be an increase in distracted driving in the last 10 years (See especially Figure 1, p12 … the rise of the open circles since 2010.) Also, if we can assume that there are two main reasons why drivers run red lights: 1. they want to, or 2. they were clueless (they didn’t see it, for what ever reasons). Cameras/Fines address #1 (from some perspectives the means (fines) have Draconian overtones, especially in poor communities where the traffic fine can spiral into total ugliness, way beyond its regressiveness. ) No one should run a red light (unless it is the middle of the night and there is zero traffic and you’ve stopped and look both ways and deemed it is safe to proceed and …). So once your behavior has been changed by the red light camera, then why does the behavior revert once they are turned off. Is it because the sign designating it as such has been removed? (How prominent were those signs in the first place?) Then don’t remove the sign, but would ITS still be a fan?. Alain
Half-baked stuff that probably doesn’t deserve your time:
Older stuff that I had missed:
Tesla responds to ‘cover-up’ claims in ‘Montana Autopilot Accident’, offers more details on investigation
F. Lambert, July 23 "…Here’s Tesla official response in full:…" Read more Hmmm…Speaks for itself. Alain
C’mon Man! (These folks didn’t get/read the memo)
Calendar of Upcoming Events:
Recent Highlights of:
J. Markoff, Aug 5, " A roboticist and crucial member of the team that created Google’s self-driving car is leaving the company, the latest in a string of departures by important technologists working on the autonomous car project.
Chris Urmson, a Carnegie Mellon University research scientist, joined Google in 2009 to help create the then-secret effort. …Mr. Urmson has been unhappy with the direction of the car project under Mr. Krafcik’s leadership and quarreled privately several months ago with Larry Page over where it was headed, according to two former Google employees….
Mr. Urmson said he had not decided what he will do next. “If I can find another project that turns into an obsession and becomes something more, I will consider myself twice lucky,” he wrote. Read more Hmmm…Very unfortunate. What a great job he has done. All the best. Alain
M. Ramsey, July 26, " A key supplier of semiautonomous car technology ended a supply agreement with Tesla Motors Inc. following a high-profile traffic fatality in May involving one of the Silicon Valley company’s electric vehicles.
Mobileye NV said it would no longer provide its computer chips and algorithms to Tesla after a current contract ends due to disagreements about how the technology was deployed. Mobileye provides core technology for Tesla’s Autopilot system, which allows cars to drive themselves in limited conditions….Read more Hmmm….Very interesting!! Alain
And in Mobileye’s Short Trip with Tesla : D. Gallagher, July 26, "In the emerging business of autonomous driving, even the safer road isn’t free of potholes….In explaining its move, Mobileye suggested that protecting its reputation was at least part of the rationale. Below is what the company said on the call:… Read moreHmmm….And why in all of this isn’t there a discussion of Automated Emergency Braking (AEB) technology/suppliers?? There must be no consumer/regulatory appeal to AEB? Alain
E. Musk, July 20 "…Integrate Energy Generation and Storage
Create a smoothly integrated and beautiful solar-roof-with-battery product that just works, empowering the individual as their own utility, and then scale that throughout the world. One ordering experience, one installation, one service contact, one phone app….
Expand to Cover the Major Forms of Terrestrial Transport…
With the Model 3, a future compact SUV and a new kind of pickup truck, we plan to address most of the consumer market. A lower cost vehicle than the Model 3 is unlikely to be necessary, because of the third part of the plan described below.
What really matters to accelerate a sustainable future is being able to scale up production volume as quickly as possible. That is why Tesla engineering has transitioned to focus heavily on designing the machine that makes the machine — turning the factory itself into a product….In addition to consumer vehicles, there are two other types of electric vehicle needed: heavy-duty trucks and high passenger-density urban transport. Both are in the early stages of development at Tesla…With the advent of autonomy, it will probably make sense to shrink the size of buses and transition the role of bus driver to that of fleet manager. Traffic congestion would improve due to increased passenger areal density by eliminating the center aisle and putting seats where there are currently entryways, and matching acceleration and braking to other vehicles, thus avoiding the inertial impedance to smooth traffic flow of traditional heavy buses. It would also take people all the way to their destination. Fixed summon buttons at existing bus stops would serve those who don’t have a phone. Design accommodates wheelchairs, strollers and bikes.
As the technology matures, all Tesla vehicles will have the hardware necessary to be fully self-driving with fail-operational capability, meaning that any given system in the car could break and your car will still drive itself safely. It is important to emphasize that refinement and validation of the software will take much longer than putting in place the cameras, radar, sonar and computing hardware.
Even once the software is highly refined and far better than the average human driver, there will still be a significant time gap, varying widely by jurisdiction, before true self-driving is approved by regulators….I should add a note here to explain why Tesla is deploying partial autonomy now, rather than waiting until some point in the future. The most important reason is that, when used correctly, it is already significantly safer than a person driving by themselves and it would therefore be morally reprehensible to delay release simply for fear of bad press or some mercantile calculation of legal liability….It is also important to explain why we refer to Autopilot as "beta"….
When true self-driving is approved by regulators, it will mean that you will be able to summon your Tesla from pretty much anywhere. Once it picks you up, you will be able to sleep, read or do anything else enroute to your destination. You will also be able to add your car to the Tesla shared fleet just by tapping a button on… Read more Hmmm….This is a chock-full vision that sounds pretty good me (and doesn’t have a mention of DSRC, V2V or V2x 🙂 ); except, do I really want to invest to become a "Tesla (AirBnB) Host" or simply use the "Mobility-on-Demand Transit System" (MoDTS) that Tesla or ALK or ???? (unfortunately NJ Transit, the obvious MoDTS operator, will pass.) Alain
S. Musil, July 12, "The most recent crash involved a Model X near the small town of Whitehall, Montana, on Sunday morning, according to the Detroit Free Press. Neither the driver nor the passenger was injured in the single-vehicle crash, the Montana Highway Patrol told the newspaper….The car failed to detect an obstacle in the road, according to a thread posted on the Tesla Motors Club forum by someone who said they’re a friend of the driver. The thread included photos showing the damage to the vehicle.
Tesla said Tuesday that it appears the driver in the crash was using the system improperly.
"As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel. He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway," the spokesman said. He added that the Autopilot feature was being used on an undivided mountain road despite being designed for use on a divided highway in slow-moving traffic….Read more Hmmm….Interesting that Tesla didn’t say that the car began to slow down (as it is supposed to if the driver does not put his/her hand back on the wheel!!!!???? (The "lane-centering" should NOT turn off if the driver does not respond (I believe the Mercedes "997 package" turns off lane-centering if you don’t respond to the buzzer 🙁 (However, since the lane centering on my 2014 S-550 only works if the lane is essentially perfectly straight, and Mercedes has never made an effort to fix/update my software, I rarely take my hands off the wheel. The system is so poor that I can’t tell if lane-centering is just not working or the buzzer turned it off. 🙁 )) , What should happen is that the car should turn on its emergency flashers, slow down at a rate that is proportional to the quality of the road conditions and once it reaches a slow enough speed have the capability to determine if a lane change to the right (in US and …) is safe or a clear shoulder to the right is available. If so, make the lane change and come to a complete stop, all the while announcing to the driver what the system is doing because hands have not been put back on the wheel. After stopping, "AutoPilot" should then turned off as should "AutoPilot" privileges until a "Tesla" representative resets the system. If that doesn’t convince the driver to put "hands-on-wheel", then the car has just averted a possible catastrophe associated with a comatose driver. Alain
Editorial Board, July 11, "A recent fatal crash in Florida involving a Tesla Model S is an example of how a new technology designed to make cars safer could, in some cases, make them more dangerous. These risks, however, could be minimized with better testing (Hmmm….Yes!) and regulations (Still too early, we don’t know enough, yet)…Tesla’s electric cars are not self-driving, but when the Autopilot system is engaged it can keep the car in a lane, adjust its speed to keep up with traffic and brake to avoid collisions. Tesla says audio and visual alerts warn drivers to keep their hands on the steering wheel and watch the road. If a driver is unresponsive to the alerts, the car is programmed to slow itself to a stop.
Such warnings aren’t sufficient, though; some Tesla drivers, as shown in videos on YouTube, have even gotten into the back seat while the car was moving. Such reckless behavior threatens not just the drivers but everyone else on the road, too. (Absolutely!)… If that system (V2V) had been in place, Mr. Brown might have survived. (Sure, but Mr Brown would have had to wait more than his normal expected life span before that system would have been adopted by more than 70% of all vehicles for it to have better than a "coin flip" chance of helping him. What would have helped Mr. Brown is if the Automated Emergency Braking system worked on his Tesla, or if the truck driver had seen him coming (not become distracted) and had not "failed to yield". ) Federal officials could take lessons from the history of airbags and the lack of strong regulations. (This is a VERY appropriate and relevant lesson!)… The agency does not yet have regulations for driverless cars or cars that have driver assistance systems. But when officials do put rules in place, they will have to update them regularly as they learn about how the technology works in practice. Automation should save lives. But nobody should expect these vehicles to be risk-free. (This is very wise. They should also immediately focus on Automated Emergency Braking systems which are the foundation of any Self-driving or Driverless systems. ) Read more Hmmm….Comments in-line above. Alain
May 7 Crash
Hmmm…What we know now (and don’t know):
1. On May 7, 2016 at about 4:40pm EDT, there was a crash between a Tesla and a Class 8 Tractor-Trailer. The accident is depicted in the Diagram from the Police Report: HSMV Crash Report # 85234095. (1) Google Earth images from the site.
2. The driver of the Tesla was Joshua Brown. "No citations have been issued, but the initial accident report from the FHP indicates the truck driver "failed to yield right-of-way."" (2) . Hmmm….No Citations??? Did the truck have a data recorder? Was the truck impounded, if so, how is the truck driver making a living since the crash? Why was his truck not equipped with sensors that can warn him of collision risks at intersections? As I’ve written, driving is one of the most dangerous occupations. Why isn’t OSHA concerned about improving the environment of these workers? Why doesn’t ATRI (the American Trucking Association’s research arm recognize the lack availability/adoption of "SmartDrivingTruck technology" as one of its Critical Issues? Why didn’t his insurance agent encourage/convince him to equip his truck with collision risk sensors. If they aren’t commercially available, why hasn’t his insurance company invested/promoted/lobbied for their development? These low-volume rural highway intersections are very dangerous. Technology could help.
"…(the truck driver)…said he saw the Tesla approaching in the left, eastbound lane. Then it crossed to the right lane and struck his trailer. "I don’t know why he went over to the slow lane when he had to have seen me,” he said…." (2) . Hmmm….If the driver saw the Tesla change lanes, why did he "failed to yield right-of-way"???
"…Meanwhile, the accident is stoking the debate on whether drivers are being lulled into a false sense of security by such technology. A man who lives on the property where Brown’s car came to rest some 900 feet from the intersection where the crash occurred said when he approached the wreckage 15 minutes after the crash, he could hear the DVD player. An FHP trooper on the scene told the property owner, Robert VanKavelaar, that a "Harry Potter" movie was showing on the DVD player, VanKavelaar told Reuters on Friday.
Another witness, Terence Mulligan, said he arrived at the scene before the first Florida state trooper and found "there was no movie playing." "There was no music. I was at the car. Right at the car," Mulligan told Reuters on Friday.
Sergeant Kim Montes of the Florida Highway Patrol said on Friday that "there was a portable DVD player in the vehicle," but wouldn’t elaborate further on it. She also said there was no camera found, mounted on the dash or of any kind, in the wreckage….
…Mulligan said he was driving in the same westbound direction as the truck before it attempted to make a left turn across the eastbound lanes of U.S. Highway 27 Alternate when he spotted the Tesla traveling east. Mulligan said the Tesla did not appear to be speeding on the road, which has a speed limit of 65 miles per hour, according to the FHP…." (2) .
3. "…the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents…" (3). Not sure how Tesla knows what Joshua Brown saw or did not see. Events prior to the crash unfolded over many seconds. Tesla must have precise data on the car’s speed and steering angle, video for those many seconds prior to the crash, as well as, what it was "seeing" from MobilEye’s cameras and radar data. At no time prior to the crash did it see anything crossing its intended travel lane? More important, why didn’t the truck driver see the Tesla? WHAT WAS HE DOING? What was the truck doing. How slow was it going? Hopefully there was a data speed recorder on the truck. Was the truck impounded, if so, how is the truck driver making a living since the crash?
One can also ask: Why was the truck not equipped with sensors that can warn the driver of collision risks at intersections? As I’ve written, driving is one of the most dangerous occupations. Why isn’t OSHA concerned about improving this workplace environment? Why doesn’t ATRI (the American Trucking Association’s research arm) recognize the lack availability/adoption of "SmartDrivingTruck technology" as one of its Critical Issues? Why didn’t the driver’s insurance agent encourage/convince him to equip his truck with collision risk sensors. If they aren’t commercially available, why hasn’t his insurance company invested/promoted/lobbied for their development? These low-volume rural highway intersections are very dangerous. Technology could help.
While the discussion is about AutoPilot, the Tesla also has Automated Emergency Braking (AEB) which is supposed to always be on. This seems more like an AEB failure rather than an AutoPilot failure. The Tesla didn’t just drive off the road, The discussion about "hands-on-wheels" is irrelevant. What was missing was "foot-on-brake" by the Tesla driver and "eyes-on-road" by, most importantly, the truck driver, since he initiated an action in violation to "rules of the road" that may have made a crash unavoidable.
3. "Problem Description: A fatal highway crash involving a 2015 Tesla Model S which, according to Tesla, was operating with automated driving systems (“Autopilot”) engaged, calls for an examination
of the design and performance of any driving aids in use at the time of the crash." (4). Not to be picky, but the initiator of the crash was the failure to yield by the truck driver. Why isn’t this human failure the most fundamental "Problem Description"? If "driving aids" were supposed to "bail out" the truck driver’s failure to yield, why isn’t the AEB system’s "design and performance" being examined. AutoPilot’s responsibility is to keep the Tesla from steering off the road (and, as a last resort, yield to the AEB). The focus should be on AEBs. How many other Tesla drivers have perished that didn’t have AutoPilot on, but had AEB? How many drivers have perished of other cars that have AEB? Seems as if this crash was more about an emergency automated systems failing to apply the brakes, rather than a driver not having his hands-on-wheel. Unfortunately, it is likely that we will eventually have a fatality in which an "AutoPilot" will fail to keep a "Tesla" on the road (or in a "correct" lane), but from what is known so far, this does not seem to be the crash.
4. "What we learn here is that Mobileye’s system in Tesla’s Autopilot does gather the information from the vehicle’s sensors, primarily the front facing camera and radar, but while it gathers the data, Mobileye’s tech can’t (or not well enough until 2018) recognize the side of vehicles and therefore, itcan’t work in a situation where braking is required to stop a Tesla from hitting the side of another vehicle.
Since Tesla pushed its 7.1 update earlier this year, the automaker’s own system used the same data to recognize anything, under adequate conditions, that could obstruct the path of the Tesla and if the radar’s reading is consistent with the data from the camera, it will apply the brakes.
Now that’s something that was put to the test by Model S owners earlier in the week:" (4). See video, "In the last two tests, the Autopilot appears to detect an obstacle as evidenced by the forward collision warning alerts, but the automatic emergency braking didn’t activate, which raised questions – not unlike in the fatal crash.
Though as Tesla explained, the trailer was not detected in the fatal crash, the radar confused it for an overhead sign, but in the tests above, the forward collision warning system sent out an alert – though as evidenced by the fact that the test subject wasn’t hit, the AEB didn’t need to activate and therefore it didn’t. Tesla explains:
“AEB does not engage when an alternative collision avoidance strategy (e.g., driver steering) remains viable. Instead, when a collision threat is detected, forward collision warning alerts the driver to encourage them to take appropriate evasive action. AEB is a fallback safety feature that operates by design only at high levels of severity and should not be tested with live subjects.”…" Read more (5) With all of the expertise that MobilEye has in image processing, it is surprising that it can’t recognize the side of a tractor trailer or gets confused with overhead signs and tunnel openings. If overhead signs (and overpasses and tree canopies) are really the issue, then these can be readily geocoded and included in the digital map database.)
5. It seems that all of the other stuff about DVD player, watching movies, previous postings on YouTube is noise. Automated Collision Avoidance Systems and their Automated Emergency Braking sub-system MUST be more robust a mitigating "failed to yield right-of-way" situations irrespective of the "failure to yield" derived from a human action (as seems to have occurred in this crash) or an "autoPilot" (which doesn’t seem to be the case in this crash). Alain
(1) Self-Driving Tesla Was Involved in Fatal Crash, U.S. Says, June 30 NYT,
(2) DVD player found in Tesla car in fatal May crash, July 1, Reuters
(3) A Tragic Loss, June 30, Tesla Blog
(5) Tesla elaborates on Autopilot’s automatic emergency braking capacity over Mobileye’s system Electrek, July 2, 2016 See also: Understanding the fatal Tesla accident on Autopilot and the NHTSA probe July 2, 2016, Tesla Autopilot partner Mobileye comments on fatal crash, says tech isn’t meant to avoid this type of accident [Updated], July 1,Sunday, May 15, 2016
Chenyi Chen PhD Dissertation , "…the key part of the thesis, a direct perception approach is proposed to drive a car in a highway environment. In this approach, an input image is mapped to a small number of key perception indicators that directly relate to the affordance of a road/traffic state for driving….." Read more Hmmm..FPO 10:00am, May 16 , 120 Sherrerd Hall, Establishing a foundation for image-based autonomous driving using DeepLearning Neural Networks trained in virtual environments. Very promising. Alain
Hearing focus of SF 2569 Autonomous vehicles task force establishment and demonstration project for people with disabilities
U.S. DOT and IIHS announce historic commitment of 20 automakers to make automatic emergency braking standard on new vehicles
Video similar to part of Adam’s Luncheon talk @ 2015 Florida Automated Vehicle Symposium on Dec 1. Hmmm … Watch Video especially at the 13:12 mark. Compelling; especially after the 60 Minutes segment above! Also see his TipRanks. Alain
This list is maintained by Alain Kornhauser and hosted by the Princeton University LISTSERV.