Site Administrator, Forum General Manager, BSOD Kernel Dump Expert
- Feb 19, 2012
- New Jersey Shore
Waymo is owned by Alphabet, Inc., i.e., Google.
Waymo Rolls Out First Phase of Commercial Autonomous Ride-Hailing - Rental Operations - Auto Rental News
Waymo Rolls Out First Phase of Commercial Autonomous Ride-Hailing Service
Waymo launched the next phase of its self-driving ride-hailing service in Phoenix today.
Photo via Waymo.
For now, only a few hundred people will have access to the Waymo One app and ride-hailing service, which will feature a safety driver during all rides. Vehicles will be available 24/7 and can be taken to several cities in the Phoenix metropolitan area, including Chandler, Tempe, Mesa, and Gilbert.
Riders will see a price estimate in the Waymo app before they request their ride.
Waymo officials said in a company blog post that they hope to make Waymo One available to more people as they add vehicles and service areas.
"There's a long journey ahead, but we believe that Waymo One will make the roads safer and easier for everyone to navigate," Waymo's CEO John Krafcik wrote in a Medium post.
Waymo will also continue its early rider program, which has been in operation in the Phoenix area since 2017. Unlike those who are part of the Waymo One program, early riders will be required to continually provide feedback to Waymo on their ride experiences.
Waymo, owned by Google parent Alphabet Inc., is also currently working on expanding its autonomous vehicle testing California's Bay Area.
Related: California Approves Waymo's Autonomous Vehicle Testing
What are your views on driverless/self-driving cars?
Please post your response(s) as I would like to know your views as well as any personal interaction you've had with these cars. I'm also positive that many other people would also be interested in whatever else you had to say/ care to contribute as well.
I know of at least 2 fatalities here in the USA where if the driver of the Driverless/ Self-Driving Cars had been paying attention, I believe both deaths could have been avoided.
- Self-Driving Uber Car Kills Pedestrian in Arizona, Where Robots Roam (19 March 2018) - Self-Driving Uber Car Kills Pedestrian in Arizona, Where Robots Roam - The New York Times
- Uber Suspends Tests of Self-Driving Vehicles After Arizona Crash (unrelated to above story) - (25 March 2017) - Uber Suspends Tests of Self-Driving Vehicles After Arizona Crash - The New York Times
- Tesla's Self-Driving System Cleared in Deadly Crash - (19 January 2017) - https://www.nytimes.com/2017/01/19/business/tesla-model-s-autopilot-fatal-crash.html?module=inline
From what I've read elsewhere, a self-driving car will always take the path of least resistance to minimize damage to itself (the car) and the so-called driver. So, if you are about to smash into the rear end of a cement mixer but the car determines that the right lane is clear (it did not see the cyclist for *whatever* reason, the system will immediately move the car into the right lane to avoid the cement mixer in its forward path, not really caring (nor noticing, in this case) the fact that there was a bicyclist in the right lane and the car would definitely hit the bike and the person on it.
However, the driver of the driverless/self-driving car would have very minimal physical injuries, if any, and his car would also sustain extremely minimum damage. So, the risk of hitting the extremely heavy cement mixer was avoided.
I have no idea whether the driverless car utilizes AI (Artificial Intelligence) software or not. I first encountered AI back in the late 1980s and honestly could not see any extreme benefits over using it vs. programming an app that would learn patterns of behaviors; select "favorites" - per a favorites type file, etc... I was in the dark as to what companies like my employer, E. I. DuPont de Nemours & Co., Inc., were spending $100,000s/ year on to purchase AI software. What I know now is that within 5 years, the AI software was scrapped and the programmers and Systems Analysts were all back to work writing new programs to take the place of the AI software.
I certainly hope that things have changed for the better today.
Why someone would be in this position (the cement mixer/bicycle story) to begin with is not really relevant here. I can recall times in which I've had to swerve into the lane next to me to avoid an accident, but would have to say that there was -0- time to determine if there was a cyclist (or other) coming up the side/lane that I moved in to. I just thank God that all of my close-calls have worked out on my side with no damages nor injuries to anyone.
I think that self driving cars are a long ways away from being 100% approved by government, if they ever are sanctioned by the government for everyday use.
One thing that no one really seems to talk about is what happens when the self driving car breaks down "during flight".
It could be something simple - like a coolant hose ruptures -- this scenario would not immediately affect the performance of the car, but would within ~15 or so minutes or less as the engine temperature increases and engine seizure nears.
Or it could be a catastrophic failure - like a tire blow-out that causes the car to become uncontrollable at 70-80 miles per hour (or more - like 90 mph or even 100 mph - my favorite speed ranges on the Interstate Highways and Turnpikes).
Rarely are either of these scenarios deadly or super-costly. But with a self-driving car, both deadly and costly could be true. I don't know for example if a self-driving car would have extra maintenance features built into it that would do something like turn off the engine if it hits x-degrees to avoid a meltdown/seizure. I'm sure that there would be plenty of warnings that would bring an extreme overheat condition to the driver's attention before a complete engine shut-down (where you would lose power steering, power brakes and very likely airbags as well).
Thank you for your participation.
J. C. Griffith