Self driving "off-road" Subaru
10,000 NM wow , thats torque !
Halo Project spawns an all-electric, self-driving Subaru 4x4 fit for the apocalypse
Printable View
Self driving "off-road" Subaru
10,000 NM wow , thats torque !
Halo Project spawns an all-electric, self-driving Subaru 4x4 fit for the apocalypse
The Expo 88 monorail was driverless , I dont know if people knew that but the concept is slightly different as the monorail had guidewheels to keep it on the track.
Seaworld bought them and converted them back to driver controlled and actually over time increased the control the driver had over the vehicle. They could have been left with very simple safety contols in an automated sequence but that is a totally different story of power play.
The Gold Coast's light rail doesn't really need a driver unless an emergency stop is needed outside of its operating envelope.
So it shouldn't be too difficult to control and monitor driverless vehicles.
That's of course spread over 4 wheels, how would this translate to flywheel torque?
Of course the bit that interests me more is the little gem hidden in another report of this vehicle...
MSU debuts ‘Halo Project’ supercar in Las Vegas | Mississippi State UniversityQuote:
—Next-generation lithium ion battery produced by Michigan-based A123, an international leader in battery technology, enabling the vehicle to travel an estimated 230 miles on a single charge. The battery has more than 50 percent more energy capacity than the previous generation.
Your link wouldn't load any pictures for me so I googled it. [smilebigeye]
The Lion Air disaster looks as if it may be shaping up as a caution for autonomous vehicles.
One of the major reasons for the crash now appears to be a system that automatically prevents the aircraft stalling malfunctioned, quite possibly because of a software problem. (The apparently faulty angle of attack sender had already been replaced)
If this sort of issue can happen with a commercial airliner, why would anyone think that it won't happen to the software in autonomous cars - where, because they are operating in a far less structured environment, the software will be vastly more complex?
I have watched numerous air crash investigations on tv, and there have been heaps of incidents that have been caused either directly or more commonly indirectly by autonomous functions.
One pilot even pulled the controls back when he should have been pushing forward, which fully stalled the plane and it crashed with all on board dying.
That would be the Air France crash, where a major contributing factor was that the fly-by-wire system did not provide feedback to one control stick of what the other was doing, so the captain was (correctly) pushing the stick forward to unstall the aircraft, while the copilot was (incorrectly) pulling his stick back.
No doubt the autonomous vehicle proponents will point out that these issues affect a system that is partly under computer control and partly under pilot control, and blame the accidents on this interface.
But ultimately, they are errors, probably in design of the software rather than in implementation, where all the possible consequences of a particular design decision have not been apparent - until they actually lead to an accident investigation.
The one i mentioned was where they lost a lot of instruments at night. The captain told the co pilot to gain air speed because they realised they were going too slow, then the captain realised the co pilot was pulling up rather than pushing forward. Too late.
The co pilot presumably did that due to a lack of experience; too little training and too much time spent not flying a plane.
I can come up with a few other examples i reckon.
There is a fundamental, conceptual problem with a mixed hierarchy of control (automation vs human), and automation degrading competence.
Maybe the Navy has a better use for AVs, where stalling isn't as catastrophic.
Navy explores use of autonomous vehicles for dangerous, dirty and dull work - ABC News (Australian Broadcasting Corporation)
In situations where there is little or no automation, the pilot's basic job may be more complex, but it shouldnt get too much more complex than that.
Where there is lots of automation, situations have occurred where it is just too complex to try to faultfind the issue(s) in the time available.
There seem to be switches that means "this will happen when such and such happens". If it is forgotten during an earlier phase, because of the automation the pilot may not think of it at the critical moment when it would normally have been thought of.
There was one incident where they were flying through volcanic ash, but cant quite remember what happenned there.
Its all tricky stuff.