Developmental robotics using artificial neural networks takes the approach of using a learning AI that must be trained like a baby, knowing nothing at first, but learning vast amounts of knowledge in a cumulative process similar to going to school for 18 years, but in only a few weeks, because the robots can share their learning with each other online, hooked up to wifi and with LTE, just like a fleet of Tesla Model S's!
The key is computer hardware and software that is adaptive, with cumulative ongoing learning and continuos optimization, so that self driving cars will become better as they are used. Better still, the learning from one AI engine in one vehicle can be shared over a 4G LTE modem with all the other cars that have the same AI control autopilot learning robot technology. I do not know what Tesla failed to partner with IBM on their true-north neurosynaptic control technology.
By training the AI, it becomes remarkably efficient at performing complex tasks like driving a vehicle, folding fabric clothing, operating a vacuum cleaner including plugging in the cord to an available electrical outlet, even understanding how to reset the breaker or GFCI if one of them is tripped by a circuit overload. We need learning robots with cognitive developmental skills like a human child growing up learning how to become a full functioning person in the world!
Asleep at the switch, I am not sure why the automakers like Tesla and Toyota are not partnering with Intel and IBM to pipeline neursynaptic learning AI into commercialized autopilot technology for upcoming vehicle updates, new platforms, and new types of vehicles.
The market is rushing autopilot into commercial reality, the fatality with the Model S in autopilot setting a new precedence in US DOT NHTSA laws that covers self driving cars. The technology is rapidly advancing as the redundant computer hardware, sensors, and software needed to give self driving cars intelligent autopilot are being pipelined into commercial production already. Uber is just about to launch hundreds of self-driving cars in a new fleet program test in the real world!
The Tesla Model S was the first mass produced real world vehicle that normal people can buy that came equipped with a sensor package and control system that enables pseudo-autopilot operation under certain specific driving conditions. The Model S receives software updates regularly that will enhance the AI control system enabling these autopilot functions to improve over time. Iterative updates will bring more features to the autopilot system in every Model S. Tesla announced that more radar sensors would be employed rather than machine vision in order to make the autopilot safer faster in revised Model S vehicles.
The Atlas robot from Boston Dynamics underscores how stupid the AI of today (2014-2016) is in comparison to where it will need to be in order to produce a functional human replacement robot mechanic to do actual hard complex work inside a nuclear reactor, or in other super highly dangerous jobs inappropriate for human beings for safety considerations. Ionizing radiation exposure not only destroys unhardened digital electronics, it also causes DNA damage that gives rise to cancer, empirically ionizing radiation causes cancer, fact, indisputable fact, not my opinion, the truth! This is why humanoid robots with the same or better mechanical intelligence control performance as a human are needed. The robot will be able to see in other frequencies using FLIR and GAMA cameras that give it an edge over humans in a nuclear reactor. Similarly, radiation shielding that is too heavy for humans to wear, can be integrated into the robots frame to protect its controllers, AI setup, and associated discrete digital assets from radiation damage.
We have a long way to go before AI achieves super human intelligences in systems like the autopilot controller of a Tesla Model S. We have an even longer way to go before a feasible $20,000 humanoid robot assistant can be useful around homes and business to help people in the real world with drudgery, heavy tasks that cause back injuries, and other repetitive boring tasks that cause wrist, shoulder, hip, elbow, or other human skeletal joint injuries, hard jobs that people can not do safely when they are old, perfect for non-human robots, when robots are significantly advanced enough and affordable enough to actually be feasible as appliances that can help people in a variety of different situations.
Learning Robots Required
Each home is different, every person is different, the robot will need to be able to perform ongoing on the fly learning by watching you teach it something, and will have to have the cognitive performance to also figure things out on its own. Connected to Wifi, the robots like the Asimo will be able to share learnings with each other, essentially re-training via sharing, so that all the robots connected to the internet to each other will become more capable faster! Development robotics using neurosynaptic learning based artificial neural network technology will be the key! I learned about all of this in just a few hours of casual online reading, I am not sure why these huge powerful corporations with lots of intelligent employees are struggling to embrace the cutting edge, when all the information to do it is available online to anyone with fingertips who has the boldness to look it up, and curiosity drive to want to understand.
Playing real physical chess with a robot
I am on round 3 of a chess game on a real wood board that my late father Ken made. The piece are plastic, once upon a time they were hollow and light weight, he added lead shotgun pellets and wood glue, then put felt on the bottom to close it off, now the chess pieces feel substantial. Meg won the first round, I won the second, we are well into the third round, time will tell. For a home assistance robot to be really successful it will need to be able to learn to play chess on the fly. I want a robot that can learn chess over a few rounds after which it becomes vastly better then me. If the robot is going to sort my socks doing laundry in my place, it will have to be more intelligent than Meg or I. The only successful humanoid robots will possess super human intelligences and skills, like the robot depicted in the fictional narrative of the film Robot & Frank, or the robot depicted in Star Trek as Data, or the Robot depicted in Prometheus as David. We have seen many example of scary intelligent robots in media, like the one in Ex-Machina. We have people like Bill Gates III, and Elon Musk warning about the risks of super human artificial intelligence, as if the robots are going to somehow magically develop emotions, dreams or desires that are all very uniquely human. We will have super human super intelligent robots long before we untangle all the mysteries of human consciousness!
Tesla Leading
The Model S already collects a remarkable amount of data from ever minute of driving that someone does behind the wheel. It feeds this data to a central data management system at Tesla that sends out software updates once the machine learning from many vehicles is integrated into a new software update that shares the learning with all Model S's. In this way the fleet of Tesla vehicles today is learning as one, one big group. Shared developmental learning AI like this is the key to ever improving AI based autopilot.
Goggle is using overpowered LIDAR because it can handle a lot more data than Tesla. Google has its own self driving car project, and they are at the forefront of autopilot development right alongside Tesla. Elon Musk thinks that LIDAR produces way too much data, but Google insists that the most detailed picture will train the AI to perform better, safer, faster. I guess we will see who ends up prevailing, I suspect that Google will do better if they hardness the power of more data! The future is all about information! The information gives us the ability to do things with matter using energy, the information is the power of knowledge!
No comments:
Post a Comment