An unconscious machine cannot be intelligent. That includes AI behemoths like IBM's WATSON and the Tesla Autopilot, although the latter has all of the sensors it would need to become conscious. The reason there are no conscious machines yet is that none have an agent to be conscious—there is no person in there. In humans, that agent is a region of the mid-brain that is intimately connected to all of its sensors. In fact, there is a column of neurons for every picture element in your eyes, for every sound frequency detected by your ears, for every smell from your nose, and for every sensation from your skin and body. This region of the brain acts as a single person to govern the behavior of all parts of that human.
HAL—aka Hyper Aware Logic—is the first to replicate the known circuitry of this region of the brain to make machines conscious decision makers like us, meaning they can react in real time to unforeseen circumstances, they can exercise judgement when interacting with other machines and humans, and they can learn from instruction and practice the way we do. And they can navigate a four way stop in a car.
HAL will be available on any GPU-based server in early 2019. One of the critical benefits of adding HAL will be a very large power savings, typically a factor of at least 10. This is owing to the achievement of the human capability to focus its computing resources on one thing at a time, rapidly shifting its attention as needed.
John Carson, CEO/VP, HAL Systems