For years, the computer industry has sought memory technologies with higher endurance, lower cost, and better energy efficiency than commercial flash memories. Now, an international collaboration of scientists may have solved many of those challenges with the discovery of thin, molecular films that can store information.
CS Digest Section: Neuromorphic Computing
Intel has announced it is doubling down in the field of artificial intelligence, launching a test platform dubbed Loihi which it describes as a self-learning neuromorphic chip aimed at allowing machines to think and learn more like people.
The TrueNorth computer chip is a "neuromorphic" chip that mimics human neurons and performs unusually advanced computations using far less energy than conventional chips, said Qing Wu, principal electronics engineer at the Air Force Research Laboratory at Wright-Patterson Air Force Base, Ohio. The technology could be a huge boost for artificial
DARPA has awarded contracts to five research organizations and one company that will support the Neural Engineering System Design (NESD) program: Brown University; Columbia University; Fondation Voir et Entendre (The Seeing and Hearing Foundation); John B. Pierce Laboratory; Paradromics, Inc.; and the University of California, Berkeley. These organizations
"The scalable platform IBM is building for AFRL will feature an end-to-end software ecosystem designed to enable deep neural-network learning and information discovery. The 64-chip array's advanced pattern recognition and sensory processing power will be the equivalent of 64 million neurons and 16 billion synapses, while the processor component will consume
Memristors have attracted interest for mimicking synapses in more energy-efficient scalable approaches to "brain-like" neuromorphic computing. However, their intrinsic variability has inhibited the performance of memristor-based neural networks, stymying progress. Now researchers in Beijing have shown that by introducing "fuzziness" into their neural network
Sophisticated cybersecurity systems excel at finding "bad apples" in computer networks, but they lack the computing power to identify the threats directly.
While the steady tick-tock of the tried and true is still audible, the last two years have ushered a fresh wave of new architectures targeting deep learning and other specialized workloads, as well as a bevy of forthcoming hybrids with FPGAs, zippier GPUs, and swiftly emerging open architectures.
The Air Force Research Lab (AFRL) reports good results from using a “neuromorphic” chip made by IBM to identify military and civilian vehicles in radar-generated aerial imagery.
As the last decade ended, ARM’s CTO Mike Muller warned the era of dark silicon was approaching.