Dushan Balisson
Native architecture for artificial intelligence
Balisson, Dushan; Melis, Wim J.C.; Keates, Simeon
Authors
Wim J.C. Melis
Simeon Keates
Abstract
Introduction
The brain is a complex organ and even to this date, very little is known about how it works. Through the years, replicating the intelligence of the brain has puzzled many scientists, and most of this work can be broadly classified into two categories – Neurophysiological or Cognitive based. The latter approach tends to overlook the actual structure of the brain in order to focus on the behaviour itself.
This is e.g. exemplified by the Turing test [1], which implies that if a human interacts with an artificial machine and identifies it as a human, then this artificial machine is considered sufficiently humanlike. Proponents of the neurophysiological approach argue that the intelligence of the brain lies in its structure, hence if this structure can be replicated, one should be able to replicate human intelligence. In this context, one of the most complete neurophysiological models of the neuron is the Hodgkin-Huxley model [2], which has served as a reference for biological plausibility of subsequent neural models. However, a major issue with the Hodgkin-Huxley model lies in the complexity to use it for the implementation of a complete and useful network. On the other extreme, one can find simple models such as the Leaky Integrate and Fire (LIF) model [3], which is the most widely used neural model, but is commonly regarded to be oversimplified. While this model is biologically implausible, it is computationally viable and can therefore be implemented into relatively large networks to study their behaviour and dynamics.
Even further simplified networks are already used to feed the need for intelligent machines that are able to learn. Today, one of the most successful machine learning paradigms is the Artificial Neural
Network (ANN). ANNs implement a self-improving function by giving a certain output, based on certain inputs, where each neuron is modelled as a transfer function. These systems have shown
commendable performance and form the basis for certain popular services, such as the Google Brain Project that provides YouTube users with recommended videos based on their viewing history [4].
However, the computational complexity of these tasks is not to be underestimated, as the Google Brain combines 16000 computers to deliver the capabilities of a rat’s brain [5]. This raises the question as to why our current systems are so largely inefficient when it comes to delivering brain-like functionality. Considering that the theoretical limit of silicon feature sizes is getting closer and that the collapse of Moore’s law is upon us, it seems more than necessary to get back to the drawing board and start considering alternative platforms that are more efficient and run without the huge sematic gap.
Citation
Balisson, D., Melis, W. J., & Keates, S. (2017, February). Native architecture for artificial intelligence. Paper presented at 1st HBP Student Conference: Transdisciplinary Reserach Linking Neuroscience, Brain Medicine and Computer Science., Vienna, Austria
Presentation Conference Type | Conference Paper (unpublished) |
---|---|
Conference Name | 1st HBP Student Conference: Transdisciplinary Reserach Linking Neuroscience, Brain Medicine and Computer Science. |
Start Date | Feb 8, 2017 |
End Date | Feb 10, 2017 |
Publication Date | 2017 |
Deposit Date | Feb 19, 2019 |
Book Title | Human Brain Project - First Student Conference |
Keywords | Artificial intelligence; Hardware architecture |
Public URL | http://researchrepository.napier.ac.uk/Output/1497808 |
Related Public URLs | https://education.humanbrainproject.eu/web/studentconference |
Downloadable Citations
About Edinburgh Napier Research Repository
Administrator e-mail: repository@napier.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search