Home > Uncategorized > Weekly QuEST Discussion Topics and News, 12 June

Weekly QuEST Discussion Topics and News, 12 June

QuEST 12 June 2015

We will have a shortened meeting – Capt Amerika has to leave by 12:30 for another commitment –

I want to briefly hit a news story about NaSent – Neural Analysis of Sentiment – they are using recursive deep learning to attack a problem we have previously discussed sentiment analysis –

http://engineering.stanford.edu/news/stanford-algorithm-analyzes-sentence-sentiment-advances-machine-learning

STANFORD ALGORITHM ANALYZES SENTENCE SENTIMENT, ADVANCES MACHINE LEARNING

Stanford algorithm analyzes sentence sentiment, advances machine learning

NaSent is a powerful new ‘recursive deep learning’ algorithm that gives machines the ability to understand how words form meaning in context.

Tom Abate | Stanford Engineering

People express opinions every day on issues large and small. Whether the topic is politics, fashion or films, we often rate situations and experiences on a sliding scale of sentiment ranging from thumbs up to thumbs down.

As we increasingly share these opinions via social networks, one result is the creation of vast reservoirs of sentiment that could, if systematically analyzed, provide clues about our collective likes and dislikes with regard to products, personalities and issues.

Against this backdrop, Stanford computer scientists have created a software system that analyzes sentences from movie reviews and gauges the sentiments they express on a five-point scale from strong like to strong dislike.

The program, dubbed NaSent – short for Neural Analysis of Sentiment – is a new development in a field of computer science known as “Deep Learning” that aims to give computers the ability to acquire new understandings in a more human-like way.

Then the next topic is where we ended last week – our colleague Morley S passed us a link that led us to an article on Nature Reviews Neuroscience – we want to have a discussion on how this aligns with QuEST – in our 2008 ‘Life and Death of ATR’ paper we wrote when discussing the QuEST architecture:

  • Architecture: The QUEST architecture can be illustrated by the representation of the concept of a Grandmother.  The representation will consist of the dynamic formation of a Grandmother linkset (as opposed to a Grandmother cell).   Any sensory stimulus will proceed down multiple parallel paths undergoing a hierarchical feed forward ‘infraconscious’ decomposition.  There is also a simultaneously occurring feedback (prediction), qualia generating loop.  The architecture allows for a competing attention mechanism between the parallel paths.  Continued processing of an unresolved concept can take place and when success occurs the quale of ‘Aha’ is generated.  The sensory data has been qualiarized.

–     Physics Based Models: QUEST will learn about entities by taking into account not only measured data, but also experience and knowledge (physics based models).   These sources will be drawn on as necessary to assist in modulating the representation of the relevant concept.

–     No Cartesian Theater: There is no need to present the world as it really exists for exploitation.  The Qualia Cartesian theater projects the qualia evoked by the unconfirmed predictions of the world being sensed  for exploitation.

–     Blind Sight: There is no reason for humans to be aware (conscious) of everything that is being measured by their sensors.   The majority of sensory input confirms the predicted state of the world.  However, if forced to, subjects have been shown to have the ability to access this raw, unqualiarized, data to a certain degree.

–     Prediction: We generate a continuous set of predictions (in space, time, spectra and other sensory channels) that can allow for optimization of the world model via quality of response to the stimuli.  These prediction functions of qualia are required for the facile understanding of the significance of the input stimuli.  By generating continuous predictions they enable the system to operate efficiently in real time.

–     …

We would like to use the material from the following article to discuss the QuEST architecture and maybe use the Sandy V malware categorization research as a tapestry for the discussion.

http://www.sciencedaily.com/releases/2015/06/150602130553.htm

Epicenter of brain’s predictive ability pinpointed by scientists

Date:

June 2, 2015

Source:

Northeastern University

Summary:

In recent years, scientists have discovered that the human brain works on predictions, contrary to the previously accepted theory that it reacts to outside sensations. Now, researchers have reported finding the epicenter of those predictions.

Neuron cell illustration (stock image). “The unique con­tri­bu­tion of our paper is to show that limbic tissue, because of its struc­ture and the way the neu­rons are orga­nized, is pre­dicting,” Bar­rett said. “It is directing the pre­dic­tions to every­where else in the cortex, and that makes it very powerful.”

Credit: © whitehoune / Fotolia

Neuron cell illustration (stock image). “The unique con­tri­bu­tion of our paper is to show that limbic tissue, because of its struc­ture and the way the neu­rons are orga­nized, is pre­dicting,” Bar­rett said. “It is directing the pre­dic­tions to every­where else in the cortex, and that makes it very powerful.”

Credit: © whitehoune / Fotolia

Close

In recent years, sci­en­tists have dis­cov­ered the human brain works on pre­dic­tions, con­trary to the pre­vi­ously accepted theory that it reacts to the sen­sa­tions it picks up from the out­side world. ** if QuEST is right we would change this statement and say consciousness works on prediction whereas the reflexive responses of sys1 work on sensory data ** Experts say humans’ reac­tions are in fact the body adjusting to pre­dic­tions the brain is making based on the state of our body the last time it was in a sim­ilar situation.

Now, Uni­ver­sity Dis­tin­guished Pro­fessor Lisa Feldman Bar­rett at North­eastern has reported finding the epi­center of those predictions.

In an article pub­lished in Nature last week, Bar­rett con­tends that limbic tissue, which also helps to create emo­tions, is at the top of the brain’s pre­dic­tion hier­archy. She co-authored the paper with W. Kyle Sim­mons, of the Lau­reate Insti­tute for Brain Research in Tulsa, Oklahoma.

“The unique con­tri­bu­tion of our paper is to show that limbic tissue, because of its struc­ture and the way the neu­rons are orga­nized, is pre­dicting,” Bar­rett said. “It is directing the pre­dic­tions to every­where else in the cortex, and that makes it very powerful.”

For example, when a person is instructed to imagine a red apple in his or her mind’s eye, Bar­rett explained that limbic parts of the brain send pre­dic­tions to visual neu­rons and cause them to fire in dif­ferent pat­terns so the person can “see” a red apple.

Bar­rett is a fac­ulty member in the Depart­ment of Psy­chology and is director of the Inter­dis­ci­pli­nary Affec­tive Sci­ence Lab­o­ra­tory. A pio­neer in the psy­chology of emo­tion and affec­tive neu­ro­science, she has chal­lenged the foun­da­tion of affec­tive sci­ence by showing that people are the archi­tects of their own emo­tional experiences.

In the Nature paper, Bar­rett sum­ma­rized research on the cel­lular com­po­si­tion of limbic tissue, which shows that limbic regions of the brain send but do not receive pre­dic­tions. This means that limbic regions direct pro­cessing in the brain. They don’t react to stim­u­la­tion from the out­side world. This is ironic, Bar­rett argues,because when sci­en­tists used to believe that limbic regions of the brain were the home of emo­tion, they were seen as mainly reac­tive to the world.

Common sense tells you that seeing is believing, but really the brain is built for things to work the other way around: you see (and hear and smell and taste) what you believe. ** QuEST / R.L. Gregrory quote ** And believing is largely based on feeling. In her paper, Bar­rett shows that your brain is not wired to be a reac­tive organ. It’s wired to ask the ques­tion: “The last time I was in a sit­u­a­tion like this, what sen­sa­tions did I encounter, and how did I act?” And the sen­sa­tions that seem to matter most are the ones that are inside your own body, which are called “interoceptions.”

“What your brain is trying to do is guess what the sen­sa­tion means and what’s causing the sen­sa­tions so it can figure out what to do about them,” Bar­rett said. “Your brain is trying to put together thoughts, feel­ings, and per­cep­tions so they arrive as needed, not a second afterwards.”

Story Source:

The above story is based on materials provided by Northeastern University. The original article was written by Joe O’Connell. Note: Materials may be edited for content and length.

Journal Reference:

  1. Lisa Feldman Barrett, W. Kyle Simmons. Interoceptive predictions in the brain. Nature Reviews Neuroscience, 2015; DOI: 10.1038/nrn3950

Northeastern University. “Epicenter of brain’s predictive ability pinpointed by scientists.” ScienceDaily. ScienceDaily, 2 June 2015. <www.sciencedaily.com/releases/2015/06/150602130553.htm>.

news summary (13)

Advertisements
Categories: Uncategorized
  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: