Weekly QuEST Discussion Topics, 18 May

QuEST 18 May 2018

We want to start this week discussing some of the left over topics from last week – specifically some more background information to answer some of the questions that people asked as we shut down on the connection between physiological neurons and artificial neural networks – we will provide some background and then use the Limulus vision system to explain some choices in common models.

We also might provide a quick review of some information about the human visual system.

We also want to discuss some recent interactions on RNN / LSTMs:

https://towardsdatascience.com/the-fall-of-rnn-lstm-2d1594c74ce0

The fall of RNN / LSTM

It only took 2 more years, but today we can definitely say:

“Drop your RNN and LSTM, they are no good!”

But do not take our words for it, also see evidence that Attention based networks are used more and more by GoogleFacebook,Salesforce, to name a few. All these companies have replaced RNN and variants for attention based models, and it is just the beginning. RNN have the days counted in all applications, because they require more resources to train and run than attention-based models. See this post for more info.

Advertisements
Categories: Uncategorized

Weekly QuEST Discussion Topics and News, 11 May

QuEST 11 May 2018

 

This week we will have our colleague Dr. Mike M. present some background material on unsupervised learning. 

Unsupervised learning can be Useful for Finding Intrinsic Classes or the Underlying Structure of the Data

•       Find relationship between input patterns:

–       Premise – similar inputs naturally cluster

•       The similarity measure chosen will determine the effectiveness of the Algorithm

Ideally, data consist of clusters, small variances within clusters, all points within a cluster ‘similar,’ and often represented by cluster center

Set of codewords = codebook; categorize new data by membership (which cluster does the new piece of data belong to?)

Categories: Uncategorized

Weekly QuEST Discussion Topics, 20 Apr

April 20, 2018 Leave a comment

QuEST 20 April 2018

We will have our colleague Prof Bert P provide lead us into a discussion of other approaches to AI (other than neural) that will provide us some of the requirements for our ‘representation’ concerns.  The material from last week that we didn’t get to – see below will be posted.

We want to focus our technical discussions this week on context.  We  have defined some of the characteristics we seek in 3rd wave AI and that included the ability to use/exploit context was listed.  We will start by reviewing previously discussed information about vision systems as an example of perception is all about context, to include Mach bands, negative color after images and also provide the details of what we referred to last week in the experiments that demonstrated the mammalian visual systems uses Gabor function models.  We will also demonstrate how the Limulus visual system can generate similar artifacts, again driving home the point it isn’t about the stimulus in isolation it requires context.

We then want to review material we’ve discussed on ‘context’, specifically material provided by our colleague Mitch Kokar.  This will provide us a path to discuss representations, specifically tools like sematic networks and OWL and statistical relationship learning.

Finally this leads us to a discussion on the difference between semantic Web and semantic interpretation.

  • The Semantic Web is a convention for formal representation languages that lets software services interactwith each other “without needing artificial intelligence.”11
  • The problem of understanding human speech and writing – the semantic interpretation problem-is quite different from the problem of software service interoperability.

–     Semantic interpretation deals with imprecise, ambiguous natural languages, whereas service interoperability deals with making data precise enough that the programs operating on the data will function effectively.

  • Unfortunately, the fact that the word “semantic” appears in both “Semantic Web” and “semantic interpretation“ means that the two problems have often been conflated, causing needless and endless consternation and confusion.

Eventually in later meetings we will planning domain definition language (PDDL) that is used to standardize AI planning language and its relationship to OWL ontological solutions.

 

 

Attachments area

Categories: Uncategorized

Weekly QuEST Discussion Topics, 13 Apr

April 12, 2018 Leave a comment

QuEST 13 April 2018

We want to start this week talking about intellectual property and patenting both in and outside the government.  Cap will discuss some experiences in taking ideas through the process associated with his previous Breast Cancer Detection business to include raising money, intellectual property strategy, provisional and full patents, details of how to read a patent.  We will hopefully also have a representative from the government patenting process to answer specific questions for those who are government employees.

We want to focus our technical discussions this week on context.  Last week when we defined some of the characteristics we seek in 3rd wave AI and that included the ability to use/exploit context was listed.  We will start by reviewing previously discussed information about vision systems as an example of perception is all about context, to include Mach bands, negative color after images and also provide the details of what we referred to last week in the experiments that demonstrated the mammalian visual systems uses Gabor function models.  We will also demonstrate how the Limulus visual system can generate similar artifacts, again driving home the point it isn’t about the stimulus in isolation it requires context.

We then want to review material we’ve discussed on ‘context’, specifically material provided by our colleague Mitch Kokar.  This will provide us a path to discuss representations, specifically tools like sematic networks and OWL and statistical relationship learning.

Finally this leads us to a discussion on the difference between semantic Web and semantic interpretation.

  • The Semantic Web is a convention for formal representation languages that lets software services interactwith each other “without needing artificial intelligence.”11
  • The problem of understanding human speech and writing – the semantic interpretation problem-is quite different from the problem of software service interoperability.

–     Semantic interpretation deals with imprecise, ambiguous natural languages, whereas service interoperability deals with making data precise enough that the programs operating on the data will function effectively.

  • Unfortunately, the fact that the word “semantic” appears in both “Semantic Web” and “semantic interpretation“ means that the two problems have often been conflated, causing needless and endless consternation and confusion.

Eventually in later meetings we will planning domain definition language (PDDL) that is used to standardize AI planning language and its relationship to OWL ontological solutions.

Categories: Uncategorized

Weekly QuEST Discussion Topics, 6 Apr

QuEST 6 Apr 2018

We want to review an article that has been bouncing around some of us and we’ve yet to be able to discuss – “Deep Learning:  A critical Appraisal” by Gary Marcus of NYU:

image001

Categories: Uncategorized

Weekly QuEST Discussion Topics, 23 Mar

March 22, 2018 Leave a comment

QuEST 23 March 2018

We will start this week by having a discussion associated with the Bletchley Park Team (BPT) questions they left with us last week:

1.)  How are you currently documenting (representing) an agent’s interface/API?  How would you like to or plan to do that?

2.)  How are you currently documenting (representing) agent functionality?  How would you like to or plan to do that?

3.)  Will you certify agents’ “quality”?  How would you like to or plan to do that?

4.)  How are “missions” (ie, the goal of some agent composition) described?  How would you like to or plan to do that?

We won’t let the BPT off that easy as what we need them to do is provide us a path towards a solid foundation that we must follow to bring some rigor to the multi-agent ICA (interoperable, composeable, adaptable) solution.  But these conversations are important for the BPT and the algorithmic warfare team (AWT – call sign is still in competition) and the platform team (Colony of Neurons = CoN – call sign is still in competition) to have.  After that discussion we want to review an article that has been bouncing around some of us and we’ve yet to be able to discuss – “Deep Learning:  A critical Appraisal” by Gary Marcus of NYU:

Categories: Uncategorized

Weekly QuEST Discussion Topics and News, 16 Mar

March 15, 2018 Leave a comment

QuEST 16 March 2018

 

Last week our ‘Bletchley Park team’ led a discussion on the ideas related to multi-agent systems – specifically how can one agent develop a model of another agent’s representation.   We want to continue this week with that discussion and provide them feedback on the types of ‘models’ and agents we are interested in. 

 

As a reminder of context – we’ve emphasized that ‘flexible AI’ will require these multi-agent systems that are not originally designed to work together be able to interoperate / compose / adapt.  We’ve defined the axes of flexibility to be in terms of Peer (relationships between agents – supervisor, peer, subordinate), task (being able to do multiple tasks) and cognition (flexible representational options). 

 

We’ve often discussed steps along the way to full flexibility to include Peer ~ interoperable, Task ~ composition, Cognition ~ adaptable.   Our team of mathematicians ‘Bletchley Park team’ have been working on these ideas and will continue a discussion on the first two concepts.

 

Title:  Agent representations for interoperability and composition

 

Speakers:  Cybenko, Erdmann, Oxley

 

Abstract:  A concrete multimodal representation for agents is proposed.

The representation is suitable for interoperability and composition, and is based on ontology and machine learning concepts.  Ideas from recent results on data topology and the manifold hypothesis will be presented and related to this representation.

Categories: Uncategorized