Talk
Paul Smolensky
Johns Hopkins University
Unifying discrete linguistic computation with continuous neural computation

GSC’s novel neural architecture — capable of encoding and processing symbol structures — will be presented, and the new grammatical theories that emerge from this architecture will be described and illustrated: theories in which grammars are evaluators of well-formedness, and grammatical structures are those that are maximally well-formed or optimal. Gradient Symbol Structures will be defined: these are structures (such as phonological strings or syntactic trees) in which each single location hosts a blend of symbols, each present (or active) to a continuously variable degree.

See http://ling.umd.edu/events/archive/999/ for more information.

Wednesday, November 16, 2016
3:30pm

Maryland Room, Marie Mount Hall