Some of the challenges of the application of machine learning to natural language include the environmental costs of models being built. Mainstream approaches focus on very large models that manipulate language by means of statistics and require huge amounts of training data and computing power to get a superficial understanding of language. In his talk at AI Global Forum 2020, Francisco Webber, co-founder and CEO of Cortical.io, argues that the future of natural language understanding is not building bigger, data-hungry and energy-inefficient models that only appear to understand language, it is to take an efficient approach towards AI leveraging what biology teaches us. By replicating the actual cognitive processes in the brain, Semantic Folding proposes a highly efficient NLU model that understands semantics at a fine-grained level.
December 18, 2020