bengio-and-marcus.png

Monday's historic debate[1] between machine learning luminary Yoshua Bengio and machine learning critic Gary Marcus spilled over into a tit for tat between the two in the days following, mostly about the status of the term "deep learning."

The history of the term deep learning shows that the use of it has been opportunistic at times but has had little to do in the way of advancing the science of artificial intelligence. Hence, the current debate will likely not go anywhere, ultimately. 

Monday night's debate found Bengio and Marcus talking about similar-seeming end goals, things such as the need for "hybrid" models of intelligence, maybe combining neural networks with something like a "symbol" class of object. The details were where the two argued about definitions and terminology. 

In the days that followed, Marcus, in a post on Medium[2], observed that Bengio seemed to have white-washed his own recent critique of shortcomings in deep learning. And Bengio replied, in a letter on Google Docs[3] linked from his Facebook account[4], that Marcus was presuming to tell the deep learning community how it can define its terms. Marcus responded in a follow-up post[5] by suggesting the shifting descriptions of deep learning are "sloppy." Bengio replied again late Friday on his Facebook page with a definition of deep learning as a goal, stating, "Deep learning is inspired by neural networks of the brain to build learning machines which discover rich and useful internal representations, computed as a composition of learned features and functions." Bengio noted the definition did not cover the "how" of the matter, leaving it open. 

Also: Devil's in the details in Historic AI debate[6]

The term "deep learning"

Read more from our friends at ZDNet