Machine Learning for Real: Why Principles, Efficiency, and Ubiquity Matter
真實機器學習:為什麼原則、效率和普遍性很重要
Marc Andreessen famously opined that “software is eating the world.” Recently, various people have suggested that AI is eating software. Deep learning seems to touch every discipline these days, but behind its startling magic tricks, it is surprisingly primitive. Most deep learning today requires vast data centers whose power consumption burdens an already overstressed planet. Add in the latency and loss of privacy, and it is clear that society would benefit from gadgets that do not require connection to the cloud to be “smart”. Even more concerning, however, is the strong dependence of today’s deep learning on folklore: on recipes and anecdotes, rather than scientific principles and explanatory mathematics. Instead, we can develop rigorous, scalable machine learning guided by information theory to create models that are predictive, power-efficient, and cost-effective.
Strong ties link the evolution of computing, semiconductor technology and design automation. The unprecedented growth of system solutions, services and markets are due to the cross-fertilization of various areas of science and technology, where often new problems motivate new solutions in a circular way. Design automation has led engineers in sailing through uncharted territories, in shaping our digital society on robust grounds and in providing us with a launchpad for the future.