In a wide range of areas, including agriculture, e-commerce, education, finance, manufacturing, medical, networking, transportation, and more, developers have employed machine learning to tackle mission-critical problems with great speed and precision.
For many IT practitioners and consultants, ML is a matter of not if but when and how. They are asking themselves questions such as, "What's my use case, design and scale? Which ML techniques will I deploy (language processing, classification, anomaly detection, etc.)? How will I deploy—DIY or with external help—and train the model?"
Recently, however, another set of questions has arisen concerning the external costs and environmental impact of ML. That applies especially to deep learning—neural networks with several layers of interconnected nodes.
Recent Studies
One of the early alerts on this topic was a 2019 paper by three scientists from the University of Massachusetts, Amherst, who estimated that the training alone of one natural language processing (NLP) deep learning model emitted five times more CO2 than an average car consumes across its entire lifetime. (Overall, in commercial deployments to date, the energy consumed while training ML is small compared to the energy required for inference, so the total carbon impact of a single model can be very large.) More studies followed.
The artificial intelligence (AI) market is growing fast. One powerful subset of AI is machine learning (ML), which involves not predefined instructions and fixed algorithms but learned patterns over artificial neural networks. Developers have used ML to solve mission-critical problems with high speed and accuracy in a wide range of domains, including agriculture, e-commerce, education, finance, manufacturing, medicine, networking, transportation and more.
For many IT practitioners and consultants, ML is a matter of not if but when and how. They are asking themselves questions such as, "What's my use case, design and scale? Which ML techniques will I deploy (language processing, classification, anomaly detection, etc.)? How will I deploy—DIY or with external help—and train the model?"
Recently, however, another set of questions have arisen concerning the external costs and environmental impact of ML. That applies especially to deep learning—neural networks with several layers of interconnected nodes.
Reducing Energy Consumption
The Google/UC Berkeley paper showed the complexity of calculating the energy consumption of deep neural networks (DNNs), especially retroactively, but advocated for transparency when practical going forward. To improve energy efficiency, the authors recommended leveraging sparsity in DNNs, using geographical optimization for workload scheduling and paying attention to specific data center infrastructure.
Engineers and scientists are also looking at the physical layer in new ways. An article published last year in Semiconductor Engineering listed 11 ways to reduce AI energy consumption, including some more conventional tips (smaller models, moving less data, sparsity) and others more striking. Note numbers nine and 10: "Using analog circuitry" and "Using photons instead of electrons." In other words, a nonstandard kind of computation.
Business Takeaway
Large data center operators have a record of exploring new computing designs. There was the move from generic CPUs to purpose-built GPUs. About five years ago, Google introduced tensor processing units (TPUs), an application-specific integrated circuit (ASIC) designed to accelerate ML workloads. This search for optimal computing infrastructure is far from over.
What's the practical takeaway from this discussion? Like cryptocurrencies, deep learning applications such as NLP have large and rising energy costs that are becoming more exposed to public scrutiny. As business organizations pay more and more attention to ESG (environment, social and governance) factors, their leaders will need to take these costs into consideration. Looking ahead, the criteria for AI, and ML, more specifically, will include not only speed and accuracy but also energy efficiency.
(This is a slightly modified version of an article originally published in Forbes. The original article can be found at https://www.forbes.com/sites/forbestechcouncil/2022/04/13/ai-and-ml-think-green/?sh=8f88da741a9d)