A significant breakthrough has been achieved by researchers from Oregon State University and Baylor University in reducing the energy consumption of photonic chips used in data centers and supercomputers. Their findings, published in the journal Scientific Reports, have crucial implications due to the substantial energy usage of data centers, which can consume up to 50 times more energy per square foot than typical office buildings, as reported by the U.S. Department of Energy.
Data centers serve as the central hub for an organization’s IT operations, storing, processing, and disseminating data and applications. In the United States, where companies like Facebook, Amazon, Microsoft, and Google generate and consume vast amounts of data, there are over 2,600 data centers. These facilities account for approximately 2% of the country’s total electricity consumption, according to the DOE.
The research led by John Conley from Oregon State University’s College of Engineering, along with his former colleague Alan Wang (now at Baylor), and OSU graduate students Wei-Che Hsu, Ben Kupp, and Nabila Nujhat, introduces an ultra-energy-efficient method to compensate for temperature variations that degrade photonic chips. These chips are expected to form the high-speed communication backbone of future data centers and supercomputers.
Unlike conventional computer chips that utilize electrons, the circuitry in photonic chips employs photons (particles of light) to achieve rapid and energy-efficient data transmission at the speed of light. However, maintaining the temperature stability and performance of photonic chips has traditionally required significant energy consumption. The research team, however, has successfully demonstrated the possibility of reducing the energy needed for temperature control by over 1 million times.
Conley explains that the breakthrough was made possible by combining Wang’s expertise in photonic materials and devices with his own specialization in atomic layer deposition and electronic devices. By using gate voltage to control temperature, the researchers created working prototypes that require virtually no electric current.
Typically, the photonics industry relies on thermal heaters to fine-tune the working wavelengths of high-speed electro-optic devices and optimize their performance. However, these thermal heaters consume several milliwatts of electricity per device. While this may seem insignificant on an individual basis, the cumulative energy consumption becomes substantial when considering the millions of devices used in data centers and supercomputers. The new method developed by the researchers offers a more environmentally friendly alternative, allowing for faster and more powerful data centers that consume less energy.
The potential impact of this breakthrough extends to applications driven by machine learning, such as ChatGPT, enabling the development of more powerful and energy-efficient systems without contributing to excessive energy consumption and its associated environmental implications.
Source: Oregon State University