Sydney Martinez, Campus Carrier news editor
The cost of technology has risen approximately 15% since late 2025, according to TechRadar, due to the widespread use of artificial intelligence (AI) and the expansion of data centers. More specifically, shortages of random-access memory (RAM) and graphics processing units (GPU) typically used in phones, computers and laptops have caused this inflation.
“One thing that stands out to me about how [AI has] impacted the cost of technology is that there is a thing called a GPU, and it’s really good at doing lots of things that parallel operation being run at once,” Christopher Whitmire, lecturer of creative technology, said. “Often, GPUs are useful for that.”
GPUs and CPUs are used by AI to complete “machine learning.” This is the process of getting AI to learn algorithms that imitate human-like learning. In order to speed this process up, GPUs are desirable to complete advanced parallel processing, or multiple operations at a time.
“There is now a higher need for parallel processing, so like rapid computational power,” Whitmire said. “Depending on the type of work you’re trying to do, you might be using GPUs to speed things up. Because there’s so much more demand for that, the cost is huge because GPUs are in a lot of different devices, especially PCs or game consoles, it’s driving up the cost of a lot of those [items].”
Whitmire said that AI companies are buying GPUs because they use the chips to train AI models, such as ChatGPT.
“They’re also very useful for training AI models,” Whitmire said. “I know a big thing that’s going on right now is that GPU cost has skyrocketed here the past couple of years because there’s all these data centers being built, so there’s a huge demand for these AI companies to need GPUs.”
Whitmire said that NVIDIA would stop producing GPUs for their video game consumers and begin focusing on making them for AI companies. In fact, 2026 will be the first year in three years that NVIDIA will not release a new GPU generation for retail consumers. According to CNBC, this shift in focus began back in 2020.
“I remember seeing some gaming community, they were outraged,” Whitmire said. “NVIDIA is for video games. They’re the major player in that sphere. So there were people that were very upset by that.”
Brook Bowers (13C), visiting assistant professor of computer science, has seen headlines that blame the increase in cost of RAM is due to AI companies.
“We have headlines that say ‘Because of the rise in popularity of certain AI technologies and techniques, then it seems to be causing shortages for the general consumer population’,” Bowers said.
Bowers said that the rise in popularity of AI takes a significant amount of RAM, causing the cost of parts to increase in price.
“Every computer uses memory, and so then suddenly even consumer devices are being impacted by [AI companies’] consumption,” Bowers said.

Bowers said that the price of GPUs has increased because AI companies have begun to operate on faster GPU systems. Many manufacturers, like NVIDIA, have begun to cater to AI companies after realizing the profit they could make off selling their chips.
“A good chunk of the marketplace was for consumer consumption, for games or media,” Bowers said. “Then some of these AI models have started leveraging GPUs and then that sector recognized, ‘Well, there’s a lot of GPUs out there that we could be buying and getting some value out of’.”
Bowers said that there are other factors that could be increasing the price of technology that aren’t related to AI.
“A number of years ago, maybe half a decade ago, there was a component shortage,” Bowers said. “It wasn’t through consumption, but [other causes] can be natural disasters in a region where these [computer parts] are manufactured or how many suppliers there are for things like memory.”
Sophomore Orlando Santiago noticed that his computer would cost much more to build today than in 2020. The cost of RAM has seen an almost $220 increase.
“I built my computer in 2020,” Santiago said. “I bought my DDR5 eight gigabyte sticks of RAM. I have four of them. They were about $80 for one, now they’re about $300 each.”
Gartner, a global research and advisory firm, estimated a 17% increase in computer prices by the end of 2026, compared to prices from 2025.
Santiago read a February article in the Los Angeles Times about large tech companies, such as Apple and Tesla, expecting a shortage of dynamic random-access memory, a type of RAM.
“What AI [companies are] doing specifically is that they are buying bulk from these companies that can only make them in certain environments,” Santiago said. “Taiwan’s really big because you need a specific factory in specific settings and a specific climate to make these chips, and [Alphabet Inc. and OpenAI] bought out all of them.”
Whitmire described the AI bubble, the potential over-validation of AI companies. The bubble may burst if AI companies are not able to grow. Whitmire says that if the market for AI were to collapse, technology would become cheaper in a couple of years.
Whitmire said that while other markets use GPUs, AI is the main driver of the recent price increase.
“It’s hard to tease out how much of the increasing cost is just general inflation,” Whitmire said. “At some level, everything’s getting more expensive, just for a variety of reasons. But yeah, for technology, I could also see, because new apps or like services or things like that, they’re all trying to integrate AI tools, so there’s all these new features. You have to pay for that.”
Whitmire proposed two solutions to mitigate the increase in cost of technology. He said that either the industry could produce more GPUs and CPUs or a new policy could be created to change the way AI is used.
“Maybe it could be that if [processing unit manufacturers] can just somehow produces more GPUs, that would drive the price down,” Whitmire said. “Maybe kind of interestingly is that if there is also policy or just a change in culture, in terms of wanting to use these tools, that could also in the long term make prices go back down.”
