In the Netherlands, there's an ambitious company that builds one of the most advanced and expensive tools in the world: a single unit costs hundreds of millions of dollars.
And when companies buy one, they also need 250 engineers to install the 165-ton device in a process that typically takes half a year.
But despite this steep cost in time and money, many microchip makers desperately want one of these machines.
The hundred-million-dollar question is: why?
The answer has to do with something called Moore's Law.
First coined by Intel co-founder Gordon Moore, this law states that every 1 to 2 years the number of transistors that can fit on a given size computer chip will double.
And by extension, the rough number of calculations that chip can do per second will also double.
Now, this law isn't a physical law like gravity.
It's just a trend Moore observed during the early 1960s.
But chipmakers turned that trend into a goal, and in turn, consumers learn to expect computing progress to continue at this exponentially fast pace.
下载全新《每日英语听力》客户端,查看完整内容