Physician John Logan Black has been in the field of genetic testing for more than a decade. As co-director of the Mayo Clinic's personalized genomics laboratory, he remembers when it was "a dream" to one day be able to provide physicians with individual genetic tests so they could prescribe the right drugs.
That dream has been realized — years early. Mayo and the Minneapolis venture firm Invenshure late last week announced the formation of Oneome, which provides just that service.
Lots of things made Oneome possible, perhaps none as fundamental as what's under the hood of the computers now stacked in a server rack in a Mayo building in Rochester — what Invenshure partner Danny Cunagin called "a big data platform."
In 2014, off-the-shelf computers running yet another big data application rarely make news. That's the amazing part, just how few people appreciate just how mind-blowingly remarkable having that kind of capability really is.
That was the powerful message delivered in Minneapolis recently by the Silicon Valley writer Michael S. Malone, author of a new history of Intel Corp.
He made a convincing case that the Intel trio of Robert Noyce, Gordon Moore and Andy Grove put the technology industry on its 50-years-and-counting trajectory of ever faster and cheaper digital devices. It's because of these three that tasks once only conceivable in an advanced research lab have by now become routine.
Malone focused his talk on the remarkable history of Moore's Law, named for scientist and Intel co-founder Gordon Moore.
It doesn't explain a scientific principle. It was mostly an observation, that the total number of transistors on an integrated circuit seemed to double every 18 months or so and should continue to do so.