SEATTLE — For PGS, an oil-imaging company in Oslo, Norway, finding pockets of oil and natural gas in the ground essentially starts by taking a large ultrasound picture of Earth.

"It involves huge amounts of data," said Guillaume Cambois, PGS' executive vice president of imaging and engineering. "And, of course, time is of the essence."

PGS, short for Petroleum Geo-Services, this year tried to speed up that work by buying a supercomputer built by Seattle-based Cray. The computer, housed in several pantrylike cabinets, takes PGS' massive library of images and data, and applies algorithms to get crystal-clear pictures that speed the complex task of finding oil.

PGS is one of the several businesses that have been buying Cray's supercomputers, a shift for a company that has traditionally sold to government agencies and academic institutions. Cray said 15 percent of its revenue last year, expected to fall between $720 million and $725 million, came from sales to businesses. That's double the percentage from 2014.

Cray's shift comes at a time when demand for cloud computing — which allows businesses access to greatly expanded computing power — is rising as corporate big-data needs increase. But demand for supercomputers is also strong; the massive pieces of technology do some things that the cloud just can't.

Cray isn't alone in this. At IBM, sales grew last year for just one of its more than a dozen lines of business, according to estimates by investment bank UBS. That would be mainframes, the giant computers with tons of processing power that Big Blue has been selling for decades. Sales by IBM's System Z unit soared 30 percent, to $2.8 billion, UBS estimates.

Market-research firm IDC says sales of high-performance computers reached $10.22 billion in 2014 and estimates the market will grow 8.6 percent a year in the following five years, topping $15 billion in 2019.

The uptick in sales of giant computers by Cray, IBM and others bucks decades of struggles to compete with smaller computers and the cloud. It's also a reminder that established technologies sometimes show surprising staying power in the face of rapid change.

The history of technology is largely a story of new innovations competing to elbow out the old. Personal computers were the death knell for the typewriter. The iPhone started a wave of change that would dethrone cellular-phone giants Nokia and BlackBerry.

The Seattle area, home to Amazon Web Services and Microsoft's Azure platform, is the epicenter of what technology analysts say is a once-in-a-generation shift in how people and businesses deal with their digital goods.

Still a demand

But that move toward cloud computing, or using giant data centers to store data and run software programs, hasn't spelled the end of the line for the business of selling refrigerator-sized computers.

As Jefferies analysts noted after IBM reported its fourth-quarter financial results recently, "the entire world is not moving to the cloud all at once."

For some companies with heavy-duty computing needs, "the economics don't make sense" to move to the cloud, said Donna Dillenberger, a technical fellow with IBM who specializes in business-focused computer systems. "It would be cheaper to have their own on-premise data center."

Many buyers of mainframes or other high-performance computers belong to industries like insurance or finance. Because of regulatory or other restrictions on how they use data, they tend to stay plugged in to powerful computers they own and operate themselves.

In other cases, complicated software developed over decades would be tough to rework for the cloud. That includes things like airline-reservation systems or complex logistics and scheduling software for railroads or utilities.

Microsoft and rival Amazon.com are introducing increasingly powerful computers that customers can rent, but analysts say high-performance computers can clear technical hurdles that most "public clouds" of pooled servers can't.

Steve Conway, an analyst with IDC, said the cloud is great at simpler technical problems. But supercomputers are often needed for complex problems where one small design change may have a ripple effect that changes 50 other inputs and everything needs to be calibrated as one, he said.

New horizons

Commercial companies have used supercomputers for decades. But as costs for the technology come down and the amount of data companies collect rises, Cray sees an opportunity to make a splash.

The company sold supercomputers to businesses across five industries in 2015, including financial services, manufacturing and life sciences. Corporations may buy the same type of computer as a $100 million model that goes to a government agency, but the company's model may cost a few hundred thousand dollars. The trade-off is less computing power.

Cray, said Chief Strategy Officer Barry Bolding, is moving "away from being exclusively a provider to big government." For the company, which employs 150 people in Washington and 1,270 worldwide including engineering and development centers in St. Paul and Chippewa Falls, Wis., targeting commercial buyers is something of a blast from the past.

The company sold the first supercomputer to the auto industry in 1979. But more recently, the majority of the company's sales targeted government clients.

Getting back on track

Cray itself is emerging from the bumpy road of corporate deal making in the 1990s that saw the company change hands twice in four years and a corporate cousin go bankrupt. That ended in 2000 when the Cray Research division was acquired by Seattle-based Tera Computing, and renamed itself Cray.

"Their history since then has been kind of a low-hanging-fruit thing," said IDC's Conway, who worked at Cray in the early 2000s. "The first order of business was to get revenue coming in so they went after the government and large universities and have done really well in building that company back."