Spotlight: Brian Courtney, GE, on Industrial Internet’s Big Data Opportunities and Big Challenges

GE's Brian Courtney is a bit like an explorer in the newly discovered, uncharted territory of industrial Big Data. After all, what engineer wouldn't want to know what was happening with every piece of equipment on the assembly line or every product unit in service out in the field every day?

As general manager of GE Intelligent Platforms’ Industrial Data Intelligence Software group, his job is to figure out how to turn a tsunami of data into a useful and manageable stream in the brave new world of Big Data. So he's also like the guy who is saying, "Be careful what you ask for, 'cause you just might get it."

Consider these points Courtney shared with Tech Trends Journal about the implications of the Industrial Internet:

  • Of all the data that exists in the world today, created since the start of human history, half of it was created in the past two years.
  • A GE jet engine takes off every two seconds. Each engine generates a terabyte (TB) of data per flight, which is compressed to 100 gigabytes (GB).
  • Producing steel at 60 feet per second or making 60 diapers a second requires high-speed sampling and monitoring to minimize waste in the event of a problem.
  • Modern automated industrial lines can generate as much as a 1 TB of data per day.
  • At 1 million transactions per second, 1 TB will take 7.5 days to process.

Big Data is a tremendous opportunity, but the challenges are also huge. The questions Courtney is asking are: How do you move that much data? Where do you store it? And most important, how do you process it all in a way that extracts the most valuable data most quickly? A jet engine that is about to fail in the air cannot wait a day or two to be discovered.

GE sees massive potential in the Industrial Internet, forecasting $514 billion of investment and $1.3 trillion of value created by 2020, as it cites this Wikibon report, based on the assumption of a 1 percent increase in efficiency for all impacted processes.

Much of that value will come out of the knowledge of equipment health, process health, process optimization, and operational optimization, which is the charter of the Industrial Data Intelligence group.

This intelligence will lead to avoidance of defects and both material and energy waste, as well as a reduction in the number and severity of failures with operational equipment in the field.

That implies that GE’s customers will be willing to share the information that is continually being generated on their machines' internal digital sensors with GE engineers or perhaps third-party providers. This will provide tremendous insight not only for the maintenance of the current generation of products and industrial equipment but for the design of the next generation, as well.

"We need to get smarter about how we process the data," said Courtney. Algorithms need to return results, not just data sets. Those results lead to answers to: “Is this particular failure systemic? Is it the design, the supplier, the batch?"

These questions are not new. Quality teams have been asking them for years. Only now the tools available for seeking the answers are far more powerful. Those tools also allow for better questions to be explored.

Today's questioners seek trends hidden in massive amounts of data. That is where GE is making its $1.5 billion investment to the Industrial Internet. Courtney noted, "We need tool sets, we need capabilities, and we need new kinds of analytics to help us drive to that. There's this whole generation of understanding that we'd like to get from the information. That's really hard to do today, because of the limitations of systems that just can't deal with the volume of data."

When asked about the role of hierarchical control and distributed processing, he said, "There is real-time closed-loop control and process optimization. Then there is prediction and process health monitoring to determine if the asset is near failure. All of that is being done today, and all of that is getting better with more data. It's real time, it’s getting data at the point of generation (on the equipment), at the point of collection (at the site level), and at the point of aggregation (remote analytics).

Courtney continued, “Realistically, as much as people love the cloud, there will always be in-machine controls required for safety and latency issues. As you move up the ladder, you discard bandwidth through data decimation. You don't need millisecond data for long-term trend analysis. As we learn more, we'd sometimes like to be able to go back, though it's still cost prohibitive to store everything."

Right now, GE's monitoring and diagnostics centers take in about 5 TB of data per day and put it into their Historian databases. It turns out that half of the data access has been by people designing the next generation of products, which, it seems, will not only be more functional but also more communicative. The ability to compare the performance of a machine with others of its type will bear fruit improvements in efficiency, for example.

"Right now, we're looking at a 1 percent improvement,” Courtney said. “If you look ahead over the next 10 years, to when there are 75 billion interconnected devices, when every machine has its own IP address, who knows? Maybe we'll see 5 to 6 or even 10 percent efficiency improvements. And if 1 percent is worth tens of billions, imagine with that could be worth?"

Post a Comment

comments powered by Disqus

Newsletter Signup

Jobs

From Around The Web Recent

From Around The Web

Recent