The energy industry has always produced data - generally large amounts of it. That is about to change, as the energy industry is about to have a proverbial blowout when it comes to data. The source of the bonanza is sensors, regulations and economic conditions and it impacts the entire value chain of energy creation and distribution.
On the upstream side, energy companies have continued to invest in seismic software, visualization tools and other digital technologies to drive greater efficiency.
Now, with the rise of pervasive computing devices—affordable sensors that collect and transmit data—the mid-stream players, pipelines and logistics and wholesale marketing (traders) are seeking informational advantage wherever possible.
Finally, the downstream market has also seen massive changes, prompted in part by smart meters and sensors but also by new sources of customer data.
Data will become a source of competitive advantage at each stage, across those stages and in every transaction.
The challenge facing the energy industry however, is that current data exploration and analytical platforms were designed to accommodate millions of rows in a reasonable timeframe. Now energy companies are looking at billions of rows and “reasonable” means sub-second latency.
Traditional CPU powered solutions simply collapse under that strain. The ones that do survive are 100+ blade instances that cost millions per year to maintain.
The future of fast, immersive data exploration lies in GPU-tuned applications. Using the parallel processing power of these chips, energy companies and utilities can visually explore multi-billion row datasets in realtime overlaying maps and using the rendering capabilities of the GPU to identify opportunities, classify risk and monitor their networks.