Generating Intelligence from Data-Intensive Assets
by Jenny Mangelsdorf
Talking about data can be as exciting as talking about mud, unless you’re living at the bottom of a fire-torched hill on the third day of a torrential rain storm. Today more than ever, companies and their IT managers are finding themselves at the bottom of that hill, watching as their organizations fill with increasing pools of data.
However, some savvy ones are looking at building enterprise collection and interpretation capabilities where experts and decision makers can pluck nuggets from the waters and use them to improve both their bottom and top lines. Without this ability, organizations run the risk of being swamped in data, unable to nimbly respond to critical issues and opportunities.
“The challenge is getting people to think of the business opportunities around their data,” says Bob Welch, president of CSC’s Chemical, Energy, and Natural Resources Group. “Ten years ago, technology didn’t allow us to even consider this. Now we have the IT maturity to think about how data can benefit an entire enterprise.”
While an organization’s quest to generate knowledge from its data isn’t new, new stresses and drivers along with the sheer amount of data now generated call for better solutions. Take the oil and gas industry. According to Bill Kuzmich, CSC’s Global Energy Sector leader, a major oil company estimates that this year they will be managing more than 100 petabytes1 of data, while a large refinery estimates it has 30,000 input/output subnets, each generating four terabytes of raw data every day.
For oil and gas companies and regulators alike, a key challenge is quickly getting to the data they need. In fact, when CSC and Hart Energy Publishing asked petroleum industry survey respondents last year where the greatest promise of improved productivity might appear, they said, “improved access to technical data and information are most important.”
“People are still fumbling around trying to find data,” says Kuzmich. “The only difference today is that they actually know where the data is — they just can’t get to it. “Regulatory agencies and oil companies alike are wrestling with the amount of data that is required to be submitted. But, gathering the data is not enough. We must now be able to use this information to quickly identify and respond to emergencies.”
Besides emergencies, the ability to quickly access data also helps oil companies bring their oil out of the ground faster, and therefore make a faster financial return. For gas companies, the ability to quickly get to the right data provides an edge in today’s fiercely competitive arena for new resources.
Data drives value
“This is an industry that’s measured sometimes in milliseconds and sometimes in eons,” says Kuzmich. “The whole industry is getting turned on its head because the amount of data they’re dealing with is only going to get larger, and decisions need to be made faster.”
For oil companies, too, until they get their assets out of the ground, those assets – their proven reserves – are, in essence, data. In fact, says Kuzmich, 10 years ago a major oil company increased its net worth by about 10 percent without drilling an extra oil well.
“New software and interpretations allowed them to prove they had more oil,” he says. “By taking a look at that same old data and massaging it in a different way, they changed the value of their company.”
To help exploration and production companies turn their data into business intelligence, CSC has developed a Petroleum Enterprise Intelligence (PEI) solution that integrates financial, operational, and technical data into one system, enabling geoscientists and engineers to make more rapid and knowledgeable decisions, such as determining the implications and asset requirements of bringing a damaged resource back online.
Innovations like PEI continue to get more important as the “big crew change” moves closer where experts predict that within the next 10 years more than half of the industry’s workforce, and thus knowledge, will retire. Even though companies and universities are working to mitigate this event, the need for readily accessible enterprise data will be key.
Location, location, location
The ability to tap an enterprise’s data, regardless of its physical location also can play a key role in attracting a quality workforce – whether it’s for the oil and gas industries or the mining industry. If a company can offer a metropolitan setting as a workplace compared to an offshore oil platform or a generator-powered mining trailer perched high atop a mountain, the ability to attract more qualified employees increases. With the ability to access and manipulate enterprise data, a company can assemble expert teams, regardless of their actual location, that can focus on numerous projects without the need to be constantly on site.
For the mining industry, this ability also supports the increasing need for automation and remote operation capabilities as mines move deeper below the Earth’s surface. For example, AngloGold Ashanti has formed a Technology and Innovation Consortium2, of which CSC is the only IT company involved, to help determine how it can mine gold at 5,000 meters below the Earth’s surface. Not surprisingly, data is part of three of the four key IT challenges CSC’s mining experts have identified when analyzing how to break this 5,000 meter barrier.
“When working this deep, the environment and the logistics and energy requirements to operate become very difficult,” says Welch. “It’s all about information — and the data problem there is enormous — because what they actually need to do is be able to operate at those deeper levels without people and that will fundamentally depend on robust, reliable, and safe IT and communications systems.”
Today already, data creates issues for mining operations. One reason is the amount of sensors used to track and monitor mining equipment and what’s happening in a mine. For example, today’s mining trucks each carry 300 to 400 sensors that provide such information as temperature, location, and visual data. A typical coal mine in Australia, says Welch, has about 40 to 50 of these trucks, plus other assorted pieces of equipment, which together make huge fleets of data-spewing mobile machinery.
“Currently, most mining companies don’t have sophisticated enough systems to make use of that sensor data,” he says. “However, even though this is a more complicated problem than, for example, in discrete manufacturing because the equipment doesn’t drive around and the variable ranges aren’t as dramatic, you don’t need to learn this from scratch.3 There are plenty of people who know how to do this. NASA has been doing it for years.”
Data improves top and bottom lines
In any industry, there has to be a compelling reason to allocate funds for new projects. For AngloGold Ashanti, a key driver to breaking the 5,000-meter barrier is quite simply to continue mining its gold. Sustainability is another driver for the mining industry, where they need a “license to operate.”
“A mining company can do what it likes, but if it doesn’t have the support of society, if you like, and in particular the local communities, then it gets very difficult for them to operate,” says Welch. “In the near future, they’re all understanding that they’ll have to not just comply with legislation, but comply with societal expectations, too, and to do that, they need to collect all of this data so they can talk about their impact.”
Chemical companies are also challenged by this issue as the number of regulatory mandates continues to escalate and evolve worldwide, and business partners, investors, and customers increasingly demand sustainability data. Last year, CSC and Chemical Week magazine’s second annual Sustainability and Compliance Survey showed that only 19 percent of respondents felt their business processes and technology systems were adequate to keep pace with future requirements.
Today, many chemical companies manage their information through spreadsheets or manual calculations. Since many of their systems are siloed, they also have to tap multiple places repeatedly, and in many cases manually, to get the data they need.
“We’re finding that companies not only agree they have to be regulatory compliant, they also want to differentiate their business and drive brand recognition by positioning themselves as being an environmentally friendly or green company,” says Chuck Deise, CSC’s Chemical Sector leader. “You can’t do that, however, unless you have the data.”
An increasing number of corporations are adopting sustainability scorecard programs where they mandate that suppliers provide such information as energy, carbon emissions, waste, and water use. As with meeting regulatory requirements, without the data, suppliers cannot show they meet these scorecard requirements. Companies also need the enterprise data to improve efficiency and, in turn, reduce their environmental footprints.
“By having all this data, you can improve your top line and your bottom line,” says Deise. “Where last year the issue was a right-to-sell and right-to-operate, now CEOs are looking at how they can leverage data to become more environmentally friendly, reduce their energy costs, be more attractive to investors and consumers, as well as make sure they’re meeting government regulations. That’s the big data problem we’re trying to solve.”
To help chemical companies with their data issues, CSC has developed an Enterprise Compliance & Sustainability solution that enables a company to collect regulatory information from across the enterprise, verify it, and then distribute reports to the many regulatory agencies that monitor their business activities.
Making data smart
The utilities industry is also seeing similar data floods of data coming its way as it builds “smart” infrastructures to monitor, manage, and regulate its resource generation and distribution, and help customers manage consumption. Experts forecast smart initiatives will create the largest increase in data the utility industry has ever seen. It will also give utilities the ability to make split-second decisions, such as if an earthquake damages one part of a utility’s grid, the utility can move customers to a different sector.
“What used to take several hours to make a decision, you can make in less than a second — that’s the promise of smart grids,” says Mike Bassigiani, CSC’s Global Utilities leader.
However, that’s only true if the utility can manage its data and transform it into business intelligence. A preliminary estimate for one utility is that the smart grid will generate 22 gigabytes of data each day from each one of its two million customers. For utilities using interval-based meters, such as in California and Texas, each meter, which used to send five data elements such as a date and time stamp, now transmits up to 5,000 discrete data elements a month, just to render an invoice.
“The explosion of the amount of information one has to manage, and the teasing out of relevant information from that data set, is becoming a herculean effort,” Bassigiani says.
“Most utilities are not ready to handle this volume of data,” adds Meir Shargal, CSC’s Smart Utility Practice leader. “Just collecting the data is useless, just as knowing tomorrow what happened yesterday on the grid does not help operations or storing more than 11 gigabytes a day per customer when you have a million customers, is not typically useful. Data management has to start at the initial reception of the data, reviewing it for events that should trigger alarms into outage management systems and other real-time systems, and then and only then, should normal data processing start."
In one project, where a utility client asked for help determining if its new smart meters and systems were accurate, in order to evaluate a one-month billing cycle, four billion pieces of data had to be analyzed along with an additional 10 billion pieces of data from the ancillary systems.
“That was a multi-month effort,” says Bassigiani. “You wouldn’t think it would take that long, but it did. People didn’t realize the relationship between all the interrelated systems, so we had to build that intelligence before we could analyze the data.”
To help both the utility and mining industries take advantage of the opportunities these new floods of data present, CSC is architecting industry-tailored solutions, similar to its Petroleum Enterprise Intelligence solution, that focus on enterprise intelligence and provide the ability to integrate an organization’s financial, operational, and technical data and rapidly deliver it to executives and specialists.
“For these industries, it’s not just a business intelligence problem, it’s also an information delivery problem,” says Welch, whose Oil & Gas Sector spearheaded CSC’s development of PEI when the data explosion began in the petroleum industry.
“We decided then that we would attack data as an uncharted territory and now see very similar opportunities in these other industries,” says Welch. “Each of these industries has taken care of their infrastructure. Now, with IT’s commoditization and virtualization, along with the introduction of delivery models like the cloud, we have the ability to move past basic information capture and processing, past thinking about how to acquire data and use it within functions and processes, and think about information delivery."
1 A petabyte equals one quadrillion (1,000,000,000,000,000) bytes.
3 Remote Operations Centers: Lessons from Other Industries (PDF, 52.5KB); Mining and the Future of Space Exploration (PDF, 92.15KB)
JENNY MANGELSDORF is a writer for CSC's corporate office.