www.agc.org

www.agc.org Contact Us Archives/Subscribe Advertise IT Forum IT Forum Steering Committee


Naylor Association Solutions
Naylor Association Solutions

Share on Facebook Share on Twitter Share on LinkedInPrint Print this Article | Send to Colleague

Using Big Data to Predict a Contractor's Next Move

While retailers, airlines and banks are leveraging their big data to make more accurate predictions to boost sales and improve operations, the construction industry is in its infancy in analyzing its databases to gain insight to boost performance.

A growing number of contractors are finding ways to make their existing software "sing" – incorporating additional analytics they’ve created themselves into vendors’ business intelligence software, to help them better estimate projects, manage finances and seek out future market opportunities. Both contractors and vendors alike say the efforts are rudimentary but encouraging, as the industry slowly evolves the tools to enable more sophisticated big data analytics.

All point to future cloud-based capabilities (where software is accessed through a web browser and hosted by a software provider) that would make it feasible – and more affordable -- for even the smallest contractors to leverage not only their data, but aggregate data across the industry.

WHAT IS BIG DATA?
Most contractors are not even sure exactly what big data means. According to a survey of 838 construction professionals conducted by Sage Construction and Real Estate, an Oregon-Columbia Chapter member, three-fourths said they were not familiar with the term. However, a third said that the management of big data is one of the top most important functions that an information technology solution should offer to their business.

"The opportunity for big data in construction is the ability to get business insight from information within the business, as well as from outside sources to compare against similar-sized contractors," says Jon Witty, vice president and general manager at Sage Construction and Real Estate in Beaverton, Oregon.

Contractors might want comparisons of project profitability, days outstanding of accounts receivable, cash flow and other financial metrics to similar-sized contractors, Witty says.

"They will also be looking for the value of larger projects in comparison to smaller projects, particularly the risk exposure if they don’t win the next big project," he says. "Do they have enough smaller projects to make up for that?"

Sage is currently developing a cloudbased solution that would enable contractors to analyze their own data compared to aggregate data of similar firms, with no one divulging confidential proprietary  information. The service, which should be available within the next several years, will be a business intelligence service analyzing historical data that contractors can use to develop their own action plans of where they want to focus their business and where they don’t.

Wayne Newitts, marketing director at Dexter + Chaney in Seattle, Washington, an AGC of Washington member, says building information modeling (BIM) is one of the construction industry’s largest potential source of big data, but the possible advantages "have yet to trickle down to most contractors."

INDUSTRY’S LACK OF ADOPTION
Given the nature of construction projects, one possible reason for the relative lack of adoption of big data management and analysis tools is that many contractors are consumers of the data – the plans and specs associated with a project – much more than they are data generators or analysts, Newitt says.

"It comes down to what is practical – would contractors have to load every piece of job into a predictive analytical engine to figure out what can they do to not lose money?" he says. "The problem is, in construction any given contractor is working in jobs so different, it might not be worth to spend the time and money analyzing whether they could save 0.001 percent. That type of analytics is good for a manufacturer that makes 10 million widgets, but it doesn’t translate that well for the construction industry."

However, contractor data does lend itself to business intelligence – taking bits of data already identified as relevant and then pulling them to get pre-oriented triggers decide whether it’s necessary to take action.

"With predictive analytics you need to have so much data to develop algorithms and conduct regression analysis to determine how things are related," Newitts says. "Construction frankly isn’t there yet."

His firm’s Spectrum enterprise business management software enables contractors to enable companies to share business intelligence information on the cloud, both for office and project management in the field. But as cloud computing evolves, contractors will one day be able to perform true predictive analytics.

"The real power of cloud computing is the ability to tap into huge data storage and processing capabilities," Newitts says. This will give the industry the volume, velocity and veracity to make big data management viable."

Erick Brethenoix, director of business analytics and decision management Strategy at IBM Corp. in Chicago, says there are "pockets" within construction operations that are beginning to use predictive analytics, such as predicting when certain equipment might fail in order to proactively order replacements. IBM has a product, Predictive Maintenance and Quality, that utilizes predictive analytics and optimization capabilities, "to smooth out operations and save contractors money."

In the future, contractors may also utilize predictive analytics and optimization tools to better schedule their crews on worksites, similar to the tools that are being developed for hospital personnel, Brethenoix says.

ALARMED BY ALGORITHMS
Most contractor data is not organized or easily retrievable in a way that allows firms to easily perform predictive analytics, says Tim Blood, preconstruction project manager at Sundt Construction Inc., a member of multiple AGC chapters.

"Not a whole lot of attention was given to how we store that data in the last decade, so while we have gobs of data, we have to work manually to make use out of it to create predictable results," Blood, who works out of Sundt’s Sacramento office, says.

Blood and his team look at past data in terms of both costs and quantities of materials used to build certain types of buildings, to better estimate how much material the firm will need for a current project. The pre-construction team analyzed data for those types of projects and discovered relationships within that raw data, which was then transformed into algorithms that drive the firm’s parametric database. Blood’s team did this inhouse, using as a backbone, DProfiler, macro BIM software by Beck Technology in Dallas.

For example, a lab building project will have significantly different materials, means and methods associated with its design than a student housing project. When the team has a new lab project, they define a number of parameters such as the area of the building, the type of lab and the number of floors, and the database will generate all of the quantities of material the team would expect to need on that type of project.

"Algorithms might sound like a big scary word, but you can easily replace it with finding relationships between certain parameters of a building, such as the square footage of a wall for a certain product type, compared to the square footage of a floor plate," Blood says. "By discovering these relationships, and converting them to algorithms you end up with the ability to create a more predictable estimate of quantities when very little detail is available."

Sundt’s Eric Cylwik, a virtual construction engineer senior in Tempe, Arizona, says it’s more challenging to develop such algorithms for highway and bridge construction, "as no two feet of a road are ever going to be exactly the same."

There are a few pieces of software that are "getting close," such as Autodesk’s InfraWorks, which pulls in USGS topographical data. Using this tool, one can specify a road to build from Point A to Point B, and to avoid a certain river or park, and the tool will analyze all the possible pathways and include the best cost for each conceptual path. However, the software doesn’t currently analyze information such as historical cost of materials and product performance data, such as whether asphalt or concrete on that particular road would last 30 years.

"It’s not very accurate on costs, because it’s use of historical cost data is very limited," Cylwik says. "Moreover, the user has to put in all of the specifics about where not to build the road – the tool doesn’t have the intelligence of which areas to avoid."

Contractors will likely be more receptive to sorting through large sets of data as vendors launch more cloud-based products that have much faster data processing, he says.

"Construction seems to be lagging compared to other major industries on leveraging big data, but perhaps a contractor might be willing to spend another $1 per use to be able to use a cloud-based service, rather than spending $1,000 to upgrade their laptops," Cylwik says.

 

2300 Wilson Boulevard, Suite 300 · Arlington, VA 22201 · 703-548-3118 (phone) · 703-548-3119 (fax) · www.agc.org
About AGC | Advocacy | Industry Topics | Programs and Events | Career Development | News & Media

© Copyright 2024 The Associated General Contractors of America. All rights reserved.