(TCMA-3348) Real Time Cost Analytics using Power BI
Author(s)/Presenters(s): Sunny Goklani
In the practice of managing costs and earned value (be it for a labor resource, physical resource, or a subcontract), cost/project managers essentially compare three broad categories of costs: Plan, Actual, and Forecasts. The critical problems faced in integrating these to present a performance picture are: i) Data for actuals and plan/forecasts usually reside in different systems, ii) Data points are not real time, and iii) There's no analytic engine to turn the cost data into instant information/performance reports.
With the experience of having developed and led the analytics practice across a portfolio of construction and remediation projects, the author, through this paper, proposes application of Power BI as a platform for cost analytics. With a focus on labor cost management, this paper demonstrates how real-time data from different systems seamlessly come together to present state-of-the-art Earned Value Analytics. The dashboards also establish various methods of drilling down into variances and establishing ad-hoc KPIs.
In the age of big data analytics and persistent confusing project storylines, the author offers this business intelligence application as a way to apply data analytics and story-telling to solve some of the industry's age-old problems.
(TCMA-3375) Enhancing Data Reliability in a World of Increasing Information
Author(s)/Presenters(s): Michael A. French, PE; Christopher W. Ronak; Nick Papadopoulos
Advanced tools and technologies are being developed in the construction industry to achieve enhanced results, which is leading to significant data growth. Although great strides are taken to improve the timeliness, quality, and quantity of information collected, a critical issue remains - ensuring the data is trustworthy. Many times, massive amounts of data are collected but never analyzed and cleansed as the project teams move on to their next task or other projects. Accurate data is necessary for properly documenting past occurrences and providing reliable analytics, both predictive and prescriptive. Creating a historical database that end users can trust requires consistent, methodical data collection techniques and analysis. New industry tools have been deployed to streamline and improve the data collection processes. Also, Artificial Intelligence (AI) and Machine Learning tools are currently in development and have the potential to further enhance data reliability. However, even with these improved systems, data validation is still essential to ensure its integrity. With the development of proper strategies to monitor and control the ever-increasing stream of data, a reliable knowledge base can be created for improved progress tracking and prediction of future outcomes.
(TCMA-3378) Getting Contracts Right for Advanced Project Analytics
Author(s)/Presenters(s): Jeffrey Paranich; Dr. Manjula Dissanayake, CCP
Successful projects begin with well-defined contracts that benefit all stakeholders. However, contracts have not kept pace with the modern digital world; especially that of project analytics applied to project data. Boilerplate templates used for decades typically ask only descriptive analytics (what happened). Project owners who are attempting to leverage advanced analytics to improve the predictability of project outcomes struggle to find data that are stored in disparate systems, in unstructured formats.
Contracts must evolve from mere responsibility matrix accountability into incentive-based stakeholder alignments surrounding data engineering to allow project managers to focus on actual decision making rather than information-chasing. This paper guides owners into establishing well-defined contract parameters that will assure contractors adhere to pre-defined data frameworks, support data pipelines, open themselves to data governance, prepare for robotic process automation, and directly maintain data warehouses. Proposed contract structure will drive the thought leadership across the industry by ensuring the construction industry collecting much-needed data for advanced construction analytics. It will also facilitate the project professionals' transition to advanced project analytics arena, which will help them to remain relevant in the post-digital era.
(TCMA-3379) Visual Analytics of Look-ahead Schedules
Author(s)/Presenters(s): Dr. Manjula Dissanayake, CCP; Annie Yu; Jeffrey Paranich
Look-ahead planning and schedules are vital to the success of any construction project. A Look-ahead schedule (LAS) illustrates the previous, present, and future time period on one window. The purpose of the LAS is to plan labor activities and establish targets for the next two weeks (or months). This detailed planning takes into account actual job-site conditions, resource availability, and capabilities of the crew, based on recent performance. Supervisors and superintendents use LAS to plan and manage the work each week.
In a complex project, analyzing the current performance and the impact of schedule changes that can happen in a given window (eg. 3-week or 90-day LAS), can become a challenging task for front-line supervision. Typically crews spend an hour each week reviewing the schedule. This paper presents a visual analytics framework and a tool that provides advanced visual analytics, which enables planning and scheduling professional to provide timely insights to the construction team. Paper will also provide a data structure, data integration method, interactive dashboard illustrating metrics that can be used for LAS Visual analytics.
(TCMA-3383) Application of Data Analytics in Industrial Projects: concepts and processes related a Data Culture approach
Author(s)/Presenters(s): Glauber Francisco Alves
This paper discusses a conceptual framework for Data Analytics in the Industrial Projects environment. This approach considers developing and contextualising the use of the tools and methods available to improve the readiness skills of the organisations in the face of the advancement of Data Analytics. This review aims to develop the fundamental concepts and processes required by organisations and project teams in the transition required towards a data-centric mindset.
In the paper, the author will provide concepts, examples, quick-wins and templates associated with the use and opportunities embedded in the design development, cost estimation, procurement, construction management, planning and scheduling routines. The focuses will be given to the demystify application of Data Analytics to industrial capital projects, including mining, chemicals and infrastructure projects.
There is still a significant hesitation and uncertainty about how the project teams shall start with the transformation towards a more dynamic environment of data-driven projects as Knowledge Management, Data Standardization and Data Culture. This paper will address the limitations of the work presented, as well a set of recommendations for future improvements.
(TCMA-3410) Role of artificial intelligence in transforming the way we manage the cost of construction projects
Author(s)/Presenters(s): Dr. Anil Sawhney
Our industry faces an ongoing challenge of controlling project costs and avoiding cost overruns. A plethora of studies has documented that as an industry we deliver projects that are often delayed and cost more than estimated at the time of project inception. While the industry embarks on the much-touted journey to embrace leading-edge technologies'”such as digital twins, blockchain, additive printing, laser scanning, and drones'”it cannot lose sight of the fundamental responsibilities of meeting the key project metrics of time and cost. With increasing public scrutiny, the importance of managing project cost has again come into the forefront. Given this emphasis, how can the industry address the issue of cost overruns? Can we use technology to improve the processes of estimating, budgeting, managing and controlling project costs to reduce cost overruns? More specifically can artificial intelligence (AI) improve the estimation of projects costs? This article articulates an answer to these important questions.
(TCMA-3434) Develop an organizational portfolio management practice in leveraging historical data to develop robust schedules
Author(s)/Presenters(s): Subhash Tuladhar
Having access to historical project data amenable to analysis and insight-extraction enables development of robust schedules. Owners have access to abundance of project data through project management information systems, corporate financial systems and external data systems. The repository of interconnected data continues to grow in leaps and bounds as data storage become affordable. At the same time, many Owners have better access to affordable data analytics capabilities which has undergone significant advancements in the recent years. This paradigm shift makes it feasible for organizations, big and small, to operationalize advanced data analytics. As a result, project controls teams have an unprecedented opportunity to transform historical data into valuable systemic insights. Using a recent example and case study, this paper presents an organizational portfolio management practice in leveraging historical data to develop robust project schedules.
(TCMA-3437) Design Principles for Creating a Visually Appealing Dashboard
Author(s)/Presenters(s): Ashwini Jain, CCT CST
Organizations have many data professionals working on various data analytics initiatives. While the work of data professionals may vary time-to-time however, designing a dashboard has been a common task for most of the data professionals. A dashboard is a good way to provide insights from a top-level summary analysis to detailed level tracking. Often data professionals struggle to fit a plethora of information in a dashboard because of different requirements within an organization. This paper defines 5W and 1 H for designing an appealing dashboard and can be used by these data professionals which can help them in accommodating the needs of multiple people at various levels within the organization.
(TCMA-3443) A single source of truth: Visual Analytics from live data sources
Author(s)/Presenters(s): Jeancarlo Duran Maica, CCP EVP; David A. Chigne Sr.
Data visualization provides decision-makers with a visual representation that makes data easier to understand and act upon. Finding the right ways to share and communicate information effectively is crucial to achieving the goals of any project.
Therefore, business analytics become essential for management teams to conduct an accurate assessment of organization performance. The use of Business Intelligence tools and its connection to databases has proved to be one of the most effective ways to bring together a single source of truth to the project team and/or the wider organization.
In the context of making a compelling proposition through the creation of live dashboards, this paper will explain different ways to establish a connection between SQL databases and visualization tools. The aim is to create a data flow that circumvents interim spreadsheets and automatically update its content by querying the data from the source database.
This paper will also share the lessons learned after the implementation of this reporting architecture in one of the largest railway projects in the UK. Benefits found are:
- One single source of truth
- Less time working on spreadsheet reports
- Live dashboards directly from the data source
- Clear and powerful data visualization that streamlines the decision-making process.
(TCMA-3450) Analytics of what? Implications of industrial megaprojects complexity in data-driven methods
Author(s)/Presenters(s): Pouya Zangeneh; Brenda Y. McCabe; Murray Pearson; Leslie E. McMullan, FAACE
Devising a knowledge base of large industrial projects requires dealing with the inherent complexity of them. Complex systems have nonlinear, interdependent, and interacting constituents on various scales (Bar-Yam, 2016). Intuitively, the largest scale of information and characteristics of a system is deemed as most important. For industrial megaprojects, the overall capital cost and schedule performance are the primary focus of most of the previous research. However, certain small-scale information can point to large-scale effects on planning, scheduling, management, and implementation. Moreover, they affect the collection of data and applications of the data-driven methods. Understanding the project complexities from the latter viewpoint precedes any meaningful application of artificial intelligence and project analytics in management and forecasting of industrial megaprojects. This paper relies upon prior published literature and the authors' professional experience to build a framework of causes and effects of project complexities as related to such challenges.
(TCMA-3459) The Unit Price Process for Data Collection and Benchmarking
Author(s)/Presenters(s): Peter R. Bredehoeft, Jr. CEP FAACE; H. Lance Stephenson, CCP FAACE
Benchmarking and historical data collection should be the ultimate goal of professional cost estimators. To achieve solid and reliable data, one must apply standard methodologies to facilitate proper data collection. This paper outlines a process which enables data collection at the lowest of data for which proper benchmarks and metrics can be established. This paper will also outline attributes related to data collection using the International Construction Measurement Standard(ICMS), Revision 2.
Unit cost data is prevalent in every cost estimate. This process showcases an effective methodology for data collection at the unit prices level, key quantities, and metrics for historical analysis. This process can be used for cost estimates or actual field construction data. The unit price process can be used across any industry. This paper will assist organizations with establishing practices centered around the historical data collection and key quantities, elements, attributes and metrics needed for benchmarking.
(TCMA-3462) Adapting Data Management Structures to Improve Performance in Post-Disaster Scenarios
Author(s)/Presenters(s): Susan Bomba; Aleshia Ayers; Lamis El Didi
When organizations face disaster-response situations that trigger large-scale maintenance and repair work, their data management systems need to be able to adapt to a significant increase in work and often changes to capital portfolios. Many data systems are designed for steady-state work, from large capital portfolios to routine maintenance work on smaller scales. After a major disaster occurs, the priorities of the organization change, resulting in shifting resources, adjusted priorities for capital and maintenance work, and modifications to existing processes and procedures. In emergency situations, an added complexity includes oversight from government regulators requiring an increase in progress reporting and data requests, requiring standardized definitions among regions and programs. A key success factor in this effort is the adaptability and usability of the existing data management systems.
By improving the data management structure for post-disaster efforts, the general data (i.e. production rates, cost per repair, resources required per repair) can be used in future maintenance efforts to budget and forecast. Therefore, the data being created during these post-disaster situations becomes useful as a benchmarking and/or historic data tool. In this paper, we will discuss strategies that help companies scale their existing tools and reporting during post-disaster periods in a way that can be leveraged for future benefit to the organization. A case study will present how a large public utility was able to adapt to a significant increase in repair work following a major disaster event and leverage the influx of information for long-term improvements to their asset management and capital planning systems.
(TCMA-3502) Benchmarking: Data Collection to Analytics
Author(s)/Presenters(s): H. Lance Stephenson, CCP FAACE
Due to the volatility and unpredictability experienced of the today's markets, it is imperative that companies ensure that their operations and project delivery systems are utilized to drive improved cost competitiveness. To further improve competitive outcomes, companies need to improve their understanding of cost and schedule drivers and behaviors through historical data collection, benchmarking and analysis.
This paper provides the audience with an understanding of some simple, yet composite approaches by using identifying and applying project attributes and cost relationships. These approaches can further be used to validate project estimates and schedules, as well as provide a baseline for variance analysis during the execution phase of the project. In addition, these approaches support the completion of a forensic investigation to understand root causes and driving factors for both positive and negative outcomes.
Benchmarking takes knowledge from the past and allows us to predict the future based on our present needs. Through proper categorization of project attributes, robust cost collection as per the defined coding requirements, as well as an understanding of objectives and key metrics, the project team can utilize the empirical analysis (i.e. cost drivers and behaviors) to improve organizational and project performance. Benchmarking will also provide an effective approach in calibrating and enhancing organizational procedures, processes, tools and behaviors, while strengthening the overall project delivery and ensuring improved cost competitiveness.
(TCMA-3518) The impact of budget setting practices on cost efficiency and predictability: Case study: Utility renewal projects in Australia
Author(s)/Presenters(s): Michael Lesnie
This large scale study (more than 2000 projects) looks at the project performance outcomes from two Australian utilities each with large portfolios of small renewal projects. For each utility, the paper considers the relationship between budget cost competitiveness, actual cost competitiveness and the deviation between budget and actual cost.