[00:00:00] Gary Fischer, PE: Now we’ve got some of your students coming up following you to share their research and how it applies to the real world projects here. So I think first off, we have motion and Brian, can you turn your videos on I and get ready to share? Okay. So we got
[00:00:25] Gary Fischer, PE: there we’re. Zang is a student of Stanford majoring in sustainable design and construction. His research interests are applying operation science principles and process simulation methods for supply chain management of prefabrication and modular construction. He has industry experience working in major general contracting firms, is a construction analyst, and then we’ve also got Brian.
[00:00:50] Gary Fischer, PE: Who leads Mortenson’s enterprise-wide AI strategy, driving innovation in the intersection of construction data and emerging technologies. He focuses on empowering field teams through practical human-centered AI applications. Guys, we can see your presentation. Who’s gonna lead off?
[00:01:10] Bochen Zhang: Thank you Gary. I can lead off.
[00:01:12] Bochen Zhang: Okay. Okay. Yeah. Thanks everyone for the opportunity to share our latest research at the PPI community. Our work applies operation science and project production management framework to one of the most persistent and often overlooked bottlenecks in the project delivery. The submittal process. You might be surprised to see this topic actually coming from CFI because given that CFI has been long promoting the VDC strategies that minimize the need for submittals, yet across many projects today, we still see a submittal. Workflow continues to be one of the risks to delay the field production.
[00:01:55] Bochen Zhang: At sfe, we believe we must explore both topics, how emerging technologies like AI can shape the future of work, and also how we can help solve the real operational problems that project teams is facing today. More common. Okay. Yeah. So today technology is advancing at an unprecedented pace path and the leading a EC companies are actively investing in AI to transform how work gets done.
[00:02:28] Bochen Zhang: But when we, speak with our industry partners who wanted to embed AI into their daily workflows, we consistently hear the same question. Should we use AI simply to automate the tasks or should we use AI strategically to shape the future workflow? Through our research collaboration with Morson today, we explored this question in the context of a real project.
[00:02:53] Bochen Zhang: What we have learned that is when AI becomes tr we, AI becomes truly powerful when it supports the project production management control, facilitating the flow in the production systems. And here I’ll let Brian Na, the director of AI and Morison, to introduce our project.
[00:03:12] Brian Nahas: Thank you. So I wanted to set the stage of how the partnership came about and how we chose this project set to be the core data source for the research effort.
[00:03:24] Brian Nahas: So the Gaylord Pacific Hotel and Convention Center. This was a $1.3 billion development project that occurred over six, over 36 acres in the Chula Vista Bayfront area, just south of San Diego. The project included a 22 story hotel, which included 1600 rooms and an and a convention center adjacent to it with just shy of 500,000 square feet of meeting space available.
[00:03:54] Brian Nahas: At its time, this was the largest hotel project under construction within the United States and we opened it earlier this year. A couple other aspects about the job, which I think created some unique use cases and systems on the project to be a part of this analysis. There’s a, there’s about a four acre waterpark associated to the project, in addition to a nine story parking structure with 1600 stalls.
[00:04:21] Brian Nahas: So this job was executed over 34 months. It was in a joint venture between Mortenson and McCarthy Construction. And it was delivered ahead of schedule. So when we reflect on this project not only was it, quite massive in scale. It involved two partnering companies very large contractor and trade partner network and not your typical systems just given the diversity of the facility type.
[00:04:52] Brian Nahas: So at the end of the day, the project had a total of 4,383 submittals associated to the full campus of work with just over 900 of them focused on product data. And that was really the genesis of the research and the partnership was really understanding the opportunity of the product data elements which are perceived to be something that artificial intelligence could help from a process standpoint.
[00:05:19] Brian Nahas: Last fact just to ground the group the project was managed through Procore. So all of the workflows were done traditionally through that platform among all of the partners on the project. And we took the raw data set out of that system in order to help with all the findings.
[00:05:37] Brian Nahas: So a very successful project for those involved. But we still wanted to seek areas to evaluate future improvements. And the hope of this effort was to really pull out and understand where those bottlenecks happened and look forward to having the outcome shared in a moment.
[00:05:58] Bochen Zhang: Yeah, thanks Brian. So even on a highly successful project like the Gaylord. We still saw many opportunities to improve the processes and then especially the submittals here. What we see as we analyzing into the data extracted from Procore some of them are still delayed by months some even by closing to a year.
[00:06:22] Bochen Zhang: These delays really created real project risk and forced both the general contractor and design teams into costly over time. So the natural question is, can really AI help us to improve the situation or being more specific to say that if we simply deploy an AI agent to automatically approve submittals for the general contractor team, will that actually solve the problem?
[00:06:51] Bochen Zhang: What we find is that the answer is not very straightforward, and that’s where we find applying the operation Science and the PPM becomes really essential to analyze the situation. So as researchers, our first task was to understand the real problem. We know that this is. A very successful project, but we still wanna see where there are chances for us to improve and why some of the submittals are still delayed.
[00:07:17] Bochen Zhang: So if AI alone doesn’t solve the delays, then what actually drive the system behavior. This is also something we want to focus on. So we broke the project into. Research questions. What are the submittal process, bottlenecks and the capacity challenges? What are the opportunities for digital and AI enabled submittal?
[00:07:39] Bochen Zhang: And really, can AI agents be introduced to improve the process flow and decision making? If so, how to do it? These questions guided our analysis and help us uncover the production physics behind the delays to, answer these research questions. We first spoke with each team involved in the submittal workflow.
[00:07:59] Bochen Zhang: We understand how they plan, review, and improve submittals. From these conversations, we mapped the current process, the CT first estimates when each submittal will be knitted by preparing a submittal log and issues. Due dates to subcontractors and subcontractors will prepare and submit their packages.
[00:08:17] Bochen Zhang: Then the GC team will start to review those submissions, compiles comments, and forwards it to the designer for detailed review. Once the designer approves the decision flows back to the general contractor team who then notifies the subcontractor for proceed. Their work. Alongside this process mapping, we also collected key capacity information from each team.
[00:08:39] Bochen Zhang: So that give us from the GC team a review capacity of around a hundreds of submittals per week. And they have to reveal that with within five work days according to the contract. The designer team’s a smaller team working part-time, so they. Can review around 10 sub submittals per week and they have to review them within 10 work days.
[00:09:00] Bochen Zhang: And this give us the baseline needed to understand how much work the system can actually handle, and then to diagnose where the bottom excel occurs. So we begin by analyzing the data the current situation using the Procore data. Because a submittal action, it’s timestamp in Procore so we can objectively evaluate whether the submissions and proposal occurred on time.
[00:09:26] Bochen Zhang: Starting with the subcontractor submissions, 69% was submitted earlier on time with around 54% were submitted early mostly between Q4 20. 2022 and through Q3, 2023. At first glance, this looks like a strong performance to us. The subcontractors are not the source, major source of the delay for the, sub sub submittals. Next we look at the approval performance of both the general contractor and design team. The general contract team approved 86% of sub submittals earlier on time with 77% actually early. Designer team approved 82% earlier on time with 76% approved early, and most of these early approvals occurred also in Q2 and Q3 of 2023, which tells us that both teams are really working very hard to keep the process moving and avoid the downstream delay.
[00:10:23] Bochen Zhang: So again, just like subcontractor submissions, approval data suggests that ECL is not a individual performance. Instead, something systematic is causing the delays, and that’s what we are going to investigate next. So after looking at the overall performance, we wanted to understand how these submittals are actually flowing through the process.
[00:10:43] Bochen Zhang: So pre plotted this chart where the X axis is the project timeline, and Y axis is the number of open submittals, essentially the working process. The in encrypting process in the system at any point in time. So in this project, the project engineers used a new digital tool to bulk create submittals from Excel, from a an Excel log into Procore.
[00:11:08] Bochen Zhang: The process began, so the process began with 155 open submittals created on day one. Then we observed a very steep rise in open sub metals immediately after pile on the foundation milestone, and the peak occurred around the start of the superstructure phase in January, 2023, where the system was carrying 247 open sub metals waiting for approval.
[00:11:35] Bochen Zhang: This chart breaks further, breaks down the open submittals by subcontractor, general contractor and designer. And what we see, it’s a clear batch release pattern. Large batches of submittals were created and released to the subcontractors early first in the project. As the project progressed, subcontractors began submitting these packages to the GC who then forwarded them to the designer for review.
[00:12:00] Bochen Zhang: We find here as comparing to the capacity we interviewed before both GC team and the designer team experienced a period where the volume of incoming work far existed. Their reviewing capacity. So here’s what we learned from this industrial data analysis. Submittals were front loaded in large badges before the project even started creating a high working process.
[00:12:25] Bochen Zhang: On day one, most submittals were submitted, reviewed and approved early, which indicates really strong individual team performance across all the teams. Yet despite this review capacity. Both at the journal contractor and the designers team became the system bottleneck, producing congestions and long queues and waiting time.
[00:12:45] Bochen Zhang: As a result, the average review and approval times were significantly longer than expected, and both teams were forced into over time during this peak period. These behavior are entirely consistent with what the literature has reported. Now we can see the physics and folding clearly in the real project context with real data.
[00:13:09] Bochen Zhang: This sets the stage for applying operation science and PPM to diagnosed the system more rigorously. So at this point, it became clear that simply increasing the general contractor’s review capacity would not solve the problem. If we speed up the general contractor’s reviewing step, then the bottleneck simply shifts downstream to the designer.
[00:13:31] Bochen Zhang: So we are not eliminating the constraint, we are just moving it. This is why we need to view the sub submittal process as an integrated production system, not as isolated tasks. So to do that, we turn to the project production management framework where it provides a structured way to diagnose flow and five levers for improving any production system.
[00:13:52] Bochen Zhang: Using these levers, we can identify where the real opportunities are for improving for improvement, which is why far beyond just adding capacity. So to better understand the system behavior, we modeled the general contractor and designer’s review process as a simplified production system using the standard PPI notation of elements in this model are grounded in actual data extracted from the Procore, but this allow us to represent the submittal workflow as a sequence of operations queues and inventories.
[00:14:24] Bochen Zhang: Just like any other production system even though this one is informational, with this model, we can quantify the second time working process variability and cap kinda capacity and begin to see how these factors start to interact to create the delays that we really observed. So to understand the dynamics of this process, we applied a little law, which is a core operation science principle.
[00:14:47] Bochen Zhang: The list law links three quantities the second time, which is the average time to approve a submittal. The working process, the open submittals waiting in the system and the throughput, like how how many submittals are approved per unit time And there is an optimal range of working, number of working process where the system performs most efficiently.
[00:15:09] Bochen Zhang: If the work process is too low, then. From the curve, we could see on the left, we could see that throughput suffers. And if the working process too high, the throughput does not meaningfully improve but the second time increase dramatically causing long delays and extended approval durations. In other words, if we flood the system with too many open works that’s probably what happened in this project guarantees long cycle times regardless of how hard the teams work.
[00:15:39] Bochen Zhang: So when we map the project data. To this relationship. The pattern becomes pretty clear here. During the most of the project, the system was carrying around 90 open submittals at any given time. At that working process level, the model predicts an average cycle time as of around 50 days, assuming a normal working hours, that teams are just working eight hours per day at that working process.
[00:16:07] Bochen Zhang: Sorry. However, the actual average cycle time coming from the Procore data was actually around 35 days, which strongly suggests that both the journal contractor and designer teams were compensating by working over time to keep the project moving. And in contrast, the model shows that the optimal working process for the system is actually around 30 open submittals per day.
[00:16:34] Bochen Zhang: And at that level, the The the, average cycle time drops to around 20 days, and which is a reduction of more than half compared to the estimated average cycle time by the model, which is a 50 days. So the key takeaway here is that the delays we observed were not caused by slow reviewers.
[00:16:55] Bochen Zhang: They were caused by overloading the system far beyond its optimal working process. So the absolute bench, we also did analysis on the absolute benchmark. So the result also shows that there is a significant opportunity to improve by reducing the variability in processing times even with when the average cycle time is acceptable.
[00:17:16] Bochen Zhang: High variability creates long tails and predictive, unpredictable and inconsistent approval durations. So to help the teams. Visualize and really control the working process. We proposed using this using available project data from Core to create this chart. And each star here represents a submittal plotted by the work working process level on the Y axis.
[00:17:43] Bochen Zhang: At the moment it’s entered stakeholders process and it’s resulting second time on the X axis. So we could then further divide this chart into four coordinates. Where the four questions can be de determined by allowed reviewing approval cycle time and in the available capacity. And then we could actually find that from the bottom left, the, if the if there’s low working process and low cycle time, the system is below capacity and performing well.
[00:18:19] Bochen Zhang: Then low working process and highs at the time, which means it’s below capacity, but still delayed, typically due to high variability, and then high when it’s. When it goes up to high working process and high cycle time, the system is overloaded and then leading to long delays. And when it’s high working process and low cycle time, it’s the system is overloaded, but still on time, which often means teams are compensating by working overtime.
[00:18:46] Bochen Zhang: But plotting this chart, the teams can immediately see when they are operating in a stable region versus when they are overloaded or relying on over time. This makes the working process visible and un actionable. And then we can, we’re able to map where is the targeted area, which is close to the top right of the the low capacity and on time quadrant.
[00:19:12] Bochen Zhang: So here’s an example of using that quadrant chart and populate with the real project data. And we’re taking the general contract to submital approval data here, as an example. So what we found that all these points are sub, we what we find that the 4 440, all of the 913 submittals, which is nearly half of them, enter the GC review process at moment when there are already a hundred submittals open and we’re waiting for approval.
[00:19:43] Bochen Zhang: And this is more than triple the optimal working process level as we calculated before. And not surprisingly, or surprisingly 73% of the delayed submittals. By in the general contractor’s court will fall in this high working process quadrant. And this is tells us very clearly that delays were driven by system overload and not really by inner performance of the individual review teams.
[00:20:11] Bochen Zhang: I. When we perform the same analysis for the designers team, we’ll observe a very similar overload pattern. Those teams were consistently asked to operate far above their capacity, which guarantees a long cycle, times, and unpredictable outcomes. As the key takeaways we don’t want, we don’t need to push harder on each individual teams.
[00:20:33] Bochen Zhang: They already work hard. But what we need to do, we need to release the open the work into the system much more smarter. If we simply add more general contractor review capacity, the bottleneck doesn’t disappear. It just moves downstream to the designer. And if we increase the designer’s capacity, the bottleneck will shift upstream, often triggering even earlier, and a larger batch releases.
[00:20:58] Bochen Zhang: So the only sustainable way we can think to stabilize this workflow is to control the working process through a pool base release system where submittals are released based on available capacity rather than schedule driven batching. In other words, the leverage point isn’t working faster. It is controlling how much work enters the system.
[00:21:21] Bochen Zhang: So finally, how can. AI really helps enable the project production management in the submittal process. The key idea is that AI is not here to replace the reviewers. Its real value is in enabling the production control. The foundation of the predictable flow. So AI can support in, in several ways under this concept, the real-time visibility by capturing the the PPM metrics such as work in process at the time the variability AI can provide.
[00:21:53] Bochen Zhang: Data backbone needed to monitor and manage the whole submittal approval system. Process and AI enabled submittal planning, automated classification and prioritization help to structure release the plans, release plans based on actual capacity, not really the fixed schedules. And then automa automated release control.
[00:22:14] Bochen Zhang: So AI can detect, upcoming overload, re recommend batching strategy, then smooth the release of the work. These capabilities allow teams to functionalize the PPM principles in day-to-day execution. Something that has been historically been difficult to do manually. And looking further ahead. A cloud-based AI agent integrated directly with a contractor and a vendor validated product database could fundamentally transform how submittals are generated, validated, and approved.
[00:22:46] Bochen Zhang: To close. I’d like to echo a sentiment from one of the mornings. Team leaders and AI should not be. Importantly, we consider AI as not a faster replacement for human reviewers. It needs to empower the project team and facilitate better process flow. So from here, let me hand over to Brian to close this presentation.
[00:23:09] Brian Nahas: Thank you. So when I think of the outcomes that we witnessed just on this one effort to give context, right? This is insight that was generated around one project. It reflected the, maybe the behavior patterns of, different participants on that job. But I really want to look fa look past that and think about the opportunity to consider scaling that view across.
[00:23:36] Brian Nahas: Ideally a domain, a part of our business to the enterprise of our company. But idealistically, what would this look like across the industry? When you think of throughput and just transfer of bottlenecks that’s likely a common behavior in many of the workflows that we have in our industry. And what would that look like in the future?
[00:23:57] Brian Nahas: Secondly how would that inform resource balancing? So where you would have overloading. Scenarios, what would that look like to predict it or foresee it based off historical patterns of use and how could that be used to proactively inform resource balancing to support the execution of the work.
[00:24:16] Brian Nahas: And then I’d say lastly consistently when AI is thought about, in particular to this, it’s at the task level to help the execution of a submittal review itself. How could we reframe that and think of how AI and other data collection and visualization could actually shape how we plan, assign and optimize the workflow as opposed to.
[00:24:43] Brian Nahas: Bolt on or embed AI into the way we show up today and expecting better results. So I think the real question I’d like to leave this group with open-ended to think about is just do we want AI to simply do the work? Or would we, like to leverage AI to redefine the future of our work and the future of our processes?
[00:25:08] Gary Fischer, PE: All right. Very good. Thanks. I think that’s probably gonna be a common question on a lot of AI agents and AI applications. Let me ask one question. Did you look at the requirements themselves and were they really necessary? The submittals?
[00:25:33] Brian Nahas: I think that’s a fundamental question for the industry to probably ask ourselves of which submittals that are a part of the process are of higher value and necessity versus others.
[00:25:44] Brian Nahas: Some of our earlier conversation when we were reflecting on this opportunity we did isolate a few sets of product data. Think of commodity items, right? They still enter the process. But it’s a commodity item from most likely a reputable vendor that has been in the industry for a long time.
[00:26:03] Brian Nahas: It’s not a unique or novel product. But it still goes through the same standard of care, of course, but the same workflow alongside slightly more complicated nuance, specialized things that are very particular for that job that require a higher touch and engagement. So I think right now as far as the requirement.
[00:26:24] Brian Nahas: They’re all defined equally. And I think that would be an interesting conversation for the industry to have around which types of submittals are important, but maybe less essential to helping create bottlenecks and, take up time.
[00:26:44] Gary Fischer, PE: Do you think AI could be educated to sort. This is really a no big deal submittal versus this is a big deal submittal,
[00:26:53] Brian Nahas: ideally.
[00:26:54] Brian Nahas: Yeah. Yeah. I think there’s, we got a couple, there’s general patterns, but ideally, yes.
[00:26:58] Gary Fischer, PE: Had a couple other questions for you. Was the quality of submittals looked at for influencing reviewer overload with rework?
[00:27:09] Gary Fischer, PE: So were things recycling because of quality issues with the submittal?
[00:27:16] Bochen Zhang: From data? I think in general the quality for the submittals are, from my experience is pretty high. But still there are review reviews and reworks still happens. There are vision ver different versions of the submittals in the system.
[00:27:33] Bochen Zhang: We capture them as also, capture their data as individual data is to be included in ANA analysis. Yes, it’s not a hundred percent on time. There are some submittals that it’s taking longer time. But then from the system perspective, what we are analyzing is the average cycle time, which actually count all these the overtime sorry all these time for reworks and reviews.
[00:28:06] Gary Fischer, PE: So that there was, and it’s a companion question here, what about rejected submittals, reentering the system, creating backlog? I guess that just complicates the issue.
[00:28:17] Brian Nahas: Yeah. Yes, we did see those. And what was also interesting is when we went into the data, we also, we saw a variability in how a team member may reintroduce it.
[00:28:31] Brian Nahas: And that was a flag that was brought up just by this data of view. In some instances, it was it was kept in the same flow and it was considered a revision. And in others it was closed and a new, effectively a new line item was open and the process started again. That was a good learning for our organization, just around standard procedures.
[00:28:54] Brian Nahas: And again, at times of potential stress or overload you may get unintended or unintentional deviation. And those were interesting flags that were called out during the data review.
[00:29:07] Gary Fischer, PE: Okay. We got another one that says, can we make assumptions about the lien process after collecting data from the traditional process?
[00:29:20] Bochen Zhang: We could using the the OS and project production management framework, we could actually estimate for the lien. Zone of the system performance. So that’s the whole point for this project. And what we demonstrated is it also is the result was able to be compared to the actual time and duration.
[00:29:44] Bochen Zhang: And then we were able to reflect on that to say that, oh it’s not that. In this team, it’s it’s just perfect it because they’re working really hard. And then, yes the modeling we were simulating, it was using the normal work time, which is eight hours per day, but apparently people spending more times outside those work hours.
[00:30:06] Bochen Zhang: So they are, in that case, they were able to reduce that average cycle time, which makes sure that project and some metals were delivered on time. The answer is yes, we do see there’s a possibility from today’s data. We are able to model the process, able to find where is the lean zone and we are starting to reflect on that, to improve the process for going into the lean zone.
[00:30:32] Gary Fischer, PE: Alright, very good. I think that’s not seeing any more questions come in. That was an excellent application of Operation Science. Thank you.
[00:30:42] Brian Nahas: Yes, thank you. Was wonderful.
[00:30:43] Gary Fischer, PE: So thank you all.