Reducing Duplicate Work in Construction: How a CDX Can Transform the Submittal Process

Reducing Duplicate Work in Construction: How a CDX Can Transform the Submittal Process

How might we...In ways that...So that...

Introduction

In February, I had the opportunity to attend the 2023 AEC Integration Summit, an event hosted by the Construction Progress Collation (CPC).

About CPC

CPC is an organization dedicated to fostering a united front of construction supply chain stakeholders. Their mission is to address the shared challenges that arise from miscommunication and errors in the construction industry. They aim to fast-track the digital transformation of the sector, with a special emphasis on revolutionizing project performance measurement in the digital age.

CPC's Approach to Construction Challenges

CPC acknowledges that the main challenges in construction are more social and contractual than technical. They believe that performance measurement significantly impacts individual behavior. To enhance project delivery, they are developing a framework they call the Common Data Exchange (CDX), which promotes increased transparency, trust, and efficiency.

Addressing Data Standard Issues

In their approach to real-world data standard issues, CPC dissects each obstacle into four stages: Digest, Debate, Decide, and Deliver. This process establishes a transparent and accountable system, fostering the adoption of effective technology solutions, and ensuring data reliability. Their aspiration is to shift the burden of adoption from the user to the software provider, with the ultimate aim of improving construction performance.

The AEC Integration Summit Experience

The AEC Integration Summit is designed as a multi-day workshop with interactive presentations. This format makes it easy to meet fellow attendees, engage with speakers, and discuss ideas. Attendees are assigned to various teams to facilitate discussions and collaboration. I was assigned to the responsibility tag team.

Recap of the Discussion

For a detailed recap of the discussions and insights from the responsibility team, please refer to the attached video. The video provides a comprehensive overview of the topics discussed and the conclusions reached during the event.

Follow up to the last article I followed a similar theme in for this workshop. Once I get access to 100K context window Claude or GPT-4 32K context I will rerun some of these tests. But with a different twist and attempted to use AI to develop a technical document for recap and next steps:

Here's the AI tools and how I used them

  • LeMUR to transcribe the youtube video
  • LeMUR to summarize the youtube video
  • GPT-4 for implementation strategy
  • GPT-4 again for Azure platform suggestions
  • GPT-4 to combine summary, implementation strategy, Azure tech stack suggestions


Document for Team Review and Analysis

Objective

Our primary objective is to enhance the submittal process in order to minimize duplicate work and bolster transparency. We understand that while technology is a crucial component, addressing people and communication issues is equally important.

Initial Meetings: Setting the Stage

To tackle these issues, we propose conducting two pivotal meetings at the commencement of a project. The first meeting would be between the owner and the General Contractor (GC) to establish the submittal log. The second meeting would involve the GC and subcontractors to determine the mutually agreed upon review process.

The Role of Common Data Exchange (CDX)

Once the rules and processes are established, submittals would be entered into a Common Data Exchange (CDX). The CDX would then automatically distribute the necessary documents to each stakeholder based on the established rules. This system ensures that stakeholders receive only the documents they need to review, eliminating unnecessary clutter. The CDX also serves as a shared source of truth around submittal statuses, reducing confusion from differing statuses in individual systems of records.

Streamlining the Review Process

Upon review of the documents, the information is sent back to the CDX, which then notifies the GC of the approval. The GC aggregates the information and sends the approval back to the trade contractor, streamlining what is currently a lengthy back-and-forth process.

Benefits of the System

The benefits of this system are manifold. It reduces duplicate information, provides clearly defined and shared statuses, identifies risk indicators like cost and schedule impacts, and offers a shared source of truth.

Balancing Idealism with Reality

While we initially took an idealistic “greenfield” approach, we recognize that in reality, integrating existing systems of records would likely be necessary. However, we emphasize that addressing the people and process issues is a prerequisite for technology to be effective.

The Importance of Pre-Planning

Extensive pre-planning to determine submittal requirements and rules upfront is crucial. With this approach, the package definition becomes less important as the system can track individual items through the review process. The key is to initiate conversations and planning among the right parties early on.

The Path to Improved Productivity

By defining submittal requirements, establishing rules to guide routing and review, and implementing a shared CDX, we believe that productivity and the submittal process could be significantly improved. The CDX allows for automatic routing of information to the correct stakeholders, reducing confusion and excess documents in individual inboxes. With a streamlined process and a single source of truth in place, we can increase transparency and efficiency at each step.


Schema Implementation Strategy

  1. Stakeholder Identification: Identify all stakeholders involved in the process, including owners, general contractors (GCs), subcontractors, and any others. Involve representatives from each group in the planning and implementation process to ensure all needs are being met.
  2. Define Process and Requirements: Hold initial meetings between the owner and the GC, and between the GC and the subcontractors, to clarify and agree upon the requirements and the process. This should include the submittal log, the review process, and the types of documents each stakeholder needs to receive.
  3. Establish Data Exchange Rules: Define the routing rules for the Common Data Exchange (CDX). This includes determining who receives what documents, how updates are communicated, and the process for approval.
  4. Select or Build a CDX: Choose a platform that can serve as your CDX. This could be an existing software tool, or we may need to build a custom solution. The CDX needs to be able to manage and distribute documents based on the rules established in step 3.
  5. Integrate Existing Systems: If necessary, work with IT to integrate any existing systems into the CDX. This may involve API integration or potentially more complicated solutions depending on the systems in use.
  6. Train Users: Once the CDX is set up, provide training to all users. This should cover how to submit and receive documents, how to check the status of a submittal, and how to communicate within the system.
  7. Implement and Monitor: Launch the new process, with the first submittals going through the CDX. Monitor the process closely at the beginning to catch any issues early.
  8. Review and Adjust: After a set period of time, review the process. Collect feedback from users and analyze any metrics we've been able to gather. Use this information to make adjustments and improve the system.


Technology Stack

Our proposed technology stack is based on the Microsoft Azure platform:

  1. Azure Active Directory (Azure AD): To manage user identities and access controls.
  2. Azure SQL Database or Cosmos DB: To store structured data like user information, document metadata, and submittal logs.
  3. Azure Storage (Blob Storage): For storing and retrieving large amounts of unstructured data, such as the actual documents involved in the submittal process.
  4. Azure Logic Apps: To create workflows that can automate the process and rules for routing documents and notifications.
  5. Azure Functions: For serverless compute operations that respond to events (like a new document being uploaded or a review being completed), perform transformations or calculations, and carry out other custom logic.
  6. Azure API Management: To create, publish, maintain, monitor, and secure APIs that access our backend services and data.
  7. Azure Notification Hubs: For sending push notifications to users when a document is approved, a new task is assigned, etc.
  8. Azure DevOps: For planning, developing, testing, and delivering the software. Azure Pipelines can be used for CI/CD.
  9. Azure Monitor and Application Insights: For tracking the performance and usage of our apps and backend services, detecting anomalies, and diagnosing issues.

On the frontend:

  1. Angular, React, or Vue.js: To build the user interface of the web application. The choice will depend on our team's expertise.
  2. TypeScript/JavaScript: The main programming language for our frontend.
  3. Azure Static Web Apps: To serve our frontend application.


Update: Got Access to GPT-4 with web browsing

As I was about to publish this I got access to GPT-4 with web browsing Microsoft Build conference is on going so servers are being slammed right now. I will update this part as things progress as GPT-4 knowledge cut off is 2021 at the time of writing this. I attempted to use browsing to get more current information on Microsoft Azure tech stack in case someone was interested in building this solution.

In Azure Data Share documentation, here are some key points:

  • Data Sharing: Azure Data Share allows sharing of data, regardless of format or size, from multiple sources with other organizations. You can easily control what you share, who receives your data, and the terms of use. It provides full visibility into your data sharing relationships through a user-friendly interface.
  • Data Governance: The service enables you to track and manage your data sharing relationships easily and efficiently. You can view who you have shared data with and when the data is accepted. It also allows you to stop future updates from flowing through at any time.
  • Analytical Datasets Expansion: Azure Data Share helps combine internal data with partner data for new insights. You can share and receive data in any format to or from Azure Synapse Analytics, Azure SQL Database, Azure Blob Storage, Azure Data Lake Storage, and Azure Data Explorer. The service is expected to support more Azure data stores in the future.
  • Security and Compliance: Azure invests heavily in cybersecurity, employs a large number of dedicated security experts, and has more certifications than any other cloud provider. This investment extends to Azure Data Share, ensuring that data shared through the service is secure and compliant.
  • Pricing: Azure Data Share operates on a pay-as-you-go model with no upfront costs, no infrastructure to set up, and no server to provision. You only pay for what you use.
  • Supported Datasets: The service currently supports unstructured and structured datasets.


Conclusion

Organizations who send their members to events like these will be able to more effective pass insights to their members in a more comprehensive way. I know while events like this are valuable but after traveling personally I'm exhausted and being fatigued or stressed from the travel doesn't help with human memory. I hope this inspires you to continue to find ways to explore ways you can use AI to assist you with your overall productiveness.

To view or add a comment, sign in

More articles by Marcus Turner

Others also viewed

Explore content categories