From Requirements to Customer Product, or the Benefits of Integrating Systems Engineering and Product Engineering Many product development challenges start with a disconnect: Requirements are defined in one tool, systems are designed somewhere else, and the engineering product structure lives in yet another system. The result is lost traceability, unclear responsibilities, and product structures that do not reflect the intended architecture. A more effective approach is to bring together Systems Engineering and Product Engineering in a continuous, integrated environment: Requirements → System Breakdown Structure (SBS) → 150% EBOM → Configured 100% products. The journey starts with requirements. These capture what the product must do: Performance targets, regulatory constraints, operational needs, and customer expectations. Requirements describe capabilities, not components. From these requirements, systems engineers develop the System Breakdown Structure (SBS). The SBS decomposes the product into systems and subsystems based on functional responsibility; propulsion, control, energy, structure, electronics, and so on. Each system becomes responsible for fulfilling a specific set of requirements and defining the interfaces to other systems. Here the product architecture begins to take shape. Product engineering then translates this architecture into the physical product structure. Each system defined in the SBS is implemented as a module or assembly in the Engineering Bill of Materials (EBOM). To support product families and variants, this is typically represented as a 150% EBOM, containing all modules and variant options across the platform. From the 150% EBOM configuration logic then selects the appropriate modules to create a specific 100% product EBOM for a customer order, region or production variant. When this process is executed in an integrated environment, powerful benefits emerge. Requirements remain traceable to the systems that fulfill them. Systems remain linked to the modules and assemblies that implement them. Changes in requirements or architecture can be traced directly to the affected product structures and configurations, and determining technical and financial impacts becomes quick and easy. This integration also supports better modularization based on changing requirements. Systems engineering defines clear functional boundaries and interfaces, which translate into well-defined product modules in the EBOM. In short, integrating systems engineering with product engineering creates a continuous digital thread: Requirements → Systems → Modules → Product Family → Customer Specific Product Configuration. And that integration is what ultimately enables companies to build complex, configurable products faster, with better control over architecture, variants, and lifecycle changes and ultimately quickly configure a product that meets specific customer requirements.
System Integration in Product Development
Explore top LinkedIn content from expert professionals.
Summary
System integration in product development refers to seamlessly combining various engineering processes, tools, and data flows to create products that meet customer requirements, support customization, and streamline production. This approach connects requirements, design, and manufacturing into a unified workflow, making it easier to track changes and configure products for different needs.
- Connect teams early: Encourage collaboration between engineering, production, and planning teams from the start to ensure everyone shares the same understanding of product requirements and design choices.
- Build digital workflows: Use integrated platforms and data models to link requirements, design structures, and production steps, allowing quick adjustments and traceability throughout the product lifecycle.
- Test and refine: Regularly check integration points and monitor system performance to spot issues early and keep product customization accurate and efficient.
-
-
For decades, the V-Model has been a cornerstone development methodology for complex, mechatronic systems. However, increasing complexity, shorter development cycles, and growing uncertainty in supply chains have sparked an intense debate about its continued validity. Critics argue that the V-Model is too rigid, reinforcing siloed domain development, an approach that appears increasingly outdated in a world dominated by E/E and embedded software. As alternatives, CI/CD-inspired approaches from software engineering or newer I-Model–based processes are proposed, emphasizing continuous system integration and unified data models. Spoiler: In this post, I’m not advocating for one of these approaches. Instead, I want to highlight which elements should be considered. The V-Model One of the greatest strengths of the V-Model is its clarity. It breaks down highly complex development processes into manageable sub-processes, assigns responsibilities across domains, and creates a shared understanding of product development. The traditional temporal separation into system design, development, and integration is increasingly challenged by simulation-driven system integration. This is why the classic “left and right flank” is often considered outdated. That criticism is valid, but as we all know: “All models are wrong, but some are useful.” Simulation and AI will replace large portions of physical system integration, but not all of it. The right flank of the V-Model still has a reason to exist. CI/CD Some argue that CI/CD practices from software development are the right answer to manage complexity and ensure agility. And indeed, especially at the component level, tight coupling of CAD, simulation, and automated test pipelines enables rapid exploration and optimization of design variants. Designs whose quality can be quantified within seconds or minutes via fast feedback loops are prime examples of how CI/CD can dramatically accelerate product development. Integrated I-Model Early system integration becomes possible when system-wide data models (engineering data backbone) guide the entire development process. This allows partial validation (and even verification) of the system very early on. Increasingly realized through MBSE, RFLP, and coupled simulations (co-simulation), these approaches help identify incompatibilities and design flaws, when they can still be eliminated efficiently through simulation. As a result, the left flank of the V-Model is massively strengthened, design spaces can be explored much deeper, and parts of the traditional right flank effectively move to the left. 🔍 Conclusion From my perspective, the V-Model will evolve, not disappear. It will adapt and absorb elements from CI/CD and integrated I-Model approaches rather than becoming obsolete. What’s your take on this evolution? Sebastian Angerer | Vlad Larichev | Nitin Ugale | Dr. Pascalis Trentsios | Andreas Kiep #SystemsEngineering #ProductDevelopment #MBSE #DigitalEngineering
-
If you really want to develop good AI product sense, study what Anthropic is doing. While OpenAI is busy copying browsers & agent platforms, Anthropic is quietly playing a masterful product game with releases like MCP, Skills, ClaudeCode, and focused models. Product sense is about finding what your users actually need and solving that elegantly. To understand better, let’s tear down MCP (Model Context Protocol) for product sense into important parts: 1. Purpose: For Users + Business For developers: To make LLMs context-aware, easily connect models to external data, tools, and systems without reinventing integrations each time. For Anthropic: To become the “USB-C of AI.” If every tool and model connects through MCP, Anthropic controls the plumbing of the AI ecosystem - the connective tissue for the agent era. 2. Problems MCP Solves Devs: Every integration (CRM, Slack, Notion) used to need custom glue code. Context for models was fragmented & hard to maintain. Enterprises: Want models to act on internal data securely. But integration overhead and governance risk make it hard. Anthropic: Couldn’t scale the ecosystem if every Claude integration were different. Needed a standard protocol devs can build on. 3. Product Questions to Ask: (great products are an outcome of hard, deep questions): What does the developer journey look like today when integrating AI with live data? How can we abstract the integration layer without losing flexibility? What primitives (Tools, Resources, Prompts) do we need to make this standard reusable? How do we ensure security and trust in every connection? What would make it 10× better - faster, safer, more discoverable? Who should adopt it first - IDEs, data tools, or enterprise apps? 4. Metrics to Measure Success Developer Adoption: # of MCP servers built SDK installs and connector reuse rate Avg time to build a new connector Enterprise Impact: Time saved integrating internal data of AI features shipped using MCP Security incidents avoided / mitigated Ecosystem Growth: # of partners/tools supporting MCP Requests routed via MCP per day Developer satisfaction (DX NPS) 🚀 5. Roll-Out Strategy Phase 1: Build reference connectors (GitHub, Slack, Drive) + SDKs. Phase 2: Open-source spec → community adoption (YouTube buzz) → marketplace. Phase 3: Enterprise integrations + certification layer (trust & audit). Phase 4: Ecosystem scale - 100s of connectors, governance, automation. 6. Product Reflections Anthropic’s bet: AI will be won not only by the best model, but by the best connectivity layer. Risk: Standards succeed only if the ecosystem aligns - security & adoption will decide the winner. Lesson for PMs: Building infra products isn’t about features; it’s about creating compounding leverage for others to build faster. Reverse engineering Anthropic's strategy will give you amazing lessons for this AI world. They’re not just building models; they’re building infra-tools for the future of intelligence.
-
IT/OT integration is how you de-risk growth. If the top floor can’t see the shop floor in real time, quality slips, downtime grows, and batch release slows. In our world of compliance and complex supplier networks, blind spots turn into audit findings and missed delivery windows. Here’s the core move I see working. Combine the real and digital worlds across product and production so horizontal data flows become routine. Think engineering models, test results, materials, building processes, automation code, and performance data moving between teams. Then connect the vertical path. Executives, planners, and operators sharing the same context so decisions line up with actual conditions. That’s where you get predictive maintenance instead of unplanned stops, data‑centric supply chain adjustments instead of last‑minute expedites, energy transparency that feeds credible sustainability metrics, and stronger cybersecurity plans that account for both IT and OT exposure. Pharma adds constraints, but the pattern still holds. IoT devices can read modern and legacy equipment, extending the digital thread into your supplier ecosystem so logistics, production timing, and potential disruptions show up early. A closed loop between development, production, and optimization tightens traceability and speeds corrective action. Digital twins let engineering teams iterate quickly on both process and line design without risking validated operations. Pick one high‑stakes decision and wire it end to end. For many, that’s batch release. Map the horizontal data you need across quality tests, materials, and line performance. Then build the vertical connection so insights reach the teams that plan, schedule, and approve. Keep the scope small, include cybersecurity from day one, and define the single source of truth for that decision. When it works, scale to the next decision.
-
SAP VC Integration with Production Planning (PP) Key Integration Steps: 1. Configuration Profiles: • Define Profiles: Set up configuration profiles in SAP VC to manage configurable products. • Create Dependencies: Establish dependencies and rules linking product configurations to BOM and routing. 2. Master Data Setup: • Material Master: Configure materials with necessary views (SD, MRP, Production). • BOM: Create a super BOM with all potential components. Use dependencies to select components based on configuration. • Routings: Develop super routings covering all operations. Dependencies determine necessary operations for each configuration. 3. Variant Configuration in Sales: • Sales Orders: Configure products in sales orders to specify characteristics. The system selects appropriate BOM components and routing operations. 4. Transfer to Production Planning: • Planned Orders: Sales order configurations transfer to planned orders in PP, generating production orders with specific components and operations. • Production Orders: Convert planned orders to production orders, tailored to customer configurations. 5. Integration Points: • MRP Run: MRP considers configurations to generate planned orders. • Capacity Planning: Ensure capacity planning accounts for different configurations. • Shop Floor Control: Use production orders to manage and control shop floor operations. Benefits of Integration: • Customization and Flexibility: Extensive customization without separate BOMs and routings for each variant. • Efficiency: Streamlined manufacturing process with production orders linked to customer configurations. • Accuracy: Reduced errors through automated BOM component and routing operation selection. • Cost Reduction: Fewer master data records needed for different product variants. Implementation Tips: • Testing: Test integration scenarios to ensure configurations influence BOMs and routings correctly. • Training: Train sales and production planning teams on handling configurable products. • Monitoring: Continuously monitor the integration to ensure smooth operations and address issues promptly. By following these steps, SAP VC can be effectively integrated with PP, enhancing manufacturing processes and meeting customer-specific requirements efficiently.
-
Teams Usually Don’t Realize How Complex a Full BOM Flow Really Is A Bill of Materials is not “just a parts list.” It’s the backbone that connects design, engineering, manufacturing, supply chain, quality, and service into one controlled, traceable product lifecycle. If your BOM breaks, your entire production line breaks. Here’s the simplified breakdown - 🔹 Data Sources Your BOM starts with CAD models, specs, standards, and legacy data, everything that defines the product before it even exists physically. 🔹 Engineering BOM (EBOM) Engineering structures the product logically, manages revisions, assigns part numbers, and controls design changes. 🔹 BOM Governance & Change Control Every change goes through ECR/ECO workflows to ensure quality, cost, and manufacturability are assessed before approval. 🔹 BOM Transformation (EBOM → MBOM) Engineering intent is transformed into a manufacturable structure, aligning assemblies, alternates, substitutes, routing, and plant-specific needs. 🔹 Manufacturing BOM (MBOM) Manufacturing defines processes, scrap factors, tooling, consumables, and ensures everything is production-ready. 🔹 System Integration PLM, ERP, MES, and supplier systems work together so the BOM flows across planning, procurement, production, and partners. 🔹 Manufacturing Execution Feedback What gets built is captured, deviations are logged, quality data is tracked, and real-world insights move back into engineering. 🔹 Traceability & Continuous Improvement As-built, as-designed, and as-maintained BOMs are kept in sync, providing compliance, service BOMs, audits, and a continuous feedback loop. The strongest manufacturers don’t just manage BOMs - they manage BOM intelligence across every system, every change, and every stage of the lifecycle. Great products are built when design, engineering, and manufacturing speak the same BOM language. For a deep dive into PLM, MES, or CAD and to elevate your understanding of PLM, connect with us at PLMCOACH and Follow Anup Karumanchi for more such information. #plmcoach #plm #teamcenter #siemens #3dexperience #3ds #dassaultsystemes #training #windchill #ptc #training #plmtraining #architecture #mis #delmia #apriso #mes
-
Real Product Lifecycle Management (PLM) with process-focused integration of the business applications PDM, ERP, SCM, MES, CRM etc. To be able to carry out the entire process of manufacturing a product digitally without any media disruption, data must flow through almost all areas of a manufacturing company. Since this data is created and managed in several systems due to its functionality, a controlled data flow between these systems must be possible. This can be achieved by implementing classic direct interfaces between the relevant systems (PDM – ERP, ERP – MES, ERP – CRM, etc.). Depending on the number of systems, the effort involved can be relatively large. In addition, the overarching process reference is missing. Another way to enable data flow between the installed business applications is to use a BPM system. On the one hand, Business Process Management allows to define cross-system processes. This is important because processes do not stop at system boundaries. On the other hand, it offers the possibility to create user portals in process tasks. The web services of the installed business applications can be used from these portals. This allows data to be read, created and modified in parallel in different systems. The business applications PDM, ERP, SCM, MES, CRM, etc. basically create a virtual IT enterprise tool. There is no leading system in this approach. Each of the business applications fulfills a specific task in the overall process. Nevertheless, the PDM system solution plays a key role, since the virtual product is created in development, design, work planning/CNC programming, etc. Its data is the source for all product-related subsequent processes in ERP, SCM, MES, CRM, etc. The acronym PDM does not stand for CAD data management software, but for a fully featured system for digitizing all engineering and engineering-related processes. In this approach, PLM is not a single system, but the result of the process-focused use of specialized business applications. Anyone who uses the term PLM for a single software for managing a virtual product will repeatedly be confronted with communication problems and will come into need of explanation in a real PLM project. Note: In the next post I will show the principle of process-oriented use of business system data using a BPM system. #ProductLifecycleManagement #BusinessProcessManagement #PLM #PDM #ERP #SCM #MES #CRM #BPM
-
How to connect engineering and manufacturing? Just before we move to 2026, I want to share some thoughts about the progress with ERP integrations. This is one of the most popular question you can hear. In conversations with prospects and customers, I keep hearing the same frustration, even when it’s not always stated directly. CAD to ERP, PLM to ERP, ... you name it. Engineering, manufacturing, and procurement are all working with product data, but they’re not working from the same understanding of it. The data can flow (somehow), but exports are manual, and the data isn’t aligned. Teams making mistakes. Extra checks required. Side spreadsheets show up “just in case.” Not because something is broken, but because intent, readiness, and decisions don’t reliably survive the handoff from engineering to manufacturing. As long as I remember, industry tried to solve this by moving data faster and more reliably between systems- CAD to ERP sync, PLM-ERP integrations, etc. That helped — but it didn’t solve the real problem. The real issue is that integration has been treated as data transfer, not as a guided workflow. In the last OpenBOM article, I share how we’re rethinking engineering-to-manufacturing integrations for 2026 — moving from manual spreadsheet-like handoffs to agentic, state-driven workflows built on product memory, and exploring where AI can help guide decisions and build an integration instead of just pushing data. From “systems are connected” to “decisions stay aligned.” 👉 ERP Integrations in 2026: From Connectors to Agentic Workflows [https://lnkd.in/e8bA4R35] Curious whether this matches what you’re seeing between engineering and manufacturing today. #3DCAD #PLM #ERP
-
Target Architecture for a Manufacturing Company (Integrating ERP, MOM, PLM, and IIoT into a Unified Platform) Key Principles · Business-Outcome Driven: Focus on measurable KPIs like OEE improvement, downtime reduction, and cost optimization. · Hybrid and Scalable: Leverage edge and cloud for optimal performance and compliance. · Secure by Design: Implement Zero Trust and end-to-end security. · Open Standards and Interoperability: Use protocols like OPC-UA, MQTT, and ISA-95. · Data Governance First: Ensure data harmonization, lineage, and quality control. Key Functions A. Capabilities and apps layer Apps covering specific use cases, e.g., predictive maintenance or automated error detection, that build upon standardized platform functionality Apps provided by a third party or platform provider and available via an app store, e.g., overall equipment effectiveness for machines B. Analytics and data platform Standardized (self-service) reporting, analytics, visualization, or location services available via API to all apps utilizing best-in-class algorithm libraries Integration and harmonization of data, taking semantics of different protocols and machines into account C. Operations services Highly scalable services handling basic platform functionalities such as device management (e.g., rights and roles, access management), service hosting, deployment and administration (e.g., activity monitoring, resource use), connectivity, and security (e.g., encrypted data exchange, key public infrastructure, certificates) available to all sites based on microservices and API D. Integration into enterprise IT systems Interface to enterprise-level software, e.g., ERP, SCM, PLM, or CAD, via aggregating data and information generated in the app or analytics and data platform layers in formats pro- cessable by enterprise-level software Enterprise-level software with access to the analytics and data platform and potentially also apps via API to perform processing that is not natively available E. Integration of the IIoT platform with MOM Integration of the IIoT platform with the MOM layer to enable detailed scheduling of production, shifts, orders, and overall lines, and configuration and status information—input for operations analytics (quality, asset maintenance, overall equipment effectiveness) and other custom apps F. SCADA, edge gateways, and machine-level connectivity Data routing and exchange with edge devices and machines, incl. data flow prioritization engines for forwarding raw or preprocessed data to the cloud Data routing, prioritization, and storage enabled by on-site processing and storage within edge gateways Easy integration of devices into the platform via plug and play "Target Architecture Readiness Checklist is available with Team Transform Partner, if anyone wants to have access." Source: Some inputs from McKinsey Transform Partner – Your Strategic Champion for Digital Transformation
-
I've seen too many integration projects go sideways because teams skipped the most important step: opening properly. When a stakeholder says "we need to integrate ERP with MES," your first move shouldn't be scheduling a requirements workshop. It should be applying ISA-95 as an analysis framework. Four questions: 1. System levels? (L4→L3, L3→L3) 2. Domain? (production, maintenance, quality, inventory) 3. Information exchange type? (definition, capability, schedule/request, performance/response) 4. Repeatable patterns? (Work Masters vs Operations Definitions) The Amárach StackWorks article uses a production order integration to show how this works. Takes 30 minutes. Prevents months of scope creep. You know what systems are in scope, what domain you're working in, what ISA-95 models you need, and how the work decomposes—before you've written requirements or facilitated a single Event Storming session. That's opening properly. It changes everything.
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development