Can we improve how we assess interdisciplinary research proposals? Having spent 10 years working in interdisciplinary research, I saw firsthand how funding panels struggle to evaluate these proposals fairly. Yet interdisciplinary research is essential for tackling complex societal challenges—from climate change to health inequalities. If we don’t get the assessment process right, we risk undervaluing the very research that can drive real-world change. This recent paper, "How qualitative criteria can improve the assessment process of interdisciplinary research proposals" by Schölvinck et al. (2024), highlights how qualitative criteria- rather than rigid metrics- can enhance fairness and clarity in the review process. Key takeaways: ✅ Qualitative criteria improve assessment 💡 Panels benefited from structured discussions on integration, feasibility, and impact rather than relying on narrow scoring systems. ✅ Interdisciplinary research needs different evaluation tools 📖 Disciplinary review panels often struggle with proposals that cross boundaries, reinforcing barriers to truly integrative research. ✅ Team composition & institutional support matter 🤝 Successful interdisciplinary projects rely on collaborative skills, open-mindedness, and strong institutional backing-factors that traditional funding criteria can overlook. ✅ Bibliometric indicators have limitations 📊 While tools like the Rao-Stirling Index can measure interdisciplinarity, they don’t capture the conceptual or methodological depth of collaboration. 💬 What’s next? This study reinforces something I’ve long believed—peer review processes must adapt to recognise the unique nature of IDR. Instead of rigid scoring, qualitative criteria should be used to guide discussions rather than dictate funding decisions. It’s such an important shift if we are serious about tackling the most pressing societal challenges. #InterdisciplinaryResearch #ResearchFunding #PeerReview #KnowledgeExchange #ResearchAssessment
Cross-Disciplinary Program Review
Explore top LinkedIn content from expert professionals.
Summary
A cross-disciplinary program review is a process that examines academic or research programs spanning multiple fields to ensure they meet quality standards and drive innovation. These reviews highlight how integrating knowledge and methods from different disciplines can address complex challenges that single-subject approaches might overlook.
- Adopt qualitative criteria: Shift away from rigid scoring systems by encouraging open discussions about integration, feasibility, and broader impacts when evaluating cross-disciplinary work.
- Value diverse contributions: Recognize teamwork, leadership, and the translation of research into policy or practice, not just traditional metrics like publications or individual grants.
- Support data integration: Improve user experience and standardize data formats to help researchers easily navigate and combine resources from different disciplines.
-
-
University Research Strategy Playbook #6/11: Incentives That Matter Breaking silos and aligning infrastructure to strategy only go so far if people are still rewarded for yesterday’s behaviours. Building an outcome-oriented institution means rewriting the incentive playbook. Most universities still prize a narrow set of outputs: individual grants (albeit now with different flavours), first-author papers, citation counts. These made sense when prestige and discipline-based excellence were the primary goals. But in an outcome-oriented university, they generate unintended consequences, including: competition over collaboration; short-terminal and project chasing; invisible work (eg team leadership, translation, community engagement, platform building); professional-service burnout (as research managers are measured on throughput). The result? Fragmented effort, lost opportunities, and perverse incentives running counter to strategy. A modern research track must link individual success to collective outcomes. Six design moves shift the dial: - Portfolio-Based Performance Evaluate contributions across research, collaboration, impact, and leadership. A paper still counts—but so does driving a flagship program, mentoring cross-disciplinary teams, or delivering policy change with government partners. - Shared Success Pools Allocate a portion of overhead and incentive funding to teams, not individuals. When programs achieve agreed-upon outcomes (e.g. regulatory adoption, prototype deployment, new industry standard), every contributor—academic, technical, professional—shares the reward. - Parallel Recognition Tracks Create parallel, equally valued pathways: Scholarly excellence (breakthrough discovery), Program leadership (coordinating large, outcome-linked efforts), Translation & engagement (commercial, policy, community impact), Platform stewardship (running critical shared infrastructure). Promotion criteria flex to each track; no one path is “second class.” - Transparent Workload & Credit Systems Time spent on partnership building, interdisciplinary teaching, or data stewardship is explicitly counted, not hidden in “service.” Credit for multi-author outputs is shared so collaboration is rewarded, not penalised. - Professional Services Incentives Research development, finance, and contracts teams earn bonuses or advancement by speed to contract, partner satisfaction, or program impact—aligning their goals with academic colleagues rather than operating in parallel silos. - Reward Forward Execution and Learning Recognise and reward individuals for making progress against their forward-looking plans that contribute to the university’s strategic intent—even when results fall short. This shifts culture away from backwards facing accounting, toward strategic experimentation: taking risks, trying new approaches, learning fast, and improving. In a complex world, universities need more of this—not less. Up next: Post 7—Building External Trust.
-
Tearing down (disciplinary) walls through #literature #reviews (aka #review #research)?! Many agree that #cross- and #interdisciplinary literature reviews hold great potential to advance #knowledge. In this newsletter guest article Richard L. Gruner and Roberto Minunno discuss the #WHY and #HOW 💡 🧠 🧱 The insights build on their article entitled "Theorizing across Boundaries: How to Conduct a ‘Breakout’ Literature Review" published in International Journal of Management Reviews (link: https://lnkd.in/efmwZu48). Abstract: "Best practice advice for literature reviews abounds, yet little advice is available for how to infuse a literature review with #theory-generative #insights that break out of #knowledge #silos. To address this issue, we provide guidance on reviewing a range of literature for theory-generative insights through a #process of #knowledge #transfers from a source domain onto a target domain. To do so, mainly building on work concerned with #analogical #reasoning, we put forward a ‘breakout’ review model, which consists of three iterative stages. While we illustrate the process model in a supply chain management context, we aim to assist any organizational scholar interested in exploring cross-disciplinary literature for new ways of thinking." Richard is "Associate Professor at The University of Western Australia, and Adjunct Associate Professor at Curtin University. Richard's research has been featured in leading international journals, including Journal of the Academy of Marketing Science, European Journal of Information Systems, International Journal of Operations & Production Management, Journal of Product Innovation Management, Renewable and Sustainable Energy Reviews, International Journal of Management Reviews, and Journal of Supply Chain Management" (source: https://lnkd.in/eu9VacTU). Roberto is a senior lecturer and "a subject matter expert in #circular #economy and sustainable, modular building design., and life cycle assessment. With a background in civil engineering, Roberto is driven to create solutions to #sustainability and circular economy challenges. [...] Roberto’s process expertise includes conducing Systematic Literature Reviews (SLRs) – providing efficient and complete analysis of relevant academic literature and synthesising key findings." (source: https://lnkd.in/ehPspFhB). Thanks so much for this insightful contribution 🙏 🙏 🙏 With references to works by Hari Bapuji, Dermot Breslin, Joep Cornelissen, David Denyer, Caroline Gatrell, and Andreas Wieland, among others. Annual Reviews The Campbell Foundation Cochrane #PhDs #research #rigor #knowledge #impact #SLR #reviewresearch #researchsynthesis #systematicreviews
-
Chemistry meets biology: cross-disciplinary evaluation of drug discovery databases Identifying data that seamlessly integrate chemical and biological information is essential yet notoriously difficult. Kyrsanov et al. present a crowd-reviewed assessment of major biochemical databases—from open-access platforms like ChEMBL to commercial resources like Reaxys and vendor-run sites—through the lens of both chemistry and biology. Their structured survey approach has student and professional users searching for small molecules (including late-stage leads and withdrawn drugs) and therapeutic targets (enzymes, GPCRs, etc.) in 13 databases, then rating usability and annotation quality. The findings highlight strong differences in coverage, search capabilities, and data organization. While academic databases often offer breadth, they can lack consistent curation or clarity. Commercial databases are typically more comprehensive but require subscriptions and can be cumbersome to search. Supplier portals do well with certain queries but often provide only minimal or text-only biological information. The authors emphasize that improving UI/UX design, standardizing data formats, and better integrating chemical and biological insights are critical next steps—particularly given modern machine learning workflows, which are highly sensitive to data fidelity and accessibility. Their results and open-source methodology serve as a blueprint for future database refinement and deeper cross-disciplinary data integration in drug discovery. Paper: https://lnkd.in/dpfqaznC #DrugDiscovery #DataIntegration #BiochemicalDatabases #Chemistry #Biology #MachineLearning #UIUX #Research #DataScience #Bioinformatics #MedicinalChemistry #Pharma #Science #Innovation #AIforScience
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development