3 Workflows I've Automated for in-house teams. ① Ask Legal ② Procurement ③ Contract Review (not just the review!) 1. Ask Legal [or any department for that matter 🤷🏼♀️] You've heard me talk about legal teams and knowledge management. Long story short, your legal team is answering the same 20 questions over and over 😵💫 A simple way to save a CHUNK of time answering questions from the business (enabling them to go faster) ALL while having complete control & keeping a human in the loop? ↪️ Set up an 'Ask Legal' bot in your comms platform. ↪️ Sync it with your knowledge base (e.g GDrive/Notion/Sharepoint). ↪️ Set up your custom instructions (Want it to tag Bob on privacy questions only, specifically on a Tuesday? No problem). ↪️ Don't want the answer to go straight out to the business without reviewing it first? Cool, turn on co-pilot mode. The result? 60-80% fewer repetitive queries. Your team focuses on the high value things that need a human lawyer. 2. Procurement Businesses have 100's of tools, but when departments don't speak to each other you end up with duplicate tools & subscriptions 😭 💵 🚽. What if there was a way for the business to find out in <1 minute if there was a tool available that covered their needs, before needing to spend some hard secured department budget? Moreover, what if I told you, they could kick off the internal procurement process from the comfort of your comms platform? Team member : “Do we already have a tool for X?” in Slack/Teams ✅ Bot checks knowledge base (policies, procurement tool). ✅ If a match is found, it shares the approved tool & owner to contact. ✅ If not, the bot can ask the user for more info and direct them with next steps to kick off the procurement process from inside Slack/Teams. Ensuring your users ACTUALLY follow the process, without adding friction. Did I just see your CFO cry tears of joy? 3. Third Party Vendor Contract Review & Project Management Getting AI to redline a contract (as a first pass) is a huge win, but there's still the other pieces of the process missing, like: 🤷🏼♀️ The business figuring out IF legal review is even needed (according to company policy). 📨 The business actually submitting the contract to legal. 😩 Managing review capacity within the legal team. 🖥️ Getting the legal team to log & update the PM tool. The list never ends. Legal reviews only what actually needs their eyes, turnaround times improve, and the business stops pinging the team for “update pls?” in Slack : ) TLDR; Most legal teams are drowning in admin work that could be automated. I've built all of these using simple processes and tools (that I've found most businesses have). You also know I love a good Figma flow. So I’ve built them for all three of the above (see a sneak peak below). Want the entire thing? Comment "FLOWS" and I'll send them over. Also, tell me what you want to see - more of the above or step-by-step how-to build videos?
Leveraging Legal Technology
Explore top LinkedIn content from expert professionals.
-
-
Dear Richard - In your recent Times article, you wrote “If the leaders of most AI companies are right about the speed of technical advance, there will be little work left for traditional lawyers by 2035” I think sweeping predictions like this are unhelpful. They risk causing unnecessary anxiety for lawyers & law students. I feel like we have been here before. 17 years ago in your book “The End of Lawyers?” you forecast the same thing - that robots would take all the lawyer jobs. Yet there are far more lawyers now I disagree with your view & set out my reasons in a recent article for the Gazette of the Law Society of England & Wales (link in comments). In summary: Why lawyers will thrive alongside AI: 1. AI makes the world more complex & lawyers thrive on complexity: Every tech advance brings new legal issues. Email didn’t reduce legal work - it created more (especially in discovery). People are now recording meetings with AI - those transcripts will be litigation gold. Additionally AI will spawn entirely new practice areas, just like privacy & cybersecurity law emerged from digital transformation 2. It’s tech, not magic: Capability doesn’t guarantee adoption. Tech uptake is often slow due to cost, complexity & resistance 3. Massive unmet demand: Traditional legal services are out of reach for most people & small businesses. AI will lower costs & unlock new markets. At Lawpath, the AI-enabled legaltech business I co-founded 12 years ago, we’ve served 500,000+ clients - proving there’s a vast unmet need beyond the reach of conventional legal models 4. People level up, they don’t give up: Legendary AI researcher Geoffrey Hinton forecast the AI-induced end of radiologists by 2021. Instead, radiology roles grew. Doctors adapted & now use AI to deliver better care 5. Law is human: Logic alone doesn’t resolve disputes or get documents agreed. If parties can’t agree, it’s lawyers, not machines, who help find resolution 6. Humans still matter: In a world of vast (& confusing) information, people trust human experts. We’ve been able to book travel online for decades, yet travel agents still thrive. Lawpath’s success came not from a pure AI solution but from blending tech with human lawyers 7. AI generates disputes: Companies are reporting an increase in the number & sophistication of complaints since GenAI. Harvard Biz Review ranked “making a complaint” #23 in GenAI use cases. More disputes = more work for lawyers I think the future isn’t human vs machine. It’s human with machine. Lawyers who embrace AI as a tool to deliver better, faster, more affordable service will be in great demand I’d welcome the opportunity for a good-natured debate on this topic at any time. Kind regards, Nick PS: For any lawyers looking to upskill in AI, in addition to my day at NRF, I’m an adjunct prof at Bond University where I research & teach AI for lawyers. Details of the GenAI for Lawyers online, short course are below Richard Susskind
-
NEWS 21/10/25: Department of Homeland Security obtains first-known warrant targeting OpenAI for user prompts in ChatGPT According to a recent article by Forbes, the U.S. Department of Homeland Security (DHS) has secured a federal search warrant ordering OpenAI to identify a user of ChatGPT and to produce the user’s prompts, as part of a child-exploitation investigation. https://lnkd.in/eatmK3zv? Key details: - The warrant was filed by child-exploitation investigators within DHS. - It specifically targets “two prompts” submitted to ChatGPT by an anonymous user. The warrant asks OpenAI for the user’s identifying information and associated prompt history. - This is described as the first known federal search warrant compelling ChatGPT prompt-level data from OpenAI. What this means for privacy: -Prompts are treated as evidence. What users have assumed to be ephemeral or private entries in a chat session with an AI service may now be subject to law-enforcement production. -Scope of data retention and access must be reconsidered. If prompt history can be identified and requested, both users and providers should evaluate how long prompts are stored, under what identifiers, and how anonymised they truly are. - Implications for user trust and provider responsibility. AI companies may face growing legal obligations to disclose user-generated content and metadata, which may affect how the services present themselves (privacy guarantees, terms of service) and how users engage with them. - International context and legal cross-overs. For users in jurisdictions with strong data-protection regimes (for example, the General Data Protection Regulation in the UK/EU), the fact that prompt-data can be subject to U.S. warrant may raise questions about extraterritorial access and data flow compliance. In short: this isn’t just another law-enforcement request. It marks the first time a generative-AI provider has been legally compelled to unmask a user and disclose their prompt history. ============ ↳I track how stories like this shape the ethics and governance of AI. You can find deeper analysis at discarded.ai. #AISafety #AIRegulation #Privacy #Governance #Ethics Image AI Generated
-
A few days ago, a law firm partner told me something that perfectly captures what's broken in legal tech. Her firm recently rolled out a new AI contract review system. Big investment, months of training, the works. The promise? Associates would breeze through due diligence, catching key issues in minutes instead of hours. Three months later, she's noticing something weird. "The associates are using the AI," she said. "They generate the summary, highlight the key terms, get all the data points. Then they sit there and read the entire contract word-by-word anyway." "Why?" "Because they don't trust it. And honestly? Neither do I. So now instead of spending two hours reviewing a contract, they spend two hours reviewing a contract 𝘱𝘭𝘶𝘴 thirty minutes playing with AI tools that didn't actually save them any time." Here's the thing nobody wants to admit: Legal tech companies are creating technology that increase workloads while claiming to reduce it. The AI produces a beautiful summary. The associate still reads everything because their name goes on the work. The partner still reviews everything because their license (and livelihood) is on the line. And in the end, the client still gets billed for all of it. Associates don't want to review contracts faster - they want to review them with confidence that they didn't miss anything catastrophic. But apparently "confidence" doesn't demo well in sales presentations. What's one tool your team uses that officially saves time but secretly creates more work? #LegalTech #Leadership
-
Most legal teams do not struggle with AI because the models fall short. They struggle because they introduce AI without rethinking how legal work actually happens. Generative AI can draft, summarize, classify, and extract at speed. It can surface patterns across contracts, triage intake, and reduce low-value effort. What it cannot do is set priorities, weigh risk, or own outcomes. That still belongs to people. The real shift is not layering AI onto existing workflows. It is redesigning them. Decide which decisions stay human. Define where automation fits. Build verification into daily practice. Too many teams start with tools. They should start with process. Where does judgment enter? Who reviews AI output? What standards define “good enough”? How do you measure impact beyond time saved? AI literacy in legal is not about prompt tricks or feature chasing. It is about understanding how probabilistic systems behave, where they fail, and how to design guardrails that protect quality, confidentiality, and trust. That is where value gets created. And where implementations either succeed or quietly fade. I’m Colin, General Counsel at Malbek, and author of The Legal Tech Ecosystem. #legaltech #innovation #law #business #learning
-
Don’t overcomplicate AI for your legal team. Here are 12 initiatives to get started: (Based on conversations about AI with over 300 in-house lawyers): PEOPLE 1. Organise CPD sessions on key legal-specific topics. Examples: 'Gen AI for Legal Practice', 'Under the Hood of an LLM' and 'Prompt Engineering 101.' 2. Create dedicated AI experimentation time each month. Let your team know it's okay to experiment (safely). Set up guardrails and opportunities to share knowledge. 3. Identify Innovation & Technology champions. Peer-to-peer sharing is key. Your champions will drive digital literacy and engagement. GOVERNANCE* 1. Understand privacy and confidentiality requirements across different legal workstreams. Consider segmenting by data-type (e.g., client, company sensitive, company non-sensitive). 2. Consider privacy and confidentiality of different AI approaches. For example, state-of-the-art proprietary services vs. smaller, hosted models. 3. Set up set of rules for using AI to align with privacy and confidentiality requirements. TECHNOLOGY 1. Identify 3 legal work streams that present high potential for automation. 2. Assess the benefits and risks associated with each. 3. Survey the market for legal technology solutions that align with identified opportunities. Consider collaborations with law firms and industry experts to build customised solutions. OPERATIONS 1. Review legal team processes and identify 3 priority areas for optimisation and automation. These might include team meetings, client management, knowledge management, etc. 2. Develop an AI knowledge hub for the legal team. Include a prompt library, use cases, user guides, and lessons learned. 3. Collaborate with other areas of the business. Ensure the legal team is part of organisation-wide AI projects - from both a risk and legal ops perspective. *This assumes a foundational layer of governance and risk management, e.g. AI Guiding Principles, Risk Management Frameworks, etc. -- Here’s the thing: Legal teams won't be first up for new AI initiatives. They could be behind or lost in the shuffle. That's a real shame - because the opportunity for AI in law is huge. AI will help in-house lawyers move up the value chain. Do less boring work. Do more stuff that matters. I really want to see that happen. And these initiatives can help your team get there. Let me know your thoughts below - is your team exploring any of these initiatives? What do you think of this approach? #lawyers #ai #inhousecounsel
-
Anthropic has released Claude for Word in Beta, and the first listed example use case on its dedicated page is legal contract review. The featured screenshots show an NDA review. The suggested prompts include flagging provisions that deviate from standard market position, ranked by severity, making indemnification mutual, and working through tracked changes from a counterparty. This is not accidental product positioning, but a deliberate move into legal. Claude for Word sits inside Word, reads multi-section documents, works through comment threads, edits clauses while preserving formatting and numbering, and surfaces every edit as a tracked change for human review before acceptance. It is currently available on Team and Enterprise plans. The governance caveats are there, and they are the right ones: always verify outputs match your firm’s standard positions, follow your organisation’s data handling policies for sensitive or regulated data. I’m thinking about what this means for the legal tech ecosystem. For companies whose core offering is document review and drafting in Word, this is direct competitive pressure from the same model provider powering many of their products. An arms race with your own supplier is an uncomfortable place to be. The response has to be adding value a general-purpose Word plugin cannot: deeper integration, maintained playbooks, workflow orchestration, audit trails, and the professional accountability layer clients actually need. For in-house legal teams, what intrigues me more is the governance dimension rather than the capability question. Claude for Word can do a lot of what specialist legal AI tools do, at lower cost, inside existing workflows. But, at the same time, I’m also wondering how many teams will deploy it without first working through the questions that genuinely matter. Who owns the quality of the outputs? How is sensitive commercial information handled? What review standard applies before a tracked change is accepted? These are not reasons to avoid the tool. They are the questions that need answers before it touches anything that matters. Are you thinking of giving this a try?
-
A Manhattan federal judge has delivered a really important ruling on artificial intelligence and legal practice - can you claim legal privilege over AI generated documents? It’s a potential major blind spot for organisations - and a huge responsibility for in-house lawyers to explain the issue to their non-legal colleagues. On 10 February, U.S. District Judge Jed Rakoff ruled in USA v. Heppner that a criminal defendant could not claim attorney-client privilege over documents he had himself prepared using an AI service and then subsequently sent to his lawyers. Bradley Heppner, former chairman of GWG Holdings, faces fraud charges over an alleged $150 million scheme, with trial set for April. But the privilege ruling carries significance beyond any single case. The reasoning rests on a principle that long predates artificial intelligence. Privilege protects confidential communications between lawyer and client made for the purpose of legal advice. It does not automatically attach to materials a client creates independently simply because those materials are later forwarded to counsel. What matters is how the document came to exist, not its destination. What AI changes is the scale of the problem. Generative AI tools now allow any executive to produce polished case narratives, issue summaries, and chronologies that resemble legal work product, all without a lawyer’s involvement. The natural instinct is to assume that once these materials are emailed to counsel, they enter the protected sphere. Judge Rakoff’s ruling suggests otherwise - the court’s focus is on what the document is and how it came to exist, not on the fact that it was subsequently routed to a lawyer. This matters because AI is rapidly becoming the default tool through which businesspeople process complex situations. An executive facing a regulatory investigation who uses a chatbot to organise the facts and draft a summary for their lawyer may be creating discoverable material that sits entirely outside the privileged relationship. Judge Rakoff also noted that the AI-generated materials could prove “problematic” if used at trial. Even where privilege is not the issue, AI-authored documents create genuine evidential difficulties - questions of authorship, accuracy, hearsay characterisation, and the optics of presenting AI-mediated narratives as though they were direct recollection. If you want AI-assisted materials to have any chance of privilege protection, “Client-produced and then forwarded to counsel” is the weak fact pattern, and after this ruling, in the US at least, it may be no fact pattern at all.
-
Stop overcomplicating Legal Operations. Had a conversation yesterday with a Head of Legal at a 200-person company. She was convinced she needed enterprise-grade contract management software, AI-powered analytics, and a dedicated Legal Ops hire. Her annual legal spend? £150k. Her team? Two lawyers and a paralegal. This is what I call the sophistication fallacy. We've been sold this myth that effective Legal Operations requires complex technology and dedicated specialists. Nonsense. The most impactful Legal Ops transformations I've seen in smaller teams started with a notepad and some brutal honesty. One sole counsel increased her strategic impact by simply mapping where her time actually went. Turned out 25% was spent on work that didn't require her to be involved. Another small team revolutionised their stakeholder relationships with a one-page guide explaining when to involve legal and when not to. No software. No consultants. Just clear thinking and the courage to say no to low-value work. Legal Operations isn't about having the fanciest tools. It's about having the clearest priorities. Save the enterprise solutions for when you've mastered the fundamentals. What's one simple change your legal team could make tomorrow that would free up capacity for strategic work? #legaloperations #inhouselegal #legalleadership #generalcounsel #smallteams
-
A Recovering Lawyer's Guide to LegalTech As April arrives, my inbox fills with messages from attorneys exploring career pivots. "How do I break into LegalTech?" "Do I need coding skills?" These questions echo my own journey from practicing at one of India's largest firms to now leading Digital & Innovation team and building the Indian LegalTech Network (ILTN). Here are 5 Steps to Successfully Navigate Your Transition 1. Build your LegalTech network Attend LegalGeek, ILTA events, or local Legal Hackers chapters. While running The Blue Pencil in law school, I discovered the LegalTech community is refreshingly approachable—people genuinely enjoy what they do, making connections more authentic than traditional legal networking. 2. Find technology opportunities in your current role Don't wait for a formal transition. Speak to your IT or innovation teams about joining projects. Volunteer for internal committees focused on process improvement. These experiences develop relevant skills while testing your interest without commitment. 3. Develop adjacent skills beyond legal knowledge Abandon self-limiting beliefs like "I cannot do tech." Master advanced features in Microsoft Word, Excel, or Google Workspace. Learn design thinking, process mapping, and product management fundamentals—far more valuable in most LegalTech roles than coding. 4. Build something concrete Today's no-code tools enable anyone to create functional applications. Identify a problem in your practice, map the process, and build a prototype using Bubble, Bryter, or Microsoft Power Automate. Demonstrating this initiative speaks volumes to potential employers. 5. Choose hands-on experience over theoretical training While LegalTech programs proliferate, practical experience typically provides better value. If pursuing further education, prioritize programs offering real-world projects over purely academic approaches. Where Legal Expertise Creates Value! Most LegalTech roles don't require coding—they need people who identify the right problems and bring together solutions. Key positions include: -Legal Solutions Architect -Legal Project Manager -Practice Development -Legal Operations Manager Resources That Made the Difference 1. Richard Susskind's "Tomorrow's Lawyers" 2. Communities like Legal Hackers, International Legal Technology Association (ILTA), Indian LegalTech Network (ILTN) 3. Practical skills in design thinking and process mapping Start Today - Start Where You Are Become your team's tech power user. Volunteer with LegalTech startups. Approach this transition with genuine curiosity rather than career desperation—successful legal innovators see problems as opportunities, not obstacles. (and as always Projects/Solutions you built > > Certificate courses) The pictures from the amazing International Legal Technology Association (ILTA)'s ILTACON 2024!
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development