New research prototype for Personal Health Agent (PHA), a comprehensive research framework for delivering personalized, evidence-based health and wellness guidance. This system is built on a multi-agent framework that models support after a human expert team, each handled by a specialized LLM sub-agent: ▶️ Data Science Agent: Analyzes multi-modal data from wearables and health records, such as blood biomarkers, to provide contextualized numerical insights. ▶️ Domain Expert Agent: Acts as a reliable source of grounded health knowledge, tailoring information based on the user's specific health profile. ▶️ Health Coach Agent: Supports users in goal-setting and behavioral change through multi-turn, psychologically-inspired conversations. The Orchestrator dynamically coordinates these specialists to synthesize a single, coherent response to complex queries. Evaluations confirmed that this collaborative multi-agent approach significantly outperformed single-agent baselines in overall response quality, clinical significance, effectiveness and usefulness as evaluated by human experts and end-users. This work, including extensive evaluation of all agentic components using the Wearables for Metabolic Health (WEAR-ME) study data, establishes a validated blueprint for the next generation of trustworthy and coherent personal health AI. Read more about this research and the multi-agent framework: https://goo.gle/42kzjvZ Preprint: https://lnkd.in/dfZ96X5c
Innovative Wearable Technology
Explore top LinkedIn content from expert professionals.
-
-
A wearable robotic limb system! 🦿 Scientists at the University of Tokyo have created JIZAI ARMS, a robotic limb system that lets people wear and swap extra arms. The system includes a wearable base unit with six attachment points for robot arms that can be removed and shared with others. It was designed to explore how people might interact in a world where humans and machines work closely together. Developed by human augmentation experts, designers, and engineers, they also consider both function and style to fit a "digital cyborg" future. Beyond social experiments, these extra arms could be useful for helping people with disabilities, artistic performances, or even new ways of working with machines. Could this be the first step toward a future where people and robots become one?
-
7 wearable and sensor innovations pushing health beyond “wellness” tracking this month: 🔘 Sibel Health is developing an AI-enabled wearable that tracks scratching behaviour in people with atopic dermatitis, turning something usually seen as a subjective symptom into a measurable clinical signal that could also support drug development. 🔘 CranioSense is working on a non-invasive approach to measuring intracranial pressure, which today often requires invasive procedures, and if validated could make brain pressure monitoring safer and more continuous in routine clinical care. 🔘 University of Technology Sydney researchers are developing AI-powered sweat sensors that can decode body chemistry in real time, tracking hormones, medication levels and potential early warning signs of disease, potentially offering a non-invasive alternative to some forms of blood testing 🔘 ŌURA rings are being used within Medicare Advantage Plans, with around one-third of eligible members opting in and sharing biometric data, which is already leading to improvements in sleep and light activity and is paving the way for deeper clinical use cases such as hypertension monitoring 🔘 Samsung Electronics is preparing to launch an AI Brain Health tool that uses data from smartphones and wearables, including speech, movement and sleep behaviour, to help detect early signs of dementia while aiming to keep the experience privacy-aware and clinically relevant 🔘 Researchers at the University of Arizona have created a wearable mesh sleeve that monitors gait and subtle movement patterns to identify early signs of frailty in older adults, with the goal of shifting care from reacting after a fall to proactively supporting prevention through continuous remote monitoring 🔘 And China is testing “smart urinals” that analyse urine in real time for markers like glucose and protein, which opens up interesting conversations about passive health screening, consent, and how health data might be gathered in everyday environments. 💬We are steadily moving from episodic health snapshots to passive, continuous and contextual signals across movement, sleep, behaviour and even body chemistry. The technology is getting closer. Now the real work is around validation, governance, reimbursement and making sure the data actually makes a difference in peoples lives 👇 Links to articles in comments #DigitalHealth #Wearables #AI
-
A 65 year old just became the first person to control an iPad using brain signals alone. Mark Jackson was diagnosed with ALS (amyotrophic lateral sclerosis) in 2021. Over time, he developed complete paralysis in both arms and weakness in his neck. No way to swipe a phone. No way to send a text. No way to do things for himself without asking someone else. Until a brain-computer interface by Synchron changed that. Here's how it works: ▶ 1. Device sits inside a brain vein ↳ A small sensor is implanted into one of the veins within Mark's brain through a minimally invasive procedure - not brain surgery. ↳ It reads brain signals from the motor cortex and translates them into digital actions on screen. ↳ Mark now watches Netflix, listens to audiobooks, browses Instagram and Facebook, and texts his kids. All by thinking about the action he wants to take. ▶ 2. Two-way communication creates real-time feedback ↳ Synchron just launched a new version using something called a BCI HID profile - Human Interface Device. ↳ The computer detects the strength and fidelity of Mark's brain signal in real time and presents feedback about where he's looking, what he's thinking about clicking, where he wants to move. For someone who can't move their arms, losing the ability to do things independently is one of the hardest parts of the disease. This technology gives that back. However, the tech is still early. Synchron has completed early feasibility trials and is preparing for pivotal trials before seeking FDA approval - a process that will take several years. But would you trust a brain implant if it gave you back your independence? #entrepreneurship #healthtech #innovation
-
I saw something that made me do a double take. A company called friend just launched an AI companion you wear around your neck. $129. Always listening. Perfect memory of everything you say. The founder, Avi Schiffman, calls it a new kind of companion. Not a replacement for real relationships. 🔽 Here's how it started. May 2023. Schiffman was building TAB, a language model designed to understand your day without needing constant explanation. Then he took a work trip to Tokyo. Alone. Surrounded by skyscrapers. And he felt lonely. Inspiration struck from Tamagotchis he had seen in toy museums. The pivot happened fast. Hardware became friendship. The device was launched on World Friendship Day. So how does it work? Friend listens 24/7. Remembers everything. Runs on Google's infrastructure. No settings. No buttons. No customisation. You can't turn it off. The founder thinks that would be weird. You can't view its memories. That would be invasive to the entity. Early data shows retention comparable to the Apple Watch. Average users send over 200 messages a day. The privacy pitch is bold. Each device has a private key encrypting all data. If you lose the device? The data is gone forever. No human can access it. Not even you. Need privacy? Swipe away the app. It stops listening until you reconnect. Schiffman believes this is the most positive use case of technology for the next decade. He envisions a future where one of your five closest friends could be an AI. He has spent nearly $1 million on a New York City subway campaign. Over 10,000 white posters plastered across stations. However, the backlash hit fast. Posters got ripped down. Defaced. Scrawled with warnings. "We don't have to accept this future." "AI is not your friend." When perfection becomes the selling point, humanity becomes the flaw. And we're being asked to accept that trade without question. The resistance isn't about hating technology. It's about refusing to let loneliness be monetised. Refusing to accept that our inability to deal with disappointment should be solved by eliminating the possibility altogether. But here's what I keep thinking about: - A companion that never forgets. Never judges. Never demands reciprocity. - It sounds additive (The founder insists it's additive). But what happens when the easy relationship crowds out the hard ones? When we choose perfect memory over messy humanity? The company believes digital relationships will define 2025 to 2035. Maybe they're right. But I can't shake this question. - Are we adding a friend, or are we practicing for a world where we prefer AI to people? - Would you wear an AI companion that remembers everything you say? - Do you need more friends? If, a year from today, I tell you, 'I just bought a friend,' I expect you to react normally, as if this is a common thing to do. A defaced advertisement for AI Friend on New York’s subway. Photograph: Friend.com
-
SMART STICKER READS REAL EMOTIONS BENEATH THE SURFACE A new stretchable, rechargeable sticker developed by researchers can detect authentic emotional states by measuring physiological signals like heart rate, skin temperature, and humidity, even when facial expressions are misleading. The wearable patch transmits real-time data to mobile devices, helping health providers assess mental health remotely. Unlike traditional emotion recognition systems, this device integrates multiple sensors and facial analysis while preserving user privacy. With AI-powered accuracy and wireless functionality, it offers promise for applications in telehealth, early intervention, and monitoring emotional well-being. 3 Key Facts: 1. Multi-Signal Detection: Measures skin temperature, humidity, heart rate, and oxygen independently without interference. 2. AI Emotion Recognition: Achieved 96.28% accuracy for acted emotions and 88.83% for real ones. 3. Remote Monitoring: Wirelessly transmits data for use in telemedicine and early mental health intervention. Source: https://lnkd.in/gdn6jFeF
-
Flexible batteries highlight how technological progress can serve human well-being, since their adaptability opens new paths for implants and wearable devices that blend naturally with the body while supporting longer, safer, and more connected healthcare. This perspective becomes tangible when we look at how these energy systems can reduce bulk, conform to tissue, and follow the body’s natural motion without disrupting sensitive medical sensors. Engineers gain more freedom to design discreet solutions, and patients benefit from devices that extend operating life while minimizing the need for interventions. Their biocompatible structure lowers the risk of rejection in long-term implant scenarios, while their lightness and efficiency make them suitable for wearable technologies that monitor health in real time. The integration with AI-driven platforms adds another layer of value, enabling continuous tracking and smarter care pathways. I see this evolution as a step that reflects the convergence of advanced materials science and human-centric medical innovation. The question that remains open is how quickly healthcare systems and regulators will embrace these possibilities for the benefit of patients worldwide. #MedicalTechnology #DigitalHealth #Innovation #Healthcare
-
Last week, we explored how robots might move, feel, and understand like humans. Now, we flip the lens and tap into one of the most exciting frontiers in human augmentation: Brain-Computer Interfaces (BCIs). BCIs connect the brain directly to machines, translating neural activity into signals that control computers, devices, or even AI agents. With the rise of Agentic AI, a new possibility is emerging: What if your intentions could become instructions, from brainwaves to prompts, directing AI with intent alone? The most intuitive interface isn’t voice; it’s thought. A Thought-to-Agent Interface (T2A) links your brain activity to an AI Agent in real time, translating mental focus, intention, or emotional state into prompts, actions, or decisions. These are some use-case examples... 🧠 In Work: You're in deep focus. You imagine a slide, your AI Agent starts drafting it. You think of a person; it pulls up your last conversation. 🧠 In Accessibility: For someone unable to speak or type, the interface interprets intent from brain signals and helps control devices, compose messages, or navigate systems. 🧠 In Creativity: A designer imagines a shape, a scene, or a melody, and the AI Agent renders variations in real time, refining the output through guided intent. These are some current research projects... 📚 Meta AI’s Brain-to-Text Decoding: Decodes full sentences from non-invasive brain activity with up to 80% character accuracy, bridging neural intent to digital language. https://lnkd.in/gTEJpa4e 📚 UC Berkeley’s Brain-to-Voice Neuroprosthesis: Translates brain signals into audible speech, restoring naturalistic communication for people with speech loss. https://lnkd.in/g_D3Xeup 📚 Caltech’s Mind-to-Text Interface: Achieves 79% accuracy in translating imagined internal speech into real-time text, enabling seamless brain-to-device communication. https://lnkd.in/gEuVKreq These are some startups to watch... 🚀 Neurable: EEG-based wearables decoding cognitive load & focus in real-time. https://www.neurable.com/ 🚀 OpenBCI: Makers of Galea, a headset combining EEG, EMG, eye tracking, and skin conductance for immersive neural interfacing. https://lnkd.in/girt4PAW 🚀 Cognixion: Brain-powered communication integrated with AR and speech synthesis for non-verbal users. https://www.cognixion.com/ 🚀 Paradromics: High-bandwidth BCI for translating neural activity into speech or system commands for those with severe impairments. https://lnkd.in/giepGKH4 What is a likely time horizon... 1–2 years: Wearable EEG interfaces paired with AI for narrow tasks: adaptive UI, hands-free control, attention-based interaction. 3–5 years: Thought-to-agent pipelines for work, accessibility, and creative tools, personalized to individual brain patterns and cognitive signatures. The future isn’t just AI that understands your prompts. It’s AI that understands you as soon as you think. Next up: Multimodal AI Sensory Fusion (“Glass Whisperer”)
-
John couldn't lift a spoon to his mouth. Harvard engineers changed that with a vest and a foot pedal. No surgery. No implants. Just technology that learns how he moves. The breakthrough that changes everything: ↳ 94% accuracy reading intentions ↳ Effort reduced by one-third ↳ 9 users tested: 5 stroke survivors, 4 with ALS ↳ Zero invasive procedures Think about that. As ALS advanced, John lost the ability to feed himself. Every meal required help. Every bite reminded him of what he'd lost. Then he reached out to Harvard's engineering team with one request: Help me eat alone again. While medicine chases cures, these engineers built companionship. A sensor-loaded vest with an inflatable balloon under the arm. Machine learning watches movement patterns. Predicts intentions. Provides support exactly when needed. John's solution was beautifully simple: Press a foot button. The vest inflates. His arm lifts. He feeds himself. Traditional Assistive Reality: ↳ Rigid exoskeletons ↳ Generic movements ↳ $50,000+ costs ↳ Fighting your body Harvard's Approach: ↳ Soft adaptive support ↳ Personalised patterns ↳ Future home use ↳ Working with you But here's what stopped me cold: This technology doesn't try to cure. It companions. For stroke survivors: rehabilitation accelerated. For ALS patients: independence extended. Not about fixing what's broken. About supporting what remains. The vest learns each user's unique patterns. Distinguishes shoulder movements with 94% accuracy. Expands range in shoulders, elbows, wrists. No more compensatory leaning or twisting. Just natural movement, gently assisted. John proved what matters most: A spoon lifted to his own mouth. Dignity restored with each meal. Autonomy measured in small, profound victories. The Multiplication Effect: 1 user = concept proven 100 devices = care transformed 10,000 deployed = independence preserved At scale = disability redefined Consider the numbers: 450,000 people with ALS globally. 15 million strokes annually. 80% facing upper limb challenges. Millions who could keep feeding themselves. Embracing loved ones. Creating. Contributing. Massachusetts General Hospital validated results. National Science Foundation funds home deployment. From lab to life. John didn't just test a device. He defined its purpose. His simple request—help me eat alone—became breakthrough engineering. His persistence now helps millions. The future of assistive technology isn't replacing human function. It's preserving human dignity through intelligent support. Follow me, Dr. Martha Boeckenfeld for innovations that honour what makes us human. ♻️ Share to bring hope to millions facing motor challenges.
-
'Advances in biomonitoring technologies for women’s health' article, published in Nature Magazine, review addresses the long-standing bias in biomedical research and healthcare toward male populations, which has resulted in women (and transgender individuals) being underrepresented in studies, diagnostic norms, and device design. The review explores applications of wearables and biosensors across multiple domains of women’s health, including fertility, pregnancy and maternal health, hormonal monitoring, vaginal infections, gynecologic and breast cancers, and osteoporosis. 📌 For example, devices that track basal body temperature, sweat biomarkers, or hormonal shifts can help with ovulation tracking and fertility. 📌 In pregnancy, smart textiles, abdominal sensors, and wearable ECG/uterine contraction monitors are being developed to continuously monitor maternal and fetal biomarkers. 📌 On the diagnostic side, innovations in point-of-care assays and microfluidic devices are being adapted to detect vaginal pathogens (e.g. via pH, enzymatic markers, or nucleic acid amplification) and early signals of gynecologic cancers (liquid biopsy, micro-exosome capture, multifunctional immunosensors). The authors argue that this gap contributes to delays in diagnosis, suboptimal treatments, and systemic inequities in women’s health. They survey emerging technologies—especially wearable sensors, point-of-care diagnostics, and AI/ML tools—that can help close that gap by enabling continuous, non-invasive biomonitoring tailored to female physiology. However, the authors underscore significant barriers and challenges to adoption. Many of the devices are still in prototype or small-scale testing stages and lack validation in diverse, large populations, especially in low-resource settings. Usability, user compliance, comfort, data interpretation, cost, and integration with clinical workflows are major hurdles. In addition, socioeconomic and digital divides—such as access to internet, smartphones, and health literacy—can limit uptake among marginalized groups. The review also discusses how AI and machine learning could amplify the impact of biomonitoring by improving predictive accuracy and pattern recognition, though models must be trained on more balanced, representative datasets to avoid reinforcing bias. Find out more via link 🔗 https://lnkd.in/d-xh9R6m #femtech #womenshealth #innovation #biomonitoring #biomarkers
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Event Planning
- Training & Development