{"id":219,"date":"2025-07-11T05:38:41","date_gmt":"2025-07-11T12:38:41","guid":{"rendered":"https:\/\/scienceblog.com\/neuroedge\/?p=219"},"modified":"2025-07-11T05:38:41","modified_gmt":"2025-07-11T12:38:41","slug":"robot-masters-animal-like-movement-in-nine-hours","status":"publish","type":"post","link":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/","title":{"rendered":"Robot Masters Animal-Like Movement in Nine Hours"},"content":{"rendered":"<p>Watch a dog navigate from sidewalk to forest floor\u2014notice how seamlessly it shifts from trotting to bounding, adjusting its gait without conscious thought. Now scientists at the University of Leeds have taught a four-legged robot named &#8220;Clarence&#8221; to do the same thing, learning in just nine hours what takes young animals days or weeks to master.<\/p>\n<p>The robot achieved something no machine has done before: autonomously switching between eight different gaits\u2014trotting, running, bounding, hopping, and more\u2014based purely on terrain conditions it encounters. Unlike current robots that must be programmed for specific movements, <a href=\"https:\/\/www.nature.com\/articles\/s42256-025-01065-z\">Clarence adapts its stride in real-time, even on surfaces it has never experienced<\/a>.<\/p>\n<h2>The Animal Blueprint<\/h2>\n<p>Animals switch gaits for survival\u2014to save energy, maintain balance, or escape predators. A horse shifts from walk to trot to gallop not by following a rulebook, but through strategies embedded in its nervous system. The Leeds research team reverse-engineered these biological principles into artificial intelligence.<\/p>\n<p>&#8220;Our findings could have a significant impact on the future of legged robot motion control by reducing many of the previous limitations around adaptability,&#8221; notes first author Joseph Humphreys, a postgraduate researcher. His framework embeds three core animal abilities: gait transition strategies, procedural memory for different movements, and real-time motion adjustments.<\/p>\n<p>The breakthrough lies in teaching robots not just how to move, but how to decide which movement to use. Traditional robots follow pre-programmed patterns\u2014imagine trying to navigate varied terrain while locked into a single walking style. Clarence learns to evaluate conditions and choose optimal gaits on the fly.<\/p>\n<h2>Beyond Programming: Learning to Choose<\/h2>\n<p>The research team developed what they call bio-inspired metrics\u2014mathematical representations of principles animals use for gait selection:<\/p>\n<ul>\n<li><strong>Energy efficiency:<\/strong> Minimizing power consumption like animals conserving energy<\/li>\n<li><strong>Stability:<\/strong> Maintaining balance across unpredictable surfaces<\/li>\n<li><strong>Force management:<\/strong> Protecting joints and actuators from excessive strain<\/li>\n<li><strong>Work optimization:<\/strong> Reducing mechanical effort required for movement<\/li>\n<\/ul>\n<p>These metrics work together\u2014no single factor drives gait choice, just as in nature. When Clarence encounters loose timber, sensors detect instability and the AI rapidly shifts from trotting to bounding, recovering balance before catastrophic failure.<\/p>\n<h2>Real-World Validation<\/h2>\n<p>The true test came outside the laboratory. Researchers unleashed Clarence on terrain it had never seen: muddy grass, rock piles, overgrown roots, even loose timber that shifted underfoot. The robot navigated them all, switching gaits autonomously as conditions demanded.<\/p>\n<p>One striking parallel emerged with wild animal behavior. When Clarence encountered particularly challenging terrain, it deployed auxiliary gaits\u2014specialized movements like pronking and limping that animals use for stability recovery. The researchers hadn&#8217;t programmed this strategy; it emerged naturally from the AI&#8217;s decision-making process.<\/p>\n<p>Professor Zhou from UCL Computer Science explains the significance: &#8220;Instead of training robots for specific tasks, we wanted to give them the strategic intelligence animals use to adapt their gaits\u2014using principles like balance, coordination, and energy efficiency.&#8221;<\/p>\n<h2>From Simulation to Reality<\/h2>\n<p>The training occurred entirely in virtual environments using deep reinforcement learning\u2014essentially high-powered trial and error across hundreds of simulated terrains simultaneously. This approach mirrors how Neo learns martial arts in The Matrix, as Humphreys notes: &#8220;All of the training happens in simulation. You train the policy on a computer, then take it and put it on the robot and it is just as proficient as in the training.&#8221;<\/p>\n<p>What makes this remarkable is the seamless transfer from simulation to reality. Despite never experiencing rough terrain during training, Clarence successfully navigated complex real-world surfaces on its first attempt\u2014a notorious challenge in robotics known as the &#8220;sim-to-real gap.&#8221;<\/p>\n<p>The robot achieved 90.6% accuracy in gait selection while consuming minimal energy\u2014just 4 joules to complete tasks that would typically require significantly more power. This efficiency becomes crucial for applications where battery life determines mission success.<\/p>\n<h2>Implications Beyond Robotics<\/h2>\n<p>The framework opens pathways for robots in hazardous environments where human presence risks lives: nuclear decommissioning, disaster response, planetary exploration. Current robots often fail when encountering unexpected conditions\u2014a limitation that could prove fatal in rescue scenarios.<\/p>\n<p>Perhaps more intriguingly, this research offers a new tool for studying animal biomechanics itself. Rather than burdening living animals with invasive sensors or dangerous experiments, researchers can test hypotheses using robotic surrogates that replicate natural movement patterns.<\/p>\n<p>The work represents a fundamental shift from programming specific behaviors to instilling adaptive intelligence. As artificial systems become more autonomous, the biological principles that enabled millions of years of successful navigation may prove invaluable guides for the next generation of adaptive machines.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Watch a dog navigate from sidewalk to forest floor\u2014notice how seamlessly it shifts from trotting to bounding, adjusting its gait without conscious thought. Now scientists at the University of Leeds have taught a four-legged robot named &#8220;Clarence&#8221; to do the same thing, learning in just nine hours what takes young animals days or weeks to &#8230; <a title=\"Robot Masters Animal-Like Movement in Nine Hours\" class=\"read-more\" href=\"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/\" aria-label=\"Read more about Robot Masters Animal-Like Movement in Nine Hours\">Read more<\/a><\/p>\n","protected":false},"author":1297,"featured_media":220,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[4,6],"tags":[],"class_list":["post-219","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-computational-innovation","category-technology","generate-columns","tablet-grid-50","mobile-grid-100","grid-parent","grid-50"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.4 (Yoast SEO v27.4) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>Robot Masters Animal-Like Movement in Nine Hours - NeuroEdge<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Robot Masters Animal-Like Movement in Nine Hours\" \/>\n<meta property=\"og:description\" content=\"Watch a dog navigate from sidewalk to forest floor\u2014notice how seamlessly it shifts from trotting to bounding, adjusting its gait without conscious thought. Now scientists at the University of Leeds have taught a four-legged robot named &#8220;Clarence&#8221; to do the same thing, learning in just nine hours what takes young animals days or weeks to ... Read more\" \/>\n<meta property=\"og:url\" content=\"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/\" \/>\n<meta property=\"og:site_name\" content=\"NeuroEdge\" \/>\n<meta property=\"article:published_time\" content=\"2025-07-11T12:38:41+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/07\/robot-4-legs.jpeg\" \/>\n\t<meta property=\"og:image:width\" content=\"700\" \/>\n\t<meta property=\"og:image:height\" content=\"394\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"NeuroEdge\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"NeuroEdge\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/\"},\"author\":{\"name\":\"NeuroEdge\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#\\\/schema\\\/person\\\/a13c664778e7eb97cb71e3e1ad356d2e\"},\"headline\":\"Robot Masters Animal-Like Movement in Nine Hours\",\"datePublished\":\"2025-07-11T12:38:41+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/\"},\"wordCount\":746,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/07\\\/robot-4-legs.jpeg\",\"articleSection\":[\"Computational Innovation\",\"Technology\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/#respond\"]}],\"copyrightYear\":\"2025\",\"copyrightHolder\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/#organization\"}},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/\",\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/\",\"name\":\"Robot Masters Animal-Like Movement in Nine Hours - NeuroEdge\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/07\\\/robot-4-legs.jpeg\",\"datePublished\":\"2025-07-11T12:38:41+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/#primaryimage\",\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/07\\\/robot-4-legs.jpeg\",\"contentUrl\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/07\\\/robot-4-legs.jpeg\",\"width\":700,\"height\":394,\"caption\":\"\\\"I'm a street-walking cheetah with a heart full of Napalm\\\"\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/07\\\/11\\\/robot-masters-animal-like-movement-in-nine-hours\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Robot Masters Animal-Like Movement in Nine Hours\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#website\",\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/\",\"name\":\"NeuroEdge\",\"description\":\"A data-driven look at neuroscience and AI, for investors, policymakers, and innovators.\",\"publisher\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#organization\",\"name\":\"NeuroEdge\",\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/04\\\/cropped-neuroedge_logo.jpg\",\"contentUrl\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/04\\\/cropped-neuroedge_logo.jpg\",\"width\":955,\"height\":191,\"caption\":\"NeuroEdge\"},\"image\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#\\\/schema\\\/logo\\\/image\\\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#\\\/schema\\\/person\\\/a13c664778e7eb97cb71e3e1ad356d2e\",\"name\":\"NeuroEdge\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g\",\"caption\":\"NeuroEdge\"},\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/author\\\/neuroedge\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Robot Masters Animal-Like Movement in Nine Hours - NeuroEdge","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/","og_locale":"en_US","og_type":"article","og_title":"Robot Masters Animal-Like Movement in Nine Hours","og_description":"Watch a dog navigate from sidewalk to forest floor\u2014notice how seamlessly it shifts from trotting to bounding, adjusting its gait without conscious thought. Now scientists at the University of Leeds have taught a four-legged robot named &#8220;Clarence&#8221; to do the same thing, learning in just nine hours what takes young animals days or weeks to ... Read more","og_url":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/","og_site_name":"NeuroEdge","article_published_time":"2025-07-11T12:38:41+00:00","og_image":[{"width":700,"height":394,"url":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/07\/robot-4-legs.jpeg","type":"image\/jpeg"}],"author":"NeuroEdge","twitter_card":"summary_large_image","twitter_misc":{"Written by":"NeuroEdge","Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/#article","isPartOf":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/"},"author":{"name":"NeuroEdge","@id":"https:\/\/scienceblog.com\/neuroedge\/#\/schema\/person\/a13c664778e7eb97cb71e3e1ad356d2e"},"headline":"Robot Masters Animal-Like Movement in Nine Hours","datePublished":"2025-07-11T12:38:41+00:00","mainEntityOfPage":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/"},"wordCount":746,"commentCount":0,"publisher":{"@id":"https:\/\/scienceblog.com\/neuroedge\/#organization"},"image":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/#primaryimage"},"thumbnailUrl":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/07\/robot-4-legs.jpeg","articleSection":["Computational Innovation","Technology"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/#respond"]}],"copyrightYear":"2025","copyrightHolder":{"@id":"https:\/\/scienceblog.com\/#organization"}},{"@type":"WebPage","@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/","url":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/","name":"Robot Masters Animal-Like Movement in Nine Hours - NeuroEdge","isPartOf":{"@id":"https:\/\/scienceblog.com\/neuroedge\/#website"},"primaryImageOfPage":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/#primaryimage"},"image":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/#primaryimage"},"thumbnailUrl":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/07\/robot-4-legs.jpeg","datePublished":"2025-07-11T12:38:41+00:00","breadcrumb":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/#primaryimage","url":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/07\/robot-4-legs.jpeg","contentUrl":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/07\/robot-4-legs.jpeg","width":700,"height":394,"caption":"\"I'm a street-walking cheetah with a heart full of Napalm\""},{"@type":"BreadcrumbList","@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/11\/robot-masters-animal-like-movement-in-nine-hours\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/scienceblog.com\/neuroedge\/"},{"@type":"ListItem","position":2,"name":"Robot Masters Animal-Like Movement in Nine Hours"}]},{"@type":"WebSite","@id":"https:\/\/scienceblog.com\/neuroedge\/#website","url":"https:\/\/scienceblog.com\/neuroedge\/","name":"NeuroEdge","description":"A data-driven look at neuroscience and AI, for investors, policymakers, and innovators.","publisher":{"@id":"https:\/\/scienceblog.com\/neuroedge\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/scienceblog.com\/neuroedge\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/scienceblog.com\/neuroedge\/#organization","name":"NeuroEdge","url":"https:\/\/scienceblog.com\/neuroedge\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/scienceblog.com\/neuroedge\/#\/schema\/logo\/image\/","url":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/04\/cropped-neuroedge_logo.jpg","contentUrl":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/04\/cropped-neuroedge_logo.jpg","width":955,"height":191,"caption":"NeuroEdge"},"image":{"@id":"https:\/\/scienceblog.com\/neuroedge\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/scienceblog.com\/neuroedge\/#\/schema\/person\/a13c664778e7eb97cb71e3e1ad356d2e","name":"NeuroEdge","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g","caption":"NeuroEdge"},"url":"https:\/\/scienceblog.com\/neuroedge\/author\/neuroedge\/"}]}},"jetpack_featured_media_url":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/07\/robot-4-legs.jpeg","jetpack_likes_enabled":true,"jetpack_sharing_enabled":true,"jetpack-related-posts":[{"id":216,"url":"https:\/\/scienceblog.com\/neuroedge\/2025\/07\/09\/robot-surgeons-learn-like-residents-and-just-performed-first-autonomous-surgery\/","url_meta":{"origin":219,"position":0},"title":"Robot Surgeons Learn Like Residents\u2014And Just Performed First Autonomous Surgery","author":"NeuroEdge","date":"July 9, 2025","format":false,"excerpt":"Three surgical robots trained on video demonstrations have achieved what many thought was years away: performing complex procedures with the precision of expert surgeons and the adaptability to handle the unexpected twists that define real medical emergencies. The systems mark a watershed moment where artificial intelligence moves from simple surgical\u2026","rel":"","context":"In &quot;Automation &amp; Efficiency&quot;","block_context":{"text":"Automation &amp; Efficiency","link":"https:\/\/scienceblog.com\/neuroedge\/category\/automation-efficiency\/"},"img":{"alt_text":"Surgie, a humanoid medical robot, is about to give an ultrasound to a patient.","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/07\/surgie.jpg?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/07\/surgie.jpg?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/07\/surgie.jpg?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/07\/surgie.jpg?resize=700%2C400&ssl=1 2x"},"classes":[]},{"id":284,"url":"https:\/\/scienceblog.com\/neuroedge\/2025\/12\/29\/brain-model-discovers-neurons-that-reliably-predict-mistakes\/","url_meta":{"origin":219,"position":1},"title":"Brain Model Discovers Neurons That Reliably Predict Mistakes","author":"NeuroEdge","date":"December 29, 2025","format":false,"excerpt":"About 20 percent of neurons in a learning brain seem to be doing something counterintuitive. When these cells become more active, mistakes follow. A new computational model of the brain, built to mirror real neural circuits rather than optimize performance, stumbled onto this pattern while learning a simple visual task.\u2026","rel":"","context":"In &quot;Brain Health&quot;","block_context":{"text":"Brain Health","link":"https:\/\/scienceblog.com\/neuroedge\/category\/brain-health\/"},"img":{"alt_text":"neuron networks","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/12\/Screenshot-2025-12-29-at-8.52.01-AM.jpg?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/12\/Screenshot-2025-12-29-at-8.52.01-AM.jpg?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/12\/Screenshot-2025-12-29-at-8.52.01-AM.jpg?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/12\/Screenshot-2025-12-29-at-8.52.01-AM.jpg?resize=700%2C400&ssl=1 2x"},"classes":[]},{"id":207,"url":"https:\/\/scienceblog.com\/neuroedge\/2025\/06\/20\/human-ai-teams-make-better-medical-diagnoses\/","url_meta":{"origin":219,"position":2},"title":"Human-AI Teams Make Better Medical Diagnoses","author":"NeuroEdge","date":"June 20, 2025","format":false,"excerpt":"Hybrid collectives consisting of humans and artificial intelligence make significantly more accurate medical diagnoses than either medical professionals or AI systems alone. New research analyzing over 40,000 diagnoses reveals that combining human expertise with AI models creates a powerful diagnostic partnership that outperforms traditional approaches. The study, published in Proceedings\u2026","rel":"","context":"In &quot;Automation &amp; Efficiency&quot;","block_context":{"text":"Automation &amp; Efficiency","link":"https:\/\/scienceblog.com\/neuroedge\/category\/automation-efficiency\/"},"img":{"alt_text":"Robot and doctor shaking hands","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/Low-Res_MPIB-DoctorsKI.webp?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/Low-Res_MPIB-DoctorsKI.webp?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/Low-Res_MPIB-DoctorsKI.webp?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/Low-Res_MPIB-DoctorsKI.webp?resize=700%2C400&ssl=1 2x"},"classes":[]},{"id":323,"url":"https:\/\/scienceblog.com\/neuroedge\/2026\/04\/24\/why-a-slower-ai-might-actually-feel-smarter-to-you\/","url_meta":{"origin":219,"position":3},"title":"Why a Slower AI Might Actually Feel Smarter to You","author":"NeuroEdge","date":"April 24, 2026","format":false,"excerpt":"Type a question into an AI chatbot and hit send. Now watch the cursor blink. Two seconds feels fine, barely noticeable. Nine seconds and something shifts: you start to wonder if the system is really working through the problem. By the time you hit twenty seconds you have either concluded\u2026","rel":"","context":"In &quot;Computational Innovation&quot;","block_context":{"text":"Computational Innovation","link":"https:\/\/scienceblog.com\/neuroedge\/category\/computational-innovation\/"},"img":{"alt_text":"Deepseek screenshot","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2026\/04\/pexels-bertellifotografia-30530410.jpg?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2026\/04\/pexels-bertellifotografia-30530410.jpg?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2026\/04\/pexels-bertellifotografia-30530410.jpg?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2026\/04\/pexels-bertellifotografia-30530410.jpg?resize=700%2C400&ssl=1 2x"},"classes":[]},{"id":197,"url":"https:\/\/scienceblog.com\/neuroedge\/2025\/06\/05\/nanowire-eye-implants-give-blind-mice-infrared-vision\/","url_meta":{"origin":219,"position":4},"title":"Nanowire Eye Implants Give Blind Mice Infrared Vision","author":"NeuroEdge","date":"June 5, 2025","format":false,"excerpt":"Scientists have developed a new type of retinal implant that not only restored vision in blind mice but also gave them the ability to see infrared light\u2014something even healthy eyes cannot detect. The device, made from interwoven tellurium nanowires, represents a significant step forward in artificial vision technology and could\u2026","rel":"","context":"In &quot;Technology&quot;","block_context":{"text":"Technology","link":"https:\/\/scienceblog.com\/neuroedge\/category\/technology\/"},"img":{"alt_text":"Tellurium exhibits broad-spectrum optical absorption, spanning visible to infrared light (top left). When implanted subretinally, a tellurium nanowire prosthesis can replace damaged photoreceptors and generate photocurrents that stimulate remaining retinal circuits (bottom left) and activate the visual cortex (top right). Thanks to engineered asymmetry and nanowire network structure, these devices produce large, spontaneous photocurrents without external bias and allow for minimally invasive implantation (bottom right). These features position tellurium nanowire networks (TeNWNs) as a promising next-generation technology for visual prosthetics.","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/science.adu2987-fa.jpg?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/science.adu2987-fa.jpg?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/science.adu2987-fa.jpg?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/science.adu2987-fa.jpg?resize=700%2C400&ssl=1 2x"},"classes":[]},{"id":300,"url":"https:\/\/scienceblog.com\/neuroedge\/2026\/02\/23\/automated-arms-are-rewriting-how-we-find-the-catalysts-that-run-the-world\/","url_meta":{"origin":219,"position":5},"title":"Automated Arms Are Rewriting How We Find the Catalysts That Run the World","author":"NeuroEdge","date":"February 23, 2026","format":false,"excerpt":"At around midnight in a laboratory in Daejeon, South Korea, two robotic arms are working through a queue of 96 experiments nobody asked them to start. One grips a pipette, delivers a precise shot of sodium borohydride into a reaction vessel, and moves on. The other lifts a small cuvette,\u2026","rel":"","context":"In &quot;Computational Innovation&quot;","block_context":{"text":"Computational Innovation","link":"https:\/\/scienceblog.com\/neuroedge\/category\/computational-innovation\/"},"img":{"alt_text":"Robot-Based Automated Platform","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2026\/02\/Robot-Based-Automated-Platform.jpeg?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2026\/02\/Robot-Based-Automated-Platform.jpeg?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2026\/02\/Robot-Based-Automated-Platform.jpeg?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2026\/02\/Robot-Based-Automated-Platform.jpeg?resize=700%2C400&ssl=1 2x"},"classes":[]}],"_links":{"self":[{"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/posts\/219","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/users\/1297"}],"replies":[{"embeddable":true,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/comments?post=219"}],"version-history":[{"count":1,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/posts\/219\/revisions"}],"predecessor-version":[{"id":221,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/posts\/219\/revisions\/221"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/media\/220"}],"wp:attachment":[{"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/media?parent=219"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/categories?post=219"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/tags?post=219"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}