{"id":236,"date":"2025-08-01T08:36:59","date_gmt":"2025-08-01T15:36:59","guid":{"rendered":"https:\/\/scienceblog.com\/neuroedge\/?p=236"},"modified":"2025-08-01T08:36:59","modified_gmt":"2025-08-01T15:36:59","slug":"new-ai-learns-to-read-scans-with-just-a-few-examples","status":"publish","type":"post","link":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/","title":{"rendered":"New AI Learns to Read Scans With Just a Few Examples"},"content":{"rendered":"<p>A team of researchers at the University of California San Diego has developed a new artificial intelligence tool that can learn to read medical images using only a fraction of the data typically required. The system, called GenSeg, dramatically cuts down the amount of expert-labeled scans needed to train diagnostic models, reducing data demands by up to 20 times. The breakthrough could help bring powerful imaging tools to hospitals and clinics with fewer resources, where annotated datasets are often scarce.<\/p>\n<h2>Learning From Just a Handful of Examples<\/h2>\n<p>Medical image segmentation, where each pixel in a scan is labeled as healthy or diseased tissue, is a cornerstone of many diagnostic tasks. Traditionally, training AI to perform segmentation has required thousands of pixel-by-pixel annotated images. But creating these datasets is expensive and time-consuming, often requiring highly trained specialists.<\/p>\n<p>&#8220;Creating such datasets demands expert labor, time and cost,&#8221; said Li Zhang, lead author and PhD student in electrical and computer engineering at UC San Diego. &#8220;For many medical conditions, that level of data simply does not exist.&#8221;<\/p>\n<p>GenSeg changes the game. It can learn from as few as 40 labeled images and still match or outperform standard methods trained on hundreds. The key is how it learns. GenSeg doesn&#8217;t just consume data, it generates it, smartly.<\/p>\n<h2>How It Works<\/h2>\n<p>GenSeg operates in stages. First, it learns how to generate realistic images from expert-labeled segmentation masks. Then it creates synthetic image-mask pairs to supplement the small real-world dataset. These combined examples are used to train a segmentation model. Through a feedback loop, the system tweaks its image generation based on how well the model performs.<\/p>\n<p>&#8220;The segmentation performance itself guides the data generation process,&#8221; Zhang explained. &#8220;This ensures that the synthetic data are not just realistic, but also specifically tailored to improve the model&#8217;s segmentation capabilities.&#8221;<\/p>\n<h2>What It Can Do<\/h2>\n<p>GenSeg was tested on 19 datasets across a wide range of imaging types and medical conditions. It learned to identify:<\/p>\n<ul>\n<li>Skin lesions from dermoscopy images<\/li>\n<li>Breast cancer from ultrasound<\/li>\n<li>Foot ulcers from standard photos<\/li>\n<li>Polyps from colonoscopy scans<\/li>\n<li>Lungs from chest X-rays<\/li>\n<li>Placental vessels from fetoscopic images<\/li>\n<\/ul>\n<p>In these tests, GenSeg often required just 40 to 100 labeled examples. In lung segmentation, for instance, it matched performance typically achieved with 175 examples using only 9. That is a 19-fold efficiency boost.<\/p>\n<h2>Outperforming the State of the Art<\/h2>\n<p>Beyond efficiency, GenSeg beat out several leading data augmentation tools and semi-supervised methods, including nnUNet, WGAN, and mutual correction frameworks. In both in-domain (same dataset) and out-of-domain (different dataset) tests, it consistently delivered better results.<\/p>\n<p>One reason is that traditional data augmentation and semi-supervised methods treat data generation and model training as separate steps. GenSeg, by contrast, integrates both in a multi-level optimization loop. This makes its synthetic data more useful, not just more realistic.<\/p>\n<h2>Flexible and Scalable<\/h2>\n<p>The team showed that GenSeg works with various segmentation backbones, from UNet to DeepLab to Transformer-based SwinUnet. It also performs well in both 2D and 3D medical image segmentation tasks, such as hippocampus and liver scans.<\/p>\n<p>While designed for low-data scenarios, GenSeg also improves results when large datasets are available. It can be dropped into existing workflows, and because it doesn&#8217;t alter the model architecture, it doesn&#8217;t increase the computational burden during diagnosis.<\/p>\n<h2>What Comes Next<\/h2>\n<p>The researchers aim to further refine GenSeg&#8217;s synthetic data generation, particularly for anatomically complex or variable cases. They also plan to incorporate direct feedback from clinicians to tailor training data more closely to real-world diagnostic needs.<\/p>\n<p>&#8220;This project was born from the need to break this bottleneck and make powerful segmentation tools more practical and accessible,&#8221; said Zhang.<\/p>\n<p>If successful, GenSeg could help democratize access to AI-assisted diagnostics, especially in resource-limited settings where labeled imaging data is hard to come by. In an age of data scarcity and rising healthcare costs, that is a powerful proposition.<\/p>\n<h2>Journal and Funding<\/h2>\n<p><strong>Journal:<\/strong> Nature Communications<br \/>\n<strong>DOI:<\/strong> <a href=\"https:\/\/doi.org\/10.1038\/s41467-025-61754-6\">10.1038\/s41467-025-61754-6<\/a><br \/>\n<strong>Title:<\/strong> Generative AI enables medical image segmentation in ultra low-data regimes<br \/>\n<strong>Authors:<\/strong> Li Zhang, Basu Jindal, Ahmed Alaa, Robert Weinreb, David Wilson, Eran Segal, James Zou, Pengtao Xie<br \/>\n<strong>Published:<\/strong> July 14, 2025<\/p>\n","protected":false},"excerpt":{"rendered":"<p>A team of researchers at the University of California San Diego has developed a new artificial intelligence tool that can learn to read medical images using only a fraction of the data typically required. The system, called GenSeg, dramatically cuts down the amount of expert-labeled scans needed to train diagnostic models, reducing data demands by &#8230; <a title=\"New AI Learns to Read Scans With Just a Few Examples\" class=\"read-more\" href=\"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/\" aria-label=\"Read more about New AI Learns to Read Scans With Just a Few Examples\">Read more<\/a><\/p>\n","protected":false},"author":1297,"featured_media":237,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[11,6],"tags":[],"class_list":["post-236","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-health-medicine","category-technology","generate-columns","tablet-grid-50","mobile-grid-100","grid-parent","grid-50"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.4 (Yoast SEO v27.4) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>New AI Learns to Read Scans With Just a Few Examples - NeuroEdge<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"New AI Learns to Read Scans With Just a Few Examples\" \/>\n<meta property=\"og:description\" content=\"A team of researchers at the University of California San Diego has developed a new artificial intelligence tool that can learn to read medical images using only a fraction of the data typically required. The system, called GenSeg, dramatically cuts down the amount of expert-labeled scans needed to train diagnostic models, reducing data demands by ... Read more\" \/>\n<meta property=\"og:url\" content=\"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/\" \/>\n<meta property=\"og:site_name\" content=\"NeuroEdge\" \/>\n<meta property=\"article:published_time\" content=\"2025-08-01T15:36:59+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/08\/ai-heart-disease-xray.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"700\" \/>\n\t<meta property=\"og:image:height\" content=\"574\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"NeuroEdge\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"NeuroEdge\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/\"},\"author\":{\"name\":\"NeuroEdge\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#\\\/schema\\\/person\\\/a13c664778e7eb97cb71e3e1ad356d2e\"},\"headline\":\"New AI Learns to Read Scans With Just a Few Examples\",\"datePublished\":\"2025-08-01T15:36:59+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/\"},\"wordCount\":693,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/08\\\/ai-heart-disease-xray.jpg\",\"articleSection\":[\"Health &amp; Medicine\",\"Technology\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/#respond\"]}],\"copyrightYear\":\"2025\",\"copyrightHolder\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/#organization\"}},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/\",\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/\",\"name\":\"New AI Learns to Read Scans With Just a Few Examples - NeuroEdge\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/08\\\/ai-heart-disease-xray.jpg\",\"datePublished\":\"2025-08-01T15:36:59+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/#primaryimage\",\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/08\\\/ai-heart-disease-xray.jpg\",\"contentUrl\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/08\\\/ai-heart-disease-xray.jpg\",\"width\":700,\"height\":574,\"caption\":\"Normal chest x-ray\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/2025\\\/08\\\/01\\\/new-ai-learns-to-read-scans-with-just-a-few-examples\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"New AI Learns to Read Scans With Just a Few Examples\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#website\",\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/\",\"name\":\"NeuroEdge\",\"description\":\"A data-driven look at neuroscience and AI, for investors, policymakers, and innovators.\",\"publisher\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#organization\",\"name\":\"NeuroEdge\",\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/04\\\/cropped-neuroedge_logo.jpg\",\"contentUrl\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/wp-content\\\/uploads\\\/sites\\\/14\\\/2025\\\/04\\\/cropped-neuroedge_logo.jpg\",\"width\":955,\"height\":191,\"caption\":\"NeuroEdge\"},\"image\":{\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#\\\/schema\\\/logo\\\/image\\\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/#\\\/schema\\\/person\\\/a13c664778e7eb97cb71e3e1ad356d2e\",\"name\":\"NeuroEdge\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g\",\"caption\":\"NeuroEdge\"},\"url\":\"https:\\\/\\\/scienceblog.com\\\/neuroedge\\\/author\\\/neuroedge\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"New AI Learns to Read Scans With Just a Few Examples - NeuroEdge","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/","og_locale":"en_US","og_type":"article","og_title":"New AI Learns to Read Scans With Just a Few Examples","og_description":"A team of researchers at the University of California San Diego has developed a new artificial intelligence tool that can learn to read medical images using only a fraction of the data typically required. The system, called GenSeg, dramatically cuts down the amount of expert-labeled scans needed to train diagnostic models, reducing data demands by ... Read more","og_url":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/","og_site_name":"NeuroEdge","article_published_time":"2025-08-01T15:36:59+00:00","og_image":[{"width":700,"height":574,"url":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/08\/ai-heart-disease-xray.jpg","type":"image\/jpeg"}],"author":"NeuroEdge","twitter_card":"summary_large_image","twitter_misc":{"Written by":"NeuroEdge","Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/#article","isPartOf":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/"},"author":{"name":"NeuroEdge","@id":"https:\/\/scienceblog.com\/neuroedge\/#\/schema\/person\/a13c664778e7eb97cb71e3e1ad356d2e"},"headline":"New AI Learns to Read Scans With Just a Few Examples","datePublished":"2025-08-01T15:36:59+00:00","mainEntityOfPage":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/"},"wordCount":693,"commentCount":0,"publisher":{"@id":"https:\/\/scienceblog.com\/neuroedge\/#organization"},"image":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/#primaryimage"},"thumbnailUrl":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/08\/ai-heart-disease-xray.jpg","articleSection":["Health &amp; Medicine","Technology"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/#respond"]}],"copyrightYear":"2025","copyrightHolder":{"@id":"https:\/\/scienceblog.com\/#organization"}},{"@type":"WebPage","@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/","url":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/","name":"New AI Learns to Read Scans With Just a Few Examples - NeuroEdge","isPartOf":{"@id":"https:\/\/scienceblog.com\/neuroedge\/#website"},"primaryImageOfPage":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/#primaryimage"},"image":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/#primaryimage"},"thumbnailUrl":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/08\/ai-heart-disease-xray.jpg","datePublished":"2025-08-01T15:36:59+00:00","breadcrumb":{"@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/#primaryimage","url":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/08\/ai-heart-disease-xray.jpg","contentUrl":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/08\/ai-heart-disease-xray.jpg","width":700,"height":574,"caption":"Normal chest x-ray"},{"@type":"BreadcrumbList","@id":"https:\/\/scienceblog.com\/neuroedge\/2025\/08\/01\/new-ai-learns-to-read-scans-with-just-a-few-examples\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/scienceblog.com\/neuroedge\/"},{"@type":"ListItem","position":2,"name":"New AI Learns to Read Scans With Just a Few Examples"}]},{"@type":"WebSite","@id":"https:\/\/scienceblog.com\/neuroedge\/#website","url":"https:\/\/scienceblog.com\/neuroedge\/","name":"NeuroEdge","description":"A data-driven look at neuroscience and AI, for investors, policymakers, and innovators.","publisher":{"@id":"https:\/\/scienceblog.com\/neuroedge\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/scienceblog.com\/neuroedge\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/scienceblog.com\/neuroedge\/#organization","name":"NeuroEdge","url":"https:\/\/scienceblog.com\/neuroedge\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/scienceblog.com\/neuroedge\/#\/schema\/logo\/image\/","url":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/04\/cropped-neuroedge_logo.jpg","contentUrl":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/04\/cropped-neuroedge_logo.jpg","width":955,"height":191,"caption":"NeuroEdge"},"image":{"@id":"https:\/\/scienceblog.com\/neuroedge\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/scienceblog.com\/neuroedge\/#\/schema\/person\/a13c664778e7eb97cb71e3e1ad356d2e","name":"NeuroEdge","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/28782ec992e8763e1f8d41ddc10864e7d8cd4cb99bacea6224c4abe634bbabec?s=96&d=mm&r=g","caption":"NeuroEdge"},"url":"https:\/\/scienceblog.com\/neuroedge\/author\/neuroedge\/"}]}},"jetpack_featured_media_url":"https:\/\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/08\/ai-heart-disease-xray.jpg","jetpack_likes_enabled":true,"jetpack_sharing_enabled":true,"jetpack-related-posts":[{"id":248,"url":"https:\/\/scienceblog.com\/neuroedge\/2025\/10\/08\/ai-learns-to-spot-exploding-stars-from-just-15-examples\/","url_meta":{"origin":236,"position":0},"title":"AI Learns to Spot Exploding Stars From Just 15 Examples","author":"NeuroEdge","date":"October 8, 2025","format":false,"excerpt":"Modern telescopes are magnificent gossips, generating millions of alerts every night about potential changes in the cosmos. The problem? Most of these whispers are lies - satellite trails, cosmic ray hits, instrumental hiccups masquerading as genuine discoveries. For years, astronomers have deployed specialized neural networks to separate wheat from chaff,\u2026","rel":"","context":"In &quot;Automation &amp; Efficiency&quot;","block_context":{"text":"Automation &amp; Efficiency","link":"https:\/\/scienceblog.com\/neuroedge\/category\/automation-efficiency\/"},"img":{"alt_text":"The same transient is shown in three surveys, with rows corresponding to Pan-STARRS (top), MeerLICHT (middle), and ATLAS (bottom). Each row presents, from left to right, the New, Reference, and Difference images. Red circles mark the expected position of the transient candidate at the centre of each stamp. All stamps are 100\u00d7100 pixels, but their angular sky coverage differs due to survey-specific pixel scales: Pan-STARRS 0.25\u2033\/pixel, MeerLICHT 0.56\u2033\/pixel, and ATLAS 1.86\u2033\/pixel. Credit: Stoppa & Bulmus et al., Nature Astronomy (2025).","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/10\/how-gemini-operates.jpg?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/10\/how-gemini-operates.jpg?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/10\/how-gemini-operates.jpg?resize=525%2C300&ssl=1 1.5x"},"classes":[]},{"id":252,"url":"https:\/\/scienceblog.com\/neuroedge\/2025\/10\/15\/ai-system-finds-crucial-clues-for-diagnoses-in-electronic-health-records\/","url_meta":{"origin":236,"position":1},"title":"AI System Finds Crucial Clues For Diagnoses In Electronic Health Records","author":"NeuroEdge","date":"October 15, 2025","format":false,"excerpt":"In hospitals where seconds matter, physicians often face a data paradox: vast electronic records but little time to extract meaning. Researchers at the Icahn School of Medicine at Mount Sinai have now developed an artificial intelligence system that transforms this flood of information into structured insight. The tool, called InfEHR,\u2026","rel":"","context":"In &quot;Automation &amp; Efficiency&quot;","block_context":{"text":"Automation &amp; Efficiency","link":"https:\/\/scienceblog.com\/neuroedge\/category\/automation-efficiency\/"},"img":{"alt_text":"clinician carrying a health record","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/10\/pexels-karolina-grabowska-6627823.jpg?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/10\/pexels-karolina-grabowska-6627823.jpg?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/10\/pexels-karolina-grabowska-6627823.jpg?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/10\/pexels-karolina-grabowska-6627823.jpg?resize=700%2C400&ssl=1 2x"},"classes":[]},{"id":207,"url":"https:\/\/scienceblog.com\/neuroedge\/2025\/06\/20\/human-ai-teams-make-better-medical-diagnoses\/","url_meta":{"origin":236,"position":2},"title":"Human-AI Teams Make Better Medical Diagnoses","author":"NeuroEdge","date":"June 20, 2025","format":false,"excerpt":"Hybrid collectives consisting of humans and artificial intelligence make significantly more accurate medical diagnoses than either medical professionals or AI systems alone. New research analyzing over 40,000 diagnoses reveals that combining human expertise with AI models creates a powerful diagnostic partnership that outperforms traditional approaches. The study, published in Proceedings\u2026","rel":"","context":"In &quot;Automation &amp; Efficiency&quot;","block_context":{"text":"Automation &amp; Efficiency","link":"https:\/\/scienceblog.com\/neuroedge\/category\/automation-efficiency\/"},"img":{"alt_text":"Robot and doctor shaking hands","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/Low-Res_MPIB-DoctorsKI.webp?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/Low-Res_MPIB-DoctorsKI.webp?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/Low-Res_MPIB-DoctorsKI.webp?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/06\/Low-Res_MPIB-DoctorsKI.webp?resize=700%2C400&ssl=1 2x"},"classes":[]},{"id":245,"url":"https:\/\/scienceblog.com\/neuroedge\/2025\/09\/29\/ai-distinguishes-glioblastoma-from-look-alike-cancers-during-surgery\/","url_meta":{"origin":236,"position":3},"title":"AI Distinguishes Glioblastoma From Look-Alike Cancers During Surgery","author":"NeuroEdge","date":"September 29, 2025","format":false,"excerpt":"In the high-stakes world of brain surgery, a pathologist's snap judgment can determine whether a patient walks out with their tumor removed or heads straight to chemotherapy instead. Get it wrong, and you've either carved out healthy brain tissue unnecessarily or left dangerous cells behind. Now, an AI system called\u2026","rel":"","context":"In &quot;Brain Health&quot;","block_context":{"text":"Brain Health","link":"https:\/\/scienceblog.com\/neuroedge\/category\/brain-health\/"},"img":{"alt_text":"glioblastoma tumor imaging","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/09\/Researchers-make-glioblastoma-cells-visible-to-attacking-immune-cells-600x400-1.webp?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/09\/Researchers-make-glioblastoma-cells-visible-to-attacking-immune-cells-600x400-1.webp?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/09\/Researchers-make-glioblastoma-cells-visible-to-attacking-immune-cells-600x400-1.webp?resize=525%2C300&ssl=1 1.5x"},"classes":[]},{"id":31,"url":"https:\/\/scienceblog.com\/neuroedge\/2025\/04\/18\/ai-matches-non-specialists-in-medical-diagnosis\/","url_meta":{"origin":236,"position":4},"title":"AI Matches Non-Specialists In Medical Diagnosis","author":"NeuroEdge","date":"April 18, 2025","format":false,"excerpt":"The latest AI systems are now diagnosing medical conditions about as well as junior doctors, according to a sweeping new analysis that's likely to raise eyebrows across healthcare. While seasoned specialists still outperform the machines, this milestone suggests we're entering a new era where AI could meaningfully augment medical education\u2026","rel":"","context":"In &quot;Automation &amp; Efficiency&quot;","block_context":{"text":"Automation &amp; Efficiency","link":"https:\/\/scienceblog.com\/neuroedge\/category\/automation-efficiency\/"},"img":{"alt_text":"Clinician at monitoring equipment","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/04\/medical-equipment-4099428_1280.jpg?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/04\/medical-equipment-4099428_1280.jpg?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/04\/medical-equipment-4099428_1280.jpg?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/04\/medical-equipment-4099428_1280.jpg?resize=700%2C400&ssl=1 2x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/04\/medical-equipment-4099428_1280.jpg?resize=1050%2C600&ssl=1 3x"},"classes":[]},{"id":281,"url":"https:\/\/scienceblog.com\/neuroedge\/2025\/12\/23\/autistic-brains-show-measurable-molecular-shift-in-glutamate-signaling\/","url_meta":{"origin":236,"position":5},"title":"Autistic Brains Show Measurable Molecular Shift in Glutamate Signaling","author":"NeuroEdge","date":"December 23, 2025","format":false,"excerpt":"For many autistic people, a crowded cafe isn't just noisy. Every clinking spoon sounds like a bell. Every flickering light feels like a strobe. Clinicians have diagnosed autism through behavioral observation for decades, but the biological mechanisms behind these differences have remained elusive. Now, researchers at Yale School of Medicine\u2026","rel":"","context":"In &quot;Brain Health&quot;","block_context":{"text":"Brain Health","link":"https:\/\/scienceblog.com\/neuroedge\/category\/brain-health\/"},"img":{"alt_text":"digital brain illustration","src":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/12\/cbcab44f-b405-42f6-89db-8c677f8b0ffe.jpeg?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/12\/cbcab44f-b405-42f6-89db-8c677f8b0ffe.jpeg?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/12\/cbcab44f-b405-42f6-89db-8c677f8b0ffe.jpeg?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/scienceblog.com\/neuroedge\/wp-content\/uploads\/sites\/14\/2025\/12\/cbcab44f-b405-42f6-89db-8c677f8b0ffe.jpeg?resize=700%2C400&ssl=1 2x"},"classes":[]}],"_links":{"self":[{"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/posts\/236","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/users\/1297"}],"replies":[{"embeddable":true,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/comments?post=236"}],"version-history":[{"count":1,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/posts\/236\/revisions"}],"predecessor-version":[{"id":238,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/posts\/236\/revisions\/238"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/media\/237"}],"wp:attachment":[{"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/media?parent=236"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/categories?post=236"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scienceblog.com\/neuroedge\/wp-json\/wp\/v2\/tags?post=236"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}