Skip to content
ScienceBlog.com
  • Featured Blogs
    • EU Horizon Blog
    • ESA Tracker
    • Experimental Frontiers
    • Josh Mitteldorf’s Aging Matters
    • Dr. Lu Zhang’s Gondwanaland
    • NeuroEdge
    • NIAAA
    • SciChi
    • The Poetry of Science
    • Wild Science
  • Topics
    • Brain & Behavior
    • Earth, Energy & Environment
    • Health
    • Life & Non-humans
    • Physics & Mathematics
    • Social Sciences
    • Space
    • Technology
  • Our Substack
  • Follow Us!
    • Bluesky
    • Threads
    • FaceBook
    • Google News
    • Twitter/X
  • Contribute/Contact

edge computing

(A) Schematic sketch for illustrating the implementation of the memristive network–based RC system for rover control through processing time-sequential sensory signals. Here, voltage-based analog sensory signals carrying spatiotemporal information components are input to the memristive reservoir. These input signals are differentiated and nonlinearly mapped to a high-dimensional data space based on the temporal contexts of the input sensory signals and are quantitatively represented by the reservoir state vector X(t), which is constructed from the voltage readings at multiple neuron terminals. Afterward, the state vector is multiplied by a pretrained weight matrix W(t) to export the output signals Y(t) for controlling the testing rover. (B to D) Training data acquisition for emulating PID control of a robot rover for performing target-tracking navigation: (B) snapshot captured from the training video, showing the PID-controlled rover tracing after a red-moving target (the inset view is a snapshot from the ESP32-based internet-of-things (IoT) camera on the rover); (C) exemplary target coordinate data plotted as the function of time points; (D) exemplary motor signal data generated by a digital PID controller, plotted as the function of time points.

Brain-like computer steers rolling robot with 0.25% of the power needed by conventional controllers

Researchers developed a chip-based quantum-dot laser that emulates a biological graded neuron while achieving a signal processing speed of 10 GBaud.

Lightning-Fast Artificial Neuron Matches Nature’s Design at Billion-Times Speed

A photograph of the artificial compound eye prototype developed at the University of Virginia School of Engineering and Applied Science by associate professor Kyusang Lee.

New Artificial Eyes Mimic Praying Mantis Vision for Improved Machine Perception

A machine-learning technique developed by researchers from MIT and elsewhere enables deep learning models, like those that underlie AI chatbots or smart keyboards, to efficiently and continuously learn from new user data directly on an edge device like a smartphone. Credits:Image: MIT News

Technique enables AI on edge devices to keep learning over time

The new platform technology modeled after the brain is composed of a tangled-up network of wires containing silver, laid on a bed of electrodes.

Experimental brain-like computing system more accurate with custom algorithm

The Grainger College of Engineering at the University of Illinois Urbana-Champaign

OpenAI’s ChatGPT costs $100k per day to run; accelerators could help

Substack subscription form sign up

Comments

  • Norwood johnson on Electrons in New Crystals Behave as If They Live in Four Dimensions
  • ScienceBlog.com on Hidden Geometry Could Finally Fix Quantum Computers
  • Theo Prinse on America Is Going Back to the Moon. This Time, It Plans to Stay
  • george w on Hidden Geometry Could Finally Fix Quantum Computers
  • Tom Hughes on Years of Exercise, Blood Pressure Drugs Failed to Slow Cognitive Decline in Seniors at Dementia Risk
© 2026 ScienceBlog.com | Follow our RSS / XML feed