Skip to content
ScienceBlog.com
  • Featured Blogs
    • EU Horizon Blog
    • ESA Tracker
    • Experimental Frontiers
    • Josh Mitteldorf’s Aging Matters
    • Dr. Lu Zhang’s Gondwanaland
    • NeuroEdge
    • NIAAA
    • SciChi
    • The Poetry of Science
    • Wild Science
  • Topics
    • Brain & Behavior
    • Earth, Energy & Environment
    • Health
    • Life & Non-humans
    • Physics & Mathematics
    • Social Sciences
    • Space
    • Technology
  • Our Substack
  • Follow Us!
    • Bluesky
    • Threads
    • FaceBook
    • Google News
    • Twitter/X
  • Contribute/Contact

neural networks

The framework developed by the researchers accelerates training of a new, larger neural network model by using the weights in the neurons of an older, smaller model as building blocks. Their machine-learning approach learns to expand the width and depth of the larger model in a data-driven way. Credits: Image: Courtesy of the researchers, edited by MIT News

Learning to grow machine-learning models

MIT researchers found that massive neural network models that are similar to large language models are capable of containing smaller linear models inside their hidden layers, which the large models could train to complete a new task using simple learning algorithms. Credits:Image: Jose-Luis Olivares, MIT

How language models like ChatGPT learn new tasks from just a few examples

MIT researchers have developed a technique that greatly reduces the error in an optical neural network, which uses light to process data instead of electrical signals. With their technique, the larger an optical neural network becomes, the lower the error in its computations. This could enable them to scale these devices up so they would be large enough for commercial uses.

Breaking the scaling limits of analog computing

Artificial neural networks are computing systems inspired by biological neural networks that constitute animal brains. Like biological models, they can learn (be trained) by processing examples and forming probability associations, then apply that information to other tasks.

Artificial neural networks need sleep too

Newer posts
← Previous Page1 Page2
Substack subscription form sign up

Comments

  • Karoly Mirnics on Common Prescription Drugs May Disrupt Cholesterol Pathways in the Womb and Raise Autism Risk
  • Aizen on Laziness helped lead to extinction of Homo erectus
  • Norwood johnson on Electrons in New Crystals Behave as If They Live in Four Dimensions
  • ScienceBlog.com on Hidden Geometry Could Finally Fix Quantum Computers
  • Theo Prinse on America Is Going Back to the Moon. This Time, It Plans to Stay
© 2026 ScienceBlog.com | Follow our RSS / XML feed