Flying Squid@lemmy.world to Technology@lemmy.worldEnglish · 1 year agoMicrosoft Needs So Much Power to Train AI That It's Considering Small Nuclear Reactorsfuturism.comexternal-linkmessage-square418fedilinkarrow-up11.17Karrow-down130cross-posted to: tech@kbin.social
arrow-up11.14Karrow-down1external-linkMicrosoft Needs So Much Power to Train AI That It's Considering Small Nuclear Reactorsfuturism.comFlying Squid@lemmy.world to Technology@lemmy.worldEnglish · 1 year agomessage-square418fedilinkcross-posted to: tech@kbin.social
minus-squareFooBarrington@lemmy.worldlinkfedilinkEnglisharrow-up1·1 year agoSure, but that’s not done with the kind of model this thread is about (separate training and inference). You’re talking about classical ML models with continuous updates, which you wouldn’t run on this kind of GPU infrastructure.
Sure, but that’s not done with the kind of model this thread is about (separate training and inference). You’re talking about classical ML models with continuous updates, which you wouldn’t run on this kind of GPU infrastructure.