The ‘yet one more factor’ introduced by Apple at its Worldwide Builders Convention (WWDC) this yr was the business’s worst-kept secret. The Apple Imaginative and prescient Professional, the tech large’s gamble on making combined actuality headsets a factor, has obtained a combined reception. A lot of the concern has centered on the eye-watering $3,499 price.
However there’s a much bigger downside: Whether or not there’ll be sufficient apps obtainable to make the price of the machine price it. It’s an actual problem to revamp apps for a completely new interface—and builders are involved. Learn the total story.
To keep away from AI doom, learn from nuclear safety
For the previous few weeks, the AI discourse has been dominated by those that suppose we may develop an artificial-intelligence system that may at some point grow to be so highly effective it’ll wipe out humanity.
So how do corporations themselves suggest we keep away from AI damage? One proposed resolution comes from a brand new paper by DeepMind et al that means that AI builders ought to consider a mannequin’s potential to trigger “excessive” dangers earlier than even beginning any coaching.
The method may assist builders determine whether or not it’s too dangerous to proceed. However doubtlessly it’d be extra useful for the AI sector to attract classes from a subject that is aware of a factor or two about very actual existential threats—safety analysis and danger mitigation round nuclear weapons.
Melissa’s story is from The Algorithm, her weekly e-newsletter providing you with the within monitor on all issues AI. Enroll to obtain it in your inbox each Monday.