Astronomers have used 15 years of X-ray data to make this video of the remnants of a white dwarf that famously exploded in 1572. Nearly 450 years later, the debris from the explosion is still expanding.
Uncategorized
Web Hosting Info
Astronomers have used 15 years of X-ray data to make this video of the remnants of a white dwarf that famously exploded in 1572. Nearly 450 years later, the debris from the explosion is still expanding.
Uncategorized
It sounds like Microsoft’s Tay chatbot is getting a time-out, as Microsoft instructs her on how to talk with strangers on the Internet. Because, as the company quickly learned, the citizens of the Internet can’t be trusted with that task.
In a statement released Thursday, Microsoft said that a “coordinated effort” by Internet users had turned the Tay chatbot into a tool of “abuse.” It was a clear reference to a series of racist and otherwise abusive tweets that the Tay chatbot issued within a day of debuting on Twitter. Wednesday morning, Tay was a novel experiment in AI that would learn natural language through social engagement. By Wednesday evening, Tay was reflecting the more unsavory aspects of life online.
To read this article in full or to leave a comment, please click here