Startup Skyward Wildfire says it may stop catastrophic fires by stopping the lightning strikes that ignite them. To this point, it hasn’t publicly revealed the way it does so, however on-line paperwork recommend the corporate is counting on an method the US authorities started evaluating within the early Nineteen Sixties: seeding clouds with metallic chaff, or slim fiberglass strands coated with aluminum.
It simply raised hundreds of thousands of {dollars} to speed up its product improvement and increase its operations. However researchers and environmental observers say uncertainties stay, together with how effectively the seeding may fit below various circumstances, how a lot materials would must be launched, how steadily it must be carried out, and what types of secondary environmental impacts would possibly consequence. Learn the complete story.
—James Temple
OpenAI’s “compromise” with the Pentagon is what Anthropic feared
OpenAI has reached a deal that may permit the US army to make use of its applied sciences in categorised settings. CEO Sam Altman stated the negotiations, which the corporate started pursuing solely after the Pentagon’s public reprimand of Anthropic, had been “positively rushed.”
OpenAI has taken nice pains to say that it has not caved to permit the Pentagon to do no matter it desires with its expertise. The corporate printed a weblog put up explaining that its settlement protected in opposition to use for autonomous weapons and mass home surveillance, and Altman stated the corporate didn’t merely settle for the identical phrases that Anthropic refused.
Nevertheless it’s not but clear if OpenAI can construct within the security precautions it guarantees because the army rushes out a politicized AI technique throughout strikes on Iran, or if the deal can be seen as adequate by workers who wished the corporate to take a tougher line. Strolling that tightrope can be tough. Learn the complete story.
—James O’Donnell
