Big AI Funding and OpenAI’s GPT Next

Google News

Nextbigfuture at substack has all of the big AI news for the week.

There was big news this week in AI, AI startups, AI research and AI funding.

1. OpenAI’s Japan CEO reveals GPT-Next will be released this year, and its effective computational load is 100x greater than GPT-4

2. Two $125 Billion AI Training Centers, Starting at 0.5-1 GW and Scaling to 5-10GW

3. XAI 100k H100 AI data Center

4. AI Research

5. Major recent AI funding announcements including Safe Superintelligence

1. OpenAI GPT Next and Orion in 2025

GPT-Next will be released this year, and its effective computational load is 100x greater than GPT-4.

GPT-4 NEXT, which will be released this year, is expected to be trained using a miniature version of Strawberry (for better reasoning) with roughly the same computational resources as GPT-4, with an effective computational load 100 times greater.

The AI ​​model called ‘GPT Next’ that will be released in the future will evolve nearly 100 times based on past performance. Unlike traditional software, AI technology grows exponentially.

This 100 times increase does not refer to the scaling of computing resources, but rather to the effective computational volume + 2 OOMs, including improvements to the architecture and learning efficiency.

Orion, which has been in the spotlight recently, was trained for several months on the equivalent of 10k H100 compared to GPT-4, adding 10 times the computational resource scale, making it +3 OOMs, and is expected to be released sometime next year.

Two $125 billion AI Training Data Centers

Two companies are looking to develop artificial intelligence (AI) data centers in North Dakota. Commissioner of Commerce Josh Teigen revealed during a Public Service Commission meeting in August that two companies had approached him and state Governor Doug Burgum about developing AI data centers.

The data center projects would start between 500MW and 1GW projects, but could scale up to 5-10GW facilities eventually. The projects would cost up to $125 billion each. This should enable the 10,000 greater compute compared to what was used for GPT 4. This should enable GPT 6 class LLM (Large language models).

See the substack article for more.

Source of this programme

“This is another clever item!!”

“Nextbigfuture at substack has all of the big AI news for the week. There was big news this week in AI, AI startups, AI research and AI funding. 1. OpenAI’s…”

Source: Read More

Source Link: https://www.nextbigfuture.com/2024/09/big-ai-funding-and-openais-gpt-next.html

#GoogleNews – BLOGGER – GoogleNews

Author: BLOGGER