"On June 20th, George Hotz, founder of self-driving startup Comma.ai leaked that GPT-4 isn't a single monolithic dense model (like GPT-3 and GPT-3.5) but a mixture of 8 x 220-billion-parameter models. Later that day, Soumith Chintala, co-founder of PyTorch at Meta, reaffirmed the leak. Just the day before, Mikhail Parakhin, Microsoft Bing AI lead, had also hinted at this."

"GPT-4 is not one big >1T model but eight smaller ones cleverly put together. The mixture of experts paradigm OpenAI supposedly used for this 'hydra' model is neither new nor invented by them."

GPT-4's secret has been revealed

#solidstatelife #ai #openai #chatgpt #gpt4