news
Data Scientist Cites Lack of GPT-4 Details
Several data scientists have pointed out the lack of technical details that came with OpenAI’s recent release of GPT-4.
The company’s GPT-3 (and its variant GPT-3.5) Large Language Model (LLM) system forms the basis for AI assistants such as ChatGPT (General Information) and the Copilot tool on Microsoft’s GitHub. I’m here. “AI Pair Programmer”. However, his latest GPT-4 system was released on his March 14th, but unlike previous systems, the technical details of GPT-4 have not been made public.
1 data scientist contacted me Virtualization and cloud review I share thoughts that have arisen from discussions with colleagues, but have chosen to remain anonymous.
They speculate that OpenAI is moving from primarily a research focus to primarily a business and revenue-generating focus, and that the lack of transparency in technical details is necessary to prevent competitors from leapfrogging OpenAI. increase.
When GPT-3.5 was released just a few months ago, it was essentially the only game in town, but with billions of dollars at stake, the likes of Google DeepMind’s Flamingo and the open-source BLOOM project , LLM has no shortage of competitors. .
One such technical detail provided in previous versions but not in GPT-4 is the number of parameters used in LLM.
For example, the GPT-3.5 model is known to have about 175 billion parameters. These parameters, technically called neural weights, are single numbers like -2.345, and all parameters collectively determine the function and behavior of the model. The more parameters a model has, the more powerful it is. A loose analogy is engine horsepower in the early days of aviation. More horsepower converted into a significantly improved aircraft.
OpenAI published quite a bit of technical information about its predecessor to GPT-4, but it changed the number of parameters, the data used to train the model (as text from Wikipedia), and architectural changes.
The number of parameters used in GPT-4 was not disclosed, but an article on the UX Planet site dated December 26, 2022 stated, “Since 2018, when GPT-1 was released, OpenAI has So, GPT-1 had 117 million parameters, GPT-2 had 1.2 billion parameters, and GPT-3 had even more, 175 billion parameters. means that the GPT-3 model has 100 times more parameters than GPT – 2. GPT-3 is a very large model size with 175 billion parameters.”
The article also Wired An article by Andrew Feldman, founder and CEO of Cerebras, a company that trains GPT models in partnership with OpenAI, said from his conversations with OpenAI that GPT-4 will be about 100 trillion parameters (that article was published in 2021 It was published in August ).
We asked Microsoft’s “new Bing” search engine, powered by GPT-4, why OpenAI hasn’t published the technical details of its latest product like its predecessor. Referencing an article on UX Planet and another article on the site that requires registration, “OpenAI has not yet published much information about GPT-4, including the number of parameters used. Sources According to OpenAI, they haven’t released much information about the model.However, while some information has been released about GPT-4, including what it does and how it can be used, the details of its specifications are still unknown. .”
December 2022 article by Atlantic “Money kills the magic of ChatGPT,” he claimed.
“How will the use of these tools change once they become profit-producing tools rather than loss-leading ones?” the article said. “Will they become paid subscription products? Will they run ads? Will they power new companies that cost less than incumbent industries?”
The first question has already been answered. GPT-4 is only available to users of OpenAI’s ChatGPT Plus service ($20 per month) and developers who have applied for a waitlist and been granted access to use the technology in their products. . You are charged for the tokens you use in requests to your model. Pricing details are available here.
The details behind GPT-4 may not be disclosed, as OpenAI is clearly looking to make a lot of profit with GPT-4 and other technologies.
Reuters ChatGPT owner OpenAI predicted $1 billion in revenue by 2024, according to a report last December.
Additionally, Microsoft has invested $10 billion in OpenAI. forbes and other sources.
With that and other Microsoft investments, Microsoft seems to have given the new Bing experience the first dibs on using GPT-4, but the general public is skeptical when it comes to using it directly for free, even if it’s very Even with a limited base, it can get left behind. A reduction that had predicted revenue of $1 billion.
One data scientist said Virtualization and cloud review“It is understandable that OpenAI chose not to publish a lot of technical information about GPT-4. The stakes are hard to imagine in terms of money and impact on people’s lives. The race to develop language models could become one of the biggest events in the history of technology.”
It’s like free internet.
About the author
David Ramel is an editor and writer at Converge360.