top of page

The Future of AI

OpenAI’s GTP-3 – the future of writing, journalism and coding is called AI. Via medialist.info

Last month, a blog was posted by Liam Porr, a University of California, Berkeley student who offered advice for more productive work: “Feeling unproductive? Maybe you should stop overthinking.” This particular post was upvoted nearly 200 times and received 71 responses from Hacker News' tech- and VC-focused commenters. What most people didn’t realize was that the blog was written using artificial intelligence. Liam Porr created multiple posts in the past month using GPT-3, a language-generating AI tool developed by OpenAI., a non-profit artificial intelligence research company that has received more than 1 billion dollars in investments from Peter Thiel, Elon Musk, Reid Hoffman, Marc Benioff, Sam Altman, and others.

GPT-3 is the third-generation of a language prediction model. It is still under beta testing and only available to selected people. OpenAI wants outside developers to help it explore what GPT-3 can do, but it plans to turn the tool into a commercial product later this year. GPT-3 is the most powerful language model ever; the model has 175 billion parameters(synapses) and cost 4.6 million dollars to test. GPT-3 has been used to write short stories, songs, press releases, technical manuals, and much more. Something very intriguing that the GPT-3 can do is mimic specific artists and their style of writing. An example of this is when Mario Klingemann, an artist who works with machine learning, shared a short story called “The importance of being on Twitter,” written in Jerome K. Jerome’s style. Mario mentioned that all he did was provide the AI with a title, the author’s name, and the initial “It.” We see this similar trend where Liam Porr published a blog by only feeding the AI with a headline. This model uses user input and ‘next word prediction’ technology to write blogs, stories, etc. related to the given prompt. GPT-3 has been exposed to over a million articles in order to train and enhance the ‘next word prediction’ technology.

We’ve always believed that no matter how developed AI becomes, it would never be able to achieve the creativity a human brain can. The GPT-3 consists of 175 billion synapses, whereas the human brain has 100 trillion synapses. A synapse is a structure that allows the transfer of electric activity (information) from one cell to another. It would take OpenAI several years to come up with something that consists of 100 trillion synapses, and its cost would be very high too. Although AI equalling the human brain seems a bit far fetched, the GPT-3 is a prime example of how technology has developed so much over time. Writing codes, designing websites, with just a couple of human prompts, shows the extent to which AI can be further improved.


0 comments

Recent Posts

See All
bottom of page