How far has AI gotten so far?
In the movie The Lawnmower Man, a mad scientist is doing experiments on the mind of a gardener, Jobe Smith. After a series of drugs and deep dives into virtual reality, Jobe, already a much smarter version of himself, says: “I realized that nothing we have been doing is new. We have not been tapping into new brain areas - we have just been awakening the most ancient.”
There is no equivalent to our reptilian brain in our quest to improve Artificial intelligence and make it think and behave like us. AI springs as an upgrade of what we can achieve using the wet mass encased in our skulls as the shoulders of giants to which it can anchor its further step. So, every AI-based improvement results from the combined effort of human neurons, working to produce a better silicon-embedded “thinking” chip. How far will that adventure get us, and how far has it come?
AI is a looming competition for quality writers
An artificial intelligence research laboratory called OpenAI has been working on building an entity they named the enigmatic GPT-3. This language processor has a neural network holding more internal connections than an average human brain. And it has devoured the entire internet and converted it into a base text database. The creators raised it with the idea that a machine writer may finally overshadow established marvels of literature like the immortal War and Peace. Although some experts say that GPT-3 is far from being the new Stephen King, one daring author who tested it says it is not too far from being just as great.
If artificial intelligence dominates human intelligence through sheer processing power, neigh sayers will point out the clear similarity of this progression to the good old Moore’s Law. Computer engineers are packing more transistors inside the same confined chip space and calling it ‘progress.’ Incidentally, to finally break away from this pattern based on sheer numbers, we need new technology, not just anything new, but breakthrough stuff. We need the power of the hyperdrive ship engines from Star Trek that go several times faster than light itself.
Although GPT-3’s achievements are far from what Starship Enterprise, for example, can do, it has just peeked through the delicate boundary between science fiction and reality. In fact, the natural language processor AI could finish a science fiction novel started by a human with an alarming success rate. The novel author, Erik Hoel, honestly admits that part of what GPT-3 produced after being fed a paragraph of his own novel is, in his own words, “definitely something I’d write.”
GPT-3 services are still unavailable to the public, and even if they were, the price you must pay to experiment with the AI’s intricate capabilities is still beyond reasonable. However, soon we might find it hard to distinguish between renowned blog authors and robot writers, and that moment may come sooner than expected even by sceptics. Imagine the huge change in the SEO business alone. A fully capable AI writer producing quality content will bring forth the equivalent of the machine-breaking textile workers from the 19th century. These people, called The Luddites, first indicated how replaceable humans are, even though we are only talking about pure automation.
Certain jobs are reserved for humans, or so we like to think. The boundaries of such a ‘safe zone’ shrink by the hour. The thought that there always be a job waiting for us that robots cannot do is a notion far from bulletproof.
AI is a standalone artist
How much of AI’s work can be attributed to it specifically? It turns out that even GPT-3 is not very capable without the necessary human input, and it crumbles to producing nonsense when you happen to loop its own output back to it.
The elaborate AI that analyzed Nirvana’s discography and produced a new song that sounds ‘just like the band’ is the result of the collective effort of dozens of experts – not to mention the human singer that pitched in the entirely non-AI vocals for the song.
More examples suggest that AI is still a human’s right hand, but without the “body,” a hand is as useless as a TV with its power cord plugged out. AI still needs the nurturing nudge of humans. Artificial intelligence is far from a human-independent entity, even at its best. On the contrary – with the symbiotic power provided by humans, it can outperform any human or machine when taken individually.
With all this emphasis on the power cord and the precious input that every AI craves, we need to understand something else. Even the best humans cannot do what they are acclaimed for without support, whether human-driven or through trusty man-made machines. So, when we suggest that the algorithm Botto will not be able to sell NFTs for a million if there was not for the developer who created it, we are splitting hairs. We know that the automated textile equipment that got smashed by the Luddites did not just bolt off the ground and raised the Maximum Overdrive (1986) style.
The Luddites’ workshops closed doors en masse due to a web of socio-political factors led to by technological advancements. Blaming new tech for us losing jobs to the machines is like a modern curse for the dumbfounded.
We must acknowledge that the bad side-effects of new technology mostly affect younger generations. Statistics show that boomers are unlikely to fall for crypto-related frauds, although the reason behind this fact spans beyond the fact that aging parents mistrust innovation.
AI advancements in the military and private sectors
During the past two world wars, most of the battles were fought in the trenches. In contrast, modern warfare elevated the role of machines to a whole new level. Although war is hell, artificial intelligence in the military has a miraculously benign effect. Precision AI-guided strikes help reduce the collateral damage to civilians. Unsurprisingly, the suffering of the innocent is an omnipresent danger in most military operations. With AI-powered new technology, soldiers are less exposed to danger, and machines step in to take the blow and reduce the risk of harming humans.
People matter, and if AI steps in to alleviate the burden assigned to humans in regular industries, they save lives as military tools. While past military action relied on grizzled soldiers willing to risk life and limb, modern tech allows operators to tackle missions remotely and thus avoid and minimize the risk of people being unnecessarily exposed to danger.
Again, just like in the previous examples in this article, the human factor comes forth as a necessary security layer. AI-powered drones are quick and precise, but only through the hands of skilled individuals. Therefore, military officers like the US Department of Defense chief of staff, Heather Durgin, are extremely careful when recruiting new drone operators. Additionally, all these AI-led appliances and applications in the army open hundreds of thousands of job opportunities in the cybersecurity sector. At the same time, the private sector does not lag too far behind in demand in that same area.
AI is projected to clock a bit over $10 bln in value by 2030 in the US alone. This astounding number is justified by the considerable investment necessary for developing and manufacturing the machines. However, a huge chunk goes towards training and perfecting the skills of the personnel dedicated to its usage and maintenance. Eventually, cyberwarfare and regular warfare would merge into one indistinguishable single unit.
With artificial intelligence as a private tutor or an incredible organizer of what humans will label as an enormous pile of gibberish, we have only scratched the surface of its possibilities. Upcoming AI-powered wonders are due soon, and although training AI to do man’s jobs is costly and time-consuming, most examples show that the goal well justifies the means. With the value and volume of Big Data growing to unimaginable heights, it would only be a matter of time when only AI would be able to make sense of it all. Humans will step back and maintain while machines rush in to sustain the digital fortresses we build for ourselves.
This article alone would be hard to produce without the helping hand of language processor software that reduces the time for editing. One day artificial intelligence might ‘grow’ the emotions that still make for the contrasting difference between them and us. Although we all know it does not even need a good word to do an excellent job, I say this: Thank you, AI, for everything you do for us.