On a rainy Tuesday in San Francisco, Apple executives took the stage to a packed auditorium to unveil the fifth-generation iPhone. The phone, which looked identical to the previous version, had a new feature that the public was quick to comment: Siri, a virtual assistant.
Scott Forstall, then Apple’s chief software officer, pressed a button on the iPhone to call Siri and asked questions. At his request, Siri checked the time in Paris (“20:16,” Siri replied), defined the word “mitosis” (“Cell division in which the nucleus is divided into nuclei containing the same number of chromosomes,” he said) and has published a list of 14 Greek restaurants highly regarded, five of them in Palo Alto, California.
“I’ve been in the field of artificial intelligence for a long time and it continues to amaze me,” Forstall says.
That was 12 years ago. Since then, Siri and competing AI-based assistants such as Alexa and Amazon’s Google Assistant have left no one indifferent. The technology is largely stagnant and talking assistants have become the butt of jokes, even in a sketch of “Saturday Night Live” from 2018 which featured a smart speaker for seniors.
The tech world is now excited about a different kind of virtual assistant: chatbots.
These AI-powered bots, like ChatGPT and San Francisco-based company OpenAI’s new ChatGPT Plus, can quickly improvise answers to questions typed into a chat window. People have been using ChatGPT to run complex tasks like coding software, write business proposals and write fiction.
And ChatGPT, which uses artificial intelligence to guess which word comes next, is rapidly improving. A few months ago I was unable to write a haiku; now he does it with pleasure. On Tuesday, OpenAI unveiled its next-generation AI engine, GPT-4, powered by ChatGPT.
The triumph of ChatGPT, the decline in participants
The enthusiasm for chatbots illustrates how Siri, Alexa and other voice assistants, which once had similar enthusiasm, have squandered its lead in the AI race.
Over the past decade, the products have encountered roadblocks. Siri ran into technological difficulties, including clunky code that took weeks to update with basic functionality, according to John Burkey, a former Apple engineer who worked on the assistant.
Amazon and Google miscalculated how voice assistants would be used, leading them to invest in areas with the technology rarely paid off, the former employees said. When those experiments failed, enthusiasm for the technology waned in companies, they said.
Voice assistants are “dumb as a rock,” Microsoft chief executive Satya Nadella said in an interview with the Financial Times this month, declaring that the latest artificial intelligence will lead the way. Microsoft has worked closely with OpenAI, investing $13 billion in the start-up and incorporating its technology into the Bing search engine, as well as other products.
Google and Apple: What they will do with their assistants
Apple declined to comment on Siri. Google has promised to offer a great virtual assistant to help people on their phones and inside their homes and cars; the company is separately testing a chatbot called Bard. Amazon said it has seen an increase 30 percent on customer engagement globally with Alexa over the past year and was optimistic about its mission to build world-class AI.
Assistants and chatbots are based on different types of Artificial Intelligence. Chatbots work with what are known as large language models, which are systems trained to recognize and generate text. from huge datasets pulled from the web. They can then suggest words to complete a sentence.
Instead, Siri, Alexa and Google Assistant are essentially what are known as command and control systems. They can comprise a finite list of questions and requests such as “What is the weather like in New York?or “Turn on the bedroom lights.” If a user asks the virtual assistant to do something that isn’t in her code, the bot simply says it can’t help.
Siri also had a cumbersome design that took a long time to add new features, said Burkey, who was tasked with improving Siri in 2014. The Siri database contains a giant list of wordsincluding the names of musical artists and places like restaurants, in nearly two dozen languages.
That made her “one big snowball,” she said. If someone wants to add a word to Siri’s database, she added, “it’s going to end up in a big bunch.”
So seemingly simple updates, like adding a few new sentences to the dataset, would require rebuilding the entire database, which could take up to six weeks, Burkey said. Add more complex functions, like the new search tools, it could take nearly a year. That meant Siri had no way to become a creative assistant like ChatGPT, she said.
Alexa and the Google Assistant used to be based on similar technology to Siri, but the companies have struggled to generate significant revenue from the assistants, the former Amazon and Google executives said. (Instead, Apple has successfully used Siri to lure shoppers to its iPhones.)
amazon alexa
After Amazon launched Echo, an Alexa-based smart speaker, in 2014, the company hoped the product would help it increase sales at its online store by allowing consumers to talk to Alexa to place orderssaid a former Amazon leader involved with Alexa.
But while people have enjoyed playing with Alexa’s ability to respond to weather requests and set alarms, few have they asked Alexa to order items, she added.
Amazon may have invested too much in creating new types of hardware, such as discontinued alarm clocks and microwaves that work with Alexa, which sold for cost or lessthe former executive said.
The company also invested little in creating an ecosystem where people could easily extend Alexa’s capabilities, in the same way Apple had done with its App Store, which helped fuel interest in the iPhone. the person said.
Although Amazon has offered a store of “skills“To allow Alexa to control third-party accessories like light switches, people had a hard time finding and configuring speaker skills, unlike the frictionless experience of downloading mobile apps from app stores.
“We’ve never had that App Store moment for attendees,” says Carolina Milanesi, consumer technology analyst by the research firm Creative Strategies which was a consultant to Amazon.
At the end of last year, the Amazon division that worked on Alexa was a major target of the company’s 18,000 layoffs, and several senior Alexa executives left the company.
Kinley Pearsall, an Amazon spokesperson, said that Alexa was much more than a voice assistant and “We are as optimistic about that mission as ever.”
Amazon’s flaws with Alexa may have led Google astray, says a former executive who worked on Google Assistant. Google engineers spent years experimenting with their Assistant to mimic what Alexa might do, including designing voice-activated smart speakers and tablet displays to control home accessories like thermostats and light switches.
Later, the company integrated ads into those household productswhich has not become a significant source of income.
Over time, Google realized that most people only use voice assistant for a limited number of simple tasks, such as setting timers and play music, explained the former manager.
In 2020, when Prabhakar Raghavan, a Google executive, took over Google Assistant, his group refocused the virtual companion as the core feature for Android smartphones.
In January, when Google’s parent company fired 12,000 employeesthe team working on operating systems for home devices lost 16% of their engineers.
Many of the big tech companies are rushing to respond to ChatGPT. At Apple headquarters, the company last month held its annual AI Summit, an internal event for employees to learn about its amazing language model and other AI tools, two people briefed on the show said.
Many engineersincluding members of the Siri team, have been testing speech generation concepts every week, the people said.
Google also said on Tuesday that it will soon launch generative AI tools to help businesses, governments and software developers build apps with built-in chatbots and incorporate the underlying technology into their systems.
According to AI experts, chatbot and voice assistant technologies will converge in the future. This means people will be able to control chatbots with voice and those who use Apple, Amazon and Google products they will be able to ask virtual assistants to help them with their work, not just with tasks like checking the weather.
“These products have never worked in the past because we’ve never had human-level dialogue capabilities,” said Aravind Srinivas, founder of Perplexity, an AI start-up that offers a chatbot-based search engine. “Now we do.”
The New York Times
Source: Clarin
Linda Price is a tech expert at News Rebeat. With a deep understanding of the latest developments in the world of technology and a passion for innovation, Linda provides insightful and informative coverage of the cutting-edge advancements shaping our world.