Google hunts for a new way to search with ChatGPT rival Bard

Google says its Language Model for Dialogue Applications experimental conversational AI service is an essential next step for the search engine company

Google has opened up its own answer to ChatGPT, with trusted testers putting Bard through its paces ahead of the new platform becoming more widely available to the public in the coming weeks.

“AI is the most profound technology we are working on today,” Sundar Pichai, CEO of Google and Alphabet explained in a blog post published yesterday. “Whether it’s helping doctors detect diseases earlier or enabling people to access information in their own language, AI helps people, businesses and communities unlock their potential. And it opens up new opportunities that could significantly improve billions of lives.”

Google has a rich history of utilising AI to enhance its search product, says Pichai. The company’s first Transformer model, BERT, was able to grasp the complexities of human language and two years ago, it introduced MUM, which boasts 1,000 times greater power than BERT and has a multi-lingual and advanced understanding of information, including the ability to identify key moments in videos and offer vital information, including crisis support, in numerous languages.

LaMDA, PaLM, Imagen, and MusicLM, the latest AI technologies from Google, build on this legacy and offer new ways to interact with information, from language and images to video and audio. 

Google putting Bard to work on search

The company says it is making these cutting-edge AI advancements available in its products, starting with Search. Pichai says this involves work on one of the most exciting aspects of AI, how it can deepen understanding of information and transform it into helpful knowledge more efficiently, making it easier for people to find what they're looking for and get things done.

“When people think of Google, they often think of turning to us for quick factual answers, like ‘how many keys does a piano have?’,” says Pichai. “But increasingly, people are turning to Google for deeper insights and understanding — like, ‘is the piano or guitar easier to learn, and how much practice does each need?’ Learning about a topic like this can take a lot of effort to figure out what you really need to know, and people often want to explore a diverse range of opinions or perspectives.”

AI can be helpful in these moments, says Pichai, synthesising insights for questions where there’s no one correct answer. Upcoming AI-powered features will distil complex information and multiple perspectives into easy-to-digest formats, so you can quickly understand the big picture and learn more from the web: whether that’s seeking out additional perspectives, like blogs from people who play both piano and guitar, or going deeper on a related topic, like steps to get started as a beginner. These new AI features will begin rolling out on Google Search soon, the company says.

Share

Featured Articles

Mobile AI in 2024: Unlocking smartphone opportunities

From Samsung, to Google, to Qualcomm, AI Magazine considers how enterprises are unlocking further value in Mobile AI via smartphones and other devices

A year of events: Tech LIVE Virtual, Cloud & 5G LIVE & more

We look back at our events from 2023, which focused on some of the hottest topics in technology: from sustainability and AI to quantum computing

Magazine roundup: Top 100 women in technology 2023

We take a look at some of the leading women in the tech sector and how their contributions to the field are advancing global digital transformation

OpenAI preparedness framework: Enhancing global AI safety

Machine Learning

GenAI as key to accelerating digital transformation in India

AI Strategy

Humane chooses cloud telecom Optiva BSS for AI Pin launch

AI Applications