Concept

Bard (chatbot)

Bard is a conversational generative artificial intelligence chatbot developed by Google, based initially on the LaMDA family of large language models (LLMs) and later the PaLM LLM. It was developed as a direct response to the rise of OpenAI's ChatGPT, and was released in a limited capacity in March 2023 to lukewarm responses, before expanding to other countries in May. In November 2022, OpenAI launched ChatGPT, a chatbot based on the GPT-3 family of large language models (LLM). ChatGPT gained worldwide attention following its release, becoming a viral Internet sensation. Alarmed by ChatGPT's potential threat to Google Search, Google executives issued a "code red" alert, reassigning several teams to assist in the company's artificial intelligence (AI) efforts. Sundar Pichai, the CEO of Google and parent company Alphabet, was widely reported to have issued the alert, but Pichai later denied this to The New York Times. In a rare and unprecedented move, Google co-founders Larry Page and Sergey Brin, who had stepped down from their roles as co-CEOs of Alphabet in 2019, were summoned to emergency meetings with company executives to discuss Google's response to ChatGPT. Earlier that year, the company had unveiled LaMDA, a prototype LLM, but did not release it to the public. When asked by employees at an all-hands meeting whether LaMDA was a missed opportunity for Google to compete with ChatGPT, Pichai and Google AI chief Jeff Dean stated that while the company had similar capabilities to ChatGPT, moving too quickly in that arena would represent a major "reputational risk" due to Google being substantially larger than OpenAI. In January 2023, Google sister company DeepMind CEO Demis Hassabis hinted at plans for a ChatGPT rival, and Google employees were instructed to accelerate progress on a ChatGPT competitor, intensively testing "Apprentice Bard" and other chatbots. Pichai assured investors during Google's quarterly earnings investor call in February that the company had plans to expand LaMDA's availability and applications.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (1)
MATH-212: Analyse numérique et optimisation
L'étudiant apprendra à résoudre numériquement divers problèmes mathématiques. Les propriétés théoriques de ces méthodes seront discutées.
Related lectures (8)
Deep Learning for Autonomous Vehicles: Learning
Explores learning in deep learning for autonomous vehicles, covering predictive models, RNN, ImageNet, and transfer learning.
Logistic Regression: Cost Functions & Optimization
Explores logistic regression, cost functions, gradient descent, and probability modeling using the logistic sigmoid function.
Normal Stress in Beams: Bending Analysis
Explores normal stresses in beams during bending and the calculation of the second moment of area for various shapes.
Show more
Related publications (12)

Harnessing Rule-Based Chatbots to Support Teaching Python Programming Best Practices

Denis Gillet, Juan Carlos Farah, Sandy Ingram, Adrian Christian Holzer

In recent years, the use of chatbots in education has been driven by advances in natural language processing and the increasing availability of digital education platforms. Although the added value of educational chatbots appears promising, researchers hav ...
2023

TRACE: A Conceptual Model to Guide the Design of Educational Chatbots

Denis Gillet, Maria Jesus Rodriguez Triana, Juan Carlos Farah, Sandy Ingram, Fanny Kim-Lan Lasne, Adrian Christian Holzer

Driven by the rising popularity of chatbots such as ChatGPT, there is a budding line of research proposing guidelines for chatbot design, both in general and specifically for digital education. Nevertheless, few researchers have focused on providing concep ...
2023

Towards Novel Evaluation Methods for Social Dialog Systems

Ekaterina Svikhnushina

Language has shaped human evolution and led to the desire to endow machines with language abilities. Recent advancements in natural language processing enable us to achieve this breakthrough in human-machine interaction. However, introducing conversational ...
EPFL2023
Show more
Related concepts (5)
Large language model
A large language model (LLM) is a language model characterized by its large size. Their size is enabled by AI accelerators, which are able to process vast amounts of text data, mostly scraped from the Internet. The artificial neural networks which are built can contain from tens of millions and up to billions of weights and are (pre-)trained using self-supervised learning and semi-supervised learning. Transformer architecture contributed to faster training.
Generative artificial intelligence
Generative artificial intelligence (AI) is artificial intelligence capable of generating text, images, or other media, using generative models. Generative AI models learn the patterns and structure of their input training data and then generate new data that has similar characteristics. In the early 2020s, advances in transformer-based deep neural networks enabled a number of generative AI systems notable for accepting natural language prompts as input.
Generative pre-trained transformer
Generative pre-trained transformers (GPT) are a type of large language model (LLM) and a prominent framework for generative artificial intelligence. The first GPT was introduced in 2018 by OpenAI. GPT models are artificial neural networks that are based on the transformer architecture, pre-trained on large data sets of unlabelled text, and able to generate novel human-like content. As of 2023, most LLMs have these characteristics and are sometimes referred to broadly as GPTs.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.