When you look back at the biggest breakthroughs in AI over the last decade, Google has been at the forefront of so many of them. Our groundbreaking work in foundation models has become the bedrock for the industry and the AI-powered products that billions of people use daily. As we continue to responsibly advance these technologies, there’s great potential for transformational uses in areas as far-reaching as healthcare and human creativity.

Over the past decade of developing AI, we’ve learned that so much is possible as you scale up neural networks — in fact, we’ve already seen surprising and delightful capabilities emerge from larger sized models. But we’ve learned through our research that it’s not as simple as “bigger is better,” and that research creativity is key to building great models. More recent advances in how we architect and train models have taught us how to unlock multimodality, the importance of having human feedback in the loop, and how to build models more efficiently than ever. These are powerful building blocks as we continue to advance the state of the art in AI while building models that can bring real benefit to people in their daily lives.

Introducing PaLM 2

Building on this work, today we’re introducing PaLM 2, our next generation language model. PaLM 2 is a state-of-the-art language model with improved multilingual, reasoning and coding capabilities.

  • Multilinguality: PaLM 2 is more heavily trained on multilingual text, spanning more than 100 languages. This has significantly improved its ability to understand, generate and translate nuanced text — including idioms, poems and riddles — across a wide variety of languages, a hard problem to solve. PaLM 2 also passes advanced language proficiency exams at the “mastery” level.
  • Reasoning: PaLM 2’s wide-ranging dataset includes scientific papers and web pages that contain mathematical expressions. As a result, it demonstrates improved capabilities in logic, common sense reasoning, and mathematics.
  • Coding: PaLM 2 was pre-trained on a large quantity of publicly available source code datasets. This means that it excels at popular programming languages like Python and JavaScript, but can also generate specialized code in languages like Prolog, Fortran and Verilog.

A versatile family of models

Even as PaLM 2 is more capable, it’s also faster and more efficient than previous models — and it comes in a variety of sizes, which makes it easy to deploy for a wide range of use cases. We’ll be making PaLM 2 available in four sizes from smallest to largest: Gecko, Otter, Bison and Unicorn. Gecko is so lightweight that it can work on mobile devices and is fast enough for great interactive applications on-device, even when offline. This versatility means PaLM 2 can be fine-tuned to support entire classes of products in more ways, to help more people.

Powering over 25 Google products and features

At I/O today, we announced over 25 new products and features powered by PaLM 2. That means that PaLM 2 is bringing the latest in advanced AI capabilities directly into our products and to people — including consumers, developers, and enterprises of all sizes around the world. Here are some examples:

  • PaLM 2’s improved multilingual capabilities are allowing us to expand Bard to new languages, starting today. Plus, it’s powering our recently announced coding update.
  • Workspace features to help you write in Gmail and Google Docs, and help you organize in Google Sheets are all tapping into the capabilities of PaLM 2 at a speed that helps people get work done better, and faster.
  • Med-PaLM 2, trained by our health research teams with medical knowledge, can answer questions and summarize insights from a variety of dense medical texts. It achieves state-of-the-art results in medical competency, and was the first large language model to perform at “expert” level on U.S. Medical Licensing Exam-style questions. We’re now adding multimodal capabilities to synthesize information like x-rays and mammograms to one day improve patient outcomes. Med-PaLM 2 will open up to a small group of Cloud customers for feedback later this summer to identify safe, helpful use cases.

  • Sec-PaLM is a specialized version of PaLM 2 trained on security use cases, and a potential leap for cybersecurity analysis. Available through Google Cloud, it uses AI to help analyze and explain the behavior of potentially malicious scripts, and better detect which scripts are actually threats to people and organizations in unprecedented time.
  • Since March, we’ve been previewing the PaLM API with a small group of developers. Starting today, developers can sign up to use the PaLM 2 model, or customers can use the model in Vertex AI with enterprise-grade privacy, security and governance. PaLM 2 is also powering Duet AI for Google Cloud, a generative AI collaborator designed to help users learn, build and operate faster than ever before.

Advancing the future of AI

PaLM 2 shows us the impact of highly capable models of various sizes and speeds — and that versatile AI models reap real benefits for everyone. Yet just as we’re committed to releasing the most helpful and responsible AI tools today, we’re also working to create the best foundation models yet for Google.

Our Brain and DeepMind research teams have achieved many defining moments in AI over the last decade, and we’re bringing together these two world-class teams into a single unit, to continue to accelerate our progress. Google DeepMind, backed by the computational resources of Google, will not only bring incredible new capabilities to the products you use every day, but responsibly pave the way for the next generation of AI models.

We’re already at work on Gemini — our next model created from the ground up to be multimodal, highly efficient at tool and API integrations, and built to enable future innovations, like memory and planning. Gemini is still in training, but it’s already exhibiting multimodal capabilities never before seen in prior models. Once fine-tuned and rigorously tested for safety, Gemini will be available at various sizes and capabilities, just like PaLM 2, to ensure it can be deployed across different products, applications, and devices for everyone’s benefit.

Source

18 comments
  1. I simply wished to thank you very much once again. I do not know the things I could possibly have taken care of without these information shared by you regarding that area of interest. It was before a real challenging crisis in my circumstances, but taking note of the skilled manner you solved that made me to jump over happiness. I am just happy for the work as well as expect you comprehend what an amazing job you have been undertaking teaching people through the use of your website. More than likely you’ve never come across any of us.

  2. I have been browsing on-line more than 3 hours these days, but I never discovered any interesting article like yours. It is lovely value sufficient for me. In my view, if all web owners and bloggers made excellent content material as you probably did, the internet can be much more useful than ever before.

  3. The crux of your writing whilst appearing agreeable in the beginning, did not really work very well with me personally after some time. Somewhere throughout the sentences you actually managed to make me a believer but only for a short while. I still have a problem with your jumps in assumptions and you would do nicely to help fill in those gaps. If you can accomplish that, I will undoubtedly be fascinated.

  4. Thanks for another wonderful article. The place else could anyone get that kind of info in such an ideal way of writing? I have a presentation subsequent week, and I am at the search for such info.

  5. Youre so cool! I dont suppose Ive read anything like this before. So nice to search out somebody with some original ideas on this subject. realy thank you for starting this up. this web site is one thing that is needed on the internet, somebody with a little bit originality. helpful job for bringing something new to the internet!

  6. It’s actually a cool and useful piece of information. I am glad that you shared this helpful info with us. Please keep us informed like this. Thanks for sharing.

  7. I loved as much as you will receive carried out right here. The sketch is attractive, your authored material stylish. nonetheless, you command get got an shakiness over that you wish be delivering the following. unwell unquestionably come more formerly again since exactly the same nearly a lot often inside case you shield this increase.

  8. The next time I read a weblog, I hope that it doesnt disappoint me as much as this one. I mean, I know it was my choice to learn, but I truly thought youd have one thing fascinating to say. All I hear is a bunch of whining about something that you could possibly fix if you werent too busy searching for attention.

  9. What i don’t understood is actually how you are not actually much more well-liked than you may be right now. You are so intelligent. You realize therefore considerably relating to this subject, made me personally consider it from so many varied angles. Its like women and men aren’t fascinated unless it is one thing to do with Lady gaga! Your own stuffs outstanding. Always maintain it up!

  10. Thank you for another excellent article. Where else may just anyone get that kind of information in such an ideal means of writing? I’ve a presentation next week, and I am on the search for such information.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

New updates to ensure safer learning at school and at home

The Google for Education team continues to build education services with highly…

Across the globe, Apple and its teams find new ways to give

The company’s Employee Giving program has raised over $880 million, with more…

Introducing Google’s Secure AI Framework

The potential of AI, especially generative AI, is immense. However, in the…