ChatGPT Weekly News: E5
After a few weeks of “clam”, the two giants in the field of generative AI, Google and Microsoft (with OpenAI), have intensified their efforts this week, initiating a new round of competition.
In Google I/O 2023 which ends this week, AI has undoubtedly been the sole protagonist. At the foundation of Google AI lies the PaLM2 model.
PaLM 2 is a new state-of-the-art language model from Google AI that is more powerful, efficient, and versatile than previous language models.
- Model size: 540 billion parameters
- Training data: 1.56T words of text and code
- Training objectives: UL2, masked language modeling, code generation, and text-to-code translation
- Downstream tasks: Text generation, translation, summarization, question answering, code generation, and text-to-code translation
- Results: State-of-the-art results on all downstream tasks
- Efficiency: Faster and more efficient inference than PaLM
- Model size: 175 billion parameters
- Training data: 1.56T words of text
- Training objectives: Masked language modeling
- Downstream tasks: Text generation, translation, summarization, question answering, and code generation
- Results: State-of-the-art results on some downstream tasks
- Efficiency: Not as efficient as PaLM 2
It can be said that PaLM 2 serves as the AI engine for Google's entire product lines.