AI Weekly: Deepfakes, Lawsuits, Mobile Minds & More!
Did you hear about the “Croissantosaurus” that has been going viral on social media:
Cool, isn’t it? Well, as cool as it looks, it is sadly fake and was made using Generative AI. Oh well, I just thought to share the fantastic picture. Now, to the AI news from last week.
OpenAI just dropped New Sora Videos
OpenAI keeps teasing the capabilities of its Sora generative video model. The latest clips are getting closer to a Hollywood production than any we’ve seen from AI to date — and all from a single prompt.
Sora’s isn’t available yet, but these results are promising. Read more.
Hold my Tesla keys! Elon Musk sues OpenAI over friendly AI
Speaking of OpenAI. Things got complicated with Elon Musk, one of their initial founding Members. Elon Musk is now suing OpenAI, alleging they've ditched their open-source approach and become too cozy with Microsoft, prioritizing profits over safety. Musk wants them to return to their original plan and keep scary super intelligence in check – all for the good of humanity. Read more about the case here.
Microsoft Bets Big on Mistral AI: Bonjour to a Multilingual Chatbot Future!
Microsoft has invested a boatload of cash ($16M) in Mistral AI, a French startup creating a super-smart multilingual language model. This means Mistral AI can chat, translate, and write like a pro in tons of languages, kinda like ChatGPT 4 but with a fancy French accent. Microsoft is betting big on Mistral AI, providing them with fancy Azure AI supercomputers and promising to help them grow and research even cooler AI stuff. Looks like the battle of the language models is heating up! More here.
Meta AI Research Introduces MobileLLM
Ever wished your phone was a bit more, well, brainy? Researchers at Meta AI are working on ways to squeeze powerful large language models (LLMs) onto our little mobile devices. LLMs are whizzes at understanding and processing tons of text data, but they also gobble up a ton of computing power, making them a poor fit for smartphones.
The new research tackles this challenge by introducing MobileLLM, a special type of LLM with a slim and efficient design. MobileLLM keeps its smarts sharp by focusing on a streamlined architecture, allowing it to tackle various tasks without needing billions of parameters (a measure of complexity). This makes it a lightweight champion, able to run on devices with even the most modest resources. By shrinking down LLMs, researchers are paving the way for a future where our phones become even more helpful and intelligent companions.
AI Re-imagines Venom Based On Different Countries
The results are intoxicating. Watch: