[ad_1]
A Different Language
Meta recently unveiled a new large language model with the aim of preserving the world’s languages. Called Massively Multilingual Speech (MMS), the model can identify more than 4,000 spoken languages. Meta feels that many of the world’s languages are in danger of disappearing, and the limitations of current speech recognition and generation technology will only accelerate this trend.
AI Sandbox
Another Meta AI tool is the AI Sandbox. It gives brands the choice to generate variations of ad copies. Furthermore, the AI Sandbox helps create ads with funky and trendy backdrops for ad copies. The idea is to save time on making visual alterations to an ad. In the broader scheme of things, this AI tool allows Meta to look beyond the sphere of social media.
Creating Its Own Chipset
Meta has created its own chip — MTIA (Meta Training and Inference Accelerator) — for generating greater compute power and efficiency than CPUs. AI relies heavily on GPUs (graphic processing units) and Meta claims to have deployed both MTIA chips and GPUs for efficient AI. That’s not all, as Meta claims to have built one of the fastest AI supercomputers in the world. Called Research SuperCluster (RSC) AI Supercomputer, it has been built to train the next generation of large AI models to power new augmented reality tools, content understanding systems, real-time translation technology and more.
The supercomputer features 16,000 GPUs. What Meta is also doing — something Google and OpenAI are cautious about — is that it is making its AI tools open source. In other words, it is giving developers and companies access to copy, modify or reuse the tools for their own benefits. In February 2023, Meta announced that LLaMA (Large Language Model Meta AI) will be available for those “affiliated with organisations in government, civil society and academia; and industry research laboratories around the world”. Meta isn’t keeping its cards close to the chest and that is a rather unusual approach as the race for AI hots up by the day.
[ad_2]
Source link