NEW YORK -- Imagine a language tutor who is always on, available anytime to teach new vocabulary or check up on a student's progress.
On Thursday, Nvidia launched a language learning platform using artificial intelligence that promises to do just that for American Sign Language learners, in partnership with the American Society for Deaf Children and creative agency Hello Monday.
The platform, called Signs, features a 3-D avatar to demonstrate signs. Users keep their video cameras on while interacting with the platform and an AI tool provides feedback as they practice the signs. At launch, the platform features 100 distinct signs, but Nvidia hopes to grow that to 1,000.
Signs represents just one of the many ways AI is helping to advance work on assistive technologies, or tools designed to help people who are disabled or elderly, or their caretakers. Meta, Google and OpenAI have, for example, used AI to improve features for blind or low-vision users, and Apple introduced AI-enabled eye tracking to help physically disabled users navigate their iPhones. Blind users say those advancements are already making it easier for them to navigate life and work.
American Sign Language is the third most prevalent language in the United States, behind English and Spanish, according to the groups behind Signs.
The ASL-learning platform is also a reminder that Nvidia has been trying to branch out into more than just the hardware behind AI. Nvidia has become a major supplier to the AI industry by building the chips that most companies use to run the technology, in addition to its own AI models and software platforms. The company's stock has soared more than 100% in the past year as AI companies who proclaim the technology's future promise buy up vast amounts of Nvidia's chips, bringing it to a more than $3.4 trillion valuation.
Michael Boone, Nvidia's manager for trustworthy AI product, said the company is committed to building AI products not just for corporate customers, but to also foster practical applications for the technology. "It's important for us to produce efforts like Signs, because we want to enable not just one company or a set of companies, but we want to enable the ecosystem," Boone said in an interview with CNN.
And, ultimately, more people using AI in any form is good for Nvidia's core chip-making business. Some investors have in recent months raised concerns about whether tech giants have been overspending on AI infrastructure, including chips, and questioned how long it might take to earn a return on that investment.
Signs is free to use, and it will allow ASL speakers to contribute videos of signs that are not already available on the platform to grow its vocabulary. That data could also enable Nvidia to develop new ASL-related products down the road - for example, to improve sign recognition in video-conferencing software or gesture control in cars. The company says it will also make the data repository publicly available for other developers.
And for future iterations of Signs, Nvidia said the team behind the platform is exploring how to include non-manual signals that are crucial to ASL, such as facial expressions and head movements, as well as slang and regional variations in the language.
"Most deaf children are born to hearing parents. Giving family members accessible tools like Signs to start learning ASL early enables them to open an effective communication channel with children as young as six to eight months old," Cheri Dowling, executive director of the American Society for Deaf Children, said in a statement about the new project.
The-CNN-Wire & 2025 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.