Imagine:
Robots that help teach social skills to children with
autism. Translation software that provides deaf students with a more fluid and
interactive experience. Data analysis to determine effective methods to
identify those with dyslexia.
اضافة اعلان
These tools, which all incorporate artificial intelligence, aim
to find better ways to detect, teach and assist those with learning
disabilities. Some are already in classrooms; others are still in the research
phase.
Social robots, which are made to interact with humans, can help
teach social and educational skills to students of all abilities, including
those with attention deficit hyperactivity disorder, hearing impairments, Down
syndrome and autism.
Addressing the needs of children on the autism spectrum is
particularly urgent because of their sheer numbers; 1 in 54 children are
diagnosed with autism, according to the Centers for Disease Control and
Prevention.
And those students tend to respond to robots “in a way that they
don’t to puppets or pet therapies or to many of the other kinds of things that
we’ve tried,” said Brian Scassellati, a professor of computer science,
cognitive science and mechanical engineering at Yale University.
That may be because robots seem humanlike but are nonjudgmental,
he said. The robots come in a variety of designs, including a small boy, a
classic sci-fi machine and a furry snowman, and they go by peppy names such as
Kaspar, Nao and Zeno.
In a recent study by Scassellati and his colleagues, an early
prototype of a robot named Jibo — which looks like a small table lamp with a
round head that swivels in all directions and a glowing white circle on a touch
screen as its face — worked every day for 30 days with 12 children and their
caregivers. Jibo modeled social-gaze behavior, such as making eye contact and
sharing attention, and provided feedback and guidance during six interactive
games played on screens.
“The robot’s job was to adjust the difficulty of the game based
on the child’s performance,” Scassellati said. But the idea isn’t that the
robot replaces a teacher or caregiver. “We never want to encourage kids to just
respond to the technology; that doesn’t do them any good,” he said. “We want to
enable them to interact with people in a more substantial way.”
Research has found that the robots help improve educational and
social skills, but far more studies are needed to discover how to make these
changes stick and translate to the real world.
How does AI play into this? Technology has advanced, but so has
research into how perceptions are formed, how people can infer one another’s
feelings and thoughts and what constitutes emotional intelligence. These
insights can be translated into algorithms that allow robots to interpret
speech, gestures and complex verbal and nonverbal cues as well as learn from
feedback.
Danielle Kovach, who teaches third-grade special education in
Hopatcong, New Jersey, said she would be curious to see what further research
shows. “So much of teaching social skills to students with autism is reading
facial expressions, reading body languages and picking up on social cues of
others. Is a robot able to mimic those things we learn from humans?” she said.
Kovach is also the president of the Council for Exceptional Children, an
organization of special education professionals.
While the social robots are primarily used in research studies,
there is a nascent marketplace aimed at classrooms and individuals. For
example, LuxAI, a Luxembourg-based company, has been selling the
friendly-looking QTRobot, designed for children with autism, to parents since
early 2021; right now it operates only in English and French.
Children with autism interact with the robot daily for 10
minutes to an hour, depending on their age and level of support needed, said
Aida Nazari, a co-founder of LuxAI. The company has sold a few hundred
QTRobots, primarily to families in the United States, she added. But many
families may find that a social robot is far too expensive at this point.
QTRobot costs $2,000 plus a $129 monthly software subscription, which includes
support services.
AI is also used in a simpler way to help those living with
autism: through gaming. Maithilee Kunda, an assistant professor of computer
science at Vanderbilt University, and her colleagues created a video game called
“Film Detective,” which will be piloted this spring.
The concept: The player wakes up in the future — the year 3021 —
and has to help a scientist and her robot sidekick catch a villain who is
stealing items from the Museum of Human History. Their detective work involves
using a series of film clips to decode how people in today’s world behave.
“Many with autism have superior visual thinking but have a lot
of difficulty with social action,” Kunda said. “So we thought, what if we can
give them visual ways to imagine theory of mind?” Theory of mind is the ability
to imagine what other people are thinking or feeling — something those with
autism can find particularly difficult, which may make social interactions
challenging.
The game taps into theory of mind by using movie clips, asking
players to interpret why characters acted the way they did and what they might
have been thinking.
The use of AI to improve visual and auditory accessibility is
also evolving quickly.
For example, the National Technical Institute for the Deaf, one
of the nine colleges of the Rochester Institute of Technology, worked with
Microsoft to customize technology and platforms that already existed in order
to caption classes for deaf and hard-of-hearing students. The classes have sign
language translators and stenographers, but more assistance was needed.
For the institute’s purposes, Microsoft Translator was “taught”
specialized terminology used in classes as well as vocabulary specific to the
university, such as the names of certain buildings and people, said Wendy
Dannels, a member of the research faculty who is deaf.
With AI, the speech-to-written-word translation is far more
fluent than automatic speech recognition used to be, she said. And spurred by
the pandemic, during which face coverings made communication particularly
difficult for many deaf and hard-of-hearing people, the institute developed an
app called TigerChat. The app turns speech into text messages, making it easier
to chat with friends.
A key use of AI in special education is its ability to detect
patterns in large amounts of data to better identify and define certain
disabilities.
Take dyslexia, for example. Those with the condition typically
have reading difficulties because they have trouble connecting the letters and
words on the page to the corresponding sounds they represent. As of 2020, 47
states required that students be screened for dyslexia in early elementary
education. Yet there is no tool designed specifically for this, and dyslexia is
often misdiagnosed — or missed completely.
The most widely used assessment for dyslexia is a test called
DIBELS (Dynamic Indicators of Basic Early Literacy Skills), typically given to
all students in kindergarten through third grade to assess their overall reading
and literacy, said Patrick Kennedy, a senior research associate at the
University of Oregon’s Center on Teaching & Learning. The test was not
designed to detect dyslexia but is used “in the dearth of other tools,” Kennedy
said.
Kennedy and his colleagues plan to recruit 48 elementary schools
in the United States and have 4,800 students in kindergarten through third
grade take the DIBELS assessment.
Over the next three years, they will examine the outcomes —
using machine learning — to determine patterns in the development of reading
and spelling over time. Ultimately, the researchers hope to evaluate if DIBELS
successfully identifies dyslexia and how it can be used most effectively.
“The purpose of this project is to provide schools with better
information to allow them to make better decisions,” Kennedy said.
Read More
Lifestyle