
Did you know that most black ink comes from fossil fuels? Living Ink is changing that by creating sustainable pigment out of algae.
March 10, 2025
Imagine navigating a social setting without being able to see the people around you. You’d have a hard time determining whether they were engaged or distracted, whether they seemed glad or upset.
So much of how we interact with each other is through body language. By some counts, up to 93% of all communications are nonverbal. The facial expressions and hand gestures that we make and perceive help us connect with each other, pick up context, and work within group dynamics.
But for people who are blind, low vision, or autistic, these communication channels are largely inaccessible. HapWare is building technology to solve this problem.
This solution has a real impact on peoples’ daily lives. Only about 40% of people with vision difficulty are employed, while as few as 15% of college-educated adults with autism are unemployed.
People in both of these groups struggle to interpret nonverbal cues. That’s why HapWare’s ALeye device turns those cues into a language they can read.
This assistive tech takes images using a glasses-mounted camera, feeds them into an AI engine to classify nonverbal cues, and then creates a tactile sensation through a pattern of buzzes on a wristband.
Think of it like force feedback Braille. Just as someone can feel Braille letters to read words, someone using HapWare’s technology can feel a sensation on their wrist that tells them whether the person in front of them is smiling, frowning, waving, or going in for a handshake.
“Access to nonverbal information is a human right,” says Jack Walters, the company’s CEO and cofounder. “We want to be a bridge that helps people get actionable information they can use to navigate social and professional situations.”
ALeye, HapWare’s first product, combines electronics, AI, and wearables to translate nonverbal communication into physical sensations.
“After just 90 seconds using the device, people can classify 14 cues at 95% accuracy,” explains Walters. From the beginning, they’ve strived to make using their device as intuitive as possible. For instance, instead of determining themselves which dynamic haptic feedback correlates to which facial expression, they’ve taken the opposite approach. They play a pattern for a user and ask them what they think it means, then aggregate feedback across testers to create patterns that are easy to understand.
Another way they’re doing this is by creating “a customizable user experience that lets users decide how much information they can get and what kind,” explains Marley Debrito, founding engineer.
For example, they found that an FBI interrogator and a professor have very different needs. While one needs to pick up on subtle facial cues that may signal deception, the other needs to be able to understand how engaged their students are or if they seem confused.
Ultimately, this comes down to a “nothing about us without us” design philosophy, says Walters. By including blind, low-vision, and autistic people in their design process, they’re able to create a solution that works for the people they’re here to serve.
So how does this all work on a technical level?
ALeye’s current design uses plenty of custom circuitry. A wireless camera connects to an onboard computer on the wristband. The single-board computer then does the image processing and feeds it into an AI model, which performs facial recognition and pose estimation to classify nonverbal expressions.
That output then travels to the eight circuit boards on the sleeve, and these drive motors that create the haptic language. This determines the “timing, intensity, pattern, and number of haptics activated,” says Debrito. The end user then feels that sensation and receives the information.
Like with any complex integrated system, there have been challenges. This includes making it wireless, reducing latency to provide realtime haptic feedback, reducing the size and weight of the electronics to make it less obtrusive and more fashionable, and documenting repeatable processes to improve consistency during manufacturing.
As the team continues to add features, they have big plans on the agenda.
This includes a mobile app that lets users customize their experience on the go for different situations and share haptic pattern loadouts with the community. They also plan to incorporate different cultural contexts to serve more people and to help with international travel and to expand their recognition capabilities to read microexpressions and deception. And of course, they will continue to miniaturize the design so the focus remains on interpersonal communication.
HapWare’s story began in the halls of Colorado School of Mines in the spring of 2023. Walters was a senior, meaning it was time for him to complete Capstone Design, which connects Mines students to clients in industry to solve real engineering challenges.
Dr. Brian Duarte, HapWare’s CTO, was Walters’ capstone client. As both a blind man and a scholar, he focused his academic research on haptic technology. He knew he wanted his group to build something with the dynamic haptics he invented, though he wasn’t sure exactly what would come out of the project.
Walters and his team spent the first six months researching haptics, learning how people use them, and doing problem discovery. This led to their first prototype, which classified facial expressions as happy, angry, or neutral and delivered this information through haptic feedback.
After winning the innovation award during that year’s capstone showcase, the team knew they were onto something. When Walters declared he was going to start a company and asked who wanted to join, Dr. Duarte replied: “I hoped you were going to say that!”
Dr. Duarte signed on as CTO, and HapWare was born. “It was the perfect match for Jack and I to cross paths the way we did because we are both passionate, dedicated, and have a desire to make a positive impact for underserved demographics,” he says.
Walters continued his entrepreneurial education through a combination of hands-on work and coursework. The InnovateX class at Mines was a crash course in all things business, and he continued to focus on problem discovery. This led to the realization that those on the autism spectrum could also benefit from his team’s device.
They were off the races. The prototype started shrinking, and they added more features to their product. “We went to the Labriola makerspace on campus and started working on anything we could,” recalls Walters. “From the software to the hardware to integrating the two together, we did a lot of testing and created a product roadmap.”
As the tech progressed, so did the business. When the Beck Venture Center opened at Mines in April 2024, the startup incubator and coworking space was a natural fit for the HapWare team. They got plugged into the community, started making connections, and developed the business side of the company.
They expanded the team, began working with local nonprofits, and started running more user tests. Now that they have a growing waitlist of interested end users, they’re focusing on meeting demand by scaling manufacturing and fundraising for growth.
LIKE WHAT YOU’RE READING?
Get more, straight to your inbox.
For this Colorado tech startup, being in the right place has made all the difference.
HapWare’s story is a case study in the Mines entrepreneurship and innovation ecosystem at work. What started as a Senior Design Capstone evolved into a company.
The founders began their journey in the classrooms of McNeil Hall, built prototypes in the Labriola makerspace, and went to the Beck Venture Center once they started getting serious about the business. At Beck, they joined the Evolve Club, a student-led entrepreneurship club. At each stage along the way, they’ve been able to access the resources and talent they need to grow.
Mines has invested heavily in creating a startup support system, and this story shows how those programs are beginning to bear fruit. “Mines has been a great support, and now there’s no stopping us,” says Walters.
While most of this story has played out in Golden, the larger Colorado community has also played a pivotal role. Working with nonprofits like the Colorado Center for the Blind, local chapters of the National Federation of the Blind, and The Arc of Colorado, a nonprofit that supports autistic people, they’ve aligned with organizations that serve their target demographic in order to make sure the product they’re building is one that people want to use.
“Colorado is a great hub for technology,” concludes Walters. “Between Golden, Denver, and Boulder, there’s tons of entrepreneurial people. For us, the reason Colorado is a great place to build a tech startup comes down to being at Mines. We have access to the best network, support system, and resources we could ask for.”
The Colorado Tech Spotlight highlights local innovations and the stories behind them. The series explores how the Colorado tech ecosystem creates an environment that promotes technological progress.
It is produced by Dynamic Tech Media and written by John Himes. Photography by Kort Duce.
Did you know that most black ink comes from fossil fuels? Living Ink is changing that by creating sustainable pigment out of algae.
Imagine a place where a physician can see patients in the morning, teach a class in the afternoon, go to the lab to check in on their research, then finish their day at their startup’s office—all without ever having to get in the car.
GelSana is disrupting more than your first aid kit. Their innovative biogels act like soft, slightly sticky putty that can stretch out across any wound and come off just as easily.