Skip to main content

Testing and Improving a Roku App’s Accessibility

04-24-24 Dustin Whisman

Accessibility efforts apply to all sorts of applications, even those we build for TV. Dustin shares the ways we implemented web accessibility within a client’s Roku app.

While we mostly focus on websites here at Sparkbox, we occasionally step outside the box and work on other types of applications, such as native apps for iOS, tvOS, or Android. In the case of a recent project, we helped a client by auditing and improving the accessibility of their Roku application.

Roku apps are built using SceneGraph, an XML-based user interface framework, and BrightScript, a proprietary scripting language built specifically for Roku. This presents a challenge in testing and evaluating an app’s accessibility.

In this case, a web development background is of limited use, since few of the familiar accessibility techniques apply to Roku development. There aren’t semantic HTML elements to give you behaviors out of the box, and you can’t use CSS for layout, visually hiding elements, or managing color contrast. The usual tools to check for accessibility issues also aren’t available, so how can you make sure a Roku app is accessible?

Going Back to First Principles

WCAG 2.2 is organized around four guiding principles: content should be perceivable, operable, understandable, and robust. By using these principles instead of focusing on technology-specific rules and techniques, we can still evaluate accessibility.


To evaluate whether a Roku app is perceivable, we need to consider users with different degrees of vision and hearing. For vision, we need to account for users who are blind or have low vision and/or color blindness. For hearing, we need to account for users who are deaf or hard of hearing.

Captions and Audio Descriptions

For time-based media (audio and/or video), captions or transcripts are necessary for users who are deaf or hard of hearing, and audio descriptions are necessary for users who are blind or have low vision. Roku doesn’t have a mechanism for transcripts, so the best thing to do would probably be to include instructions on where to find them if they’re available elsewhere.

On this particular project, the app’s target market is users with vision or hearing disabilities, so most videos had captions and/or audio descriptions already, so we didn’t need to worry too much about video playback. As long as the captions were displayed or the audio descriptions were announced while playing videos, all was well. The main challenge was testing and fixing the app’s navigation so users could effectively find the videos they were looking for.

Color Contrast

For users with low vision or colorblindness, color contrast is essential for making text and interactive controls distinguishable from the background or other elements. This Roku app had a dark background image, and it used white text throughout the app, except for buttons with lighter backgrounds.

The color contrast was already acceptable when we started working on the app, so all we needed to do was make sure not to introduce contrast issues as we went.

Screen Reader Testing

Roku has a screen reader, but it isn’t very well documented, and we found that a lot of content in the app was not read by default. In web development, screen readers give users options for reading pages top to bottom, navigating by landmark regions, headings, or forms, and there are tried and true methods for exposing content to screen readers. The Roku screen reader proved to be more of a guessing game, and it took a fair amount of trial and error to figure out how to make content available to people using the screen reader.

For example, Roku apps require a lot of manual focus management, and when elements receive focus, the screen reader only announces the focused element. This meant that video descriptions and other non-interactive text content would be skipped by the screen reader. We found a couple of different workarounds for this.

One workaround was to add buttons that are only available when the screen reader is turned on and set them up to read whatever static content we needed when selected. Another was to set a description XML attribute on the element, which in some cases would be read by the screen reader (but not for all types of elements, as we discovered through trial and error).

It’s worth noting that Roku does not appear to have any zoom settings, so it’s likely that low vision users would need to rely on the screen reader.


In web development, when you’re evaluating whether content is operable, you need to consider many different input modes: mouse and keyboard, keyboard-only, touch, and voice control. For Roku, you can use the remote or an app that acts as a remote. This ends up simplifying some things (no drag and drop interactions to worry about), but complicating others.

Directional Navigation

The remote has directional buttons for up, down, left, and right, an “OK” button for selecting controls, home and back buttons for navigation, and video playback buttons like play/pause, fast-forward, and rewind.

The directional buttons introduce an interesting navigation challenge. On the web, navigation more or less goes from start to finish–you can move forwards or backwards. With Roku, users need to be aware of where controls are in relation to each other on the screen, which is fine for users who can see the interface, but less so for people who can’t.

A screengrab of a sample TV layout, including a list of categories on the left side and a three by three grid of animal video thumbnail images on the left.
Without seeing the screen, how would you know to press the right button to navigate from the sidebar to the grid of videos?

Screen Reader Testing

For certain layouts, the screen reader will announce controls like “Art History, button, 2 of 30, in a grid” as a way to let users know that there are potentially buttons in multiple directions. For others, it will only announce the number of buttons, with no hint for whether those buttons are arranged horizontally or vertically.

In testing other apps for comparison, this problem is widespread. Some apps will include hint text to be announced by screen readers, like “navigation is on the left”, but this appears to be the exception, not the rule. In our experience, changing the announced text tends to require overriding the default behavior entirely, which is ultimately a brittle approach that we opted to avoid.

There’s a very high ceiling to the amount of effort you could put in to make the screen reader announce everything in an ideal way. Most likely, users will try different buttons to find out what happens, or they’ll use the service in another context, like the web.

Focus Management

Another operability issue to look out for is focus traps and dead ends. We found a few bugs with the app early on as we were working, where the app would crash in some way and the only way out was to press the home button and then reopen the app. We fixed those as we went and made sure we didn’t introduce any situations where users couldn’t get out of the screen they were on without exiting the app.

Focus management is much more of a thing for Roku apps than it is on the web. For web development, you’re usually better off not doing anything to manually manage focus (with some exceptions), but for Roku apps, you will frequently need to listen for button events to move focus between different sections. There’s a high risk of losing focus, trapping focus (by not handling events properly), or messing up the focus order. We had to do a decent amount of refactoring to simplify focus management logic in places to avoid bugs and make things more maintainable.


To evaluate how understandable the Roku app is, we mostly just needed to make sure that the app was consistent and predictable. On the web, you usually need to make sure you aren’t doing anything surprising, like rearranging the main navigation on each page, or performing possibly unwanted actions on hover or focus. In a way, Roku’s limitations make it pretty easy to avoid these kinds of issues.

Consistent Navigation

Major streaming apps will commonly change which videos or genres are recommended depending on the user, time of day, what’s popular on the platform, etc. If this was the only way to find content, it would be a huge accessibility issue, but apps usually have a search feature or other, more consistent ways to find content you’re interested in.

The app we worked on was simpler than that, though. On the home screen, you would see “New Releases” and your “Favorites” if you were logged in. Other than that, you could browse by topic or by series, or you could search for specific titles. The only thing that changes is which titles are available.

On Focus and Change on Request

For the most part, no actions are taken when a button receives focus. Only when the user presses the “OK” button does the app navigate to another screen or start playing a video, for example. There are some exceptions, though.

When browsing by topic or by series, when the user navigates up or down a sidebar menu, relevant results load in based on what is focused. For users who can see the new results load in, this is a pretty normal interaction, but for screen reader users, the experience could be improved. Roku doesn’t have an equivalent for ARIA live regions, so to make it clear that results have loaded, we would need to add some custom logic to announce when results come in.

We determined that this may not be a major issue, though, since the “OK” button isn’t wired up to do anything. It’s likely that users would navigate through the sidebar, press “OK”, then navigate to the results, which would already be loaded. This is a situation where user testing could provide insight into whether this is actually a problem worth spending the effort to address.


From WCAG 2.2, describing the Robust principle: “Content must be robust enough that it can be interpreted by a wide variety of user agents, including assistive technologies.” For the web, that means different operating systems, browsers, screen sizes, etc. For Roku, the scope is much more limited, but the user agent is very specific and finicky, and it’s hard to tell what “best practices” even are, given the limited documentation and small developer community.

We did take some steps to ensure our code followed best practices that we could identify. When we started, the app had some linting set up with an ESLint plugin, but the linting rules weren’t enforced and the plugin hadn’t been maintained for about 5 years. We ended up replacing this setup with bslint, a code linter built specifically for Roku apps. After fixing up all the existing issues that the new linter found, we also set up a GitHub Action to run the linter on every pull request. With the linter in place, we can be reasonably confident that any code changes follow the latest best practices and remain compatible even as Roku releases OS updates.

Other than that, we performed extensive manual testing, with and without the screen reader turned on, to make sure that everything worked as expected and that we didn’t break things along the way.


While we stepped out of our comfort zone to work on this Roku app, we were able to use our accessibility knowledge to evaluate and improve the accessibility of the app. Is it perfect? No, but it is better than it was before. Do I wish I could run an automated tool to tell me what to fix? Absolutely. But by going back to the principles of digital accessibility, we were able to track down issues and fix them up as best we could.

Related Content

User-Centered Thinking: 7 Things to Consider and a Free Guide

Want the benefits of UX but not sure where to start? Grab our guide to evaluate your needs, earn buy-in, and get hiring tips.

More Details

Want to talk about how we can work together?

Katie can help

A portrait of Vice President of Business Development, Katie Jennings.

Katie Jennings

Vice President of Business Development