Jump to content

GCChick

Members
  • Posts

    1
  • Joined

Reputation

10 Good

Personal Information

  • Homepage
    http://www.mistyrayburn.com
  • Occupation
    Concert Photographer
  1. I am posting this for my fiance Brandon Cole ( @Superblindman on twitter ) . He is a totally blind accessibility advocate and he has loved SWTOR since I started playing it as a founding character. He is super excited about this thread and I am too. ---- I am about to tell you how to make The Old Republic totally accessible to a blind person. 1: Full UI narration. Every element needs to be narrated, and activating the narration option should be easy to do. Perhaps a key command that activates it would work best for this game, as blind players would need to be able to use it for every single thing from server selection, to settings configuration, to character creation and so on. Even optional things like the Codex should support this, as the blind deserve access to all parts of the game. The recommended method of narration for a game of this size and scope is Text to Speech narration, which can be achieved either by creating your own solution, by utilizing the Windows Narrator API’s, or by utilizing existing screen reader API’s to support the existing screen readers of blind users. All of these methods would be acceptable, though the first would of course be the most difficult. 2: Full keyboard navigation. While it is true that movement already works this way, keyboard navigation would absolutely need to extend to, say, picking up items, selecting things in every menu, and so on. The blind typically do not use the mouse as cursor navigation is quite difficult when you can’t see a cursor, we would need keyboard/controller navigation applied to absolutely everything. This includes things like scanning with the macrobinoculars, although I will speak more on those shortly. 3: Audio cues. We will need positional audio cues for certain things, such as collectable loot, intractable objects and people, nearby enemies, and so on. However, audio cues would also need to be applied to some puzzle elements, such as once again, macrobinocular scan areas. We will need to know when it is viable to press the appropriate buttons in order to activate them. This is where I will also bring up the idea of concessions. If it is too difficult to translate moving the mouse to the proper area to look at the correct thing into keyboard movement, or if a solution works for keyboard movement but contains no audio cue to give us any hint of which way to move, there may need to be an optional key to simply automatically scan the thing we’re supposed to scan. I try to make as few concessions as possible in my consulting work, but this is an area where one may be required. Back to audio cues though, there may also need to be audio cues to indicate quest area boundaries. If we are to hunt a particular enemy in a quest circle to collect particular drops, we must know when we’re both entering and leaving that area so we can ensure we’re staying where we need to be long enough to finish the task. 4: Additional commands. While we do need everything narrated, we don’t need EVERYTHING narrated all the time. For example, I don’t need to know what my health is every second. I do, however, need a command that allows me to, by my own choice, check my health status, as well as any status effects that might be hindering or helping me. We would also need commands to quickly check on the statuses of our allies if we were, say, playing a support role in a group. Additional commands to repeat quest objectives, check distance to quest objective, and so on, may also be helpful. 5: Additional systems. By far the most complicated bit, we would need certain additional systems. First, an interactive map which, instead of being formatted as a map we’d have to move around, a situation in which we might still miss things, is formatted as a sortable list. This list contains essentially text versions of all the icons the sighted user would see on their map. Ideally we could then sort these by active quests, merchants, and so on. Selecting one of these from our keyboard-browsable list would allow us to read a description if one existed, and/or set a waypoint to that destination. Once a waypoint was set, we’d move into the next system we would need, which is something to help with movement. These systems need to work together in order for us to reach our objective. We need to be able to move toward a waypoint we’ve set, and there are a couple of ways to handle this. First, a key that basically turns us to face our objective. Something we just press now and again to reorient on it. This is workable, but will also require additional coding for things like jumping. We’ll need to know when we need to jump, or potentially be automatically made to jump when it is required. As tremendous as I understand the artists did with the environments, we still cannot see them, and so they must be worked around. Furthermore, I know there are complex platform puzzles that exist in the game, and unless the devs were willing to do very specific and precise coding of audio cues for those puzzles, auto platforming may be the best way to approach those. Of course, another option is that we press a key, and we are automatically moved toward our objective in every sense. Believe it or not, this option is fine with the blind as long as we can also stop it in the middle in cases where we might hear something interesting off to one side and wish to go investigate. So yes, in a giant MMO, you absolutely can take our movement controls away and autowalk us completely, as long as we can stop. The point to consider here, and the reason we would accept this option, is because it’s either that, or we don’t get there at all. We’d rather play with a liiiittle less control than not play. I haven’t listed absolutely everything we would need here, but everything I haven’t listed is derivative of everything else I’ve said. 2 more quick examples would be keyboard targeting of enemies and allies for attacking and healing, (additional controls), and audio cues and targeting assist for space combat segments, (audio cues and additional systems). I also completely understand the extreme unlikelihood of any of this actually being implemented, as it would probably require the re-coding of several in-game systems. However, this is the best opportunity I’ve seen to make people on the TOR team, and hopefully people within Bioware as a whole, to think about blind accessibility. And so, even if none of this comes to pass, I hope the ideas presented here are, at the very least, considered for future projects. Thank you.
×
×
  • Create New...