Transcript of PEAT Talks: Sharing Success from the WWW-W4A Accessibility Hackathon
Hello and welcome to PEAT Talks, the virtual speaker series from the Partnership on Employment and Accessible Technology. On every third Thursday of the month, PEAT Talks showcases various organizations and individuals whose work and innovations are advancing accessible technology in the workplace. My name is Christa Biel, I’m a member of the PEAT team, and I will be hosting today's topic.
Before we get started, I’m going to quickly review a few logistics. We will have time for Q&A at the end, so please enter any questions you have in the chat window or email them to info@PEATworks.org. You can also email info@PEATworks.org if you are having technical difficulties. You can download today's presentation on PEATworks.org and an archived recording will be posted online following today’s event. We will also be live tweeting today's event at from @PEATworks. So please feel free to join us and follow along using the hashtag #peattalks with two t’s.
Today PEAT is excited to share the outcomes from the World Wide Web and Web for All accessibility hackathon cohosted by PEAT and Google last month in Montreal. The Hackathon challenged participants to take a popular web-based product used widely in workplaces and schools each objects h5p interactive content objects and make it accessible to all users.
We are joined by three speakers from all over the globe today who participated in the Hackathon. First, we have Svein-Tore Griff, managing director of Drupal. He is the primary inventor and architect of h5p, and served as a hackathon judge. Next, we would like to welcome Greg Gay, the 2016 Web for All conference chair, and another one of our hackathon judges. Greg also leads the accessibility research and development underway at the ATutor LMS open source community and teaches web accessibility practices to web developers. And last but not least, we have Volker Sorge managing director of progressive accessibility solutions and a member of the winning Hackathon team. Now, without further delay, I will turn things over to PEAT’s project director, Josh Christianson, to provide some background.
Thank you Christa, and thank you all for joining us. Want to wish a happy global accessibility awareness day to everyone. I think it is particularly fitting that on a global awareness day we have speakers taking place from four different countries, which I think is cool. I'm going to give a little intro before we turn it over to our guests. First, I want to plug about a recent blog that was posted on peatworks.org. It gives, really, a little more detail about the event. So we're going to cover a very high level today but would encourage you to check out that blog and see if you have more questions or learn a bit more about what transpired. We meet at the Google offices. We would plug Ben Caldwell from our team who wrote that, was a judge and is also on the phone today. He may be able to jump in with some questions as needed. So check that out.
Also I want to give some big thank yous to everyone that is on this call, as well as our partners from Google that were there. This Hackathon event was a true collaboration with our team and Web for All, and it was fun to have it happen and it wouldn't have happened without people chiming in and getting teamwork together. We were kind of on a maiden voyage and it was fun to work with the team, so thank you all that are on here for your collaboration. And also that this was a first.
This Hackathon event was really a first for PEAT. I think we learned a lot. We’ll talk a little bit about that later. But we hope to have more innovation sprints, which is what I've been told maybe close to the Hackathon, more innovation sprints to come. We’re interested in design challenges, innovation challenges, any way to really get in front of developers, designers, programmers, people that are creating technology to help spread the gospel that is accessibility. So we've got a few things in the works at PEAT. And any of you all on the line that are involved and interested in some, please reach out with other ideas, as we’d love to collaborate on other events.
For me personally, it was really exciting to see the impact on a product that people can have. So we have a great product that you'll learn a little bit more about in a minute, open-source code. But then to see individuals and teams kind of get in there, get their hands dirty and make something more accessible. For me, who is not a developer and on the technical side, just to see some first-hand knowledge of the -- you know, people can make a difference and get in there and make things more accessible was empowering in a way, and I look forward to seeing more of that.
I mentioned spreading the word about accessibility. I think that was part of the beauty of this event. We had folks that were there by design that were not aware of accessibility, so Web for All is an accessibility focused conference, but the World Wide Web conference is not, and so we recruited partnering with the World Wide Web conference, thanks to Greg, and got some folks there that may not be aware -- were not previously aware of accessibility. Same with Google, they tapped into their Meetup groups, and so that was exciting, because we had people there that were accessibility experts and programmers with accessibility backgrounds, but also developers and designers that were not exposed to it.
We did a survey after the Hackathon, and it rated after level of accessibility. If they said they had no accessibility experience previously, or maybe just a little, we asked them that skip question, if they were more likely -- by participating in the Hackathon, they were more likely to incorporate accessibility best practices. And across-the-board everyone said yes. And so I know it's anecdotal in a way and it remains to be seen, but, to me, that's an exciting part of such events, is maybe just raising awareness and teaching new skill sets around universal design and usability so that folks can go forth and use them when they're creating things.
But before I get too much into the details, I want to turn it over to my colleague. I'm going to just go over the agenda very quickly. You heard about our guest. We're going to first hear a little bit more about the actual products, the learning management system, content management system that we found and used. Then we're going to get a recap of the variety of team efforts from Greg, who was instrumental in this event. Just touching a high level on what some of the teams tackled. And finally, from the winning team, Volker, and he's going to talk about what he and his team looked at and tackled during the day's events. And we will save time at the end for questions and answers. So, without further ado, I'm going to turn it over to Svien-Tore. Thanks so much.
Yes, thank you.
Are you there?
Yeah, I'm here. Okay. So we're looking at the H5P logo now, and H5P is all about interactive conference. It is a plug-in for existing publishing systems. So you now see the Drupal and WordPress logos. We have plug-ins for Drupal. We have plug-ins for WordPress. And were also launching a plug-in program for Moodle soon, hopefully this month or in June we'll have a plug-in for Moodle as well.
By installing the H5P plug-in to all base systems, the system becomes able to produce more interactive conference. I'll name a couple of examples of things you can create yourself in your Moodle or Drupal or WordPress website if you have the H5P plug-in. And one example is interactive video. You see a screenshot from the interactive video [indiscernible] that anyone can use to create interactive content.
Another example is memory game, a quite classical game where you are to find pairs. It can be created in WordPress and Drupal and Moodle if you have H5P. And there are lots of other content types. I think it's 28 now, but it's probably more. We have examples on this slide of some of different types that you can create with H5P. So you can create quizzes, games, simulations, anything that seems alive and interactive.
And H5P is a community effort. And we tried to illustrate this on the slide you see there. The picture is not of the best quality. But, yeah, it's a community effort. Anyone can join H5P, participate, create content types, and share them on our website, H5P.org, or improve the existing ones. And we were very happy that Beth and Doug Ferrell chose H5P and participated in the H5P community and improved some of our content types in terms of accessibility. And we really [indiscernible] I really look forward to releasing the results of the works in the coming months. It was great to see how easily and effectively people could just file into H5P and make a huge improvement in the short time. I guess we will hear more about it from the next speaker. Thank you.
I am Greg Gay. I'm going to talk a little bit about what the four of the five groups did. We had five groups participate in the Hackathon, each with five or six participants. Group six focused on the charts content types. So we're looking at a screenshot of it now, which is a pie chart. There's also a bar chart as well that you can set up. And what this group did was they focused in on making the elements within the pie chart itself accessible so the elements of the pie could be tabbed through, and labels were attached to those so you could hear them.
They added in tab indexing around each of the elements within the pie so that you can navigate through, and the focus would move around within the pie. And the same for the bar chart. They made the bars accessible by adding labels and adding keyboard access to tab indexes. And they also added in a description field, sort of like a long description so that they can describe the gist of what's being presented in the chart.
Group one focused on a content type called "Marked the Words." So here we're looking at an example screen from Mark the Words, and you can see that the question is to find the berries within the paragraph below, and the berries are highlighted here. So group one, what they did was they added a button to the interface that allowed a user who's using a screen to be able to jump back and forth between the beginning and the end. They also set it up so that when navigating through the words with a screen reader you could hear the selected words announced as selected. And when navigating through words, marked ones are now announced. So they didn't get all of what they wanted to get complete done, so they made a few suggestions there. One was to remove the role button from the words, because right now they're announced as button and the word. So you move the button and it just announces the word.
They also suggested adding in some hidden text at the start and the end of the list to let the user know that they're either at the start or the end, as well as some hidden instructions so the screen reader user will hear a description of what needs to be done in order to complete the activity, and to set up the word so that users could navigate through them one at a time and listen to them.
Group four worked on the summary builder. So this is sort of like a multiple-choice quiz, which you are looking at a screenshot of one right now. So they did a few things. First of all, they set up the count. So if we go back to the screenshot, up in the top right corner it says "progress 0/2." So they updated that, and replaced the / with the words "out of" so that it made more sense on the slide there. Here we go.
They also added in an extra key so that when you're submitting -- currently the spacebar was used to submit the answers, so they've also added the enter key as a possibility. They've added in focus highlighting, so as you're navigating through each of the elements they light up so you can see where you are. They added in labels and tab indexes to make the elements focusable. They hooked the labels up properly, which were missing the four attributes before, so the labels weren't actually connected. They were there but not connected properly. So now they are.
They also added in an incorrect label, so when a selection is made that's incorrect it announces itself as incorrect. And they added a FIELDSET and a LEGEND to make the whole interface more meaningful to somebody who is listening to it with a screenwriter.
Group 3, who were the runners-up, they worked on the interactive video content type. So their focus was on making the interactive items that pop-up accessible. So they added an access key to the play and pause buttons so it can be controlled by keyboard alone without having to use a mouse. They added ARIA alerts into the various pop-ups. So if you look at the screen you can see there is a little pop-up in the top left corner. So, previously, if you were listening with a screen reader you would have been able to hear or notice that that was there, so they added a roll alert to that so when it pops up, it automatically announces itself to the reader. And they did that also for the bookmarks, as well as the interaction labels that pop-up as well. And they added some default title to the interaction if there wasn't one already provided.
So next up is Volker, and he will talk about what Group 7 did.
Yeah, hello. Hi. I'm Volker Sorge, and I was working with -- I think we were five people in Group 7, and we were tackling two different types of widgets. Both were kind of quiz-type widgets. One was called "Hot spot," and the other one was an arithmetic quiz. I will quickly summarize what we did on the Hot Spot quiz, and then I will dive deeper into the arithmetic quiz, because there's a couple of issues that occurred there, which I think are, in general, important issues to keep in mind when developing particular [indiscernible] the web content.
So with the Hot Spot quiz was, it was effectively -- imagine you have a quiz where you have kind of a multiple-choice quiz but you have images on which you can click, and then the question which asks, you know, which is the right image to answer the question. And the general problem there was that the images were effectively not accessible in the sense that you could not necessarily navigate to them with a keyboard, as well as that you couldn't obviously hear what was in those images.
And so the way we solved the problem was just adding a tab index in order to add the images so they can be accessed by keyboard, and added some area labels with explanation, and then kind of make them keyboard everything, keyboard clickable. And so it would be announced what the right and the wrong answer would be. All right, that was kind of the image quiz.
The other quiz we were working on was arithmetic quiz, and I will briefly explain how this works, how it works visually on a webpage, and then I'll point out a couple of particular issues, which we've discovered and solved. Right, so arithmetic quiz, it's kind of a multiple-choice quiz about simple arithmetic questions, say, here, as an example, 5 x 7 = ? and you get a choice of six answers. All of these are buttons underneath, which you can click the right one or the wrong one. And it's a timed quiz, so what happens, the way it works is start on the start slide, you click on the start button, a countdown comes up, counts down three, two one, and then the third question shows up. And whatever you answer a question, you're being -- it's visually displayed whether your question was correct or not by highlighting a button, or low lighting a button effectively, and then a score is being kept and the timer is running in the corner, and whenever one of your -- when you answer the question, then this question kind of flies out of the window out of the [indiscernible] on the left, and from the right comes a new slide with the next question.
It is all very nicely animated, but there is a bunch of issues, which we discovered. In particular, so it was only partially accessible, particular thing things like questioned itself, as well as the score and the timer were absolutely inaccessible, so if you couldn't see them, you cannot hear them, and you were not aware what the time was.
And another major problem was that the buttons were keyboard accessible and clickable, which is, in a way, a good thing, but unfortunately all the buttons were that. Now think about, as I said, the slides in a way, are being swapped out while you do the quiz. So if they're coming from the right they go out to the left. And while this looks like they're actually leaving the browser at that point, they don't. They're just being visually hidden. What the effect is is that they are still in the page. So, in other words, what you have, you have all the buttons for all the possible questions, and all the answers effectively still on the page.
So what happens, if you look at a slide, or if your screen reader looks at the slide, it reads actually what's on the page. If you then start navigating with your keyboard you get to all the answers for this particular slide. But if you navigate a bit too far, you get to the answers of the next slide or you navigate back to the answer on the previous slide, so, you know, obviously not what you want. And you can actually click those as well. So then we've got a problem of how we're going to make this accessible.
Now there's two ways to go. One way was, well, we just work on this particular widget and try to make it accessible. And another thing was that -- another option was that since those buttons are fundamental to many other widgets as well -- it was a very fundamental data structure, so were the slides -- that one should start at the basic structure. And I decided then, that we should go this direction, and rather just working on our own single widget and effectively fighting against windmills, we should kind of go to the root of the problem.
And the way we've made it accessible then was, so we tried to integrate the accessibility from the start into the main data structure. And in order to make sure that buttons are accessible, since buttons are one of these agents of five elements which are automatically being made keyboard accessible, or they're being automatically put into the tab order, what you need to give them is some sort of a way to give them a default accessibility. So, since you don't want all of them always being keyboard accessible, we can give them a tab index, which is minus 1, which means that they are focusable but they're not reachable normally.
When a slide comes in, then you give it a roving tab index, which means if a slide comes in, all the buttons on the slide are being set to tab index zero, so they're being effectively made reachable for the time, and the buttons on the slide are being swapped out. They're being, again, reset to minus 1, so they're no longer reachable. And then you do the same thing with thing like the questions, score, and timer. You put all of these in the tab index as soon as the slide becomes visible, then they become reachable.
So the take-home message, in many ways, from this part was, well, if you start integrating accessibility as default from the get-go, you have a lot less work to do when you put together the widgets. The second problem was that there were a lot of animations in these slides. So one of the animations, for instance, was, as I said, it was a timed quiz. So when you hit the start button at the beginning you get this countdown, which counts down three, two, one, which is nicely visible [indiscernible] and then you should start.
Unfortunately, that was inaccessible in the sense that you couldn't not hear it on the screen reader. If a blind user would hit the start button, there would be three seconds of silence, and they would have no idea what's actually going on. But the way you make this accessible is quite easy actually. All you have to do is you get the numbers that are being displayed, give them -- you make them an [indiscernible] region, which means that they all alert -- automatically alert the screen readers that they should be spoken now.
Our next general way to make animation accessible. Second problem with animations which was much harder to solve, actually, was the problem of displaying whether or not an answer was correct. So think about if you click on a button which you think is the right answer, if it's the right answer it's being visually displayed to you by kind of expanding slightly and being highlighted. So you can see, oh, this is the right answer. If it's the wrong answer, what happens in parallel is, your button, the one you clicked on, would go to recede a bit into the page, go get smaller, whereas the correct answer would come out towards you. Visually very nice. If you can't see it, what are you going to do?
Well, the way we solved this problem was by adding a temporary live region. So that means we could not just make the button a live region, because that would just repeat the number in there without any additional information. [Indiscernible] was already on the page, you would already be voiced by the screen reader. So now what you do is you add a temporary live region, which is effectively really invisible, to the human eye at least, and then you push in the information, like which number was correct, into that live region, and then you also push in the information, what was the incorrect reply into the live region, and then you have to remove this part again when you move to the next slide. And the difficult part there is that you have to get the timing right, and the timing here means that it takes a while for screen reader to realize that a live region is in the page and it has been updated. Unfortunately, for every screen reader it takes a different time, which means that, at this point, if you're the developer, you have to do a lot of testing on different platforms and different screen readers, which is often very fragile, because they change on a daily basis.
So the take-home message from there is animations can be made accessible in a way, but it is rather difficult for the developer, so there's a shout out for the community, if you like, what needs a proper testing harness to make this a little easier for people.
Right. And I think this is -- well we've done the Hackathon, and I'll give it over for the questions and answers. Thank you very much.
Thank you, Volker. Thank you all. That was -- watching this [indiscernible] they all have such a diverse skill set, and next tackle the problems in a short time and making headway was exciting. I want to encourage everyone on the line, we've got a couple more minutes. If you have any questions you would like to ask of the participants, please type them in the window box there. We might be able to get to a few. I've got one or two that I will throw out. In the meantime the [indiscernible] quota with HP5, we, you know, kind of bet on our team, did some research and thought it was a great tool and open source, which we love and could be used both for employment and in education, which you were focusing on. But I was curious, from your standpoint, what did you learn about accessibility in the participation? And I know you said you're still working on the updates but what are kind of the next steps around accessibility for H5P?
Yeah, we started in January, I think, kind of a parallel initiative on accessibility within H5P and [indiscernible] testing all the types. We have quite a big backlog of issues that my company and others in the community are working on. And we will be releasing two or three competent types with accessibility updates monthly. So, hopefully by then the current competent types has got much better accessibility. And, obviously, now we are prioritizing to include the ones that have been worked along the Hackathon. I think some of them have been solved. The most important issues, and others, we have to add some more work before we can call those content types really accessible. But very important things have been fixed in the Hackathon, and we will continue to fixing them, yeah.
Thank you. Thank you for being willing to kind of open up your product and participate. We do really hope that those solutions get folded in there into your next updates. I appreciate it. I see there is a question for Anvil. It looks like he's answering. The winners celebrated with cold beer and wine provided by Google. Actually everyone could --
I could maybe --I have to rush to the airport unfortunately.
Oh, that's true, you ran away. But Volker was represented in a picture. His team held up a picture with a big "Thanks, Volker on there" when we presented him with a prize. But they also -- Comcast provided a Google Chrome Cast -- I said Comcast. Google provided a Chrome Cast for all the participants in the first and second place winners and, really, recognition in the blog here. In events such as these were trying to spread the word with them. So that was mostly a labor love with a little recognition for their efforts.
Before we go -- and we're right at time, -- I'd just love to give Greg and Volker and Greg a quick kind of hot take. I know, Volker, when you were going through your solutions, you said, one take away is start with accessibility in the beginning, and it's easier to fix. The other was some difficulties with animation. So a call was to developer community to get in there and solve that problem.
But from both of you-- and, Greg, I'll let you go first, and then Volker, what was a takeaway for you being an expert in this field and seeing the event take place, what was one thing you took away from the event itself?
Well, for me, I think it was a combination of different backgrounds of the people who were participating. We weren't really sure how this was going to work in the first place, so we surveyed people to find out what their backgrounds were, so we mixed up the programmers with the designers, with the people with disabilities who are also involved. And I think the combination of the different backgrounds actually worked quite well in terms of identifying where the issues were in fixing the issues and then talking about them in explaining how things were done.
Yes, and a big thanks to both you and them for coming up with ideas and making it happen.
Volker, any last thoughts?
Yes. I would also like to add that it was great in a very well organized Hackathon in the way that everybody got off the ground very quickly, so it took less than half an hour. We were really stuck into the coat, so we could really get a lot done in the time.
And if I may add another take-home message, in particular, if any developers are listening, or people who are in charge of developers. I think one good idea would be, in general, to make it mandatory for every developer to get rid of their mouse for a day a week. Because one of the major problems is always that everybody tries to click on everything with the mouse and they do not think about people who can't access things with a mouse and have to make everything keyboard accessible.
That's a great idea. There is no substitute for experience and kin of going through what the users might go through, so thank you. And a good plug on global awareness accessibility. Well I want to thank you all and wrap it up while. I will end by saying, definitely, if you have ideas for other types of events get in touch with us, send our way, because we'd love to do other integration types of events.
We will have another PEAT talk every month. So I hope you join us next week -- I mean next month, Thursday, June 16th, at 2 p.m. Eastern Standard Time. We'll be welcoming digital marketer Elisa Greenwood, who was chatting today. She'll be discussing the steps to ensure that your social media recruiting efforts for active and passive talent can successfully reach candidates with disabilities. So we saw her presentation at CSUN and look for her sharing some of the insight with us at PEAT. You can find the registration link on PEATworks.org or look for an email soon.
But I just want to give a huge thanks to all of our speakers for their participation in lead up and the event itself, and for the time today I can't thank you enough. I mean, all of you took time on the line to join us today. Have a great rest of your Global Accessibility Awareness Day. Thank you everybody.