Thursday, August 31, 2017

The internship wraps up today, and I decided to wrap things up by livening up the AppDB home page with a screenshot carousel.

Normally I'm not terribly interested in purely cosmetic changes; all the previous changes to the layout that I've made have been to improve usability. But the old home page was afflicted with a sad little jpeg surrounded by too much whitespace at the upper right corner. It just was crying out for something better, and the AppDB has an abundance of screenshots.

Bootstrap made setting up the carousel fairly easy, though there were some minor glitches before I got the surrounding text to behave itself. The main problem then was image sizing.

The sceenshots in the AppDB come in many different sizes and aspect ratios. The carousel would display the full width of the screenshots, but size itself up or down as needed to keep the proportions. The result was a very unpleasant jumping up and down of the text below the carousel as it resized itself.

Limiting the height of the carousel and setting overflow to hidden stopped the jumping, but it meant that taller screenshots were all arbitrarily cropped at the bottom, and the results weren't usually pleasing.

Ultimately I decided to select an assortment of screenshots from the AppDB and manually edit them to uniform proportions for the carousel. The results went live a few minutes ago, as I write this.

This will be my last blog post for awhile; there will be one more to report on WineConf2017, which I'll be attending at the end of October. This isn't, however, the end of my work on the AppDB. I have a list of things I still want to do.


Saturday, August 19, 2017

This past week I finally got around to fixing the very first bug I ever filed against the AppDB.

It wasn't really a bug per se, but an enhancement request: add a field to the test report form for changes to default settings in winecfg (Wine's configuration tool). The bug has been gathering  dust in Wine's bugzilla since 2008, but the reasons for wanting the enhancement are still valid.

The AppDB ratings systems differentiates between applications that work as well as on Windows  "out-of-the-box" (platinum rating) and ones that can be made to work as well as on Windows with some effort (gold rating). False platinums submitted by users who did in fact use various tweaks to get the application working has been a chronic problem, as has the issue of users who give gold ratings but fail to specify the workarounds needed to achieve that.

I was already familiar with the test report form code from having grappled with the distributions issue, so adding another field to the form and database would not be a difficult task. The challenges here were in the design and wording of the interface: my goal was to reduce user errors, not create more.

The first issue was what to name the field. My bug report just mentioned changes to winecfg, but in reality the kinds of things that might need to be done to make an application work  encompassed more than that, such as changes to registry settings or installation of runtimes or codecs. In addition, many users make such changes through winetricks, a third party script, and don't think of them as changes to winecfg.

I needed a broader word that would encompass all possible fiddling, and in the end it came down to a choice between "tweaks" and "workarounds." I chose the latter because that's the word most commonly used in bugzilla.

The next issue was deciding what kind of field to add. I knew we needed a wysiwyg textarea that would allow users to say whatever they wanted/needed to say, but I also knew from similar fields already in the form that such a field is not useful for restricting ratings precisely because users can say whatever they want. If users could simply type in "None" in a Workarounds field, then the presence of content in that field couldn't reliably be used to restrict platinum ratings.

So I decided to add two fields to the Workarounds section. One is a simple yes/no question with radio buttons: "Were any workarounds used for problems that do not exist in Windows?"  The other is a wysiwyg textarea for users who answer "yes" to describe the workarounds they used.

With that in place, I added checks to the form processing to make sure the submitted rating is consistent with the answer to the Workarounds question. Users are now prevented from submitting platinum ratings if they report having used workarounds, and users who submit gold ratings without having described the workarounds used are now required to supply that information.

Could users willfully bypass those checks by, for example, putting workarounds information in another field and giving a platinum rating? Sure. However, I don't believe many (if any) of the incorrect ratings we've seen have been willful attempts to mislead, and if there are any, they will still be caught by the maintainer or admin processing the submitted form.





Sunday, August 6, 2017

For the past couple of weeks I've been focusing on changes that mainly benefit AppDB admins, specifically the management of maintainers.

One of the first problems I fixed when I began this internship was with the interface for managing maintainers. In addition to fixing the sort order to something that made sense to humans (the original order was by userId, and appeared random), I also added a column to display the last time the maintainer had logged into the AppDB. I suspected that we had many maintainers who hadn't logged in in a long time, and the actual number proved even worse than I had feared: 57% of the maintainerships belonged to a user who hadn't logged in in over two years. (Note the number is for maintainerships, not users, as one user may have multiple maintainerships.)

How did this happen? The function that notifies maintainers of new test reports in their queue also sends two  reminders, and removes the maintainership if they fail to process the queued reports after 7 days. That worked well for widely-used programs, but the AppDB has many entries for less-popular programs. Those were the maintainers falling through the cracks: since they weren't getting submitted test reports, and nothing was checking to see if they were submitting their own (which they are supposed to do), they were never removed.

About a month ago I removed the inactive user check, and in the process of studying that code, discovered that it was supposed to also remove maintainerships from users who had not logged in in six months. It never worked, because the check was only run against users who did not have data associated with their accounts, and being a maintainer was defined as having data. So the check was looking for maintainers in an array that had explicitly excluded them.

With all that in mind, I added a control to identify and delete maintainers who had not logged in in over 24 months to the admin control center. I eventually plan to add it to the daily cleanup cron script. The time period, IMO, is still overly generous; I selected it in part because of the large backlog of maintainerships that were going to be removed the first time it was run (3439 out of 6242), and in part because no time limit had ever been enforced before and I wanted to make it exceedingly difficult for anyone to argue that the definition of "inactive" was unreasonable. I may eventually shorten the time period to 18 or even 12 months.