-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Track the web's top pain points over time #15
Comments
Some discussions about this happened in another place, let me try to summarize the important points here: @robnyman, @dontcallmedom, and I talked about potentially doing several types of surveys:
|
In short, I think all of these are interesting and fill a purpose. Also – particular in the current economy – that it would rather come down to resources, people and money, to figure out what we can run and how we prioritize them. |
Definitely a fair point and MDN short surveys are, currently, the fastest and cheapest way we have to survey developers. However, these types of surveys can't be compared with the DNA-style surveys. In a short survey, we wouldn't be able to ask users to go through a long and tedious maxdiff sorting process. We could do a simpler ranking question instead, but that has drawbacks: much lower fidelity of the results, and impossibility to know how far each pain point is with regards to the others. Also ranking 28 needs (the full list from the DNA survey) takes time and we'd likely see low participant numbers. If we were to go with a MDN short survey format, it'd be good to lock down on a short list of pain points and make it quick for people to signal their top point(s). It would also be good to allow for free-text answers. |
The most obvious approach would be to take the 28 needs statements from 2020 and ask developers to rank them using the same question type (max diff) as originally. Unfortunately this takes a lot of time for survey takers, and any change to the list of 28 statements could influence the results. Another approach is asking "how satisfied are you with X" at many points over time, similar to the Satisfaction by Subcategory from MDN DNA 2020. Unfortunately we expect the responses to change slowly over time, so we'd need a lot of data to detect any real change. For both of the above, one also has to consider whether the audience is equivalent at every point in time, to avoid mix shift effects. Finally, we could ask if developers think things have improved in the past 12 months. In order to make sense of the results we'd need to ask about the improvements across a number of things, including things we're confident haven't improved. These are just my non-researcher ideas. |
Thanks @foolip. Those are great points. Ruth is proposing to categorize the pain points in larger buckets (testing, documentation, browser compat, etc.) and then testing the buckets one by one using MDN short surveys with a "don't care"/"care some"/"care a lot" approach. Schalk also proposed to run another short survey with the top-3 of each category so we can also compare cross-bucket. That's a good point about the mix-shift effect too. I wonder if there's a way we can model the survey to avoid it being too much of a problem. I'll report here with a complete list of categories and pain points to move this forward. |
I came up with the following categories below. They have an average of 5 items each. I'd love to also add something to ask if any new pain points emerged, and Philip's questions about whether things have improved over the past 12 months. Interoperability
Testing and debugging
Documentation and learning
Frameworks, libraries, build tools, and services
Web capabilities
Compliance with laws, regulations, and best practices
|
Some notes from the WebDX CG meeting today, in no particular order:
As an action item for next steps, I volunteered to take a look at the questions from the DNA survey and seeing how we could trim it down to a shorter version. @Rumyra, let me know if you have access to it. |
I think that's worth discussing. In the case of IE11 specifically, even if it's reached end of support, I think it's still interesting to either get a confirmation that developers have now dropped their support, or that they still need to support it for the foreseeable future due to slow update rate among users. |
@captainbrosset - I've had a quick look into it - we still have the original website with results running https://insights.developer.mozilla.org/ and albeit archived the repo powering that https://github.com/mdn/insights I haven't been able to find the results in Alchemer - I'm enquiring as I expect it wasn't called 'DNA survey' when created (@atopal do you have any insight into that?). However the pdfs are available in the repo https://github.com/mdn/insights/tree/main/public/reports/pdf which could still prove helpful |
Thank you @Rumyra. I looked into the reports in more details, and even if we don't have the survey questions, I think it's quite easy to imagine what a trimmed down version of the DNA survey would be: the maxdiff ranking of the 28 needs. The DNA survey had more questions around technologies, and what's missing from the web, which we could remove to create a shorter survey. The DNA survey displayed sixteen sets of five need statements, ensuring that each of the 28 needs was seen ~3x by each respondent. For each set, respondents were told to pick the one need that caused the least frustration and the one need that caused them the most frustration. I don't know if these sets were created randomly for each person taking the survey, or designed beforehand. Also of interest: 79.6% of respondents agreed (or strongly agreed) that the list of needs was a fair representation of the needs they experience as a web developer. But, 13.4% neither agreed nor disagreed which means there is room for improvement in the needs list. Finally, it's worth noting that a sophisticated analysis of the MaxDiff data was done for the DNA survey, and to extract proper data from a new survey, we'd need to do the same thing. |
In the event that we want to try and run a simplified/trimmed down version as a MDN short survey, I took a stab at coming up with a new list of needs:
This is largely inspired by the DNA survey needs, but with a few changes:
This list has 20 needs. If we wanted to maxDiff it, we could run it as 10 sets of 6 needs in order to ensure each need is seen 3 times. |
Perhaps I should have googled this topic earlier... I only just stumbled upon @PaulKinlan's quarterly web pain point survey, which seems to fill this gap nicely.
Feels to me like we shouldn't unnecessarily duplicate efforts here. |
Just to record some of my thoughts around this somewhere:
So I'm more in favour of working from the ground up, based on specific questions we want to be able to answer. For example one thing that we are interested in (and others also seem to want to know about) is "which versions of browsers are developers actually supporting, and what process do they go through to support it". Previous surveys have had a "which browsers do you support" question and a "how far back do you support browsers" question, but they haven't allowed the answers to be coupled. It seems like a better design here might be along the lines of "Think of a site you worked on in the last six months. For this site, which versions of the following browsers did you have to support" and then specific options for major browsers (e.g. unsupported, current release only, back to ESR, specific versions). Then for that specific project / set of browsers you could ask about how they go about testing for support (to get information on manual vs automated tests and choice of tooling), and ask about problems that they ran into with that browser on that project. Obviously for other topics like general concerns about the web stack and surrounding tooling we might need different questions. But I also think we already get more data about that from surveys like State of CSS. |
See #20. Google has agreed to share their "devSat" survey questions. The very first satisfaction questions are very similar to the Web DNA questions, but have obviously been updated too. |
Context
This was originally proposed by @una here: #8 (comment)
The MDN DNA survey ran in 2019 and 2020 and gave us great insights into how web developers experience the web platform.
It got 76K results in 2019 and 31K results in 2020.
Both times, people had to rank 28 "needs" using a MaxDiff study (so, only a few needs were displayed at each step of the survey).
The list of 28 needs were mostly the same across the 2 years, with some differences. The 3 lowest ranking needs from 2019 were dropped in 2020 and replaced with 3 new ones.
The top 10 needs stayed the same over the 2 years, but the order between them changed a bit. Here is the list in 2020:
Running a new survey
Two years have passed, and it seems that knowing the answer to the following questions would be useful:
Furthermore, tracking this over time (for example once a year) seems useful too.
The text was updated successfully, but these errors were encountered: