GWS 2023: On reliability of data

By Jon Freitag


The 2023 edition of Wargames, Soldiers, & Strategy magazine's Great Wargaming Survey is a wrap, and the results are in. This year, there were 9,282 completed surveys. That count is down from 2022, but 9,282 respondents still represent a solid sample size. Several new questions were added into the 2023 survey. There are still questions that have yet to see any analysis, some new twists to earlier analysis, and revisiting some old questions with a look using fresh data. Many topics to explore in the coming twelve months!

Let's start  the 2023 analysis cycle with a look at a topic first examined in May of 2023. That topic is On the Reliability of Data.  Why examine this topic again so soon after the last investigation? For one, the survey counts are down from 2022. Second, the age question was asked differently. Third, I wondered if discussing the conundrum of first-time respondents would move the needle at all.  Finally, we can assess the stability of the data with one more year added into the mix.

Prior Survey Response by Year

As identified in last year's analysis, the percentage share  of respondents having taken a prior survey remained in the low 50% range. After mentioning this result, I wondered if surfacing this tendency would affect results.  Well, it is impossible to say whether mentioning this attribute had any impact upon the results but the percentage of those who say they had taken the survey before jumped to 66%.

Years Spent Wargaming (Duration)

What about the number of years spent in the  hobby? Did that remain consistent to previous years? No! The group claiming to have 31 or more years in the hobby spiked to nearly 45%. Perhaps the hobby is graying?

Respondent Location by Year

Year after year, the location of respondents seems stable. We see roughly the same mix of locations each year.

Age Group by Year

Like last year's analysis, the tendency of age cohorts to increase over time continues.  In tandem with the Duration chart above, 2023 saw a jump in the number of respondents in these older age cohorts. Is this phenomenon due solely to an aging population or could something else be driving these results?

One of my frustrations from recent years is that compliance with answering the Age question has been slipping. In 2019, Age was changed from an age range to a discrete age to accommodate the psychometric study (one day, I will dig into those data!). Age was a required field. In 2020, Age became an optional field but remained discrete. 2021 and 2022 kept that same format, but non-compliance increased to the point that in 2022, nearly 1 in 5 respondents failed to answer the age question. Wanting to reverse this trend, I suggested returning to age cohorts rather than asking for discrete age. The result?  Non-compliance almost entirely vanished.  Could these non-compliant respondents have resided primarily in the older age cohorts?

Primary Interest by Year

Finally, we revisit primary interest in wargaming. The trend seen earlier of a shift from Fantasy/Sci-Fi to both Mixed and Historicals continues. If responses are drawn from a representative sample, then the notion holds that non-historical wargamers shift away from preferring purely non-historical wargaming as they age. Is the survey catering to and drawing from fewer non-historical wargamers over time, or is the trend illustrated here accurate? Other analyses may be able to confirm one way or the other.

Even with some shifts in the data, I still maintain that the hypothesis that these data are drawn from similar populations, across time, cannot be rejected outright.


Many questions. Some answers. Your comments are encouraged!

4 comments

Jason, thank you for your comments.

I have discussed survey methods in past posts but in summary, participants in the survey are self-selecting so random sampling of the population is not possible. That is the main reason (well, besides curiosity) to conduct repeated cross-sectional studies to examine the stability of these data through time.

Jon Freitag

Hi,

I wonder, why don’t you report the confidence intervals in your analysis?
Also, where are details about the sampling methodology and are there response errors?

Thanks, Jason

Jason Thomas

Derek, thank you for coming back to take a look at this post and give the topic some thought.

If you have been following these analyses over the years, you will find that I do address your point on the assumptions and caveats of these analyses routinely. This concern is one of the reasons I return to the Reliability of Data topic from time to time. Not only to convince the readership but to reassure myself as well. In addition to this post, please visit the May 2023 post that touches on these points (https://www.karwansaraypublishers.com/en-us/blogs/great-wargaming-survey-blog-wss-magazine/gws-2022-on-the-reliability-of-data). What we are effectively doing is making a repeated cross-sectional study. By drawing different samples across time and seeing similar results, as you state, we cannot reject that these samples are different. Given the turn-over in respondents, this, in itself, is a surprising result.

If the hypothesis cannot be rejected that the populations have remained similar over time,

Jonathan Freitag

You certainly can’t reject the hypothesis that these data are drawn from similar populations across time, but it really looks like you should.

Given the way you recruit participants (self selecting sample) and the ways that social media publicity for the Great Wargaming Survey has varied over the years it looks to me that the population that you’re reaching has varied year on year and you really shouldn’t be discussing trends in the data.

Derek Hodge

Leave a comment

Related Posts