Let’s Have A Look At Education and Religious Attendance
(ANALYSIS) Here’s why I started this Substack — to allow me to fully expand on some graphs that I post on social media that get all kinds of backlash.
That’s exactly what happened a couple of days ago when I posted the graph that’s below.
My tweet was simple: The data indicates that the relationship between regular religious attendance and education is a positive one. The more educated, the more likely to attend church/synagogue/mosque. That’s true in every wave of the Cooperative Election Study. And, the effect is not a small one. In many years, someone with a graduate degree is 50% more likely to be a weekly attender than someone without a high school diploma.
Oh, the replies came in fast and furious. (I had a friend with a decent Twitter following tell me that once you get 100 retweets on something, just turn off notifications. That’s good advice.) People said that it just couldn’t be true because these reports don’t comport with their reality. In that case, it’s the data that’s wrong — not their perception of the world around them.
Here’s what I’m going to do in this post — analyze every single piece of survey data that I know that includes both an education variable and a question about religious attendance. This post will be the one I link to often when I get some pushback. I lay all my cards on the table here and just show you the raw empirical data as best as I can. Let’s get to it.
Multiple people chimed in that this effect would go away if I limited the sample to folks who could actually earn a college degree — meaning filter out the youngest adults in the sample. So, I did that. I kicked out anyone who was between the ages of 18 and 25 years old. Here are the results:
Almost nothing changes. In some years, the percentages are exactly the same. There’s a very good reason for that. In the average year of the Cooperative Election Study, 90% of the sample is at least 25 years old. You remove them, and the overall composition changes so little that it doesn’t matter in the grand scheme of things. So, that’s not a critique that finds any support in the data.
But I wanted to take another approach to this question. I divided the sample into five-year birth cohorts. This is the way that most social science approaches the issue of generations. Generations are really a bad metric for measuring age because they are so large. For instance, millennials could be between the ages of 27 and 42 right now. It doesn't take much thought to realize that those are completely different life stages. Birth cohorts tend to mitigate that issue. I used the data from 2020, 2021 and 2022 combined. The total sample size is nearly 147,000 respondents.
OK, not all the lines are pointing upward here. That’s certainly the case for respondents born in the 1940s. There’s no relationship between education and weekly religious attendance for this group. But once you get to the second row of bar graphs, the line comes back to an upward tilt. That positive relationship is evident in every single birth cohort from 1955 onward. Nine in total. So, this isn’t a phenomenon of just younger people — it’s true for individuals who are very much into their retirement years, too.
But the Cooperative Election Study is but one piece of survey data. There are others. The other one that I use often is the General Social Survey. Conducted every year or two since 1972. The issue here is that the GSS is much smaller than the CES — like, orders of magnitude, smaller. The sample in the 2018 CES was 60,000. It was 2,348 in the GSS. So that means our margin of error is going to be way bigger in the GSS sample.
Same basic analysis here — education and those who say that they attend religious services nearly every week or more. I did this for every single survey year in the GSS. That’s 34 survey years in all. The results are a bit of a mixed bag, really.
You have to look at these results in their totality, not just one single survey wave because of the issues of smaller sample size and margin of error. I mean, just scan the row that begins with 1993 and runs through 2002. In some graphs, there is no relationship. That’s true in 1994 and 1996. But it’s not the case in 1998, 2000, or 2002. But the next row basically indicates no relationship between these two variables.
But here’s something I will say — it’s very hard to look at these results and argue that the relationship between education and religious attendance is in the negative direction. The only years in which that may be true are 1972 and 1985. But 1985 is really weird among those with a graduate degree (and so is 1988, if we are being honest).
Let me throw one more dataset into the mix. It’s the Baylor Religion Survey that’s been conducted in six separate waves beginning in 2005. Wave 6 was fielded in 2021, but that data is not publicly available yet. Here’s that same analysis — education and religious attendance in the five years that I do have, though.
Yeah, those lines are basically flat. I know that some of you are going to look at 2017 and tell me that I’m wrong, but that line is not statistically significant, either. This is a big pile of no relationship between two variables. You can’t draw any conclusions from this data either way.
Why does the CES show a clearly positive relationship in every single survey wave, while the data from Baylor and the GSS are not so clear? There could be a whole bunch of reasons. One is that the CES is administered online, through a web browser. The GSS was conducted face to face for decades and the Baylor survey was initiated through a letter in the mail and then allowed to be filled out on paper or through a web portal. I wrote a bunch about survey mode and how it can impact results in this post.
But I wanted to put all this to another, even more rigorous test. Obviously, the world is way more complicated than a simple bivariate relationship between education and church attendance. There are a whole bunch of factors that could impact how often someone attends religious services. Regression is a way to hold constant some of those main factors and really try to understand how education, by itself, does the work. So, I specified one using the GSS data. I divided it into decades of administration and controlled for all the usual demographic suspects: gender, race, partisanship, age and income.
Guess what? The lines are pointing upwards in every single decade. In the 1970s, about 28% of those without a high school diploma attended weekly. It was 46% of those with a graduate degree. In the sample collected between 2000 and 2008, the overall level of attendance is clearly lower — but the angle of the line is upward. It’s 14% of those at the bottom end of education, and it’s 34% of those at the top.
Now, the line does flatten out considerably in the data that was collected between 2010 and 2018. Among the less educated, about 23% were weekly attenders. Among the most highly educated it was 27%. But that’s statistically significant. So, the relationship between education and attendance may be weakening some. But the evidence all points in the same direction — education and attendance are positively correlated.
Let’s try the same thing with the Cooperative Election Study. Same controls, but I used five different individual survey years here given that the CES has such a huge sample size.
And that same positive relationship is on display in these plots. In 2008, 27% of those without a high school diploma were weekly attenders. It was 36% of those with a graduate degree. The overall level of attendance is clearly dropping in each subsequent wave of the survey, but the upward trend of the lines holds steady. In 2012, it was 10 points. In 2016, it was 13 points. In both 2020 and 2022, it was 11 points. That’s pretty darn consistent.
OK, so let’s look at this in totality now. One thing I really try to stress with my graduate students when teaching research methods is how messy empirical results can be sometimes. It’s very rare to see 10 studies that test the relationship between A and B and find that in all 10 cases, more of A leads to more of B. That’s just not how social science works.
But if you find that in six of those tests the relationship is positive, in three it’s not statistically significant,and in one it’s negative, I think it’s fair to assume (with less than perfect certainty) that the relationship is generally a positive one. The next 10 studies may point things in a different direction, and you should always update your understanding in light of new evidence. But you have to roll with what you’ve got.
I just don’t know how you look at all this data that I’ve brought to bear and conclude that there’s not a positive relationship between education and religious attendance. You most certainly cannot conclude that it’s a negative relationship. That finds basically no support in this data at all. There’s some evidence that the relationship may not be statistically significant, but for me, the regression clears that up.
People who are more educated are more likely to be attending a religious service in the local house of worship this weekend than those with a high school diploma or less. That’s what the preponderance of evidence tells me.
This piece was originally published in Ryan Burge’s “Graphs About Religion” Substack.
Ryan Burge is an assistant professor of political science at Eastern Illinois University, a pastor in the American Baptist Church and the co-founder and frequent contributor to Religion in Public, a forum for scholars of religion and politics to make their work accessible to a more general audience. His research focuses on the intersection of religiosity and political behavior, especially in the U.S. Follow him on Twitter at @ryanburge.