Today’s guest blog is by our own Melibee, Kyle Rausch. Kyle and I had a great conversation about the new Study Abroad 101 rankings and how it told a completely different story than our “go to” data in our field, IIE’s Open Doors Report. Read on to hear Kyle’s thoughts about what story data can (or can’t) tell.
If there is one practical thing I will take away from my graduate program and the scores of academic studies I have drudged through it is that a careful evaluation of the data presented can often times tell a more interesting story than the study itself! Frequently, I’ll find myself having more questions than before I started reading the study thanks to curiosity in how the data was compiled and what the statistics are really telling me. Inevitably that is about the point where I decide to power on the Keurig and indulge myself by clicking on the time-sucking Facebook tab that is of course always open on my browser.
I suppose it’s not really surprising then that a long, back-and-forth virtual conversation ensued when Missy, Melibee’s founder, asked for one of our Melibee hive members to come forward and write a post about the new data compiled by the Study Abroad 101 (SA 101) Study Abroad Rankings. Always interested in lists, rankings, and data about our field, I thought I’d take a stab at it. A cross-country move, multiple loquacious drafts, and several headaches later, I still am not entirely sure what the data out there about study abroad and student trends is telling us.
The Study Abroad 101 Study Abroad Rankings list is a survey that is sent directly to the students participating on different programs and as such it provides a very Gen Y picture of American study abroad. It includes categories such as the most livable cities, top short-term programs, top food cities, and interestingly enough, the top program providers. With the recent release of IIE’s Open Doors report, it is interesting to consider SA 101’s more student-driven stats as compared to the IIE’s more academic findings
Since IIE’s report is compiled based on data collected from academic institutions in the country I think that it is regarded as the definitive source for documenting study abroad trends. However, the results from the Study Abroad 101 survey seem to present something different. Consider the category ‘Top Friendliest Cities,’ and we find a cultural story.
What does it mean that American students voted the top 3 friendliest cities as Asian cities? The Study Abroad 101 report indicates that this category was based on the amount of friends whom American students had met in the host country and with whom they would be likely to remain in contact. The fact that Seoul (South Korea), Hirakata (Japan) and Tokyo (Japan) placed in the top could be taken to mean that Asia is a more open region when it comes to meeting others. Is this because the region is regarded as more homogenous as whole and therefore more readily seeks the other? Still, looking at the data there is a gap that needs to be addressed: Hirakata is #2 with only six programs evaluated. It would be helpful to know how this city placed so high in this category having so few programs in the study.
Similarly, another category, ‘Top Livable Cities,’ says much about our American culture. Whereas the top 3 friendliest cities were cities in Asia, it appears that this has no correlation to how livable American students regard a city. Instead, the top 3 cities in this category are all English-speaking locales: Melbourne (Australia), Dublin (Ireland) and Gold Coast (Australia). As a lover and student of languages, I think that our country’s under-appreciation for learning foreign languages has an influential effect on how we find the livability of a city to be. This trend does correlate to the findings of IIE reports too, which always reports the United Kingdom as the top destination to where U.S. students travel, largely because of the perceived similarities in culture and language. In my opinion, many American students want to study abroad…just not in locations where they have to radically challenge their western ideology. Thankfully, the IIE does show a recent change in this trend with interest in China on the rise, and interestingly a large increase in Costa Rica.
There are just a couple of the stories I took away from these reports. Every year I am always eager to see the new results published in the Open Doors report to monitor the trends of our field. I was intrigued by Study Abroad 101’s survey since it seems to present a different perspective on study abroad trends and what is important to the student. However, I still have many questions raised from both studies. For instance, how does the IIE handle the possibility of skewed data for those well-known schools that send a large portion of students from other institutions? And for the SA 101 survey, why were those categories picked and why does the data favor third-party providers so much?
I’m not sure this blog post presented any definitive conclusions drawn from these studies (I’m about ready to power on that Keurig again!) but I’m interested…what stories is the data telling you…what questions does the data raise?
About the Author: Kyle Rausch works for Arizona State University’s Study Abroad Office in Tempe, Arizona. In the past he has served as Immigration Specialist and Passport Acceptance Facility Manager at Florida State University where he is finishing his MS in Higher Education Administration.