Final Project - HIST 680

Doing Digital Humanities | Final Reflection

What’s the aim of your project?

I wanted to go down memory lane, truth be told. I loved the writing and research that I did for the American Historical Association (AHA) over a decade ago and saw this final project as a good avenue for exploring my work then on AHA Today to the work now on their new blog, Perspectives Daily.

As I explain in the introduction to my final project on my blog, I didn’t really realize at the time that I was researching and participating in digital history – I was young, and quite honestly, technology wasn’t the behemoth that it is today (although some folks may emphatically disagree with me!). I lost touch with the AHA’s blog as my government contracting career took root, so I wasn’t really sure what to expect when I delved back into their blog posts from the last calendar year, December 2018 through November 2019. Using Voyant Tools, I was hoping to see some thematic overlap with the blog posts that I wrote over a decade ago and the newest posts in the last year. I was also expecting to see some significant changes since the field’s grown tremendously with technology changing at rapid speed. Truth be told, this project yielded a little bit of both of my expectations.

What are the sources?

I compiled blog posts in basic Excel spreadsheets and uploaded them to Voyant Tools:

Did you discover anything about your sources? How did you decide to present these sources?

My sources were fairly straightforward and easy load into Voyant Tools since they’re just websites, so I created four Excel spreadsheets with links to all of the articles and linked them in my blog under the Methodology section.

Standard webpage verbiage can throw off the corpus view pretty significantly. I added the terms that seemed to skew my initial corpus to the stopword list and was left with the top five terms that seemed logical and relevant. It wasn’t until I started to explore these top terms using the Correlations, Collocation, and Reader  tools that I realized just how much the template website text can skew the results as any other filler text like prepositions and punctuation.

What software did you use?

I used Voyant Tools for my analysis and WordPress for my presentation.

What was your process?

  1. Split project into two categories for analysis – general interest and careers – mainly because those are my personal interests, both then and now
  2. Added publication names from the corpus – like AHA and perspectives – to my stopword list to ensure that my analysis wasn’t superficial
    Note: This list evolved as I got into my analysis. For example, I excluded professional because it was in the top five words, but the context was just headings, standard webpage verbiage, and menus.
  3. Compiled list of top five words using the Trends tool:
    • General Interest: AHA Today | Perspectives Today
    • Careers: AHA Today | Perspectives Today
  4. Used thumbnails of the cirrus and trend graphics to make my blog post more readable
    Note: I played around a good bit with the best way to present my findings on my blog and decided that thumbnails with links to the main corpus (in a separate tab for increased user friendliness and readability) was the best way to go so as not to take up too much real estate in the blog post.
  5. Explored the data using several Voyant Tools and decided that the Correlations, Collocation, and Reader tools would be the most useful for tracing the evolution of old digital history terms and also the rise of new terms and their corresponding contexts. I wanted to know if the terms I used the most in my AHA Today posts are still relevant today, and if so, their context.

Did you have any problems completing the project?

I wouldn’t necessarily call it a problem, but I realized that a lot of my top terms were used heavily because they were either in menus or tags, so they didn’t really offer any insight into the article content. I excluded some of these terms early on in the project but once I got into the heart of the analysis, I thought that it was more useful to leave in these terms in my blog since I didn’t know that they were insignificant until I delved into the analysis using additional grid tools in Voyant.

Note: I did update my word clouds at least a half a dozen times based on some of my exploration with the tools and continued to add to the list of excluded terms until I felt that I’d reached the point of drawing a hard stop.

There were top words that seemed relevant and seemed like they would yield fruitful analyses when placed in context. Ultimately, I didn’t filter out these terms because I felt like in the context of this project, it was beneficial to dig into the Voyant tools and then discover that the high frequency was due to standard verbiage and so forth.

Leave a Reply

Your email address will not be published. Required fields are marked *