Papers I Liked 2023

I felt like I barely did any serious reading this year, and maybe that’s even true, but my read folder contains 168 papers for 2023, so even subtracting the ones that are in there by mistake, that’s enough to pick a few highlights. As usual, I hesitate to call these favorites, but I learned something from them. They are in no particular order except chronological by when I read them. Themes are kind of all over the place because this year has been one of topical whiplash for me. Broadly, early in the year I was reading a lot more economics, and later in the year more Machine Learning. Computational econ was a focus because I taught that class again after a 2 year hiatus and added Python. Learning Python was a bigger focus: I can say that I am now quite middling at it, which was an uphill battle. I spent the middle of the year trying to catch up with the whole language modeling thing that is apparently hot right now. A lot of the learning on each of these topics was books and classes, so I will add a section on those too.

Classes and Books

  • Python, introductory
    • I quite liked the QuantEcon materials for the basics, though that’s idiosyncratic to it being targeted to numerical methods in economics and to having already used the Julia materials.
  • Python, advanced
    • Please help me, I’m dying. Send recs. Part of it is that I still need a deeper foundation in the basics of computation (like, command line utils, not CS theory). Part of it is that the one good thing about Python, its huge community and rich library ecosystem, is also the terrible thing about it, the whole thing being a huge and ever shifting set of incompatible hacks and patches fixing basic flaws in older patches fixing basic flaws in, etc ad infinitum.
  • General Deep learning
    • Melissa Dell’s Harvard class is the only one I’m aware of that’s aimed at economists that will explain modern practical deep learning, including contemporary vision, text, and generative architectures, with a focus on transformers. Use this if you want to do research with text, images, documents. Taught by an economic historian, but orders of magnitude more up to date than anything by an econometrician or computational economist, including what gets published in top econ journals (which are great, but not for ML).
  • Natural Language Processing
    • Jurafsky and Martin, Speech and Language Processing, 3rd ed: Learn the history of NLP, up to the modern era. A lot of the old jargon remains, the methods mostly don’t. But this will explain the tasks and how we got to modern methods.
    • HuggingFace Transformers is the library people actually use for text processing. This is mostly a software how to, but then again modern NLP is pretty much nothing but software, so you may as well get it directly.
    • Grimmer, Roberts, and Stewart, Text as Data: Fantastic on research methods, and how to learn systematically from document corpora. Technical methods are from the Latent Dirichlet Allocation era, now charmingly dated, though their stm software will get you quite far very quickly in the exploratory phase of a project.

Papers I liked

Related