How can users be literate of systems that evade the comprehension of even the experts who created them? For generative A.I., such a literacy seems to depend upon terms of art: misleading homonyms like ‘explainable’, ‘memorization’, ‘natural’, etc. In the field of computer science, terms of art serve as metrics to quantitatively evaluate A.I. performance with the aim of achieving greater efficiency in the tasks those systems were programmed to perform in the first place. Apparently, machine intelligence has blasted off; it approaches infinity and beyond while we stand still. With this post, I attempt to carve off a clean…
Recent Posts
- Introduction to Marie Pruitt
- Introduction to Toluwani Odedeyi
- Introduction to Mehdi Mohammadi
- Introduction to Thais Rodrigues Cons
- DRC Roundup September 2024
- Blog Carnival 22: Editor’s Outro: “Digital Literacy, Multimodality, & The Writing Center”
- Digitizing Tutor Observations: A Look into Self-Observations of Asynchronous Tutoring
- AI (kind of) in the Writing Center