In an earlier post, I mentioned that I have a “mind of paper” in that much of what I believe I know is actually based on works I’ve read over the decades. One important influence was the late Raymond McInnis, an academic librarian at Western Washington University, who probably should have been cited as a source for this sentence in my chapter 6: “The development of a scientific vocabulary and associated classificatory terms are generally considered hallmarks of an emerging scientific field as they facilitate the exchange of knowledge, even as both vocabularies and classifications are subject to change as the particular science advances in knowledge.” I was in email contact with him while I was working on my dissertation on the diffusion of theories some 20 years ago, and did cite him extensively in that.
As an academic librarian, McInnis was particularly interested in “discourse synthesis,” which he viewed as a communal activity in which scholars developed a recognized body of knowledge through written communication, but which can also viewed on an individual level, as being the process by which an author transforms multiple existing texts into new ones.
“Discourse synthesis” today is often associated with this process that students go through in researching and writing academic papers, intended as educational preparation for the many professional careers in which writing publishable papers will be essential to success. Prior to the rise of the internet, this process was termed “bibliographic instruction” by academic librarians and generally involved helping students find appropriate print resources for a particular topic within the library’s collection for their papers, after which English composition instructors (usually in the infamous “Comp One” freshman lit course) would help students synthesize their resources into a written product that would be appropriate for a particular discipline (or “discursive community”).
This process has evolved dramatically in recent decades, and “information literacy” is now the preferred phrase for academic librarians working in this area, and their mandate for helping students reaches far beyond the individual collections, as print reference tools are almost extinct, overtaken by new electronic resources that are internet-based and often open access. The composition process too has evolved dramatically. Nancy Nelson & James King describe the post-source selection and synthesizing process in their 2021 “Discourse synthesis: Textual transformations in writing from sources” as research into “the ways in which writers make use of, and transform, multiple other texts in writing their own. It is intertextual research that has blurred boundaries of various kinds, not only the boundary between the processes of reading and writing but also boundaries across disciplines as well as regions of the world.”
However, McInnis was most fascinated by the early origins of “discourse synthesis” as a largely unexplored part of pre-modern intellectual history, giving rise to “discursive communities.” In his 1995 “The lexicography of scholarly concepts“, he wrote:
“With much intellectual struggle, the concept, as a symbol for knowledge emerged in the eighteenth century with a more dynamic function than it had in preceding ages. Often metaphorical (e.g., “scientific revolution”), concepts served as rhetorical conventions in what came to be recognized as part of “empiricism.” Recognized today as a separate school of thought, the period in fact became known as the “Age of empiricism.” Creating new knowledge became no longer a matter of retrieving arguments by recourse to topoi, in the manner of the ancient rhetoricians. In the preceding ages, topoi, a Greek word for topics, are, in effect, “concepts.” In rhetoric, a topic [is] a place or store or thesaurus to which one resorted in order to find something to say on a given subject. More specifically, a topic was a general head or line of argument which suggested material from which proofs could be made. With the age of empiricism, creating new knowledge, as a form of invention, became a process of discovery through empirical search and making assertions that express relationships the mind perceives in or imposes on that information in order to render it coherent. A creative act, invention, then, involves the making of new meaning, not a recall of preconceived lines of argument or ways of arguing” (1995, p. 32).
The following year, he guested edited a special issue in Social Epistemology on the topic of “Discourse Synthesis” and his introduction to the issue stated that “Today, ‘understand’, ‘explanation’, ‘agreed-upon’, ‘consensus’ or ‘concept’ are terms in the vocabulary of discourse. To a greater or lesser degree, these same terms comprise the lexicon of discourse synthesis. Since the seventeenth century, many of these same words have helped to shape the discourse of the new logic. (As I note later, the new logic is the label generally given the discourse that emerged out of the decline of scholasticism.) Books by John Locke and David Hume, for example, include ‘Understanding’ in the title. These words, then, make up the standard lexicon of discourse… At the risk of simplification, scholarship works this way: discipline by discipline, scholars seek to understand and explain the subject matter in their area of specialization. The object of their activity is to produce a body of knowledge in specific fields of inquiry. As they achieve an understanding of their subject, scholars publish the results of their interpretations (that is, their research findings) in the form of explanations. Explanations then, can be said to communicate understanding.” (1996, pp. 1-2)
His other contribution to the issue (“Discursive communities/interpretive communities: The new logic, John Locke, and dictionary‐making, 1660–1760“) provides a much richer historical account of how this process evolved over time, including the divergence of dictionaries and encyclopedias as separate intellectual enterprises.
He wrote, “After 1690, for several generations, the majority of theorizing upon language by English and Continental European writers was, scholars widely agree, a reworking or reinterpretation of Locke’s Essay on Human Understanding. Essentially, Locke’s theory argues that language, like government, is artificial; it rests upon a contract where meaning is arbitrary, the result of social convention. Words, then, are separable from things leading to the conclusion that words correspond to reality only according to their conscious designation by society. … As Locke’s theory of abstract ideas articulates the process by which the meaning of a term is formulated, this operation, basically psychological, shows both the human origins and instrumental significance of language, and the nature of concept-formation in science” (1996, p. 111).
McInnis focused on how lexicographers of the period (most notably the great Samuel Johnson) adopted this empirical approach and applied it to dictionary-making by seeking out examples of how words were used by various writers in order to ground their definitions and to change them as usage also changed. This was, of course, a radical departure from pre-modern dictionaries, but is taken for granted today by students and others using the Oxford English Dictionary, the preeminent dictionary of the English language. Not just scientific terms, but the majority of English words, are identified and investigated by today’s leading lexicographers.
McInnis also edited a 2001 book which expanded on his Social Epistemology special issue (Discourse Synthesis: Studies in Historical and Contemporary Epistemology) featuring both several papers from the issue and several new contributions from such well-known figures as Steve Fuller and Henry Small. Contributors discussed various discursive communities, explaining how these can vary from tightly structured scientific ones (Small) to very unstructured philosophical ones (Fuller). Nancy Nelson also contributed a chapter focusing on her idea of “intertexual transformation” as “product and process,” which McInnis described in his introduction as being very different from the rest of the volume and which, as I mention above, has now become integral to academic librarianship.
What McInnis didn’t live to see and probably didn’t foresee is the way in which, thanks to large language models, “discourse synthesis” has now become a goal of many artificial intelligence applications in education and beyond, while their involvement in various “discursive communities” is already underway (for instance, this interview with ChatGPT on the future of journalism in Editor & Publisher). Right this way to the Crater of Doom in Human Understanding, folks! (Or, in Hicks and Lloyd’s terminology, to “the basket of doom” in information literacy, as “explained” below in Google’s AI overview just now):
