close

Вход

Забыли?

вход по аккаунту

?

asi.10021

код для вставкиСкачать
Book Reviews
The Future of Classification. Rita Marcella and Arthur Maltby.
Hampshire, England: Gower Publishing; 2000; 144 pp. Price
$99.95 (ISBN: 0-56607992-5.)
This book contains a collection of essays on classification, most
of which are written by rather well-known contributors. It contains
the following chapters:
(1) Eric Hunter: Do we still need classification?
(2) Arthur Maltby & Rita Marcella: Organizing knowledge: The
need for system and unity.
(3) Julian Warner: Can classification yield an evaluative principle
for information retrieval?
(4) Robert Newton: Information technology and new directions.
(5) Alan MacLennan: Classification and the Internet
(6) A.C. Foskett: The future of facetted classification
(7) Joan S. Mitchell: The Dewey Decimal Classification in the
twenty-first century.
(8) I.C. McIlwaine: UDC in the twenty-first century.
(9) Lois Mai Chan & Theodora L. Hodges: The Library of Congress Classification
(10) M.P. Satija: Sources of investigating the development of
bibliographic classification.
Overall, the book is a disappointment, and points to problems
in library and information science (LIS) as a research field. Classification is often regarded as one of the core subdisciplines of the
field and as one of the core qualifications of library and information professionals. Nevertheless, no classification researchers (not
even S.R. Ranganathan or Jack Mills—and none of the authors in
the book under review) are visible in bibliometric maps of LIS
(e.g., White & McCain, 1998)!
One of the problems in this book is that it fails to define
classification and to distinguish between different kinds of classification. By only considering systems like Dewey, LC and facetted
classifications, it fail to consider, for example, bibliometric approaches in LIS as kinds of classifications and thus to consider the
basic strength and weakness of different methods of classification.
In computer science the term “ontologies” is very popular, and can
be considered a modern development in classification research.
Vickery (1997) made a useful introduction to this research, but it
is not considered in the present book.
I have a feeling that most of the authors in this book (and other
“classification researchers” as well) are more or less implicitly working from the presumption that classification is about printed documents, and certainly not full-text electronic retrieval. I am, of course,
aware that some of the chapters in the book do explicitly consider the
Internet and electronic retrieval. However, if the electronic environment is to be considered, one needs to compare the relative strength
and weakness of all kinds of subject access points (cf., Hjørland &
Kyllesbech Nielsen, 2001). One has to consider what utility—if
any— classification codes can have in relation to all other kinds of
access points. If what is considered “classification” is not considered
in relation to the electronic challenge, it is in my opinion reduced to
something of minor importance.
© 2002 John Wiley & Sons, Inc.
In Chapter 3, Julian Warner actually does take a step toward
considering inherent weaknesses in current approaches to Information Retrieval (IR), and this chapter is, in my view, the best one.
I think he is right in making the point that the IR tradition has built
on the assumption that the system should provide a set of records
that satisfy a query. What an IR system, in his view, should do is
enlarge the users’ capacity for informed choice between the representation of objects in the given universe of discourse. “Such an
enhanced capacity for informed choice broadly corresponds to
exploratory capability. It should also be regarded as analogous to
a sense of cognitive control over, or ability to discriminate between, representations of objects” (p. 36). His basic idea is not
much unfolded in the present chapter, but I think his line of
research looks promising. Again, however, the capacity of different forms of classification of contributing to such discriminatory
powers should be considered relative to other kinds of subject
access (cf., Hjørland & Kyllesbech Nielsen, 2001).
In recent years the methods of classification and more generally
knowledge organization has been reconsidered. Hjørland & Albrechtsen (1999) claimed that the four basic methods are, respectively, empiristic, rationalistic, historicist, and pragmatic. If one
uses, for example bibliometric methods, one applies an empiricist
method. The best representatives of the rationalist method are the
facetted classifications. An important example of historicist methods are given in Hjørland (2000), considering the classification of
the social sciences. An unfolded comparison of all methods used in
one domain is given in Hjørland (1998). In my view, the future of
classification is connected to a combination of these four methods
of classification and to the further clarification of strong and weak
aspects of different methods and systems. Unfortunately, these
issues are not addressed in the book, as it fails to answer the
fundamental questions about the future of classification in LIS.
Birger Hjørland
Royal School of Library and Information Science
6 Birketinget,
DK-2300 Copenhagen S, Denmark
E-mail: [email protected]
Published online 8 November 2001
DOI: 10.1002/asi.10008
References
Hjørland, B. (1998). The classification of psychology: A case study in the
classification of a knowledge field. Knowledge Organization, 24(4),
162–201.
Hjørland, B. (2000). Review of I. Wallerstein et al. (1996). Open the social
sciences, report of the Gulbenkian Commission on the Restructuring of
the Social Sciences. Stanford, CA: Stanford University Press. In Knowledge Organization, 27(4), 238 –241.
Hjørland, B., & Albrechtsen, H. (1999). An analysis of some trend in
classification research. Knowledge Organization, 26(3), 131–139.
Hjørland, B., & Kyllesbech Nielsen, L. (2001). Subject access points in
electronic retrieval. Annual Review of Information Science and technology,
35, 249 –298.
Vickery, B.C. (1997). Ontologies. Journal of information science, 23(4),
277–286.
White, H.D., & McCain, K.W. (1998). Visualizing a discipline. An author
co-citation analysis of information science, 1972–1995. Journal of the
American Society for Information Science, 49(4), 327–355.
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, 53(1):57– 62, 2002
Saving the Time of the Library User Through Subject Access
Innovation: Papers in Honor of Pauline Atherton Cochrane.
William J. Wheeler. Champaign, IL: Graduate School of Library
and Information Science; 2000; 217 pp. Price $30.00 (ISBN:
0-87845-108-0.)
According to the book’s editor, “the [eight] papers in this book
run the gamut on subject access issues and are contributed by some
of the most influential scholars in the field. Some worked closely
with Pauline in the early years, others were students of hers at
Syracuse or work closely with her now at Illinois” (p. 3). The
authors of the eight articles in the book are all friends, coworkers,
and students of Professor Pauline Atherton Cochrane, and are
writing to honor her productive career of 50 years in the field of
Library and Information Science. In addition to the eight articles,
the book comes with an introduction written by William J.
Wheeler and with a short note written by Professor Marcia J. Bates
describing her long-standing professional association and friendship with Professor Cochrane. Finally, the book includes Professor
Cochrane’s curriculum vitae and an end-of-the-book index compiled by Sandra Roe. The narrative and ideas in each article are
well developed, and all articles are worth the reader’s time. Each
article can be read on its own with no particular attention given by
the reader to the order the articles are presented in the book. Room,
in the review that follows, is mostly reserved for critical ideas and
concepts and brief summaries for a select few of the subject access
projects the authors describe along with suggestions offered for
future research from each article.
The article by Robert Fugmann, Obstacles to Progress in
Mechanized Subject Access and the Necessity of a Paradigm
Change, is an extended introduction to many of the subject access
issues covered in the book, such as quality indexing, recall and
precision, user-centered evaluation and design, database indexing,
and much more. This article’s content can be compared to that of
Drabenstott’s, also in this book, in terms of discussion about
computerized versus intellectual indexing, for an interesting contrast of opinions about subject access, as the editor notes in his
introduction. Although some comparison of mechanized versus
intellectual indexing is offered below, this article’s real value rests
on its critical view of the “negative influence of positivisticempiricistic philosophy . . . on indeterminacy, predictability, definability, user fallibility and misplaced jurisdictional claims of
information technology” (p. 8). It is because of the article’s critical
stance that Fugmann’s analysis of recall and precision is pushed
beyond system-based particulars of recall and precision, and it is
very artfully related to discussions about user systems evaluation,
design, and costs in many of the article’s detailed sections. Discussions about recall and precision has dominated much early
positivistic research, and writings on systems evaluation in information retrieval and Fugmann is particularly concerned that a
law-like inverse relationship between recall and precision does not
always hold true. Fugmann is in good company in disputing the
law-like inverse relationship between recall and precision. As
Vinh-The Lam describes later on in the book, Professor Cochrane
in her 1978 Subject Access Project (SAP) demonstrated that an
increase in recall was not necessarily coupled with a decrease in
precision.
In addition, Fugmann in questioning the value of the positivist
foundations of information science points out that although consistency in indexing it is a good thing, what human indexers should
be concerned with is an “overall predictability of both essence
selection and essence representation” in indexing (p. 17). Full-text
indexing, with its lexicalization of every word, when used in
conjunction with keyword searching, does increase recall or the
number of records that are retrieved. It has been shown that
full-text indexing is equivalent to intellectual indexing (human
intervention that stresses creation of concepts in a controlled
vocabulary for the latter to remain useful over time). But, full-text
58
lexicalization, as the author states in a variety of passages in his
text, does not enhance predictability in concept representations.
Concern about ideas and concepts, in addition to the meaning of
individual words in full text lexicalization, and their inclusion in
vocabularies used for indexing purposes, is of paramount importance to accommodate user interaction with an information system
over time. The survival power of an information system (continuation in a system’s ability to serve a useful purpose) is a function
of a vocabulary that includes individual words and other concepts
skillfully developed by human indexers. Human intellectual indexing contributes to a user-oriented approach in information systems
design that opposes training users simply to adjust searching
behaviors to whatever vocabulary and searching mechanisms a
system happens to offer. Furthermore, paying attention to qualitative differences in indexing and focusing research efforts on quality indicators for information systems survival power are all part of
an ongoing questioning in information science of the jurisdictional
claims of information technology. Use of artificial intelligence or
machine-based intelligent processes for lexicalization of full text
to satisfy the indeterminate nature of human searching is not a
futile effort, altogether. But, according to Fugmann, mechanized or
machine-based processes should be balanced or coupled with
superb and ever-vigilant intellectual processes for subject analysis
and more up-to-date, interpretive theoretical foundations for Library and Information Science.
On MARC and Natural Text Searching: A Review of Pauline
Cochrane’s Inspirational Thinking Grafted onto a Swedish Spy on
Library Maters, Bjorn Tell offers an interesting narrative of his
work in libraries and of his acquaintance with Professor’s Cochrane’s work in classification and user-oriented research issues
dating back to 1963. Albeit an admirer of Professor Cochrane’s
work on the Subject Access Project (SAP), which aimed to enhance subject access by increasing the number of access points in
a bibliographic record, Tell is also preoccupied with a more
economical and yet useful version of MARC.
For a cataloging project, charged with organizing access to
thousands of committee reports in the Swedish legislative system,
Tell used rules from SAP to extract content from the reports for
cataloging purposes. A noted difference, however, between the
original creation and use of SAP rules by Professor Cochrane and
their subsequent application by Tell was that captions to tables and
graphs were also cataloged in Tell’s cataloging project in Sweden.
For work on a project for UNESCO, Tell was given responsibility
to catalog records for the National Library of Nicaragua. He
recommended 11 initial fields as sufficient for providing user
access to existing records. Although 12 fields were approved for
the Nicaragua project, still, according to Bjorn Tell, that level of
description made for an “adequate bibliographic citation” (p. 56).
Donald W. King in his article, Blazing New Trails: In Celebration of an Audacious Career, provides a detailed insight into
Professor Cochrane’s career. The discussion in this article focuses
on research done in the 1960s and very early 1970s, and covers
studies for the American Institute of Physics (AIP), research about
Universal Decimal Classification (UDC) and Professor Cochrane’s
dissertation on relevance and development and evaluation of an
on-line information-retrieval system. In addition, King describes
lessons learned by Professor Cochrane and her research colleagues
in the area of relevance measurements and information systems
design.
With a 50% rate of return, obtained on 2,000 questionnaires
mailed to U.S. research physicists, and by asking the latter to
describe in their own words search needs and queries, in an AIP
project, Professor Cochrane demonstrated value in developing usercentered criteria for the design of ideal information systems. In
addition, through Professor Cochrane’s research, the usefulness of
such categories as property, object, and method were demonstrated
as useful in organizing index terms. In another project, Professor
Cochrane and Stella Keenan analyzed over 20,000 abstracts from
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—January 1, 2002
1 year of Physics Abstracts, and based on the number of articles
abstracted, they ranked journals accordingly with six of the top
journals in every subfield of physics providing abstracts to 25% of
articles. Furthermore, using IBM-developed software, loaded on
Datatrol, Professor Cochrane concluded that although the Universal Decimal Classification (UDC) system can be used as part of an
automated information retrieval system in a batch or an interactive
mode, a custom-made index was a better choice than UDC. The
value of the above studies should not be judged based on the
gigantic proportions of each research project. Rather, it should be
noted as significant, that for some time now the importance of
user-centered criteria and quality of indexing have been judged
and treated as critical for research in the field of Library and
Information Science. User-centered studies and quality in indexing
still remain critical in the research of many important scholars in
the field today.
The article by Raya Fidel, The User-Centered Approach: How
We Got Here, is a basic introduction to a user-centered approach
in Library and Information Science without the author having
taken sides with any particular theoretical articulation of a usercentered approach. Fidel first explained basic assumptions and the
necessity of user-centered approaches and described some basic
concepts, such as information want, information demand, information need, and information use and information impact. Second,
the article discussed different types of user studies and instruments, such as interviews, log analysis, and observation, used by
different studies. According to the author, there are two major
assumptions in a user-centered approach. One, focus in research
must be to increase or update knowledge in Library and Information Science about users and of their searching behaviors and then
incorporate that knowledge in the design of better information
systems. In the past, a system’s design was based on universal
principles, and training was used to teach users to adapt to the
information system. Two, different patterns exist in how different
groups look for information. Research must first identify unique
user groups and then an analysis of the patterns for information
seeking and searching behavior within each group must be undertaken on an ongoing basis.
In her very timely article, Subject Access in Interdisciplinary
Research, Linda Smith talks about a research need to match
available, but scattered, information resources to the needs of
interdisciplinary scholars and the role information technology can
play in making such matching possible between users and electronic and/or digital resources. As Smith states, echoing Bates
(1996), not much research has been done to inform practice in
Library and Information Science in terms of how interdisciplinary
scholars find information across disciplines. With different terminology used for topics across various databases and different
objectives in constructing indices and also because of existence of
shallow depth indexing in some subject areas, the task of finding
information across disciplines is extremely challenging.
Albeit digital libraries provide for a digital pipeline for information to flow from many databases to a user, there are still
problems with actually constructing an effective searching mechanism and interface to enable searching across different database
vocabularies effectively. In terms of mapping vocabularies, Smith
quotes Lancaster (1988), who suggested integrating vocabulary
across different databases by creating conversion tables to connect
or integrate terminology. Another suggestion from Lancaster
(1988) includes a macrovocabulary, or the bringing together of
existing indexing languages under a newly built superstructure.
Tools and practical advice, however, exist to make searching
across different subject areas a bit easier, for now. For example, a
searcher may follow hyperlinks instead of going through subject
terms available in directories on-line. Following hyperlinks can be
very helpful, similar to citation chasing, in that links usually cover
similar topics or themes in various research areas. Applied research in a variety of areas now provides for overlaps in indexing
vocabularies (Olson and Strawn (1997) were mentioned for their
mapping between Library of Congress Subject Headings and Medical Subject Headings). In terms of future research, multifaceted
mapping, not only for controlled vocabularies, but also for linking
uncontrolled with controlled vocabularies to enhance search results and also integrative research in different information landscapes, user groups, and required interfaces are just some of the
exciting research areas that can benefit interdisciplinary researchers.
Quoting Harter (1986, p. 170), Karen M. Drabenstott, in Web
Search Strategies, defines search strategy as “an overall plan or
approach for a search problem” (p. 115), and proceeds to discuss
search strategies in relation to traditional information retrieval and
the WWW. The author begins by arguing that search strategies
available for use on traditional information retrieval systems do
not help users achieve their overarching goal, that of finding
information quickly, in Web-based environment. Boolean searching, because of a fairly high level of expertise that it requires, has
not had a great following among users of traditional information
retrieval systems. The author’s discussion of citation pearl growing
and building blocks for improved searching shows that there is still
a major connection to boolean-based search techniques necessary
for creating the building blocks and the successive fractions of
information from bigger chunks. In addition, use of only booleanbased searching on Web-based systems to support subject access
ignores use of hyperlinks by users for surfing and citation pearl
growing (or following links that connect users to Web-sites with
similar themes).
Drabenstott suggests a repackaging of old search strategies, as
well as offers new ones especially for the Web to take advantage
of statistical search techniques that, according to research, do
surprisingly well when compared to boolean searching engines and
techniques. According to Drabenstott, what users should do is to
understand and use Web-searching strategies and the Web-based
information systems will do the rest. There is no need on the part
of users to really learn specialized boolean searching for specific
Web-based search engines other than understand how to develop
and use different facets of the same concept during their searching.
Some of the Web-based search strategies suggested in this article
include: serendipity (surfing strategy), tool-based strategy (focuses
on searching subject-based directories, search engines and metasearch engines), amount search strategies (ways to locate, sample,
and collect information), citation pearl growing, successive fractions, URL guessing strategies, and a variety of step-by-step approaches. The discussion in this article, an optimistic view of
technologies and what users can do on-line with minimal support
from trained subject access specialists is somewhat misplaced, but
the search strategies are solid and can be used for instruction in
courses for Web-based searching and by users, anywhere and
anytime.
A review of past and present efforts, as well as speculations
offered for the future about enhancing subject access, are all part
of the focus of the article, Enhancing Subject Access to Monographs in Online Public Access Catalogs: Table of Contents added
to Bibliographic Records, by Vinh-The Lam. Improved subject
access, a goal for online public access catalogs (OPAC), was made
exceedingly difficult to attain or to pursue in a consistent basis in
libraries due to inadequacies in LCSH. Such inadequacies included, and continue to include, headings that lacked currency and
also were inconsistent, shallow and unclear. As a result, many have
been the projects that were initiated to address the above inadequacies.
The subject access project (SAP) by Professor Atherton during
1978 was a comprehensive effort to provide a framework for
enhancing subject access and balancing recall versus precision.
Meticulously following basic cataloging rules for the addition of
subject terms from tables of contents and end-of-book indexes,
subject terms were added in records for a database called Books.
Although costs about this project are not described by Lam,
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—January 1, 2002
59
precision, Lam reported, did not decrease as recall results increased. In addition, Lam mentions studies by Karen Markey
(1983) and by DeHart and Matthews (1989) that have demonstrated the value of adding table of contents and chapter titles,
respectively, as subject terms in a record. In the 1980s, the Enhanced Subject Project (ESP) experiment by the Australian Defense Force Academy, based on Professor Atherton’s SAP work,
demonstrated user-centered benefits in terms of access and cost
effectiveness from adding table of contents items as subject terms.
Another project, this one in Sweden, also showcased the value and
cost effectiveness of adding captions from table of contents and
graphs as subject terms. All the research and practice successes led
to additional discussion in 1991 by the Library of Congress for
standards in adding subject terms from indexes, abstracts, and
books reviews to records. At the end, it was decided that more
attention should be paid to table of contents (TOC). The concern
that came about from the above projects and publications by the
Library of Congress on standards for TOC gave rise to additional
discussion, briefly reported by Lam, about benefits of automated
processes vs. intellectual mediation by librarians and other experts.
Albeit Lam sees a cost effectiveness in the automated addition
of TOC entries, the author laments that “convenience and low
costs have been given preference over a good and proven technique” (the technique is SAP). At this stage of research and
application it is too early, although early indications are positive,
on the value of TOC entries as subject terms. Early indications,
however, are that the addition of TOC entries as subject terms
enhances recall and serves the interests of users, including saving
time.
Search descriptions, instead of search commands, occupy the
attention of Eric H. Johnson in his article, Objects for Distributed
Heterogeneous Information Retrieval. Johnson is concerned with
providing an explanation for how search objects, utilizing search
descriptions, are constructed to search globally distributed heterogeneous databases. According to the author, a shift in research
and practice from search commands to search descriptions is
necessary to accommodate searching across different databases
that may not use the same identifiers for search commands, and in
some cases may not include similar sets of indices or a database
schema. A search description is submitted to many different databases through the use of a client–server interaction. Johnson argues
that most clients are still dumb terminals or even personal computers with much computing and software power that goes underutilized because of excessive reliance on a server’s computing
power and output. Some researchers and practitioners have suggested creating mapping programs to connect the many systems
that support the heterogeneous databases. Johnson, however, argues for an “object layer on top of existing systems and uses a
specialized client utilizing those objects to provide unified ways of
querying and viewing distributed heterogeneous databases” (p.
174). Johnson’s solution, as argued in his article, relied (or required) powerful personal computers that some users and institutions may not have and maintained a burden on users to be able to
construct boolean logic search descriptions. In all fairness, however, in his discussion, Johnson suggested that technical development should move in the direction of designing more user-friendly
interfaces that provide pull-down menus and drag and drop capabilities to support construction of boolean logic search descriptions
by users.
In addition to the above capabilities, according to Johnson,
client software should also support query persistence, search documents with persistent concurrency, offer qualifier precision reduction, subject qualifier extensions and language translation,
among other functionality. Query persistence refers to the ability to
hold and save the same query in the system. Persistent concurrency
is the ability to keep and modify the query and finally update the
search document. In Johnson’s scheme, a search document is a
window that holds properties of the object, or the search query,
60
along with a list of all the records pulled together from many
heterogeneous databases. Language translation is the ability to
translate queries from one language to another. The qualifier
precision reduction matches a search description’s qualifier (let’s
say AB for abstract) to a default index that another database,
lacking the AB index, may recognize. The subject qualifier extension refers to action taken to extent basic identifiers such as Author
to recognize personal and corporate variations submitted to them
as a search description by a user. The point for both qualifier
precision reduction and subject qualifier extension is to offer users
some records in return, hopefully close to what the user may have
wanted, and not to respond with an error message or some other
failure message for the submitted search description. Depending
on what system a user employs to search and whether or not
Z39.50 or Dublin Core is used, precision reduction and extension
routines maybe handled differently.
In summary and in closing, although the articles did run the
gamut on subject access issues, a reader’s appreciation about
enhanced subject access as a way to save time for users and reduce
library costs can be so much the better from such wide coverage.
Content in this edited book can be useful to stimulate discussion in
many graduate level courses, and can serve as useful background
for anyone writing and also teaching on subject access both in
traditional areas and also in Web-based information retrieval.
Anastasis (Tassos) D. Petrou
UCLA Information Studies Department
GSE & IS Building, Mailbox 951520
405 Hilgard Ave.
UCLA
Los Angeles, CA 90095-1520
E-mail: [email protected]
Published online 7 November 2001
DOI: 10.1002/asi.10021
Cyber-Marx: Cycles and Circuits of Struggle in High-Technology Capitalism. Nick Dyer-Witheford. Urbana, IL: University
of Illinois Press; 1999: 416 pp. Price: $21.95 (ISBN: 0-252-06795-9.)
Dyer-Witheford, an assistant professor at the Faculty of Information and Media Studies at the University of Western Ontario,
has written an important book that, both in a clear and a scholarly
manner discusses the political economy of information and communication technologies from a late Marxist point of view. Though
published in 1999, Dyer-Witheford’s book is gaining renewed
importance with increasingly powerful “antiglobalization” protests
and with worldwide attention on one of the book’s main subjects,
the Italian autonomous Marxist philosopher and activist, Antonio
Negri [who, together with Duke professor, Michael Hardt, has
published the blockbuster, Empire (Harvard University Press,
2000)]. Dyer-Witheford’s book has remained relatively unreviewed in information science journals, which is unfortunate,
given that it is being widely taught in the humanities and in
communications, and that it is even being taught in some of the
more progressive library and information science curriculums.
This book certainly deserves a wider audience in our field both for
the importance of its content and for its clarity of presentation of
that content.
In Cyber-Marx, Dyer-Witheford situates information and communication technologies within a postmodern Marxism. In the
opening chapters he discusses various modern and late modern
readings of the social and cultural meaning of information and
communication technologies, ranging from those of Daniel Bell’s
postindustrialism, to the different positions that classical Marxism
has taken toward such technologies, and to various post-Fordist
positions. The position that Dyer-Witheford most argues for in his
book, however, is that of Antonio Negri’s work, particularly those
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—January 1, 2002
positions in regard to information and communication technologies that Negri had articulated in The Politics of Subversion and in
various articles during the 1980s and early 1990s.
Particularly the first four chapters and the final, ninth chapter,
of Cyber-Marx provide an excellent introduction to Negri’s work
and for understanding the work of other, akin, Italian theorists,
such as Paolo Virno and Maurizio Lazzarato, on the topic of the
social meaning and importance of information and communication
technologies today. [Dyer-Witheford’s book, together with many
of the primary texts in Virno and Hardt’s collection Radical
Thought in Italy: A Potential Politics (University of Minnesota,
1996), form a nice core introduction to autonomous Marxist
thought on this topic.] As Dyer-Witheford explains, information
and communication technologies are understood by these theorists
in terms of Marx’s discussion of “general intellect” in the latter’s
Grundrisse. The concept of “general intellect” in Marx refers to a
moment in the development of capital when the dominant means
of production are to be found in the social relations and the
intellectual capacities of the workers (an early premonition of the
dominant social conditions for production in so-called “information economies” or what is often termed, “post-Fordism”). DyerWitheford neatly and skillfully argues for how class-recomposition
occurs at the moment when it seems capital has most penetrated
into everyday life in the form of attempts to incorporate all of
social relations, affect, and language into capital production (i.e.,
as “social capital”). As capital becomes more dependent on the
“knowledge-economy” and on technologies that link workers,
however, it stretches and undermines its traditional controls over
those workers. Dialectically, class recomposition follows on the
heels of capital’s overextension, this time resulting in a “class” that
goes far beyond industrialism’s “working class.”
In addition, Dyer-Witheford’s book argues that newer information and communication technologies provide a means for constructing global social organizations outside the capitalist/neoliberal framework of “globalization.” Today, we can see the fruits
of such organizing in the so-called “antiglobalization” (really,
counter-capitalist globalization) protests and the role that “autonomous” Web-sites play in organizing these highly mobile and
diverse political forms into other models and means for globalization.
Dyer-Witheford’s book is an excellent introduction to Antonio
Negri’s work, and it is an excellent introduction to the political
theory of the “antiglobalization” protests. It takes the reader
through difficult French philosophy and Italian political theory and
history beginning in the 1960s and 1970s, and shows how concepts
developed out of that theory and history reach through the 1980s
and into the present day. The book’s bibliography is quite exhaustive of relevant, particularly, English and French sources, and it is
useful for research. The book’s clarity of presentation makes it
quite useful for academic classes in the social and cultural studies
of information and communication, in classes on the political
economy of information, for classes in knowledge management,
critical management studies, history of information and history of
communication, and for LIS foundation courses. Generally, this
book is very useful for individuals and academic classes interested
in the political, social, and cultural meaning of recent information
and communication technologies. The book is rich in content and
reads easily.
Ron Day
Library and Information Science Program
Wayne State University
106 Kresge Library
E-mail: [email protected]
Published online 9 November 2001
DOI: 10.1002/asi.10022
A Sociological Theory of Communication: The Self-Organization of the Knowledge-Based Society. Loet Leydesdorff. Parkland, FL: Universal Publishers; 2001: 351 pp. Price: $29.95.
(ISBN: 1-58112-695-6.)
In this book, Loet Leydesdorff sets out to answer the question
“Can society be considered as a self-organizing (autopoietic) system?” (p. 1). To do so, Leydesdorff confronts the traditional
problem in sociological theory, namely the integration of individual action into the social structure without loosing the essence of
either. In the process, he develops a general sociological theory of
communication, as well as a specific theory of scientific communication designed to analyze complex systems such as the European Information Society. For his efforts, Leydesdorff is most
successful in developing his general theory of communication,
reasonably successful in deriving from it a theory of scientific
communication, and least successful in applying the latter theory
to the study of an emerging European Information Society.
Loet Leydesdorff is a Senior Lecturer at the University of
Amsterdam’s Department of Communication Studies. He holds a
B.Sc. in Chemistry, M.Sc. in Biochemistry, M.A. in Philosophy,
and a Ph.D. in Sociology. He has published extensively in the areas
of scientometrics, the philosophy of science, sociology of innovation, and social network analysis (for a detailed listing of his
publications, see http://home.pscw.uva.nl/lleydesdorff/list.htm).
The book is organized into 10 chapters, each corresponding to
a previously published article (for a list, see p. viii). Fortunately,
there must have been some rewriting involved, because the book
reads mostly as one work rather than as a collection of related
articles masquerading as a monograph. The 10 chapters are organized into an introduction (Chapter 1) and three parts. Part One
(“Sociological Reflections”) contains “Towards a Sociological
Theory of Communication” (Chapter 2), “The Evolution of Communication Networks” (Chapter 3), and “The Non-Linear Dynamics of Sociological Reflections” (Chapter 4). Part Two (“Is Society
a Self-Organizing System?”) contains “New Perspectives on Empirical Theories” (Chapter 5), “A Triple-Helix of University–
Industry–Government Relations” (Chapter 6), “The European Information Society” (Chapter 7), and “Regime Changes and Sustainable Development” (Chapter 8). Part Three (“Philosophical
Reflections”) contains “Uncertainty and the Communication of
‘Time’” (Chapter 9) and “The Expectation of Social Change”
(Chapter 10).
The first five chapters are devoted to laying out the goal of the
book, reviewing the extensive literature, and developing a general
sociological theory of communication. The special theory of scientific communication is derived from the general theory in Chapter 6. Chapter 7 is an attempt to demonstrate how the theory of
scientific communications might be applied in an empirical study,
in this case to detecting an emerging European Information Society. This chapter is the least successful and satisfactory of all the
chapters. It is not well integrated into the flow of the book, perhaps
retaining too much of its original form as previously published.
Chapter 7 is the least well developed chapter, leaving one with the
sense of being dropped into the middle of a presentation that
suddenly stops before any useful or enlightening conclusions are
drawn. Chapter 8 is a further elaboration of the implications that
Leydesdorff’s theory of scientific communication will have on the
methodologies of future empirical research. This chapter is rather
sketchy, and leaves much to be desired in terms of useful insights
into designing methodologies more in line with these two new
theories. The final two chapters bring together the work of the first
five chapters and further amplify and clarify both theories of
communication, particularly in terms of “time” and “uncertainty.”
The book concludes with a 15-page bibliography, an index of
authors cited in the text, and a subject index.
The body of the book is illustrated with line drawings, charts,
and tables of varying utility. Most are useful in clarifying a concept
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—January 1, 2002
61
(e.g., Fig. 5.2, p. 146) or in supporting an argument (Fig. 6.2, p.
191), while a few seemed superfluous or pointless (e.g., Fig. 4.1,
p. 130). Several figures contained errors or omissions. For example, the values for Sweden, France, and The Netherlands referenced to Figure 7.6 (p. 231) in the text do not appear in the figure,
while the labeling for Figure 7.7 (p.232) and the text references to
it do not correspond well and are confusing to sort out. The
production values for the illustrations are uneven as well. Most of
the illustrations are well done (e.g., Fig. 5.3, p. 148), with a few are
not (Fig. 3.4, p. 102). The lack of a list of illustrations makes these
problems all the more annoying. Fortunately, the quality and the
accuracy of the vast majority of figures and tables are good, which
makes up for the few that are not. The text of the book is very
clearly and cleanly written, and the concepts fully explained using
a minimum amount of discipline specific jargon and quantification.
With the exceptions previously noted, the presentation is thorough
and in depth.
Leydesdorff’s general theory draws on existing theoretical sociological models. The models most heavily used are those developed by Niklas Luhmann (self-organizing social systems), Jurgen
Habermas (communication as social action, uniting individual
action, and social organization), Claude Shannon (information as
uncertainty), and Anthony Giddens (structuration, social structure
based on the dualism of action theory and institutional analysis).
For Leydesdorff, the relationship between the individual and society is one of actors and networks, each existing as a separate
system that nonetheless “operate in each other’s environment, and
over time” (p. 155). Actors and networks are “structurally coupled
while each performs its own operation” (p. 158). Actors function
as the network nodes, while the network links the nodes. Actors/
nodes communicate with each other via the network. Communication is required for the network to survive. Networks that cease
to communicate cease to exist. It is communicate or die. Even so,
the participation of each actor/node in the network is not required,
as communication “patterns can be maintained in the network over
time,” even when some of the actors/nodes have stopped functioning or have been replaced with new actors/nodes (p. 159). “Each
change in the network requires action” at the local actors/nodes,
“but the system of reference for the change is the network” not its
constituent actors/nodes (p. 159). This results in a system with
memory, albeit a virtual one that is “physically located in the
actor” (node) (p. 159).
The communication process is generated by the network when
one or more actors/nodes create a “disturbance” by acting (transmitting “information” as defined by Shannon) through the network
(p. 159). Frequent communication causes the network to grow and
become more complex. Because the communications process can
include communications about communications, the network can
become self-referential (“self-reflexive”), the result of a “recursive
operation of structure upon the information previously contained
within the network,” within its virtual memory (p.159). Because
the communication system is made up of decentralized actors/
nodes, communication is “distributed by nature,” and therefore,
“contains uncertainty” (p. 160).
Networks contain not only information (as per Shannon), but
also meaning. For meaning to exist, it requires an actor/node that
can “receive the message [via the network], deconstruct it with
respect to the expected information, and assess this information
reflexively with references to its own structure” or perception of its
own place in the network (p. 174). Thus, if a system can “position
messages reflexively, it is able to give the messages meaning; if
not, the system can only disturb the content by generating noise”
(p. 174).
Communication acts to create and define networks of communication, which in turn define or create social organization by
discourse. Social reality is based on language and its use. Hence,
social reality can be deconstructed into its constituent discourses at
62
any specified level of analysis by defining the system of reference.
Because chaos or uncertainty is inherent in the network it is also
inherent in a social system. A social system does not naturally seek
stasis or equilibrium, but is constantly “emerging,” or is only an
“expectation” (pp. 300 –301). The appropriate metaphor for describing such a situation is not biological (society as a living
system) but cybernetic, using a “model of parallel and distributed
processing” to understand the “intersecting routines” (p. 246). The
theorist or observer then understands itself “in terms of a reconstruction, as theories are constructed and reconstructed as discursive reflections of universes that are envisaged” (pp. 192–193).
The communication system is a postmodern system, decentralized
and discursive in nature. Society is defined by its communication
network(s), and therefore, should never be reified as an entity
separate from the communication process, as society can always be
deconstructed into its component discourses (p. 160).
From his general theory of communications, Leydesdorff derives a sociological theory of science or, because society is a
function of its component discourses, a model of scientific communication. This model is based on the relationship between the
cognitive tasks of the individual scientists, defined as personal
communications among a particular research group, and science as
a social organization, defined as communication between research
groups and their surrounding environment. That is, “what does the
research group do when it researches,” and “what will these
[disciplinary/research field] networks process as a signal from the
research group?” (p. 170). Each functions as its own “specific
processors, as parallel densities in the relevant network,” each
operating according to its own “rules of the game,” i.e., programs.
Within the research group, the concern is the degree to which it can
“organize its own self-referential loop as a group” (p. 171). In
relation to its colleagues (competitors), a given research group
“may lose or gain because of its changing position in terms of the
signals which it can send to specific networks [i.e., intellectual
fields of study/disciplines] in its environment,” i.e., science (p.
170).
Science then is the codification of “reflexive communications
on top of a . . . network of social relations among scientists” (p.
190). As an historical process, “science is an order emerging from
networks of communication with dynamics relatively independent
of the carrying authors,” or the actors/nodes of the network [emphasis original] (p. 190). Kuhnian scientific revolutions then are
the result of competing paradigms disturbing “one another by
performing in an emerging “reality’ of scientific discourses and
science-based transformations” (p. 301).
This brief summary cannot do justice to the intellectual depth,
philosophical richness of the theoretical models, and their implications presented by Leydesdorff in his book. Next to this, the
caveats presented earlier in this review are relatively minor. For all
that, this book is not an “easy” read, nor is it for the theoretically
or philosophically faint of heart. The content is certainly accessible
to those with the interest and the stamina to see it through to the
end, and would repay those who reread it with further insight and
understanding. This book is recommended especially for the reader
who is looking for a well-developed, general sociological theory of
communication with a strong philosophical basis upon which to
build a postmodern, deconstructionist research methodology.
Eric G. Ackermann
University Libraries
Virginia Polytechnic Institute and State University
P.O. Box 90001
Blacksburg, VA 24062-9001
E-mail: [email protected]
Published online 27 November 2001
DOI: 10.1002/asi.10009
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—January 1, 2002
Документ
Категория
Без категории
Просмотров
5
Размер файла
114 Кб
Теги
10021, asi
1/--страниц
Пожаловаться на содержимое документа