Already a subscriber? - Login here
Not yet a subscriber? - Subscribe here

Browse by:



Displaying: 11-20 of 1408 documents

Show/Hide alternate language

language and mind
11. Epistemology & Philosophy of Science: Volume > 56 > Issue: 3
Andrei V. Nekhaev Андрей Викторович Нехаев
Yablo’s Paradox: Is the Infinite Liar Lying to Us?
Парадокс Ябло: лжет ли нам бесконечный лжец?

abstract | view |  rights & permissions | cited by
In 1993, the American logic S. Yablo was proposed an original infinitive formulation of the classical ≪Liar≫ paradox. It questioned the traditional notion of self-reference as the basic structure for semantic paradoxes. The article considers the arguments underlying two different approaches to analysis of proposals of the ≪Infinite Liar≫ and understanding of the genuine sources for semantic paradoxes. The first approach (V. Valpola, G.-H. von Wright, T. Bolander, etc.) imposes responsibility for the emergence of semantic paradoxes on the negation of the truth predicate. It deprives the ≪Infinite Liar≫ sentences of consistent truth values. The second approach is based on a modified version of anaphoric prosententialism (D. Grover, R. Brandom, etc.). The concepts of truth and falsehood are treated as special anaphoric operators. Logical constructs similar to the ≪Infinite Liar≫ do not attribute any definite truth values to sentences from which they are composed, but only state certain types of relations between the semantic content of such sentences.
12. Epistemology & Philosophy of Science: Volume > 56 > Issue: 3
Vladimir I. Shalak Владимир Иванович Шалак
On “Yablo’s Paradox: Is the Infinite Liar Lying to Us?” by Andrei V. Nekhaev
О статье А.В. Нехаева «Парадокс Ябло: лжет ли нам бесконечный лжец?»

abstract | view |  rights & permissions | cited by
It is believed that the paradoxes emerge due to self-reference. The interest to the Yablo’s paradox is caused by the fact that it does not contain direct or indirect self-reference. The analysis of this paradox has the following disadvantages: 1) incorrect retelling of cited sources, including the Yablo’s paradox; 2) attribution to the cited authors of provisions that they did not approve; 3) carelessness in the use of logical symbolism; 4) confusion in terminology related to the concepts of Truth and False; 5) insufficiently substantiated conclusions.
vista
13. Epistemology & Philosophy of Science: Volume > 56 > Issue: 3
Vitaly V. Ogleznev Виталий Васильевич Оглезнев
The “Open Texture” of Empirical Concepts and Linguistic Anti-Reductionism of Friedrich Waismann
«Открытая текстура» эмпирических понятий и лингвистический антиредукционизм Фридриха Вайсмана

abstract | view |  rights & permissions | cited by
The article presents a careful analysis of the idea of the “open texture” of empirical concepts and the problems of verification in the way that they were formulated by Friedrich Waismann. The idea of the “open texture” means for Waismann a certain type of a linguistic indeterminacy or a sort of lack of definition, which must be distinguished from, and linked to, another types like vagueness or ambiguity. It is shown that empirical statements are not conclusively verifiable for two different reasons: the incompleteness of description of the material object and the open texture of the terms involved. We cannot conclusively verify statements in which the empirical concepts are used, because we cannot define these concepts in an exhaustive way because of their open texture. Thus, the definition of the concept will be incomplete. Waismann’s approach to definition plays here a key role, and it is directly related to the open texture of concepts. The author proposes interpreting the open texture as an immanent property of the concept, as something that is embedded in it a priori, and which can cause a vagueness. Nevertheless, an open texture must be distinguished form a vagueness. This leads to the conclusion that an open texture is a possibility of vagueness; vagueness can be remedied by giving more accurate rules, open texture cannot. In this sense, the “open texture” of a language allows for a more precise definition of concepts (by adjusting the definition) if appropriate circumstances arise. This justifies the thesis that the argument of the open texture is the ontological basis of the linguistic anti-reductionism of Friedrich Waismann.
14. Epistemology & Philosophy of Science: Volume > 56 > Issue: 3
Trevor Pinch Тревор Пинч
From Technology Studies to Sound Studies: How Materiality Matters
От исследования технологий к звуковым исследованиям

abstract | view |  rights & permissions | cited by
In this paper I put in dialogue two areas of scholarship: Technology Studies and Sound Studies. Within Technology Studies I discuss the influential social construction of technology approach and illustrate it with the history of the moog electronic music synthesizer, the first commercial music synthesizer. I stress the role of standardization of keyboards and the key role played by users in the development of this technology. I examine certain iconic sounds that the moog synthesizer produces and discuss the stabilization of sound. It is argued that just as technologies can be traced as stabilizing over time, sounds also can be traced with certain sounds stabilizing and being taken up by users whilst other sounds fail to stabilize. The technology required to produce a sound, performance practice, and wider cultural concerns such as the naming of sounds are crucial ingredients in the stabilization of sound.
case-studies – science studies
15. Epistemology & Philosophy of Science: Volume > 56 > Issue: 3
Igor F. Mikhailov Игорь Феликсович Михайлов
Computational Knowledge Representation in Cognitive Science
Вычислительная репрезентация знаний в когнитивной науке

abstract | view |  rights & permissions | cited by
Cognitive research can contribute to the formal epistemological study of knowledge representation inasmuch as, firstly, it may be regarded as a descriptive science of the very same subject as that, of which formal epistemology is a normative one. And, secondly, the notion of representation plays a constitutive role in both disciplines, though differing therein in shades of its meaning. Representation, in my view, makes sense only being paired with computation. A process may be viewed as computational if it adheres to some algorithm and is substrate-independent. Traditionally, psychology is not directly determined by neuroscience, sticking to functional or dynamical analyses in the what-level and skipping mechanistic explanations in the how-level. Therefore, any version of computational approach in psychology is a very promising move in connecting the two scientific realms. On the other hand, the digital and linear computational approach of the classical cognitive science is of little help in this way, as it is not biologically realistic. Thus, what is needed there on the methodological level, is a shift from classical Turing-style computationalism to a generic computational theory that would comprehend the complicated architecture of neuronal computations. To this end, the cutting-edge cognitive neuroscience is in need of а satisfactory mathematical theory applicable to natural, particularly neuronal, computations. Computational systems may be construed as natural or artificial devices that use some physical processes on their lower levels as atomic operations for algorithmic processes on their higher levels. A cognitive system is a multi-level mechanism, in which linguistic, visual and other processors are built on numerous levels of more elementary operations, which ultimately boil down to atomic neural spikes. The hypothesis defended in this paper is that knowledge derives not only from an individual computational device, such as a brain, but also from the social communication system that, in its turn, may be presented as a kind of supercomputer of the parallel network architecture. Therefore, a plausible account of knowledge production and exchange must base on some mathematical theory of social computations, along with that of natural, particularly neuronal, ones.
16. Epistemology & Philosophy of Science: Volume > 56 > Issue: 3
Olga V. Popova Ольга Владимировна Попова
Human and Human Death as a Neuroscience Ethics Problem
Человек и его смерть как проблема этики нейронаук

abstract | view |  rights & permissions | cited by
The article deals with the philosophical problem field of modern neuroethics. The general idea of the state of modern neuroethics is given and it is shown that research in this area encompasses both fundamental problems that classically belonged to the field of philosophy research (for example, such as the problem of psychophysical dualism, the physical bases of consciousness, freedom of will and its interrelation with brain activity) and problems with applied orientation, explicating the ethical-social and legal dimension of innovation development in the field of neuroscience and actualizing the analysis of their social risks. It is shown that the development of neuro-ethics in the modern world became possible thanks to a special way of functioning of modern science, which was called the technology of science and to the development of the phenomenon of technological development ethification, which in practical terms was expressed in the study of the problem of the correspondence of the results of innovative scientific and technical projects to the interests of various social groups, their expectations and values, and also helped to determine the status of new technologies in relation to social reality. The article gives an idea of the existing normative field necessary for the development of neuro-ethics. On the basis of using the resource of discourse analysis of R. Harre, who singled out personal grammar (P-grammar), organism grammar (O-grammars) and molecular grammar (M-grammar), a philosophical analysis of such a problem of modern neuroethics as brain death was carried out. Also was given a structural description of new biotechnogenic identities of human being. The conclusion is drawn that the concept of brain death is an example of the conventional nature of scientific truth, the formation of which is influenced by various socio-cultural and economic factors. In the context of the development of neuroscience and the emergence of new methods of brain regeneration, it can be rethought.
17. Epistemology & Philosophy of Science: Volume > 56 > Issue: 3
Vladislav A. Shaposhnikov Владислав Алексеевич Шапошников
To Outdo Kuhn: on Some Prerequisites for Treating the Computer Revolution as a Revolution in Mathematics
Преодолеть Куна: о некоторых предпосылках рассмотрения компьютерной революции как революции в математике

abstract | view |  rights & permissions | cited by
The paper deals with some conceptual trends in the philosophy of science of the 1980‒90s, which being evolved simultaneously with the computer revolution, make room for treating it as a revolution in mathematics. The immense and widespread popularity of Thomas Kuhn’s theory of scientific revolutions had made a demand for overcoming this theory, at least in some aspects, just inevitable. Two of such aspects are brought into focus in this paper. Firstly, it is the shift from theoretical to instrumental revolutions which are sometimes called “Galisonian revolutions” after Peter Galison. Secondly, it is the shift from local (“little”) to global (“big”) scientific revolutions now connected with the name of Ian Hacking; such global, transdisciplinary revolutions are at times called “Hacking-type revolutions”. The computer revolution provides a typical example of both global and instrumental revolutions. That change of accents in the post-Kuhnian perspective on scientific revolutions was closely correlated with the general tendency to treat science as far more pluralistic and transdisciplinary. That tendency is primarily associated with the so-called Stanford School; Peter Galison and Ian Hacking are often seen as its representatives. In particular, that new image of science gave no support to a clear-cut separation of mathematics from other sciences. Moreover, it has formed prerequisites for the recognition of material and technical revolutions in the history of mathematics. Especially, the computer revolution can be considered in the new framework as a revolution in mathematics par excellence.
interdisciplinary studies
18. Epistemology & Philosophy of Science: Volume > 56 > Issue: 3
Sergei Yu. Shevchenko Сергей Юрьевич Шевченко
Hierarchy of Technoscience Estimation: the Case of Drug Equivalence Dispute
Иерархия оценок технонауки

abstract | view |  rights & permissions | cited by
Semantic framework of the discussion about the equivalence and interchangeability of the original drugs and generics is considered in the article. Generics are identical to the original drugs in terms of chemical structure, nevertheless some patients and doctors consider that generics are less effective and have more severe side effects than original drugs. These discussions are considered as an example of a public deliberation concerning the achievements of technoscience. The conflicting parties determined the identity either from the chemical structure of the drug (according to D. Chalmers – secondary intensional) or from the phenomenal characteristics of the situation of its application (primary intensional). In this regard, the method of resolving the conflict is the alignment of the hierarchy of methods for determining equivalence in biomedicine. The methodology of evidence-based medicine already has such a hierarchy, which makes it possible to determine the validity of outcomes of clinical trial. According to this hierarchy, the phenomenal characteristics of the outcome of treatment (quality and life expectancy of patients) are more important than instrumentally established indicators. Thus, new clinical data on the efficacy and safety of a generic should give the opportunity to cancel the recognition of drug equivalence. More generally, this means that technoscience achievement which directly affects a person can only be assessed by him/her. The recognition of such priority is the basis of the ‘division of linguistic labor’ during public deliberation concerning the achievements of technoscience.
archive
19. Epistemology & Philosophy of Science: Volume > 56 > Issue: 3
Alexander A. Pechenkin Александр Александрович Печенкин
The Concept of Probability in Mathematics and Physics (on the 1920–30 Discussions in Soviet Scientific Literature)
Понятие вероятности в математике и физике (дискуссии 20–30-х гг. в СССР)

abstract | view |  rights & permissions | cited by
In the Soviet scientific literature of 1920‒30 the concept of probability was holly debated. The frequency concept which was proposed by R. von Mises became popular among Soviet physicists belonging to the L.I. Mandelstam community. Landau and Lifshitz were also close to this concept in their famous course of theoretical physics. A.Khinchin, a mathematician who cooperated with Kolmogorov, opposed to the frequency conception. In this paper we try to demonstrate that the frequency position was connected with the anthropomorphous approach to physics, whereas Khinchin’s positions implied the criticism of anthropomorphism and put forward the ideal of objective knowledge.
new trends
20. Epistemology & Philosophy of Science: Volume > 56 > Issue: 3
Evgeny N. Ivakhnenko Евгений Николаевич Ивахненко
Annmarie Mol on the Way to Multiple Ontologies
Аннмари Мол на пути к множественным онтологиям

abstract | view |  rights & permissions | cited by
The article critically examines the ideas of the Dutch philosopher and ethnologist Annemarie Mol. Her main work, “The Body Multiple: Ontology in Medical Practice”, is mainly subjected to analysis. According to the author of the article, A. Mol managed to offer his own version of the “ontological turn” and, perhaps, change the accents in the entire theoretical repertoire of actor-network theory (ANT). She, carrying out a “police investigation” in hospital Z, was able to show the multiplicity of ontologies of the body and its disease / illness. What is called the illness is represented by a large number of actors – people, their relationships, tools, diagnostic methods, etc. – which together can be represented as an assembly or assemblage. The “choreography of the ontology” of such an assembly is contingent, since it may be different.