Sign up with your email address to be the first to know about new products, VIP offers, blog features & more.
- For every complex problem, there is an answer that is clear, simple and wrong Joachim Sturmberg, Stefan Topolski: this is an examination of the notions of knowledge, truth and certainty as they apply to medical research and patient care. The human body does not behave in mechanistic but rather complex adaptive ways; thus, its behaviour to challenges is non-deterministic. This insight has important ramifications for experimental studies in health care and their statistical interrogation that are described in detail. Four implications are highlighted: one, there is an urgent need to develop a greater awareness of uncertainties and how to respond to them in clinical practice, namely, what is important and what is not in the context of this patient; two, there is an equally urgent need for health professionals to understand some basic statistical terms and their meanings, specifically absolute risk, its reciprocal, numbers needed to treat and its inverse, index of therapeutic impotence, as well as seeking out the effect size of an intervention rather than blindly accepting P-values; three, there is an urgent need to accurately present the known in comprehensible ways through the use of visual tools; and four, there is a need to overcome the perception, that errors of commission are less troublesome than errors of omission as neither's consequences are predictable.
- Fair Wages and Foreign Sourcing Gene M. Grossman, Elhanan Helpman: On general equilibrium models for studying the impact of workers’ relative-wage concerns on resource allocation and the organization of production, Pareto efficiency, and the distinction between 'closed' and 'open' economies.
- Black Holes Do Not Erase Information The "information loss paradox" in black holes—a problem that has plagued physics for nearly 40 years—may not exist - "Radiation from a Collapsing Object is Manifestly Unitary" according to a new University at Buffalo study.
- First passage times in integrate-and-fire neurons with stochastic thresholds Wilhelm Braun, Paul C. Matthews, and Rudiger Thul: The authors consider a leaky integrate–and–fire neuron with deterministic subthreshold dynamics and a firing threshold that evolves as an Ornstein–Uhlenbeck process. The formulation of this minimal model is motivated by the experimentally observed widespread variation of neural firing thresholds. They show numerically that the mean first passage time can depend non-monotonically on the noise amplitude. For sufficiently large values of the correlation time of the stochastic threshold the mean first passage time is maximal for non-vanishing noise.They then provide an explanation for this effect by analytically transforming the original model into a first passage time problem for Brownian motion. This transformation also allows for a perturbative calculation of the first passage time histograms. In turn this provides quantitative insights into the mechanisms that lead to the non-monotonic behaviour of the mean first passage time. The perturbation expansion is in excellent agreement with direct numerical simulations. The approach developed here can be applied to any deterministic subthreshold dynamics and any Gauss–Markov processes for the firing threshold. This opens up the possibility to incorporate biophysically detailed components into the subthreshold dynamics, rendering our approach a powerful framework that sits between traditional integrate-and-fire models and complex mechanistic descriptions of neural dynamics.
- Deterministic Relativistic Quantum Bit Commitment Relativistic quantum cryptography exploits the combined power of Minkowski causality and quantum information theory to control information in order to implement cryptographic tasks: what progress has been made in such methodologies?
- The Probabilistic No Miracles Argument Jan Sprenger: This paper develops a probabilistic reconstruction of the No Miracles Argument (NMA) in the debate between scientific realists and anti-realists. It is demonstrated that the persuasive force of the NMA depends on the particular disciplinary context where it is applied, and the stability of theories in that discipline. Assessments and critiques of “the” NMA, without reference to a particular context, are misleading and should be relinquished. This result has repercussions for recent anti-realist arguments, such as the claim that the NMA commits the base rate fallacy (Howson, 2000; Magnus and Callender, 2004). It also helps to explain the persistent disagreement between realists and anti-realists.
- Cosmic Acceleration in a Model of Fourth Order Gravity Shreya Banerjee, Nilesh Jayswal, and Tejinder P. Singh: A forth-order model of gravity is investigated having a free length parameter, and no cosmological constant or dark energy. The authors consider cosmological evolution of a flat Friedmann universe in this model for the case that the length parameter is of the order of present Hubble radius. By making a suitable choice for the present value of the Hubble parameter, and value of third derivative of the scale factor (the 'jerk') the authors find that the model can explain cosmic acceleration to the same degree of accuracy as the standard concordance model. If the free length parameter is assumed to be time-dependent, and of the order of the Hubble parameter of the corresponding epoch, the model can still explain cosmic acceleration, and provides a possible resolution of the cosmic coincidence problem. We also compare redshift drift in this model, with that in the standard model.
- Regularities, Natural Patterns and Laws of Nature Stathis Psillos: The goal of this paper is to outline and defend an empiricist metaphysics of laws of nature. The key empiricist idea is that there are regularities without regularity-enforcers. Differently put, there are natural laws without law-makers of a distinct metaphysical kind. This outline relies on the concept of a ‘natural pattern’ and more significantly on the existence of a network of natural patterns in nature. The relation between a regularity and a pattern will be analysed in terms of mereology. Here is the road map. In section 2, a brief discussion on the relation between empiricism and metaphysics, aiming to show that an empiricist metaphysics is possible. In section 3 offers arguments against stronger metaphysical views of laws. Then, in section 4, nomic objectivism is motivated. In section 5 addresses the question ‘what is a regularity?’ and develop a novel answer to it, based on the notion of a pattern. In section 6 an analysis of the notion of pattern is detailed, and in section 7 raises the question: ‘what is a law of nature?’, the answer to which is: a law of nature is a regularity that is characterised by the unity of a natural pattern.
- Arrows without time: a shape-dynamic account of the time-asymmetry of causation Nguyen, D. N.: Contemporary approaches to the rapprochement of the time-asymmetry of causation and the time-symmetry of fundamental physics often appeal to the thermodynamic arrow. Nguyen gives an overview of these approaches and criticisms of them, and argues that appealing to the thermodynamic arrow is a problematic strategy since it requires us to commit to the excess metaphysical baggage of absolute space and absolute time. He then develops a new account drawing on recent work on the theory of shape dynamics, which avoids this difficulty.
- Philosophy, logic, science, history Tim Crane: analytic philosophy is sometimes said to have particularly close connections to logic and to science, and no particularly interesting or close relation to its own history. It is argued here that although the connections to logic and science have been important in the development of analytic philosophy, these connections do not come close to characterizing the nature of analytic philosophy, either as a body of doctrines or as a philosophical method. We will do better to understand analytic philosophy – and its relationship to continental philosophy – if we see it as a historically constructed collection of texts, which define its key problems and concerns. It is true, however, that analytic philosophy has paid little attention to the history of the subject. This is both its strength – since it allows for a distinctive kind of creativity – and its weakness – since ignoring history can encourage a philosophical variety of ‘normal science'.
- Agent Causation as the Solution to All the Compatibilist's Problems Ned Markosian: Ned defends the view that 'agent causation' theorists should be compatibilists. In this paper, he goes on to argue that 'compatibilists' should be agent causation theorists.
- Cheating is evolutionarily assimilated with cooperation in the continuous 'snowdrift game' Tatsuya Sasak, Isamu Okada: It is well known that in contrast to the Prisoner’s Dilemma, the snowdrift game can lead to a stable coexistence of cooperators and cheaters. Recent theoretical evidence on the snowdrift game suggests that gradual evolution for individuals choosing to contribute in continuous degrees can result in the social diversification to a 100% contribution and 0% contribution through so-called evolutionary branching. Until now, however, game-theoretical studies have shed little light on the evolutionary dynamics and consequences of the loss of diversity in strategy. Here an analysis of continuous snowdrift games with quadratic payoff functions in dimorphic populations is undertaken. Subsequently, conditions are clarified under which gradual evolution can lead a population consisting of those with 100% contribution and those with 0% contribution to merge into one species with an intermediate contribution level. The key finding is that the continuous snowdrift game is more likely to lead to assimilation of different cooperation levels rather than maintenance of diversity. Importantly, this implies that allowing the gradual evolution of cooperative behavior can facilitate social inequity aversion in joint ventures that otherwise could cause conflicts that are based on commonly accepted notions of fairness.
- Quantum Interference Links The Fate of Two Atoms: For the first time, physicists from the CNRS and Université Paris-Sud at the Laboratoire Charles Fabry (CNRS/Institut d'Optique Graduate School) have achieved interference between two separate atoms: when sent towards the opposite sides of a semi-transparent mirror, the two atoms always emerge together. This type of experiment, which was carried out with photons around thirty years ago, had so far been impossible to perform with matter, due to the extreme difficulty of creating and manipulating pairs of indistinguishable atoms. The work is published in the journal Nature dated 2 April 2015.
- Alienation and Subjectivity in Marx and Foucault Jae Hetterley: In this paper, an assessement is made as to the extent of the Foucauldian critique of the 'subject' problematizes Marxists philosophy, given that a key aspect of Marx's critique of capitalism is the idea that the capitalist mode of production produces alienated 'labour', a concept grounded in objective and 'universalistic' notions of subjectivity.
- Learning about probabilistic inference and forecasting by 'playing' with multivariate normal distributions G. D’Agostini: The properties of the normal distribution under linear transformation, as well the easy way to compute the covariance matrix of marginals and conditionals, offer a unique opportunity to get an insight about several aspects of uncertainties in measurements. The way to build the overall covariance matrix in a few, but conceptually relevant cases is illustrated: several observations made with (possibly) different instruments measuring the same quantity; effect of systematics (although limited to offset, in order to stick to linear models) on the determination of the ‘true value’, as well in the prediction of future observations; correlations which arise when different quantities are measured with the same instrument affected by an offset uncertainty; inferences and predictions based on averages; inference about constrained values; fits under some assumptions (linear models with known standard deviations). Many numerical examples are provided, exploiting the ability of the R language to handle large matrices and to produce high quality plots. Some of the results are framed in the general problem of ‘propagation of evidence’, crucial in analyzing graphical models of knowledge.
- Replication, Communication, and the Population Dynamics of Scientific Discovery Richard Mcelreath, Paul Smaldino: Many published research results are false, and controversy continues over the roles of replication and publication policy in improving the reliability of research. A mathematical model is developed of scientific discovery in the context of replication, publication bias, and variation in research quality. This model provides a formal framework for reasoning about the normative structure of science. It is shown that replication may serve as a ratchet that gradually separates true hypotheses from false, but the same factors that make initial findings unreliable also make replications unreliable. The most important factors in improving the reliability of research are the rate of false positives and the base rate of true hypotheses: offered suggestions are made here for accomplishing these goals. The results also clarify recent debates on the communication of replications. Surprisingly, publication bias is not always an obstacle, but instead may have positive impacts — suppression of negative novel findings is often beneficial. It is also found that communication of negative replications serves the scientific community even when replicated studies have diminished power. The model presented in this paper is only a start, but it speaks directly to ongoing debates about the design and conduct of science.
- Is the world made of loops? Alexander Afriat: In discussions of the Aharonov-Bohm effect, Healey and Lyre have attributed reality to loops σ(0) (or hoops [σ(0)]), since the electromagnetic potential A is unmeasurable and can therefore be transformed. It is argued that [A] = [A + dλ]λ and the hoop [σ(0)] are related by a meaningful duality, so that however one feels about [A] (or any potential A ∈ [A]), it is no worse than [σ(0)] (or any loop σ(0) ∈ [σ(0)]): no ontological firmness is gained by retreating to the loops, which are just as flimsy as the potentials. And one wonders how the unmeasurability of one entity can invest another with physical reality; would an eventual observation of A dissolve σ(0), consigning it to a realm of incorporeal mathematical abstractions? The reification of loops rests on the potential’s “gauge dependence”; which in turn rests on its unmeasurability; which is too shaky and arbitrary a notion to carry so much weight.
- Why Physics Uses Second Derivatives Kenny Easwaran: A defense of a causal, reductionist account of the nature of rates of change-like velocity and acceleration is offered. This account identifies velocity with the past derivative of position, and acceleration with the future derivative of velocity. Unlike most reductionist accounts, this account can preserve the role of velocity as a cause of future positions and acceleration as the effect of current forces. It is shown that this is possible only if all the fundamental laws are expressed by differential equations of the same order. Consideration of the continuity of time explains why the differential equations are all second-order. This explanation is not available on non-causal or non-reductionist accounts of rates of change. Finally, it is argued that alleged counterexamples to the reductionist account involving physically impossible worlds are irrelevant to an analysis of the properties that play a causal role in the actual world.
- The theory of global imbalances: mainstream economics vs. structural Keynesianism Thomas I. Palley: Prior to the 2008 financial crisis there was much debate about global trade imbalances. Prima facie, the imbalances seem a significant problem. However, acknowledging that would question mainstream economics’ celebratory stance toward globalization. That tension prompted an array of theories which explained the imbalances while retaining the claim that globalization is economically beneficial. This paper surveys those new theories. It contrasts them with the structural Keynesian explanation that views the imbalances as an inevitable consequence of neoliberal globalization. The paper also describes how globalization created a political economy that supported the system despite its proclivity to generate trade imbalances.
- Essentialism and Anti-Essentialism in Feminist Philosophy Alison Stone: This article revisits the ethical and political questions raised by feminist debates over essentialism, the belief that there are properties essential to women and which all women share. Feminists’ widespread rejection of essentialism has threatened to undermine feminist politics. Re-evaluating two responses to this problem — ‘strategic’ essentialism and Iris Marion Young’s idea that women are an internally diverse ‘series’ — it is argued that both unsatisfactorily retain essentialism as a descriptive claim about the social reality of women’s lives. It is also argued that instead women have a ‘gene-alogy’: women always acquire femininity by appropriating and reworking existing cultural interpretations of femininity, so that all women become situated within a history of overlapping chains of interpretation. Because all women are located within this complex history, they are identifiable as belonging to a determinate social group, despite sharing no common understanding or experience of femininity. The idea that women have a genealogy thus reconciles anti-essentialism with feminist politics.
- Relatedness and Economies of Scale in the Provision of Different Kinds of Collective Goods Jorge Pena, Georg Noldeke and Laurent Lehmann: Many models proposed to study the evolution of collective action rely on a formalism that represents social interactions as n-player games between individuals adopting discrete actions such as cooperate and defect. Despite the importance of relatedness as a solution to collective action problems in biology and the fact that most social interactions unavoidably occur between relatives,incorporating relatedness into these models has so far proved elusive. The authors address this problem by considering mixed strategies and by integrating discrete-action n-player games into the direct fitness approach of social evolution theory. As an application, the authors use their mathematical framework to investigate the provision of three different kinds of collective goods, paradigmatic of a vast array of helping traits in nature: “public goods” (both providers and shirkers can use the good, e.g., alarm calls), “club goods” (only providers can use the good, e.g., participation in collective hunting), and “charity goods” (only shirkers can use the good, e.g., altruistic sacrifice). It is shown that relatedness relaxes the collective action problems associated to the provision of these goods in different ways depending on the kind of good (public, club, or charity) and on its economies of scale (constant, diminishing, or increasing returns to scale). The authors' findings highlight the importance of explicitly accounting for relatedness, the kind of good, and economies of scale in theoretical and empirical studies of collective action.
- Digging into the “Giant Resonance”, scientists find hints of new quantum physics: A cooperation between theoretical and experimental physicists from European XFEL and the Center for Free Electron Laser Science (CFEL) at DESY has uncovered previously unknown quantum states inside atoms.
- Leibniz on the Modal Status of Absolute Space and Time Martin Lin: Leibniz is a relationalist about space and time. He believes that nothing spatial ortemporal is more fundamental than the spatial and temporal relations that obtain betweenthings. These relations are direct: they are unmediated by anything spatially or temporallyabsolute such as points in space or moments in time. Some philosophers, for example, Newton and Clarke, disagree. They think that space and time are absolute. Their absolutism can take different forms. Newton, for example, believes that space is a substance, or more accurately, something substance-like. A substance is not a relation of any kind. Therefore, if space is a substance or substance-like, then it is absolute. Other absolutists, such as Clarke, believe that space is a monadic property of God. A monadic property is not a relation and thus if space is a monadic property, then it is absolute. Leibniz clearly thinks that absolutism is false. What is less clear is his attitude toward its modal status. Are absolute space and time merely contingently non-actual or are they impossible? In his correspondence with Clarke, Leibniz makes a number of claims regarding this issue that appear, on the face of it, to be inconsistent with one another. He argues that the Principle of the Identity of Indiscernibles (the PII) follows from God’s wisdom. God’s wisdom is the basis of only contingent truths, thus it would follow that the PII is a contingent truth. He argues against absolute space and time by way of the PII. This suggests that relationalism is also a contingent truth and so absolute space and time must be merely contingently non-actual. And yet he also appears to claim that absolute space and time are impossible. What justifies his claim that they are impossible? Is Leibniz being inconsistent?
- Reduction, Emergence and Renormalization Jeremy Butterfield: Excellent paper on the interface between philosophy and physics (science). In previous works, the author described several examples combining reduction and emergence where reduction is understood `a la Ernest Nagel, and emergence is understood as behaviour or properties that are novel (by some salient standard). Here, the aim is again to reconcile reduction and emergence, for a case which is apparently more problematic than those I treated before: renormalization. Renormalization is a vast subject. So the author confines himself to emphasizing how the modern approach to renormalization (initiated by Wilson and others between (1965 and 1975), when applied to quantum field theories, illustrates both Nagelian reduction and emergence. The author's main point is that the modern understanding of how renormalizability is a generic feature of quantum field theories at accessible energies gives us a conceptually unified family of Nagelian reductions. That is worth saying since philosophers tend to think of scientific explanation as only explaining an individual event, or perhaps a single law, or at most deducing one theory as a special case of another. Here we see a framework in which there is a space of theories endowed with enough structure that it provides a family of reductions.
- Quantum Darwinism and non-Markovian dissipative dynamics from quantum phases of the spin−1/2 XX model Gian Luca Giorgi, Fernando Galve, and Roberta Zambrini: Quantum Darwinism explains the emergence of a classical description of objects in terms of the creation of many redundant registers in an environment containing their classical information.This amplification phenomenon, where only classical information reaches the macroscopic observer and through which different observers can agree on the objective existence of such object, has been revived lately for several types of situations, successfully explaining classicality. Here an exploration of quantum Darwinism in the setting of an environment made of two level systems which are initially prepared in the ground state of the XX model, which exhibits different phases; it is found that the different phases have different ability to redundantly acquire classical information about the system, beingthe “ferromagnetic phase” the only one able to complete quantum Darwinism. At the same time the authors relate this ability to how non-Markovian the system dynamics is, based on the interpretation that non-Markovian dynamics is associated to back flow of information from environment to system, thus spoiling the information transfer needed for Darwinism. Finally, the authors explore the mixing of bath registers by allowing a small interaction among them, finding that this spoils the stored information as previously found in the literature.
- Blind Rule Following and the 'Antinomy of Pure Reason' Alex Miller: Saul Kripke identifies the 'rule-following problem' as finding an answer to the question: what makes it the case that a speaker means one thing rather than another by a linguistic expression? Crispin Wright and Paul Boghossian argued on many occasions that this problem could be neutralised via the adoption of a form of non-reductionism about content. In recent work on 'blind rule-following', both now argue that even a non-reductionist view can be defended in a way as to neutralise the challenge posed by Kripke's Wittgenstein, a more fundamental problem about rule-following remains unsolved. In this paper, it is argued that, courtesy of a non-reductionist conception of content, we can successfully meet the 'Kripkensteinian' challenge in a way that the Wright-Boghossian problems are themselves neutralised.
- Spatial Evolutionary Public Goods Game on Complete Graph and Dense Complex Networks Jinho Kim, Huiseung Chae, Soon-Hyung Yook & Yup Kim: The authors study the spatial evolutionary public goods game (SEPGG) with voluntary or optional participation on a complete graph (CG) and on dense networks. Based on analyses of the SEPGG rate equation on finite CG, they find that SEPGG has two stable states depending on the value of multiplication factor 'r', illustrating how the “tragedy of the commons” and “an anomalous state without any active participants” occurs in real-life situations. When 'r' is low, the state with only loners is stable, and the state with only defectors is stable when 'r' is high. The authors also derive the exact scaling relation for 'r*'. All of the results are confirmed by numerical simulation. Furthermore, they find that a cooperator-dominant state emerges when the number of participants or the mean degree, left 'fencekright' fence, decreases. They also investigate the scaling dependence of the emergence of cooperation on 'r' and left 'fencekright' fence. These results show how “tragedy of the commons” disappears when cooperation between egoistic individuals without any additional socioeconomic punishment increases.
- The Feminine Subject Susan Hekman: I cannot recommend this book highly enough - In 1949 Simone de Beauvoir asked, “What does it mean to be a woman?” Her answer to that question inaugurated a radical transformation of the meaning of “woman” that defined the direction of subsequent feminist theory. What Beauvoir discovered is that it is impossible to define “woman” as an equal human being in our philosophical and political tradition. Her effort to redefine “woman” outside these parameters set feminist theory on a path of radical transformation. The feminist theorists who wrote in the wake of Beauvoir’s work followed that path. Susan Hekman’s original and highly engaging book traces the evolution of “woman” from Beauvoir to the present. In a comprehensive synthesis of a number of feminist theorists she covers French feminist thinkers Luce Irigaray and Helene Cixous as well as theorists such as Carol Gilligan, Carole Pateman and Judith Butler. The book examines the relational self, feminist liberalism and Marxism, as well as feminist theories of race and ethnicity, radical feminism, postmodern feminism and material feminism. Hekman argues that the effort to redefine “woman” in the course of feminist theory is a cumulative process in which each approach builds on that which has gone before. Although they have approached “woman” from different perspectives, feminist theorists has moved beyond the negative definition of our tradition to a new concept that continues to evolve.
- The Future of Differences: Truth and Method in Feminist Theory Susan Hekman: yet another gem by Susan - 'This is an ambitious book. It seeks to develop a clear theory of difference(s) on which to ground feminist epistemology and practice. Hekman's contention is that feminists must eschew equally both universalism and relativism. Her careful and insightful readings of feminist classics and contemporary scholarship have produced a text that will become a classic in its own right. Hekman's modestly stated ambition is to provide a form of analysis that engages both with differences and with general concepts. Her reading of Weber is truly a tour de force in this regard. This is a book that every feminist scholar will want to read and use.' Henrietta L. Moore, Professor of Social Anthropology and Director of the Gender Institute, London School of Economics.
- Supernumeration: Vagueness and Numbers Peter Simons: There is a notable discrepancy between philosophers and practitioners on approaches to vagueness. Philosophers almost all reject fuzzy logic and a majority accept some form of supervaluational theory. Practitioners analysing real data on the other hand use fuzzy logic, because computer algorithms exist for it, despite its theoretical shortcomings. These two communities should not remain separate. The solution, it is argued, is to put supervaluation and numbers together. After reviewing the principal and well-known defects of fuzzy logic, this paper shows how to use numerical values in conjunction with a supervaluational approach to vagueness. The two principal working ideas are degrees of candidature (of objects and predicates) and expected truth-value. an outline is then presented of the theory, which combines vagueness of predicates and vagueness of objects, and discussions of its pros and cons are considered, considering the obvious principal objections: that the theory is complex, that there is arbitrariness in the selection of numbers, and that penumbral connections must be accounted for. It is then contended that all these objections can be answered.