Sign up with your email address to be the first to know about new products, VIP offers, blog features & more.
- Poetic Wandering - Walking Tour Highlights The Sights and Sounds of Literary Harvard University Georgia Bellas and Sarah Sweeney: April is National Poetry Month, though at Harvard every month could be. The University’s poetic legacy dates back hundreds of years and has helped shape the world’s literary canon. E.E. Cummings, John Ashbery, and Wallace Stevens are among the University’s well-known poetic alumni, while Maxine Kumin and Adrienne Rich attended Radcliffe. Harvard Gazette invites you to explore Harvard by foot and ear. This walking tour of campus can be completed in a lunch hour or less, and pairs classic Harvard landmarks with a sampling of the poets connected to the University. Using recordings housed at the Woodberry Poetry Room as well as new recordings, the tour also commemorates the April 13 birth of Seamus Heaney, a Nobel Prize winner and Harvard’s one-time Boylston Professor and poet-in-residence. Heaney died on Aug. 30, 2013, but his mark on Harvard is indelible.
- On The Representation of Women in Cognition Roberta Klatzky, Lori Holt, & Marlene Behrmann: Upon reading the recent Cognition special issue, titled “The Changing Face of Cognition “ (February 2015), the authors of this discussion felt a collective sense of dismay. Perusing the table of contents, they were struck by the fact that among the 19 authors listed for the 12 articles, only one female author was present. While the substantive content of the issue may persuade them that the face of cognition is changing, it appears that changes in gender distribution are not to be expected. The face of cognitive science will remain unequivocally male. According to recent statistics (NSF, 2013), more than 50% of doctorates awarded in cognitive psychology and psycholinguistics were to women, and the same holds for neuropsychology and experimental psychology. A clear implication is that women scientists should play a significant role in the future of cognitive science and cognitive neuroscience. The authors ask, then, why would the journal present an image of this science’s future as envisioned largely by male scientists?
- Rational theory choice: Arrow undermined, Kuhn vindicated Seamus Bradley: In a recent paper, Samir Okasha presented an argument that suggests that there is no rational way to choose among scientific theories. This would seriously undermine the view that science is a rational entreprise. In this paper the author shows how a suitably nuanced view of what scientific rationality requires allows us to avoid Okasha’s conclusion. The author then goes on to argue that making further assumptions about the space of possible scientific theories allows us to make scientific rationality more contentful. Finally, the author shows how such a view of scientific rationality fits with what Thomas Kuhn thought.
- Quantum Criticality in Life's Proteins Gabor Vattay: Stuart Kauffman, from the University of Calgary, and several of his colleagues have recently published a paper on the Arxiv server titled 'Quantum Criticality at the Origins of Life'. The idea of a quantum criticality, and more generally quantum critical states, comes perhaps not surprisingly, from solid state physics. It describes unusual electronic states that are are balanced somewhere between conduction and insulation. More specifically, under certain conditions, current flow at the critical point becomes unpredictable. When it does flow, it tends to do so in avalanches that vary by several orders of magnitude in size. Ferroelectric metals, like iron, are one familiar example of a material that has classical critical point. Above a critical temperature of 1043 degrees K the magnetization of iron is completely lost. In the narrow range approaching this point, however, thermal fluctuations in the electron spins that underly the magnetic behavior extend over all length scales of the sample—that's the scale invariance we mentioned. In this case we have a continuous phase transition that is thermally driven, as opposed to being driven by something else like external pressure, magnetic field, or some kind of chemical influence.Quantum criticality, on the other hand, is usually associated with stranger electronic behaviors—things like high-temperature superconductivity or so-called heavy fermion metals like CeRhIn5. One strange behavior in the case of heavy fermions, for example, is the observation of large 'effective mass'—mass up to 1000 times normal—for the conduction electrons as a consequence of their narrow electronic bands. These kinds of phenomena can only be explained in terms of the collective behavior of highly correlated electrons, as opposed to more familiar theory based on decoupled electrons.
- Moral Loopholes in the Global Economic Environment: Why Well-Intentioned Organizations Act in Harmful Ways S. L. Reiter: Thomas Pogge’s notion of moral loopholes serves to provide support for two claims: first, that the ethical code of the global economic order contains moral loopholes that allow participants in special social arrangements to reduce their obligations to those outside the social arrangement, which leads to morally objectionable actions for which no party feels responsible and that are also counterproductive to the overall objective of the economic system; and, second, that these moral loopholes are more likely to exist as our economic order becomes more global. It will be shown that attempts to rectify the situation with voluntary corporate codes of conduct are inadequate.
- Description And The Problem Of Priors Jeffrey A. Barrett: Belief-revision models of knowledge describe how to update one’s degrees of belief associated with hypotheses as one considers new evidence, but they typically do not say how probabilities become associated with meaningful hypotheses in the first place. Here the author considers a variety of Skyrms-Lewis signaling game [Lewis (1969)] [Skyrms (2010)] where simple descriptive language and predictive practice and associated basic expectations co-evolve. Rather than assigning prior probabilities to hypotheses in a fixed language then conditioning on new evidence, the agents begin with no meaningful language or expectations then evolve to have expectations conditional on their descriptions as they evolve to have meaningful descriptions for the purpose of successful prediction. The model, then, provides a simple but concrete example of how the process of evolving a descriptive language suitable for inquiry might also provide agents with effective priors.
- Consensus based Detection in the Presence of Data Falsification Attacks Bhavya Kailkhura: This paper considers the problem of detection in distributed networks in the presence of data falsification (Byzantine) attacks. Detection approaches considered in the paper are based on fully distributed consensus algorithms, where all of the nodes exchange information only with their neighbors in the absence of a fusion center. In such networks, we characterize the negative effect of Byzantines on the steady-state and transient detection performance of the conventional consensus based detection algorithms. To address this issue, the author studies the problem from the network designer’s perspective. More specifically, he first proposes a distributed weighted average consensus algorithm that is robust to Byzantine attacks. it is shown that, under reasonable assumptions, the global test statistic for detection can be computed locally at each node using our proposed consensus algorithm. The, it the author exploits the statistical distribution of the nodes’ data to devise techniques for mitigating the influence of data falsifying Byzantines on the distributed detection system. Since some parameters of the statistical distribution of the nodes’ data might not be known a priori, a learning based techniques is proposed to enable an adaptive design of the local fusion or update rules.
- Confirmation in the Cognitive Sciences: the Problematic Case of Bayesian Models Frederick Eberhardt and David Danks: Bayesian models of human learning are becoming increasingly popular in cognitive science. It is argued that their purported confirmation largely relies on a methodology that depends on premises that are inconsistent with the claim that people are Bayesian about learning and inference. Bayesian models in cognitive science derive their appeal from their normative claim that the modeled inference is in some sense rational. Standard accounts of the rationality of Bayesian inference imply predictions that an agent selects the option that maximizes the posterior expected utility. Experimental confirmation of the models, however, has been claimed because of groups of agents that “probability match” the posterior. Probability matching only constitutes support for the Bayesian claim if additional unobvious and untested (but testable) assumptions are invoked. The alternative strategy of weakening the underlying notion of rationality no longer distinguishes the Bayesian model uniquely. A new account of rationality — either for inference or for decision-making — is required to successfully confirm Bayesian models in cognitive science.
- Reason, Value, and Respect: Kantian Themes from the Philosophy of Thomas E. Hill, Jr. Edited by Mark Timmons and Robert N. Johnson: In thirteen specially written essays, leading philosophers explore Kantian themes in moral and political philosophy that are prominent in the work of Thomas E. Hill, Jr. The first three essays focus on respect and self-respect.; the second three on practical reason and public reason. The third section covers a set of topics in social and political philosophy, including Kantian perspectives on homicide and animals. The final set of essays discuss duty, volition, and complicity in ethics. In conclusion Hill offers an overview of his work and responses to the preceding essays.
- Moral investing: Psychological motivations and implications Enrico Rubaltelli, Lorella Lotto, Ilana Ritov, Rino Rumiati: In four experiments we showed that investors are not only interested in maximizing returns but have non-financial goals, too. We considered what drives the decision to invest ethically and the impact this strategy has on people’s evaluation of investment performance. In Study 1, participants who chose a moral portfolio (over an immoral one) reported being less interested in maximizing their gains and more interested in being true to their moral values. These participants also reported feeling lower disappointment upon learning that a different decision could have yield a better outcome. In Studies 2 and 3, the authors replicated these findings when investors decided not to invest in immoral assets, rather than when they choose to invest morally. In Study 4, the authors found similar results using the same industrial sector in both the moral and the immoral conditions and providing participants with information about the expected return of the portfolio they were presented with. These findings lend empirical support to the conclusion that investors have both utilitarian (financial) goals and expressive (non-financial) ones and show how non-financial motivations can influence the reaction to unsatisfactory investment performance
- Rethinking Responsibility in Science and Technology Fiorella Battaglia, Nikil Mukerji, and Julian Nida-Rümelin: The idea of responsibility is deeply embedded into the “lifeworld” of human beings and not subject to change. However, the empirical circumstances in which we act and ascribe responsibility to one another are subject to change. Science and technology play a great part in this transformation process. Therefore, it is important for us to rethink the idea, the role and the normative standards behind responsibility in a world that is constantly being transformed under the influence of scientific and technological progress. This volume is a contribution to that joint societal effort.
- The 'Consistent Histories' Formalism and the Measurement Problem Elias Okon and Daniel Sudarsky: the authors defend the claim that the Consistent Histories formulation of quantum mechanics does not solve the measurement problem. In order to do so, we argue that satisfactory solutions to the problem must not only not contain anthropomorphic terms (such as measurement or observer) at the fundamental level, but also that applications of the formalism to concrete situations (e.g., measurements) should not require any input not contained in the description of the situation at hand at the fundamental level. The authors' assertion is that the Consistent Histories formalism does not meet the second criterion. It is also argued that the so-called second measurement problem, i.e., the inability to explain how an experimental result is related to a property possessed by the measured system before the measurement took place, is only a pseudo-problem. As a result, the authors reject the claim that the capacity of the Consistent Histories formalism to solve it should count as an advantage over other interpretations.
- 10 Things you might not know about Black Holes: Whether you’re a gravitational guru or an armchair astronomer, you’ll gravitate toward these lesser-known facts about black holes. Believed to be the churning engines of most galaxies, black holes push the known laws of physics to their limits, and inspire some great (and some not-so-great) sci-fi adventures. To borrow a line from Perimeter Associate Faculty member Avery Broderick: “Black holes don’t 'suck' – they’re awesome!” Here are some oft-overlooked nuggets of black hole awesomeness. Feel free to use them at parties to add a little gravitational 'gravitas' to any conversation.
- Fisher information and quantum mechanical models for finance V. A. Nastasiuk: The probability distribution function (PDF) for prices on financial markets is derived by extremization of Fisher information. It is shown how on that basis the quantum-like description for financial markets arises and different financial market models are mapped by quantum mechanical ones.
- The Social Impact of Economic Growth Editors Susanna Price and Kathryn Robinson explore the social aspects of Chinese economic growth in their soon-to-be-published book, Making a Difference? Social Assessment Policy and Praxis and its Emergence in China. Following, Susanna Price offers further insight into the book’s origins and the impact the book may have on the field of Asian development studies.
- Pricing postselection: the cost of indecision in decision theory Joshua Combes and Christopher Ferrie: Postselection is the process of discarding outcomes from statistical trials that are not the event one desires. Postselection can be useful in many applications where the cost of getting the wrong event is implicitly high. However, unless this cost is specified exactly, one might conclude that discarding all data is optimal. Here the authors analyze the optimal decision rules and quantum measurements in a decision theoretic setting where a pre-specified cost is assigned to discarding data. They also relate their formulation to previous approaches which focus on minimizing the probability of indecision.
- Should Our Brains Count as Courtroom Evidence? Kamala Kelkar: Judges in the future could tap straight into criminal brains and nip second offenders before they’ve had a chance to do it again, says the Obama administration. Incriminating biomarkers could eventually be used by courts to predict recidivism and influence decisions about parole, bail and sentencing, finds the second volume of a report called Gray Matters released late last month by the president’s commission on bioethics. The judicial system already uses questionable methods to adjust sentences based on the defendant’s criminal, psychological and social background, so there’s an allure to using brain scans for possibly more efficient and objective risk assessment profiles. But the prospect of neuroprediction in the courtroom leads to a slew of ethical and moral questions. Should we assign longer sentences to a criminal functioning with what scientists say is a brain ripe for a second offense? Why not just let brain scans identify the most dangerous people and send them straight to jail before they’ve committed a crime? “There’s a lot of motivation to literally get inside the heads of criminals,” said Lisa Lee, the executive director of the commission. “What the commission was really concerned about was the careful and accurate use of neuroscience in the courtroom—given what’s on the line.”
- Mind, Reason, and Being-in-the-World: The McDowell-Dreyfus Debate Joseph K. Schear: John McDowell and Hubert L. Dreyfus are philosophers of world renown, whose work has decisively shaped the fields of analytic philosophy and phenomenology respectively. Mind, Reason, and Being-in-the-World: The McDowell-Dreyfus Debate opens with their debate over one of the most important and controversial subjects of philosophy: is human experience pervaded by conceptual rationality, or does experience mark the limits of reason? Is all intelligibility rational, or is there a form of intelligibility at work in our skilful bodily rapport with the world that eludes our intellectual capacities? McDowell and Dreyfus provide a fascinating insight into some fundamental differences between analytic philosophy and phenomenology, as well as areas where they may have something in common. Fifteen specially commissioned chapters by distinguished international contributors enrich the debate inaugurated by McDowell and Dreyfus, taking it in a number of different and important directions. Fundamental philosophical problems discussed include: the embodied mind, subjectivity and self-consciousness, intentionality, rationality, practical skills, human agency, and the history of philosophy from Kant to Hegel to Heidegger to Merleau-Ponty. With the addition of these outstanding contributions, Mind, Reason, and Being-in-the-World is essential reading for students and scholars of analytic philosophy and phenomenology.
- Objective probability-like things with and without objective indeterminism László E. Szabó: It is argued that there is no such property of an event as its “probability.” This is why standard interpretations cannot give a sound definition in empirical terms of what “probability” is, and this is why empirical sciences like physics can manage without such a definition. “Probability” is a collective term, the meaning of which varies from context to context: it means different—dimensionless [0, 1]-valued — physical quantities characterising the different particular situations. In other words, probability is a reducible concept, supervening on physical quantities characterising the state of affairs corresponding to the event in question. On the other hand, however, these “probability-like” physical quantities correspond to objective features of the physical world, and are objectively related to measurable quantities like relative frequencies of physical events based on finite samples — no matter whether the world is objectively deterministic or indeterministic.
- Science shows there is more to a Rembrandt than meets the eye: Art historians and scientists use imaging methods to virtually "dig" under or scan various layers of paint and pencil. This is how they decipher how a painter went about producing a masterpiece – without harming the original. A comparative study with a Rembrandt van Rijn painting as its subject found that the combined use of three imaging techniques provides valuable complementary information about what lies behind this artwork's complex step-by-step creation. The study, led by Matthias Alfeld of the University of Antwerp in Belgium, is published in Springer's journal Applied Physics A: Materials Science and Processing. Rembrandt's oil painting Susanna and the Elders is dated and signed 1647. It hangs in the art museum Gemäldegalerie in Berlin, Germany. The painting contains a considerable amount of the artist's changes or so-called pentimenti (from the Italian verb pentire: ''to repent") underneath the current composition. This was revealed in the 1930s when the first X-ray radiography (XRR) was done on it. More hidden details about changes made with pigments other than lead white were discovered when the painting was investigated in 1994 using neutron activation autoradiography (NAAR). Alfeld's team chose to investigate Susana and the Elders not only because of its clearly visible pentimenti, but also because of its smaller size. Macro-X-ray fluorescence (MA-XRF) scans could thus be done in a single day using an in-house scanner at the museum in Berlin. These were then compared to existing radiographic images of the painting.
- Symmetry and the Metaphysics of Physics David John Baker: The widely held picture of dynamical symmetry as surplus structure in a physical theory has many metaphysical applications. Here the author focuses on its relevance to the question of which quantities in a theory represent fundamental natural properties.
- Measuring the Value of Science Rod Lamberts: Reports about the worthy contributions of science to national economies pop up regularly all around the world – from the UK to the US and even the developing world. In Australia, the Office of the Chief Scientist recently released an analysis of science and its contribution to the economy down under, finding it's worth around A$145 billion a year. It's perfectly sensible and understandable that science (and related sectors) would feel the need to account for themselves in financial or economic terms. But in doing this we need to be wary of getting lulled into believing that this is the only – or worse, the best – way of attributing value to science. When it comes to determining the value of science, we should heed the words of the American environmental scientist and thinker, Donella Meadows, on how we think about indicators: Indicators arise from values (we measure what we care about), and they create values (we care about what we measure). Indicators are often poorly chosen. The choice of indicators is a critical determinant of the behaviour of a system. Much public debate about the value of science has been hijacked by the assumption that direct, tangible economic impact is the way to measure scientific worth. We seem now to be in a place where positing non-economic arguments for science benefits runs the risk of being branded quaintly naïve and out-of-touch at best, or worse: insensitive, irrelevant and self-serving. But relegating science to the status of mere servant of the economy does science a dramatic disservice and leaves both science and society the poorer for it. So here are five ways we can acknowledge and appreciate the societal influences and impacts of science that lie well beyond the dreary, soulless, cost-benefit equations of economics.
- Evolution and Normative Scepticism Karl Schafer: It is increasingly common to suggest that the combination of evolutionary theory and normative realism leads inevitably to a general scepticism about our ability to reliably form normative beliefs. In what follows, Karl argues that this is not the case. In particular, he considers several possible arguments from evolutionary theory and normative realism to normative scepticism and explains where they go wrong. He then gives a more general diagnosis of the tendency to accept such arguments and why this tendency should be resisted.
- A Generative Probabilistic Model For Deep Convolutional Learning Yunchen Pu, Xin Yuan, and Lawrence Carin: A generative model is developed for deep (multi-layered) convolutional dictionary learning. A novel probabilistic pooling operation is integrated into the deep model, yielding efficient bottom-up (pretraining) and top-down (refinement) probabilistic learning. Experimental results demonstrate powerful capabilities of the model to learn multi-layer features from images, and excellent classification results are obtained on the MNIST and Caltech 101 datasets.
- Faster Algorithms for Testing under Conditional Sampling Moein Falahatgar, Ashkan Jafarpour, Alon Orlitsky: There has been considerable recent interest in distribution-tests whose run-time and sample requirements are sublinear in the domain-size k. The authors study two of the most important tests under the conditional-sampling model where each query specifies a subset S of the domain, and the response is a sample drawn from S according to the underlying distribution. For identity testing, they ask whether the underlying distribution equals a specific given distribution or ǫ-differs from it, and reduce the known time and sample complexities from Oe (ǫ −4) to Oe(ǫ −2), thereby matching the information theoretic lower bound. For closeness testing, which asks whether two distributions underlying observed data sets are equal or different, the authors reduce existing complexity from Oe (ǫ −4 log5 k) to an even sub-logarithmic Oe (ǫ −5 log log k) thus providing a better bound to an open problem in Bertinoro Workshop on Sublinear Algorithms.
- Objectivity and Conditionality in Frequentist Inference David Cox and Deborah G. Mayo: Statistical methods are used to some extent in virtually all areas of science, technology, public affairs, and private enterprise. The variety of applications makes any single unifying discussion difficult if not impossible. The authors concentrate on the role of statistics in research in the natural and social sciences and the associated technologies. Their aim is to give a relatively nontechnical discussion of some of the conceptual issues involved and to bring out some connections with general epistemological problems of statistical inference in science. In the first part of this chapter (7(I)), they considered how frequentist statistics may serve as an account of inductive inference, but because this depends on being able to apply its methods to appropriately circumscribed contexts, they need to address some of the problems in obtaining the methods with the properties we wish them to have. Given the variety of judgments and background information this requires, it may be questioned whether any account of inductive learning can succeed in being “objective.” However, statistical methods do, the authors think, promote the aim of achieving enhanced understanding of the real world, in some broad sense, and in this some notion of objectivity is crucial. They open and begin by briefly discussing this concept as it arises in statistical inference in science.
- Generalizations Related To Hypothesis Testing With The Posterior Distribution Of The Likelihood Ratio I. Smith, A. Ferrari: The Posterior distribution of the Likelihood Ratio (PLR) is proposed by Dempster in 1974 for significance testing in the simple vs composite hypotheses case. In this hypotheses test case, classical frequentist and Bayesian hypotheses tests are irreconcilable, as emphasized by Lindley’s paradox, Berger & Selke in 1987 and many others. However, Dempster shows that the PLR (with inner threshold 1) is equal to the frequentist p-value in the simple Gaussian case. In 1997, Aitkin extends this result by adding a nuisance parameter and showing its asymptotic validity under more general distributions. Here it is extended to the reconciliation between the PLR and a frequentist p-value for a finite sample, through a framework analogous to the Stein’s theorem frame in which a credible (Bayesian) domain is equal to a confidence (frequentist) domain. This general reconciliation result only concerns simple vs composite hypotheses testing. The measures proposed by Aitkin in 2010 and Evans in 1997 have interesting properties and extend Dempster’s PLR but only by adding a nuisance parameter. Here, a proposal is offered for two extensions of the PLR concept to the general composite vs composite hypotheses test. The first extension can be defined for improper priors as soon as the posterior is proper. The second extension appears from a new Bayesian-type Neyman-Pearson lemma and emphasizes, from a Bayesian perspective, the role of the LR as a discrepancy variable for hypothesis testing.
- The Embodied “We”: The Extended Mind as Cognitive Sociology Teed Rockwell: Cognitive Science began with the assumption sometimes called Cartesian Materialism-- that the brain is an autonomous machine that can be studied as a closed system. The challenges of solving the puzzles presupposed by that assumption led to a recognition that mind is both embodied and embedded i. e. it cannot be separated from either the rest of the organism or from the organism's symbiotic relationship with its environment. The unavoidable (but often ignored) implication of this conclusion is that if our environment includes other minds, our minds must also be embodied by other minds. This means that we are irreducibly social, for the same reasons that we are irreducibly embodied and embedded in an environment. This paper explores and questions the assumptions of Game Theory - the branch of computer science that assumes that society can only be understood as the interaction of isolated rational autonomous agents. If the Game Theory of the future were to follow the lead of cutting edge cognitive science, it would replace computational models with dynamical ones. Just as Extended Cognition theories recognize that the line between mind and world is a flexible one, Dynamic social theories would recognize that the line between mind and mind is equally flexible that we must be understood not as autonomous individuals with selfish interests, but rather as fluctuating tribes or families dynamically bonded, and motivated not only by selfishness, but by trust, loyalty and love.
- Neyman: Distinguishing tests of statistical hypotheses and tests of significance might have been a lapse of someone’s pen Deborah G. Mayo: Contrary to ideas suggested by the title of the conference at which the present paper was presented, the author is not aware of a conceptual difference between a “test of a statistical hypothesis” and a “test of significance” and uses these terms interchangeably. A study of any serious substantive problem involves a sequence of incidents at which one is forced to pause and consider what to do next. In an effort to reduce the frequency of misdirected activities one uses statistical tests. The procedure is illustrated on two examples: (i) Le Cam’s (and asssociates’) study of immunotherapy of cancer and (ii) a socio-economic experiment relating to low-income homeownership problems.
- Testing Composite Null Hypothesis Based on S-Divergences Abhik Ghosh, Ayanendranath Basu: The authors present a robust test for composite null hypothesis based on the general S-divergence family. This requires a non-trivial extension of the results of Ghosh et al. (2015). They then derive the asymptotic and theoretical robustness properties of the resulting test along with the properties of the minimum S-divergence estimators under parameter restrictions imposed by the null hypothesis. An illustration in the context of the normal model is also presented.