Sign up with your email address to be the first to know about new products, VIP offers, blog features & more.
- Truth Relativists Can't Trump Moral Progress Annalisa Coliva: "In this paper we raise a new challenge for truth-relativism, when applied to moral discourse. In §1 we set out the main tenets of this doctrine; in §2 we canvass two broad forms a relativist project can take – Descriptive Relativism and Revisionary Relativism; in §3 we briefly consider the prospects of the combination of truth-relativism with either project when dealing with disagreement arising in the relevant areas of discourse. We claim that truth-relativism faces what we dub “the Lost disagreementProblem”, while leaving its final assessment for another occasion. In §4 we show how there is another – so far unnoticed – challenge truth-relativists must face when dealing with disputes about morals: we call it “the Progress Problem”. In §5 we show how a recent notion proposed in connection with truth-relativism and the problem of future contingents, viz. the idea of trumping, can help relativists make sense of such a problem. Yet, we conclude, in §6, that the appeal to trumping in fact forces a dilemma onto truth-relativists engaged in either a Descriptive or a Revisionary project."
- A Definable Henselian Valuation With High Quantifier Complexity Immanuel Halupczok, Franziska Jahnke: mathematical logic - an example of a parameter-free definable Henselian valuation ring is given which is neither definable by a parameter-free ∀∃-formula nor by a parameter free ∃∀-formula in the language of rings. This answers a question of Prestel.
- Self-Knowledge for Humans Quassim Cassam: Human beings are not model epistemic citizens. Our reasoning can be careless and uncritical, and our beliefs, desires, and other attitudes aren't always as they ought rationally to be. Our beliefs can be eccentric, our desires irrational and our hopes hopelessly unrealistic. Our attitudes are influenced by a wide range of non-epistemic or non-rational factors, including our character, our emotions and powerful unconscious biases. Yet we are rarely conscious of such influences. Self-ignorance is not something to which human beings are immune. In this book Quassim Cassam develops an account of self-knowledge which tries to do justice to these and other respects in which humans aren't model epistemic citizens. He rejects rationalist and other mainstream philosophical accounts of self-knowledge on the grounds that, in more than one sense, they aren't accounts of self-knowledge for humans. Instead he defends the view that inferences from behavioural and psychological evidence are a basic source of human self-knowledge. On this account, self-knowledge is a genuine cognitive achievement and self-ignorance is almost always on the cards. As well as explaining knowledge of our own states of mind, Cassam also accounts for what he calls 'substantial' self-knowledge, including knowledge of our values, emotions, and character. He criticizes philosophical accounts of self-knowledge for neglecting substantial self-knowledge, and concludes with a discussion of the value of self-knowledge.
- Absence of gravitational-wave signal extends limit on knowable universe: Imagine an instrument that can measure motions a billion times smaller than an atom that last a millionth of a second. Fermilab's Holometer is currently the only machine with the ability to take these very precise measurements of space and time, and recently collected data has improved the limits on theories about exotic objects from the early universe. Our universe is as mysterious as it is vast. According to Albert Einstein's theory of general relativity, anything that accelerates creates gravitational waves, which are disturbances in the fabric of space and time that travel at the speed of light and continue infinitely into space. Scientists are trying to measure these possible sources all the way to the beginning of the universe.
- On the Importance of Interpretation in Quantum Physics - A Reply to Elise Crull Antonio Vassallo and Michael Esfeld: E. Crull claims that by invoking decoherence it is possible (i) to obviate many “fine grained” issues often conflated under the common designation of 'measurement' problem, and (ii) to make substantial progresses in the fields of quantum gravity and quantum cosmology, without any early incorporation of a particular interpretation in the quantum formalism. It is pointed out here that Crull is mistaken about decoherence and tacitly assumes some kind of interpretation of the quantum formalism.
- Projective simulation with generalization Alexey A. Melnikov, Adi Makmal, Vedran Dunjko, and Hans J. Briegel: The ability to generalize is an important feature of any intelligent agent. Not only because it may allow the agent to cope with large amounts of data, but also because in some environments, an agent with no generalization ability is simply doomed to fail. In this work we outline several criteria for generalization, and present a dynamic and autonomous machinery that enables projective simulation agents to meaningfully generalize. Projective simulation, a novel, physical, approach to artificial intelligence, was recently shown to perform well, in comparison with standard models, on both simple reinforcement learning problems, as well as on more complicated canonical tasks, such as the “grid world” and the “mountain car problem”. Both the basic projective simulation model and the presented generalization machinery are based on very simple principles. This simplicity allows us to provide a full analytical analysis of the agent’s performance and to illustrate the benefit the agent gains by generalizing. Specifically, we show how such an ability allows the agent to learn in rather extreme environments, in which learning is otherwise impossible.
- General Covariance, Diffeomorphism Invariance, and Background Independence in 5 Dimensions Antonio Vassallo: This paper considers the “GR-desideratum”, that is, the way general relativity implements general covariance, diffeomorphism invariance, and background independence. Two cases are discussed where 5-dimensional generalizations of general relativity run into interpretational troubles when the GR-desideratum is forced upon them. It is shown how the conceptual problems dissolve when such a desideratum is relaxed. In the end, it is suggested that a similar strategy might mitigate some major issues such as the problem of time or the embedding of quantum non-locality into relativistic spacetimes.
- Coherent states, quantum gravity and the Born-Oppenheimer approximation Alexander Stottmeister, Thomas Thiemann: This article aims at establishing the (time-dependent) Born-Oppenheimer approximation, in the sense of space adiabatic perturbation theory, for quantum systems constructed by techniques of the loop quantum gravity framework, especially the canonical formulation of the latter. The analysis presented here fits into a rather general framework, and offers a solution to the problem of applying the usual Born-Oppenheimer ansatz for molecular (or structurally analogous) systems to more general quantum systems (e.g. spin-orbit models) by means of space adiabatic perturbation theory. The proposed solution is applied to a simple, finite dimensional model of interacting spin systems, which serves as a non-trivial, minimal model of the aforesaid problem. Furthermore, it is explained how the content of this article affects the possible extraction of quantum field theory on curved spacetime from loop quantum gravity (including matter fields).
- Dummett on the Relation between Logics and Metalogics Timothy Williamson: This paper takes issue with a claim by Dummett that, in order to aid understanding between proponents and opponents of logical principles, a semantic theory should make the logic of the object-language maximally insensitive to the logic of the metalanguage. The general advantages of something closer to a homophonic semantic theory are sketched. A case study is then made of modal logic, with special reference to disputes over the Brouwerian formula (B) in propositional modal logic and the Barcan formula in quantified modal logic. Semantic theories for modal logic within a possible worlds framework satisfy Dummett’s desideratum, since the non-modal nature of the semantics makes the modal logic of the object-language trivially insensitive to the modal logic of the metalanguage. However, that does not help proponents and opponents of the modal principles at issue understand each other. Rather, it makes the semantic theory virtually irrelevant to the dispute, which is best conducted mainly in the object-language; this applies even to Dummett’s own objection to the B principle. Other forms of semantics for modal languages are shown not to alter the picture radically. It is argued that the semantic and more generally metalinguistic aspect of disputes in logic is much less significant than Dummett takes it to be. The role of (non-causal) abductive considerations in logic and philosophy is emphasized, contrary to Dummett’s view that inference to the best explanation is not a legitimate method of argument in these area.
- The Foundations of Transcendental Pragmatism Alexander Schmid: Over the course of the last three centuries in America, two particular schools of philosophical, and in one case, literary thought, have captured the American intellectual imagination: transcendentalism and pragmatism. While transcendentalism flourished in the middle of the 19th century and was prominent among litterateurs and essayists, pragmatism was prominent among scholars and philosophers near the end of the 19th century and the beginning of the 20th. At first glance, transcendentalism and pragmatism may seem not to have more in common than each being uniquely American. This may be true on a superficial level, but does not, however, preclude each movement from being privy to some truth. It is on that belief that this paper will be grounded, and after a short exposition of what transcendentalism and pragmatism mean, a new school and way of thinking, transcendental pragmatism, will be unveiled and use this paper as its founding document.
- Curve-Fitting For Bayesians? Gordon Belot: Bayesians often assume, suppose, or conjecture that for any reasonable explication of the notion of simplicity a prior can be designed that will enforce a preference for hypotheses simpler in just that sense. Further, it is often claimed that the Bayesian framework automatically implements Occam’s razor — that conditionalizing on data consistent with both a simple theory and a complex theory more or less inevitably favours the simpler theory. But it is shown here that there are simplicity-driven approaches to curve-fitting problems that cannot be captured within the orthodox Bayesian framework and that the automatic razor does not function for such problems.
- On the Irrelevance Of General Equilibrium Theory Lars P. Syll: On "the problem with perfect competition is - not its “lack” of realism; but its lack of “relevancy”: it surreptitiously assumes an entity that gives prices (present and future) to price taking agents, that collects information about supplies and demands, adds these up, moves prices up and down until it finds their equilibrium value. Textbooks do not tell this story; they assume that a deus ex machina called the “market” does the job. In the real world, people trade with each other, not with “the market.” And some of them, at least, are price makers. To make things worse, textbooks generally allude to some mysterious “invisible hand” that allocates goods optimally. They wrongly attribute this idea to Adam Smith and make use of his authority so that students accept this magical way of thinking as a kind of proof. Perfect competition in the general equilibrium mode is perhaps an interesting model for describing a central planner who is trying to find an efficient allocation of resources using prices as signals that guide price taker households and firms. But students should be told that the course they follow—on “general competitive analysis”—is irrelevant for understanding market economies." Emmanuelle Benicourt & Bernard Guerrien. Lars agrees with this statement and adds: "We do know that - under very restrictive assumptions - equilibria do exist, are unique and are Pareto-efficient. One however has to ask oneself — what good does that do? As long as we cannot show, except under exceedingly special assumptions, that there are convincing reasons to suppose there are forces which lead economies to equilibria - the value of general equilibrium theory is negligible. As long as we cannot really demonstrate that there are forces operating - under reasonable, relevant and at least mildly realistic conditions - at moving markets to equilibria, there cannot really be any sustainable reason for anyone to pay any interest or attention to this theory."
- Relativistic Paradoxes and Lack of Relativity in Closed Spaces Moses Fayngold: Some known relativistic paradoxes are reconsidered for closed spaces, using a simple geometric model. For two twins in a closed space, a real paradox seems to emerge when the traveling twin is moving uniformly along a geodesic and returns to the starting point without turning back. Accordingly, the reference frames (RF) of both twins seem to be equivalent, which makes the twin paradox irresolvable: each twin can claim to be at rest and therefore to have aged more than the partner upon their reunion. In reality, the paradox has the resolution in this case as well. Apart from distinction between the two RF with respect to actual forces in play, they can be distinguished by clock synchronization. A closed space singles out a truly stationary RF with single-valued global time; in all other frames, time is not a single-valued parameter. This implies that even uniform motion along a spatial geodesic in a compact space is not truly inertial, and there is an effective force on an object in such motion. Therefore, the traveling twin will age less upon circumnavigation than the stationary one, just as in flat space-time. Ironically, Relativity in this case emerges free of paradoxes at the price of bringing back the pre-Galilean concept of absolute rest. An example showing the absence of paradoxes is also considered for a more realistic case of a time-evolving closed space.
- Collective Belief, Kuhn, and the String Theory Community James Owen Weatherall, Margaret Gilbert: "One of us [Gilbert, M. (2000). “Collective Belief and Scientific Change.” Sociality and Responsibility. Lanham, MD: Rowman & Littlefield. 37-49.] has proposed that ascriptions of beliefs to scientific communities generally involve a common notion of collective belief described by her in numerous places. A given collective belief involves a joint commitment of the parties, who thereby constitute what Gilbert refers to as a plural subject. Assuming that this interpretive hypothesis is correct, and that some of the belief ascriptions in question are true, then the members of some scientific communities have obligations that may act as barriers both to the generation and, hence, the fair evaluation of new ideas and to changes in their community’s beliefs. We argue that this may help to explain Thomas Kuhn’s observations on “normal science”, and go on to develop the relationship between Gilbert's proposal and several features of a group of physicists working on a fundamental physical theory called “string theory”, as described by physicist Lee Smolin [Smolin, L. (2006). The Trouble with Physics. Mariner Books: New York.]. We argue that the features of the string theory community that Smolin cites are well explained by the hypothesis that the community is a plural subject of belief."
- Why Build a Virtual Brain? Large-scale Neural Simulations as Test-bed for Artificial Computing Systems Matteo Colombo: Despite the impressive amount of financial resources invested in carrying out large-scale brain simulations, it is controversial what the payoffs are of pursuing this project. The present paper argues that in some cases, from designing, building, and running a large-scale neural simulation, scientists acquire useful knowledge about the computational performance of the simulating system, rather than about the neurobiological system represented in the simulation. What this means, why it is not a trivial lesson, and how it advances the literature on the epistemology of computer simulation are the three preoccupations addressed by the paper. Keywords: Large-scale neural simulations; epistemology of computer simulation; target-directed modeling; neuromorphic technologies, brain-networking.
- Accelerating universe? Not so fast: A University of Arizona-led team of astronomers found that the type of supernovae commonly used to measure distances in the universe fall into distinct populations not recognized before; the findings have implications for our understanding of how fast the universe has been expanding since the Big Bang. The discovery casts new light on the currently accepted view of the universe expanding at a faster and faster rate, pulled apart by a poorly understood force called dark energy. This view is based on observations that resulted in the 2011 Nobel Prize for Physics awarded to three scientists, including UA alumnus Brian P. Schmidt.
- The Value Of Knowledge Duncan Pritchard: It is widely held that knowledge is of distinctive value. This is the main reason that knowledge and not mere justified true belief has been the central notion in epistemological methodology, teleology and deontology. The 'value-problem' is to explain why this is the case. In this important paper, Duncan argues against the view that knowledge is of particular value, and thus gives a negative answer to the 'value-problem' and follows through with the ramifications of such denial.
- An Algebraic Topological Method for Multimodal Brain Networks Comparisons Tiago Simas, Mario Chavez, Pablo Rodriguez, and Albert Diaz-Guilera: Understanding brain connectivity has become one of the most important issues in neuroscience. But connectivity data can reflect either the functional relationships of the brain activities or the anatomical properties between brain areas. Although one should expect a clear relationship between both representations it is not straightforward. Here, a formalism is presented that allows for the comparison of structural and functional networks by embedding both in a common metric space. In this metric space one can then find for which regions the two networks are significantly different. The methodology can be used not only to compare multimodal networks but also to extract statistically significant aggregated networks of a set of subjects. Actually, this procedure is used to aggregate a set of functional networks from different subjects in an aggregated network that is compared with the structural connectivity. The comparison of the aggregated network reveals some features that are not observed when the comparison is done with the classical averaged network.
- Mechanisms meet Structural Explanation Laura Felline: This paper investigates the relationship between Structural Explanation and the New Mechanistic account of explanation. The aim of this paper is twofold: firstly, to argue that some phenomena in the domain of fundamental physics, although mechanically brute, are structurally explained; and secondly, by elaborating on the contrast between SE and ME, to better clarify some features of SE. Finally, this paper will argue that, notwithstanding their apparently antithetical character, SE and ME can be reconciled within a unified account of general scientific explanation.
- Can social interaction constitute social cognition? Hanne De Jaegher, Ezequiel Di Paolo and Shaun Gallagher: An important shift is taking place in social cognition research, away from a focus on the individual mind and toward embodied and participatory aspects of social understanding. Empirical results already imply that social cognition is not reducible to the workings of individual cognitive mechanisms.To galvanize this interactive turn, the authors provide an operational definition of social interaction and distinguish the different explanatory roles - contextual, enabling and constitutive - it can play in social cognition. Then the authors show that interactive processes are more than a context for social cognition: they can complement and even replace individual mechanisms.This new explanatory power of social interaction can push the field forward by expanding the possibilities of scientific explanation beyond the individual.
- Evolving to Generalize - Trading Precision for Speed Cailin O’Connor: Biologists and philosophers of biology have argued that learning rules that do not lead organisms to play evolutionarily stable strategies (ESSes) in games will not be stable and thus not evolutionarily successful. This claim, however, stands at odds with the fact that learning generalization - a behavior that cannot lead to ESSes when modeled in games - is observed throughout the animal kingdom. In this paper, the author uses learning generalization to illustrate how previous analyses of the evolution of learning have gone wrong. It has been widely argued that the function of learning generalization is to allow for swift learning about novel stimuli. It is shown that in evolutionary game theoretic models learning generalization, despite leading to suboptimal behavior, can indeed speed learning. It is further observed that previous analyses of the evolution of learning ignored the short term success of learning rules. If one drops this assumption, it is argued, it can be shown that learning generalization will be expected to evolve in these models.This analysis is then used to show how ESS methodology can be misleading, and to reject previous justifications about ESS play derived from analyses of learning.
- Asymptotic behaviour of weighted differential entropies in a Bayesian problem Mark Kelbert and Pavel Mozgunov: Consider a Bayesian problem of estimating of probability of success in a series of trials with binary outcomes. The authors study the asymptotic behaviour of weighted differential entropies for posterior probability density function (PDF) conditional on x successes after n trials, when n → ∞. In the first part of work Shannon’s differential entropy is considered in three particular cases: x is a proportion of n; x ∼ n β , where 0 < β < 1; either x or n − x is a constant. In the first and second cases limiting distribution is Gaussian and the asymptotic of differential entropy is asymptotically Gaussian with corresponding variances. In the third case the limiting distribution in not Gaussian, but still the asymptotic of differential entropy can be found explicitly. Then suppose that one is interested to know whether the coin is fair or not and for large n is interested in the true frequency. In other words, one wants to emphasize the parameter value p = 1/2. To do so the concept of weighted differential entropy introduced in and is used when the frequency γ is necessary to emphasize. It was found that the weight in suggested form does not change the asymptotic form of Shannon, Renyi, Tsallis and Fisher entropies, but change the constants. The main term in weighted Fisher Information is changed by some constant which depend on distance between the true frequency and the value one wants to emphasize.
- Understanding Democracy and Development Traps Using a Data-Driven Approach: Why do some countries seem to develop quickly while others remain poor? This question is at the heart of the so-called poverty or development trap problem. Using mathematics on open data sets researchers now present new insights into this issue, and also suggest which countries can be expected to develop faster. The paper is published in the journal Big Data. 'Development' economists have identified several potential causes of economic development traps, but the issue is complex. Some countries appear to be stuck not only in an economic development trap but also in a political development trap with a lack of democracy.
- Knowledge Representation meets Social Virtual reality Carlo Bernava, Giacomo Fiumara, Dario Maggiorini, Alessandro Provetti,and Laura Ripamonti: This study designs and implements an application running inside 'Second Life' that supports user annotation of graphical objects and graphical visualization of concept ontologies, thus providing a formal, machine-accessible description of objects. As a result, a platform is offered that combines the graphical knowledge representation that is expected from a MUVE artifact with the semantic structure given by the Resource Framework Description (RDF) representation of information.
- Biohumanities: Rethinking the relationship between biosciences, philosophy and history of science, and society Karola Stotz, Paul E. Griffiths: It is argued that philosophical and historical research can constitute a ‘Biohumanities’ which deepens our understanding of biology itself; engages in constructive 'science criticism'; helps formulate new 'visions of biology'; and facilitates 'critical science communication'. The authors illustrate these ideas with two recent 'experimental philosophy' studies of the concept of the gene and of the concept of innateness conducted by ourselves and collaborators. It is then concluded that the complex and often troubled relations between science and society are critical to both parties, and then argued that the philosophy and history of science can help to make this relationship work.  
- Macroscopic Observability of Spinorial Sign Changes: A Reply to Gill Joy Christian: In a recent paper Richard Gill has criticized an experimental proposal which describes how to detect a macroscopic signature of spinorial sign changes under 2π rotations. Here it is pointed out that Gill’s worries stem from his own elementary algebraic and conceptual mistakes. In a recent paper a mechanical experiment has been proposed to test possible macroscopic observability of spinorial sign changes under 2π rotations. The proposed experiment is a variant of the local model for the spin-1/2 particles considered by Bell, which was later further developed by Peres providing pedagogical details.This experiment differs, however, from the one considered by Bell and Peres in one important respect. It involves measurements of the actual spin angular momenta of two fragments of an exploding bomb rather than their normalized spin values, ±1.
- Bohmian Dispositions Mauricio Suárez: This paper argues for a broadly dispositionalist approach to the ontology of Bohmian mechanics. It first distinguishes the ‘minimal’ and the ‘causal’ versions of Bohm’s Theory, and then briefly reviews some of the claims advanced on behalf of the ‘causal’ version by its proponents. A number of ontological or interpretive accounts of the wave function in Bohmian mechanics are then addressed in detail, including i) configuration space, ii) multi-field, iii) nomological, and iv) dispositional approaches. The main objection to each account is reviewed, namely i) the ‘problem of perception’, ii) the ‘problem of communication’, iii) the ‘problem of temporal laws’, and iv) the ‘problem of under-determination’. It is then shown that a version of dispositionalism overcomes the under-determination problem while providing neat solutions to the other three problems. A pragmatic argument is thus furnished for the use of dispositions in the interpretation of the theory more generally. The paper ends in a more speculative note by suggesting ways in which a dispositionalist interpretation of the wave function is in addition able to shed light upon some of the claims of the proponents of the causal version of Bohmian mechanics.
- Probability Without Certainty - Foundationalism and the Lewis-Reichenbach Debate David Atkinson and Jeanne Peijnenburg: Like many discussions on the pros and cons of epistemic foundationalism, the debate between C.I. Lewis and H. Reichenbach dealt with three concerns: the existence of basic beliefs, their nature, and the way in which beliefs are related. This paper concentrates on the third matter, especially on Lewis’s assertion that a probability relation must depend on something that is certain, and Reichenbach’s claim that certainty is never needed. It is noted that Lewis’s assertion is prima facie ambiguous, but argued that this ambiguity is only apparent if probability theory is viewed within a modal logic. Although there are empirical situations where Reichenbach is right, and others where Lewis’s reasoning seems to be more appropriate, it will become clear that Reichenbach’s stance is the generic one. This follows simply from the fact that, if P(E|G) > 0 and P(E|¬G) > 0, then P(E) > 0. It is finally concluded that this constitutes a threat to epistemic foundationalism.
- Categorical Equivalence between Generalized Holonomy Maps on a Connected Manifold and Principal Connections on Bundles over that Manifold Sarita Rosenstock and James Owen Weatherall: A classic result in the foundations of Yang-Mills theory, due to J. W. Barrett [“Holonomy and Path Structures in General Relativity and Yang-Mills Theory.” Int. J. Th. Phys. 30(9), (1991)], establishes that given a “generalized” holonomy map from the space of piece-wise smooth, closed curves based at some point of a manifold to a Lie group, there exists a principal bundle with that group as structure group and a principal connection on that bundle such that the holonomy map corresponds to the holonomies of that connection. Barrett also provided one sense in which this “recovery theorem” yields a unique bundle, up to isomorphism. Here we show that something stronger is true: with an appropriate definition of isomorphism between generalized holonomy maps, there is an equivalence of categories between the category whose objects are generalized holonomy maps on a smooth, connected manifold and whose arrows are holonomy isomorphisms, and the category whose objects are principal connections on principal bundles over a smooth, connected manifold. This result clarifies, and somewhat improves upon, the sense of “unique recovery” in Barrett’s theorems; it also makes precise a sense in which there is no loss of structure involved in moving from a principal bundle formulation of Yang-Mills theory to a holonomy, or “loop”, formulation.
- Artificial intelligence and Making Machine Learning Easier: the Role of Probabilistic Programming Larry Hardesty: Most advances in artificial intelligence are the result of machine learning, in which computers are turned loose on huge data sets to look for patterns. To make machine-learning applications easier to build, computer scientists have begun developing so-called probabilistic programming languages, which let researchers mix and match machine-learning techniques that have worked well in other contexts. In 2013, the U.S. Defense Advanced Research Projects Agency, an incubator of cutting-edge technology, launched a four-year program to fund probabilistic-programming research. At the Computer Vision and Pattern Recognition conference in June, MIT researchers will demonstrate that on some standard computer-vision tasks, short programs — less than 50 lines long — written in a probabilistic programming language are competitive with conventional systems with thousands of lines of code. “This is the first time that we’re introducing probabilistic programming in the vision area,” says Tejas Kulkarni, an MIT graduate student in brain and cognitive sciences and first author on the new paper. “The whole hope is to write very flexible models, both generative and discriminative models, as short probabilistic code, and then not do anything else. General-purpose inference schemes solve the problems.” By the standards of conventional computer programs, those “models” can seem absurdly vague. One of the tasks that the researchers investigate, for instance, is constructing a 3-D model of a human face from 2-D images. Their program describes the principal features of the face as being two symmetrically distributed objects (eyes) with two more centrally positioned objects beneath them (the nose and mouth). It requires a little work to translate that description into the syntax of the probabilistic programming language, but at that point, the model is complete. Feed the program enough examples of 2-D images and their corresponding 3-D models, and it will figure out the rest for itself. “When you think about probabilistic programs, you think very intuitively when you’re modeling,” Kulkarni says. “You don’t think mathematically. It’s a very different style of modeling.”