סמינרים - ארכיון
סמסטר חורף תשפ"ד
Speaker: Dr. Alex Blum, Max Planck Institute
Title: Inconsistency in Fundamental Physics
Quantum Field Theory is the foundation of the Standard Model of Particle Physics, our current best theory of microscopic matter and interactions. It has also been hailed as the numerically most precise theory in the history of science. Yet from its first construction in the late 1940s, the mathematical consistency of this theory was called into question and debated. In my talk, I will analyze the origins of these doubts, why they ultimately remained unresolved, and what they mean for the practice of fundamental physics and for the prospects of ultimately finding a final theory of "everything".
Speaker: Dr. Janina Wellmann, Max Planck Institute Berlin
Title: Registering Life in Motion. A Challenge to the Sciences and Humanities
Motion, previously marginalized, is now at the core of organic life. On the subcellular and molecular level, in the inner life of cells, where cargoes are delivered, molecules transported, and organelles displaced, movement turns out to be the root of all activity.
Computer animations, high-resolution microscopy and mathematical simulation reveal the organism in maximal motion, but they are also the conditio sine qua non for a world of research that can no longer think, calculate, or experiment without moving images.
In my talk, I will investigate the consequences, which the new conception of life as constant becoming has for the sciences and humanities alike. I argue that not only does modern biotechnology face fundamental experimental, technological and conceptual challenges, and do the humanities provide tools and perspectives to contribute to their understanding, but also that these challenges can best be faced in dialogue and mutual enrichment.
Speaker: Dr. Guy Hetzroni, The Open University of Israel
Title: Einstein’s Principle of Equivalence and the Empirical Basis for Theoretical Arguments
The interplay of empirical and theoretical considerations in contemporary fundamental physics poses various challenges to the philosophy of science. This talk will offer new reflections on these challenges, based on a new account of certain theoretical methods applied in Einstein’s general theory of relativity and in particle physics.
In his early expositions of general relativity, Albert Einstein presented his principle of equivalence as an empirical observation extended and promoted to a fundamental principle that underlies the construction of the theory. The content of the principle, however, was soon brought into question together with its very validity in the theory. Einstein later abandoned this early presentation of the theory altogether. By the late 1920s he described the construction of the theory as indicating a radical break in theoretical physics, away from inductive reasoning and towards mathematically based patterns of reasoning and non-empirical guiding principles. In parallel, the philosophical reflections on the epistemological foundations of the theory shaped the fundamental divides in early 20th century philosophy of science. These debates are echoed in contemporary debates in the philosophy of physics on the epistemic status of theoretical virtues and principles, and the justification of theories based on them.
In this talk I will revisit the relation between the early debates and the current ones. The discussion will be based on the new "methodological equivalence principle", a suggested generalization of Einstein's principle from which the basic structure of general relativity, it is shown, can be derived in a way that maximizes the weight of empirically based considerations. The same principle is then shown as underlying the applicability of the gauge argument in particle physics, similarly highlighting the role of empirical considerations and resolving worries related to the applicability of mathematics. Finally, I'll argue that the two cases demonstrate the relevance of heuristic and methodological considerations to foundational questions in the philosophy of physics and the philosophy of science.
Speaker: Seth Goldwasser, University of Pittsburgh
Title: Standard Aberration: Cancer Biology and the Modeling Account of Function
Cancer biology features the ascription of normal functions to parts of cancers. At least some ascriptions of function in cancer biology track local normality of parts within the global abnormality of the aberration to which those parts belong. That is, cancer biologists identify as functions activities that, in some sense, parts of cancers are supposed to perform, despite cancers themselves having no purpose. This talk provides a theory to accommodate these normal function ascriptions—I call it the Modeling Account of Normal Function (MA). MA comprises two claims. First, normal functions are activities whose performance by the function-bearing part contributes to the self-maintenance of the whole system and, thereby, results in the continued presence of that part. Second, MA holds that there is a class of models of system-level activities (partly) constitutive of self-maintenance members of which are improved by including a representation of the relevant function-bearing part and by making reference to some activity which that part performs, where that activity contributes to those system-level activities. I contrast MA with two other accounts that seek to explicate the ascription of normal functions in biology, namely, the organizational account and the selected effects account. Both struggle to extend to cancer biology. However, I offer ecumenical readings which allow them to recover some ascriptions of normal function to parts of cancers.
Speaker: Dr. Lotem Elber-Dorozko, Pittsburgh
Title: How a philosophical definition of purpose in biology can help us identify good models in neuroscience
In recent years, a variety of neural network models have been extremely successful in predicting brain processes and behavior. Nonetheless, neuroscientists often disagree on whether these suggested models of cognitive capacities indeed capture essential properties of these capacities. In my talk I aim to clarify these debates by noting that attempts to understand cognitive capacities occur under the posit that the brain is a teleological system – a system that has (at least one) purpose. It seems undeniable that biological organisms have purposes; spiders build webs to catch pray and mice pups emit sounds to call their mom, for example. Even so, philosophers and scientists have debated for centuries how to define and assign such purpose and have not reached agreement.
Although this teleological view of the brain is an essential part of neuroscience, the difficulties it raises are rarely explicitly invoked in scientific discourse. Considering teleology with relation to neuroscientific practice sheds light on the source of the inherent vagueness that surrounds the delineation of cognitive capacities. Furthermore, it suggests that neuroscientists cannot rely on prediction of behavior and brain processes alone to support their models. Instead, models of purposeful behavior should also be assessed relative to the history that caused the capacity to be brought forth. I take object recognition and recent convolutional neural networks models of this capacity as a case study to demonstrate these points and suggest several concrete ways in which neuroscientific practice can be amended.
Speaker: Dr. Hisham Abdulhalim, Technion
Title: Denied by an (Unexplainable) Algorithm: Teleological Explanations for Algorithmic Decisions Enhance Customer Satisfaction
Automated algorithmic decision-making has become commonplace, with firms implementing either rule-based or statistical models to determine whether to provide services to customers based on their past behaviors and characteristics. In response, policymakers are pressing firms to explain these algorithmic decisions. However, many of these algorithms are “unexplainable” because they are too complex for humans to understand. Moreover, legal or commercial considerations often preclude explaining algorithmic decision rules. We study consumer responses to goal-oriented, or teleological, explanations, which present the purpose or objective of the algorithm without revealing mechanism information that might help customers reverse or prevent future service denials. In a field experiment with a technology firm and in several online lab experiments, we demonstrate the effectiveness of teleological explanations and identify conditions when teleological and mechanistic explanations can be equally satisfying.
Whereas the epistemic value of explanations is well established, we study how explanations mitigate the negative impact of service denials on customer satisfaction. Yet in situations where companies do not want to, or cannot, reveal the mechanism, we find that teleological explanations create equivalent value through the justifications they may offer. Our results thus show that firms may benefit by offering teleological explanations for unexplainable algorithm behavior, positioning their firms as more ethical than others.
סמסטר אביב תשפ"ג
Dr. Osvaldo Ottaviani, Technion
Title: What role do plants (and living beings) play in the philosophy of Leibniz?
Without being directly involved in botanical research, Leibniz has always had a strong interest in this subject, as well as all in many investigations about the nature of living beings. In some passages, he also claims that the inquiries of modern investigators (microscopists, anatomists, etc.) on the generation of plants and animals provided new insights not only on the structure of living beings and the universe, but also on the nature and the constitution of souls and incorporeal substances. In my talk, I will try to answer the following question: in which sense Leibniz could say that the nature and structure of living beings is of the utmost importance to the establishment of his theory of simple substances or monads? I will move from some recently edited texts where Leibniz is concerned with the nature of some climbing plants, and the rejection of the theory of the spontaneous generation of plants and insects. In both cases, Leibniz defends the universality of the sexual reproduction of living beings (stressing the analogy between plants and animals), while, at the same time, maintaining that living beings are never generated mechanically by something which is not already organic. Leibniz's account of life wants to keep together the following claims: 1) the rejection of spontaneous generation (because nothing organized can be generated from chaos or inorganic matter); 2) the thesis of preformation of all organisms; and 3) their status of ‘machines’ endowed with an infinity of organs. Thus, Leibniz employs the research and observations of the moderns on the generation of insects, plants, and animals as an empirical confirmation of his metaphysical thesis on the infinite complexity of the machines of nature. From the metaphysical point of view, the infinite composition of natural machines explains the sense in which each soul is able to express an infinite universe (like a living mirror of the universe, as Leibniz says).
Dr. Assaf Marom, Technion
Title: Anatomy education in the age of digital reproduction
Anatomy is a core subject among the basic sciences of medicine, which has been studied in medical schools through dissection ever since anatomy was established as a science several centuries ago. However, since the last quarter of the previous century anatomical education has been undergoing a process of transformation, mainly fueled by the development of medical imaging technology. Consequently, many medical schools have been modifying their courses to an imaging-based education, relying on computer software, medical images, and holograms, among others. While both traditional and imaging-based approaches to anatomical education have advantages and disadvantages, here I discuss the drawbacks of the new approach, arguing that the technological advances we witness provoke scientific, pedagogic, and ethical challenges that should be addressed before rushing to implement them.
Dr. Daniel Kunzelmann (University of Basel)
Title: Doing fieldwork with(in) surveillance architectures: methodological and ethical implications for digital researchers.
Social media create seemingly transparent contexts of information. Comments, attitudes and attributions become (often) easily accessible to researchers. Not least for qualitative approaches, this provides an empirical treasure of data offering valuable insights into sometimes very intimate spheres of life.
We might understand the virtual spaces in which such data are generated as architectures of surveillance that establish a specific regime of (in)visibility. This conceptual reframing raises some fundamental methodological and research ethics questions anew: Who is allowed to see (and know) for what purpose? And how may this knowledge be generated and disseminated?
Using ethnographic material this lecture reflects on some of the challenges that scientists face while conducting research via social media. The main focus will be on the tension between the actors' wish (or need) for anonymity and a scientific standard that demands the disclosure and traceability of empirical material. A "case-based approach" is presented that makes it possible to uphold established research ethical standards and, at the same time, benefit from the potential of easily accessible material on social media
Title: What Use Was Science to Philosophy—and What Use is Philosophy’s History to Science?
It is widely, though not universally believed that philosophy from certain periods is best understood by studying it in the context of the scientific discoveries and hypotheses that preoccupied its authors. I will give a few examples of how to read and misread classical texts in the history of philosophy –Descartes, Leibniz, and Kant, who were essentially philosophers of nature– by taking their scientific context into account. It is also widely, though not universally believed that scientists do not benefit from studying the history of philosophy. I would next like to challenge this view. After professional philosophy (inevitably) detached itself from professional science, many scientists lost the ethical orientation that had guided their predecessors. This orientation must be recovered if we are not to enter a scientific-technological dystopia.
Title: Music Technology – Bottom Up
Can electric guitars, synthesizers and Spotify really be the focus of serious academic research? Yes! In this talk we will take a bottom-up look on the academic discipline of music technology. We’ll start with an overview of Lior Arbel’s work – the invention and research of several musical instruments, software and hardware. We’ll see how music technology research blends engineering, computer science, human-computer interaction and social sciences. We’ll go on to describe the international music technology discipline and glance at its current state in Israel, or lack thereof.
Dr. Jonathan Najenson, Technion
Long-Term Potentiation (LTP) Revisited: Reconsidering the Explanatory Power of Synaptic Efficacy
Changes in synaptic strength are described as a unifying hypothesis for memory formation and storage, leading philosophers to consider the "synaptic efficacy hypothesis" as a paradigmatic explanation in neuroscience. However, in Craver's mosaic view, while synaptic efficacy plays a unifying role by integrating diverse fields within a hierarchical mechanism, it does not have explanatory power across distinct memory types. In this talk, I will show how the mechanistic approach can accommodate the explanatory power of this hypothesis by examining its role across different mechanistic hierarchies, which in turn supports the idea of unification.
Dr. Oren Bader, Technion; Heidelberg University Hospital.
Neurophenomenology as a research program – Bridging the gap between philosophy and neuroscience (Abstract below).
Francisco Varela coined the term Neurophenomenology (NP) to suggest a research area that combines phenomenology – the philosophical study of the structures of subjective experiences with scientific investigations into their underlying neurocognitive mechanisms. Although Varela introduced the concept of NP over 20 years ago, only a few attempts were made to implement this unique approach in concrete studies. In this talk, I’ll discuss the benefits and limitations of an integrated philosophical and scientific research program for studying subjective experiences by reviewing results from a recent NP study on intergroup empathy. Specifically, I’ll ask can naturalizing phenomenology help bridge the gap between empirical investigations and philosophical approaches and what can be achieved when designing hybrid research.
Prof. Emmanuel Farjoun Hebrew University
Exponential economic growth: Causes and Costs (Abstract below).
The talk is based on his recently published book "How Labour powers the global economy – a labor theory of capitalism" (coauthored by M. Machover and D. Zachariah).
Abstract: In this talk, the basic drive of the Capitalist economy to grow will be reduced to basic labor consideration, conservation and measures. The limits of this growth will be explained: Both lower and upper bounds are explored.
Real monetary prices do not faithfully reflect social-economic causes and costs. A deeper analysis is needed. Its main tool is a probabilistic reduction of the apparent, easy-to-observe money costs, capital costs, and natural resources cost to labor inputs. These are measured simply by hours of work and are somewhat harder to trace. Labor inputs, in turn, reflect much deeper social-economic realities. The implications, under the present system, to the ecological and the human-social future are shown to be grave. Ways to overcome these grave and inevitable implications will be discussed.
Prof. Elay Shech
Scientific Understanding, Modal Reasoning, and (Infinite) Idealization
One of the main goals of science is to provide understanding. Yet, the use of idealizations and abstraction in science, which are falsehoods of sorts, is ubiquitous. How can science afford understanding in light of idealizations? How can falsity allow us to understand that truth? In this talk, I attempt to make some headway in answering such questions.
Specifically, by appealing to resources found in the scientific understanding literature, I identify in what senses idealizations afford understanding in the context of the (magnetic) Aharonov-Bohm effect. Using this example, which appeals to infinite idealizations, I will suggest that idealizations facilitate modal reasoning in a manner that is essential for understanding both scientific theory and what some phenomenon is supposed to be in the first place.
Prof . Aviram Rosochotsky (Tel Aviv University)
Relationism in Shape Dynamics
Relationism is a philosophical view according to which motion consists solely in changes to position relations between material bodies. Consequently, it rules out absolute motions – those that take place with respect to space and time which (supposedly) exist independently of matter. In my talk I'll show how Barbour and Bertotti (1982) were able to create a modification of Newton's theory of gravity which is compatible with the relationist view of motion using a theoretical device called best-matching. I will then review the treatment of motion in Shape Dynamics, which is the generalization of Barbour and Bertotti's approach in a relativistic context. The comparison between the revisions of Newtonian gravity and General Relativity will show interesting differences.
סמינרים – חורף + אביב תשפ"ב
Prof. Omer Einav (Molad).
Defending the Goal: Football and the Relations between Jews and Arab in Mandatory Palestine, 1917-1948
My research uses the periodization of the Mandate in analyzing relations between the Zionist and Palestinian national movements, in the socio-cultural context. As a research tool, that context with its diverse aspects has been used in recent years as a field for exploring the dynamic created in Mandatory Palestine, between the two societies that lived there, and as a reflection of nationalist aspects at the heart of existing historiography. The focus is the game of football, through which a distinctive angle an attempt is made to examine the development and traits of relations between Jews and Arabs
Dr. Lotem Elber-Dorozko (Technion) and Arnon Levy (Hebrew University).
What is so good about being optimal? On appeals to optimality in the cognitive sciences
Models in cognitive science often describe people’s behavior as optimal. What are the motivations for such descriptions? We discuss three – empirical, methodological, and conceptual. The first involves a claim about the power of natural selection, namely that it can be expected to lead many cognitive capacities to be performed optimally, in the sense that they maximize fitness. On this view, appeals to optimality have explanatory value in virtue of background assumptions about evolution. However, we present several interrelated reasons to question the claim that cognitive capacities maximize fitness.
Alternatively, optimal models may serve as a good first approximation for the behavior. We suggest that it is more accurate to consider optimal models as idealizations, because ‘approximation’ suggests that the model can be brought closer (often, arbitrarily closer) to reality, and this does not hold for most cognitive models. It is an open question what can be learned from optimality-based models, construed as idealizations.
Finally, we consider a conceptual motivation for viewing people as optimal. Here, the idea is that optimality is part of an attempt to “make sense of the organism”, to rationalize its behavior. We accept that such a perspective is important, perhaps indispensable in studying minds. But we urge caution about tying it to the notion of optimality or to strong notions of rationality.
Dr. Sharon Bassan
A Proportionality-Based Framework for Government Regulation of Digital Tracing Apps in Times of Emergency
Times of emergency present an inherent conflict between the public interest and the preservation of individual rights. Such times require granting emergency powers to the government on behalf of the public interest and relaxing safeguards against government actions that infringe rights. The lack of theoretical framework to assess governmental decisions in times of emergency leads to a polarized and politicized discourse about potential policies, and often, to public distrust and lack of compliance.
Such a discourse was evident regarding Digital Tracing Apps (“DTAs”), which are apps installed on cellular phones to alert users that they were exposed to people who tested positive for COVID-19. DTAs collect the most sensitive types of information, such as health-related and location or proximity information, which violates the right to privacy and the right to be free of surveillance. This sensitive information is normally legally protected. But in emergencies there are no legal restrictions limiting the collection of such data. The common privacy-law approach supports DTA implementation under the condition that the technology preserves the privacy of users. But this Article suggests that the privacy approach focuses on micro considerations and under-addresses the implications of DTA-based policy. Instead, this Article suggests rethinking DTA implementation during COVID-19 through the doctrine of proportionality. Often used by European Union courts in areas where decisions entail meaningful implications to individual rights, the doctrine offers a clear and workable normative evaluation of tradeoffs in a more nuanced, explicable, and transparent way. Highlighting macro considerations, the doctrine of proportionality suggests that 1) DTA-based policy is less proportionate compared to traditional contact-tracing methods; 2) policies created while relying on smartphones are inequitable and biased; and 3) the sharing of sensitive personal information with private companies will have irreversible social surveillance implications. Additionally, the proportionality method not only provides a flexible methodological tool to evaluate government decisions in times of emergency but also offers an opportunity to examine how governments achieve and justify the acceptance and assimilation of new technological policy measures, which may take societies in new directions.
Dr. Paula Reichert (LMU München).
Why does time have a direction?
This talk discusses the physical origin of time directionality. In nature we observe irreversible (i.e. time-directed) processes, like the breaking of an egg, diffusion through cell membranes, and so on, despite the fact that all the fundamental laws of physics are time-symmetric. How can this apparent paradox be resolved? We discuss how macroscopic time asymmetry can be grounded on time-symmetric microscopic laws by means of a typicality argument based on special initial conditions. This demand for a special initial condition thrusts the origin of time asymmetry back in time towards the beginning of the universe. In this context, we discuss how modern cosmology seeks to explain the origin of time directionality via a time-symmetric one past – two futures scenario where the Big Bang provides merely a special instant, a Janus point, in an otherwise eternal universe.
Professor Omri Barak (Technion)
Understanding the brain with models we don't understand
I will present the common manner in which computational models are used in neuroscience, and then an alternative that combines machine learning, reverse engineering and dynamical systems. The talk will touch upon questions such as "What defines a good model?", "How should we compare models to data?" and more.
Dr. Alik Pelman (Technion), Dr.Alon Shepon (TAU), Dr.Jerke de Vries (Van Hall Larenstein), Dr.Sigal Teper (Tel-Hai)
What is the Most Environmental (Healthy) Nutrition? Comparing a case study of self-sufficient farming to common industrial alternatives
Providing food security to feed a growing population with reduced environmental impacts and resilient to climate change is an ongoing challenge. Here, we detail a low-input subsistence Mediterranean agroforestry system that is based on traditional annual crop rotation and perennial shrubs and trees and provides adequate nutritional supply with limited labor and using reduced land and on-site water resources. Our records span a 9 year time period in which 0.1 of hectare provided a balanced diet to the producing farmer, in effect closing a full farm-table-farm cycle, with no synthetic fertilizers or herbicides and with zero waste. Situated within a semi-arid region that is a climate change ‘hotspot’ this food system serves as a case study to further examine food production systems that provide healthy diets with lower environmental impacts and greater agrobiodiversity and resilience than conventional industrial farming practices and even organic ones.