סמינרים - ארכיון (כולל הקלטות)

כרטיסיות 3

לחצו כאן להקלטות הסמינרים


סמסטר חורף תשפ"ה 

Speaker: Dr. Daniel A. Di Liscia, LMU Munich and Humanities and Arts Technion

Title: Concepts in Motion: Late Medieval Interdisciplinarity and the Rise of Modern Science

Abstract

According to the Aristotelian tradition, physics, mathematics, and metaphysics are three distinct speculative disciplines, each with its own approach and, especially, distinct objects of research. These disciplines cannot be combined without causing serious complications. However, Aristotle addresses the particular case of certain disciplines that derive their principles from pure mathematics but focus on natural objects; music, astronomy, and optics are typical examples, though for different reasons in each case. From the 13th century onward, these fields became collectively known as scientiae mediae, or "middle sciences,” as they are epistemologically situated somewhere between mathematics and physics. Galileo still referenced them in his Discorsi.

With this background, scholars have questioned whether the work of the "calculators,” which essentially involves the mathematization of motion, could be understood as a particular case of the middle sciences. Disappointingly, however, no definitive proof has yet been provided within this context. Nonetheless, the later development of this tradition clearly indicates that the concept of mathematizing motion evolved into a new discipline, later called the science of the latitude of forms (scientia de latitudinibus formarum).

In my talk, I will present several pieces of evidence for the historical fact that the inclusion of the text De latitudinibus formarum in university curricula paved the way for a new, geometry-based science of motion, which its proponents themselves identified as a scientia media. I will substantiate this argument with both texts I have previously cited in my published works and others that have, until now, remained unknown.


סמסטר אביב תשפ"ד

Speaker: Dr. Enrico Piergiacomi, Humanities and Arts, Technion

Title: Infinite Scepticism: Digital Atomism and the Case Study of Metrodorus of Chios

Abstract

Democritus’ pupil Metrodorus of Chios (4th century BC) is ascribed with embracing seemingly contradictory perspectives according to ancient sources. While he exhibited an extremely sceptical stance on epistemology, asserting that «none of us knows anything, and not even this, whether we know it or do not know it», he simultaneously endorsed scientific doctrines in the realm of physics, such as atomism, the infinitude of the universe, and a mechanistic explanation for natural phenomena.

This apparent schism between his views on physics and epistemology raises the question of why Metrodorus felt justified in making assertions about the nature and processes of the world despite his profound epistemic scepticism. This paper aims to propose that Metrodorus paradoxically based his sceptical outlook on atomism.

In another fragment, Metrodorus posited that the causes of things and events are infinite, suggesting that every finite entity has both a physical foundation and a transcendent source of existence. This idea of infinitude serves as the grounding for both the continuous infinite changes observed in experience (qualitatively and quantitatively) and the justification for epistemic doubt. If the causes of an event are infinite and the human mind is finite, our understanding can only capture a fraction of the causal spectrum beyond any given phenomenon. In other words, truth remains indeterminate.

This hypothesis also elucidates why Metrodorus doubted whether we truly ignore something, namely that we may unknowingly possess knowledge. The enigmatic statement «whether we know it or do not know it» could imply that we have possibly discerned the causes of a particular phenomenon within the infinite causal spectrum. However, we may be oblivious to this knowledge due to the overwhelming nature of infinitude.

In conclusion, I propose that Metrodorus’ sceptical epistemology and atomistic physics were interconnected in an unconventional manner. The concept of infinitude serves as a natural premise leading to a tentative and indeterminate perspective on nature, subject to constant revision as more infinite causes of reality are comprehended.

Speaker: Prof. Oded Ezer, Holon Institute of Technology  

Title: The Intersection of the Art of Typography, Technology, and Human Experience

Abstract

In this seminar, I will provide insights into the core creative concepts and methodologies behind integrating typography, fiction, and contemporary visual practices. The seminar will start with a brief history of typography and include descriptions and examples of my professional and creative work. I will also explore potential collaborative opportunities and future projects. My goal is to inspire interdisciplinary collaborations that blend artistic methods with fields such as cognition, computer science, biotechnology, mathematics, education, data and decision sciences, physics, and engineering to enhance experimental narratives. I will showcase the potential for innovative joint projects and encourage discussions on new interdisciplinary partnerships.

Speaker: Dr. David Manheim, Humanities and Arts, Technion

Title: Cooperating with Stochastic Parrots

Abstract

Large Language Models are a significant new technology both technologically and socially, but what they are, exactly, and how we should view them now and as they become more sophisticated and integrated into social and business interactions in the near-term is still unclear. After reviewing what LLMs are, and why they are considered "Stochastic Parrots" by some experts, this talk will first make a case that cooperation is a relevant concept for interacting with these models, even if they are not moral actors. Following this, the presentation will consider whether the characterization as stochastic parrots is reasonable from the perspectives of computational complexity and theory of mind, and raise some of the questions related to whether they should be considered moral actors, specifically regarding agency, intent, and consciousness. It will conclude with some thoughts on critical open questions for if and when they are considered moral actors socially, ethically, and societally.

Speaker: Dr. Eliyahu Keller, Architecture and Town Planning, Technion

Title: On Architectural Origins and Ends: Raimund Abraham’s Birth of Architecture

Abstract

Stories of the origin of architecture are more than mere tales. Rather, they are textual, visual, and representational devices not only of what architecture, as a profession and a discipline, aspires to be, but of the world in which architecture operates, the subjects—whether architect or user— it imagines, and the kind of culture which it sought, in retrospect, to construct around and by virtue of itself. By depicting an origin against which architecture is established, such stories seek to legitimize, concretize, and give meaning to all architectural acts that followed that once primordial act.

Equally intriguing, yet scarcely investigated in comparison to architecture’s imaginary beginnings, are the tales and images of architecture’s end. Indeed, it could be argued that to imagine the ultimate demise of a discipline so self-preoccupied, if not obsessed with its own beginnings, would constitute a contradiction in terms. And yet, as literary scholar Frank Kermode noted, if we are to understand our place in the middle, one must take into account not only how a thing begins but how it emphatically terminates. If the origins of architecture, then, are to have a function within the discipline’s story, and if the discipline itself is to have a future, then a certain formation of architecture must come to an end.

My lecture will the entanglement of architecture’s origins and ends through a reading of several noted architectural origin myths against a more recent work that belongs to this tradition: the last published work of the Austrian-born visionary draftsmen and architect, Raimund Abraham, titled ‘The Birth of Architecture’ (2009). Building on the extant literature and countless interpretations surrounding the history of architectural origin myths, as well as an exploration of Abraham’s career and personal history, I will posit Abraham’s drawing as the conclusion to the origins coming out of the Enlightenment and their conception of architecture’s relationship to nature. Out of this examination, Abraham’s Birth of Architecture emerges not only as the ultimate conclusion to the centuries-old Western story of nature’s subjugation but the first chapter in a future history of architecture that emerges from a recognition of the destruction that has been demanded by human progress; an origin story for an architecture in a world in which the traditional conception of nature no longer exists.

Speaker: Dr. Maayan Tsadka

Title: Acoustic Ecology and how to expand our hearing

Abstract

Coming from a practice of music composition, sound and field recording, I will share some of my practices/thoughts about using microphones, technology and listening techniques as means for augmenting our hearing/listening/perception.

I will share some of my creative projects and perspectives from the growing field of acoustic ecology and the use of sound recording as a research and creative tool.

Speaker: Dr. Nir Ofek, Network Biology Research Lab, Technion

Title: The Feedback Loop of Synchrony: Emergence in Group Dynamics and Its Influential Effects

Abstract

Synchrony is the phenomenon where individuals within a group exhibit simultaneous, coordinated behavior. In this talk, I will present our ongoing research: how synchrony impacts individual identity transformation and its significance for resilience and social cohesion. To understand how synchrony emerges, we will present our further approach to identify the role of different connectivity setup—close-loop versus open-loop—in facilitating or hindering group synchrony.

Speaker: Dr. Orit Wolf, Humanities and Arts, Technion

Title: The Art of "Cadenza" Writing for Classical Concerto – from Mozart and Beethoven's Era up to the 21st century

Abstract

The "Cadenza" is perhaps the most precious part of the concerto from the classical era, where composers such as Mozart and Beethoven enabled the soloist to express virtuosity and reflections upon the thematic material of the given musical text.  This was an opportunity for the soloist to play alone while the orchestra was silent. While those classical Masters have offered their examples of cadenzas, they both emphasized the importance of the performer writing or improvising his/ her own cadenza.   Accordingly, "Romantic" style cadenzas composed for classical piano concertos have taken off the cadenza into new realms of daring and creativity. There is a rich tradition spanning from the cadenzas written by Clara Schumann, Busoni, and Brahms to the contributions of contemporary composers and improvisers. Strangely enough, during the 20th century, cadenza writing declined and suffered from over-conservativism. This prompts several critical questions for performers and scholars alike: Should we adhere to the cadenzas recommended by the original composers, or should we endeavor to create our own? What are the established conventions for composing or improvising a cadenza for a classical concerto?

Additionally, what stylistic approach should be employed: that of the original composer or a more modern or personally favoured style?  Which thematic materials should be incorporated, and in what manner? Is there a prescribed duration for the cadenza, or is it inherently a free-form structure? These questions are not only essential for a comprehensive understanding of the function of the cadenza but are also essential to shape future performance practice of modern cadenzas for classical concertos.


סמסטר חורף תשפ"ד

Speaker: Dr. Christian Beck, Technion

Title: Into the Tension between Quantum Nonlocality and Relativistic Space-Time

Abstract

In this talk, I will give an overview of two rival aspects of two of our most fundamental physical theories, quantum theory and relativity. Quantum entanglement leads to nonlocal (superluminal) correlations between distant events. Bell’s theorem establishes that these correlations cannot be explained by local common causes. But according to Einsteinian relativity, there is no absolute temporal order between such events; different “observers” disagree on which occurs first. Hence, any explanation of the correlations in terms of “cause” and “effect” would be relative to a frame of reference.

To guarantee that this tension between quantum nonlocality and relativity does not lead to inconsistencies, certain constraints are imposed on relativistic quantum theories (local commutativity) which guarantee that quantum nonlocality cannot be used to send signals faster than light (no-signalling). It is usually argued that the possibility of superluminal signals would give rise to causal paradoxes. I will demonstrate the basic scheme of such paradoxes and argue that – although having valid lines of argument – they appear to be too anthropocentric to base on them a fundamental physical principle. I will motivate another physical requirement, which I call relativistic consistency, which seems to be better suited than no-signalling but implies the latter.

Speaker: Dr. Nir Fresco, Ben Gurion University of the Negev

Title: Miscomputation as Malfunction in the Computational Sciences of Mind and Brain

Abstract

Miscomputation is a deviation from a norm that is set for artificially designed physical systems (e.g., desktops and smartphones), but also for species and organisms. When a system miscomputes, it computes a different mathematical function g, rather than the function, f, which is the norm (i.e., there exists at least one input ix such that g(ix) ≠ f(ix)). Given the importance of miscomputation in any complete account of physical computation, few works have dealt with this phenomenon explicitly (e.g., Colombo, 2021; Dewhurst, 2014; Fresco & Primiero, 2013; Petricek, 2017; Piccinini, 2015; Primiero, 2020; Tucker, 2018). In contrast to artificial computing systems, there are no design specifications according to which the behaviour of a biological system can be classified as correct or incorrect. How then should miscomputation be construed in the computational sciences of mind and brain? Is any appeal to ‘miscomputation’ in these sciences merely instrumental? Is this usage aligned with how ‘miscomputation’ is used in computer science and engineering? These questions remain open. Surprisingly, most normative accounts of computation in the philosophical literature face a dilemma: either there is no such thing as miscomputation, or it cannot be explained.

Speaker: Dr. Alik Pelman, Humanities and Arts, Technion

Title: Meta-Philosophy as Intellectual Peace-Making

Abstract: 

When faced with a philosophical controversy, philosophers normally tend to decide what view they are siding with, and then provide arguments for that view and against the competing ones. However, one can adopt a different stance, a meta-philosophical stance, and search instead for common ‘building blocks’ from which the different competing views can all be reconstructed. Such a set of building blocks allows the generation of a common logical space of possibilities, on which the different existing views can be located, and thus analysed, compared, and contrasted. In addition, such a ‘map’ reveals new possible views yet to be taken. The challenge, of course, is to come up with an appropriate set of building blocks for a given set of competing views. In this talk, I shall propose several such meta-philosophical analyses, applied to debates in the philosophy of science, philosophy of mind, environmental ethics, and metaphysics. We shall also witness that, once placing a set of different competing views on such a common ‘map’, and thus gaining a more objective, impartial overall perspective, it often becomes less attractive to choose one side over the other, thereby fostering ‘peace-making’.

Speaker: Dr. Ariel Furstenberg, Hebrew University

Title: How committed are we to our intentions? From electrophysiology to free-will

Abstract: 

In a famous philosophical paradox, Buridan's ass starves to death, because, placed equally between two identical stacks of hay, it cannot make up its mind whether to approach the right or the left haystack. We are faced daily with the need to pick between alternatives that are equally attractive (or not) to us.What are the processes that allow us to avoid paralysis and quickly choose between such equal options when there are no preferences or rational reasons to rely on? This is an empirical question with significant philosophical implications regarding issues of intentionality, human agency, free will and self-control. In this talk, I will present experiments addressing this important phenomenon using measures of human behavior, electrophysiology and neural network modeling, and discuss the cognitive and neural mechanisms involved in the process of intention formation and execution, in the face of alternatives to choose from. Specifically, I will show results revealing the temporal dynamics of rapid intention formation and, moreover, ‘change of intention’ in a free choice picking scenario, in which the alternatives are on par for the participant. Furthermore, I will claim that arbitrary decisions, arguably based on a whim, are subject to similar control mechanisms as reasoned decisions. With these results in hand, I plan to discuss their conceptual implications, focusing primarily on the notions of intention, reasons and causes and their ensuing conceptual change. This discussion proposes a refinement of broader conceptions encompassing human agency, free will, and self-control.

Speaker: Dr. Daniel Grimmer, University of Oxford

Title: In Search of New Spacetimes: From Unofficially Humean Laws to a Kantian Spacetime

Abstract: 

What, if any, can help us explain how nature facilitates the dynamical behavior of matter? One might attempt to answer this question via a straightforward metaphysical reading of our best physical theories. If taken literally, our best theories say that there exist things such as space, time, and laws of nature which govern the world's events and which give them a stage upon which to happen. But do we have to read our ontological commitments off of our theories so literally? Not every aspect of our best theories is taken so literally. For instance, given a theory described in one coordinate system, we have the mathematical tools to easily redescribe it in a wide variety of alternate coordinate systems (or even using no coordinates whatsoever). This strong capacity for coordinate redescription leads us to believe that coordinate systems are merely a representational artifact which we project onto the world as a means of organizing and codifying its regularities.

But what if we could also redescribe our best physical theories using different laws of nature or as being set on different spacetime manifolds (e.g,. a Mobius strip vs a Euclidean plane)? What if we could do this just as easily as we can switch between different coordinate systems? In some recent work [1-3], I have shown that we can! In this talk I will argue that given this strong capacity for nomological and spatiotemporal redescription, we ought to adopt a (roughly) Humean view of the laws of nature and a (roughly) Kantian view of space and time. The laws of nature which appear in our best physical theories do not correspond to any active governing force out there in the world; Instead, the laws have the metaphysical relevance that they do only because they are a particularly nice way of systematizing the dynamical behavior of matter. Similarly, the spacetime manifold which appears in our best physical theories does not correspond to any sort of stage upon which the world's drama plays out; Instead, space and time have the metaphysical relevance that they do only because they are particularly nice ways of organizing and codifying the dynamical behavior of matter.

Speaker: Dr. Alex Blum, Max Planck Institute

Title: Inconsistency in Fundamental Physics

Abstract: 

Quantum Field Theory is the foundation of the Standard Model of Particle Physics, our current best theory of microscopic matter and interactions. It has also been hailed as the numerically most precise theory in the history of science. Yet from its first construction in the late 1940s, the mathematical consistency of this theory was called into question and debated. In my talk, I will analyze the origins of these doubts, why they ultimately remained unresolved, and what they mean for the practice of fundamental physics and for the prospects of ultimately finding a final theory of "everything".

Speaker: Dr. Janina Wellmann, Max Planck Institute Berlin

Title: Registering Life in Motion. A Challenge to the Sciences and Humanities

Abstract: 

Motion, previously marginalized, is now at the core of organic life. On the subcellular and molecular level, in the inner life of cells, where cargoes are delivered, molecules transported, and organelles displaced, movement turns out to be the root of all activity.

Computer animations, high-resolution microscopy and mathematical simulation reveal the organism in maximal motion, but they are also the conditio sine qua non for a world of research that can no longer think, calculate, or experiment without moving images.

In my talk, I will investigate the consequences, which the new conception of life as constant becoming has for the sciences and humanities alike. I argue that not only does modern biotechnology face fundamental experimental, technological and conceptual challenges, and do the humanities provide tools and perspectives to contribute to their understanding, but also that these challenges can best be faced in dialogue and mutual enrichment.

Speaker: Dr. Guy Hetzroni, The Open University of Israel

Title: Einstein’s Principle of Equivalence and the Empirical Basis for Theoretical Arguments

Abstract:

The interplay of empirical and theoretical considerations in contemporary fundamental physics poses various challenges to the philosophy of science. This talk will offer new reflections on these challenges, based on a new account of certain theoretical methods applied in Einstein’s general theory of relativity and in particle physics.

In his early expositions of general relativity, Albert Einstein presented his principle of equivalence as an empirical observation extended and promoted to a fundamental principle that underlies the construction of the theory. The content of the principle, however, was soon brought into question together with its very validity in the theory. Einstein later abandoned this early presentation of the theory altogether. By the late 1920s he described the construction of the theory as indicating a radical break in theoretical physics, away from inductive reasoning and towards mathematically based patterns of reasoning and non-empirical guiding principles. In parallel, the philosophical reflections on the epistemological foundations of the theory shaped the fundamental divides in early 20th century philosophy of science. These debates are echoed in contemporary debates in the philosophy of physics on the epistemic status of theoretical virtues and principles, and the justification of theories based on them.

In this talk I will revisit the relation between the early debates and the current ones. The discussion will be based on the new "methodological equivalence principle", a suggested generalization of Einstein's principle from which the basic structure of general relativity, it is shown, can be derived in a way that maximizes the weight of empirically based considerations.  The same principle is then shown as underlying the applicability of the gauge argument in particle physics, similarly highlighting the role of empirical considerations and resolving worries related to the applicability of mathematics.  Finally, I'll argue that the two cases demonstrate the relevance of heuristic and methodological considerations to foundational questions in the philosophy of physics and the philosophy of science.

Speaker: Seth Goldwasser, University of Pittsburgh

Title: Standard Aberration: Cancer Biology and the Modeling Account of Function

Abstract:

Cancer biology features the ascription of normal functions to parts of cancers. At least some ascriptions of function in cancer biology track local normality of parts within the global abnormality of the aberration to which those parts belong. That is, cancer biologists identify as functions activities that, in some sense, parts of cancers are supposed to perform, despite cancers themselves having no purpose. This talk provides a theory to accommodate these normal function ascriptions—I call it the Modeling Account of Normal Function (MA). MA comprises two claims. First, normal functions are activities whose performance by the function-bearing part contributes to the self-maintenance of the whole system and, thereby, results in the continued presence of that part. Second, MA holds that there is a class of models of system-level activities (partly) constitutive of self-maintenance members of which are improved by including a representation of the relevant function-bearing part and by making reference to some activity which that part performs, where that activity contributes to those system-level activities. I contrast MA with two other accounts that seek to explicate the ascription of normal functions in biology, namely, the organizational account and the selected effects account. Both struggle to extend to cancer biology. However, I offer ecumenical readings which allow them to recover some ascriptions of normal function to parts of cancers.

Speaker:  Dr. Lotem Elber-Dorozko, Pittsburgh

TitleHow a philosophical definition of purpose in biology can help us identify good models in neuroscience

Abstract:

In recent years, a variety of neural network models have been extremely successful in predicting brain processes and behavior. Nonetheless, neuroscientists often disagree on whether these suggested models of cognitive capacities indeed capture essential properties of these capacities. In my talk I aim to clarify these debates by noting that attempts to understand cognitive capacities occur under the posit that the brain is a teleological system – a system that has (at least one) purpose. It seems undeniable that biological organisms have purposes; spiders build webs to catch pray and mice pups emit sounds to call their mom, for example. Even so, philosophers and scientists have debated for centuries how to define and assign such purpose and have not reached agreement.

Although this teleological view of the brain is an essential part of neuroscience, the difficulties it raises are rarely explicitly invoked in scientific discourse. Considering teleology with relation to neuroscientific practice sheds light on the source of the inherent vagueness that surrounds the delineation of cognitive capacities. Furthermore, it suggests that neuroscientists cannot rely on prediction of behavior and brain processes alone to support their models. Instead, models of purposeful behavior should also be assessed relative to the history that caused the capacity to be brought forth. I take object recognition and recent convolutional neural networks models of this capacity as a case study to demonstrate these points and suggest several concrete ways in which neuroscientific practice can be amended.

Speaker:  Dr. Hisham Abdulhalim, Technion

Title: Denied by an (Unexplainable) Algorithm: Teleological Explanations for Algorithmic Decisions Enhance Customer Satisfaction

Abstract:

Automated algorithmic decision-making has become commonplace, with firms implementing either rule-based or statistical models to determine whether to provide services to customers based on their past behaviors and characteristics. In response, policymakers are pressing firms to explain these algorithmic decisions. However, many of these algorithms are “unexplainable” because they are too complex for humans to understand. Moreover, legal or commercial considerations often preclude explaining algorithmic decision rules. We study consumer responses to goal-oriented, or teleological, explanations, which present the purpose or objective of the algorithm without revealing mechanism information that might help customers reverse or prevent future service denials. In a field experiment with a technology firm and in several online lab experiments, we demonstrate the effectiveness of teleological explanations and identify conditions when teleological and mechanistic explanations can be equally satisfying.

Whereas the epistemic value of explanations is well established, we study how explanations mitigate the negative impact of service denials on customer satisfaction. Yet in situations where companies do not want to, or cannot, reveal the mechanism, we find that teleological explanations create equivalent value through the justifications they may offer. Our results thus show that firms may benefit by offering teleological explanations for unexplainable algorithm behavior, positioning their firms as more ethical than others.


סמסטר אביב תשפ"ג

Dr. Osvaldo Ottaviani, Technion
Title: What role do plants (and living beings) play in the philosophy of Leibniz?

Abstract:
Without being directly involved in botanical research, Leibniz has always had a strong interest in this subject, as well as all in many investigations about the nature of living beings. In some passages, he also claims that the inquiries of modern investigators (microscopists, anatomists, etc.) on the generation of plants and animals provided new insights not only on the structure of living beings and the universe, but also on the nature and  the constitution of souls and incorporeal substances. In my talk, I will try to answer the following question: in which sense Leibniz could say that the nature and structure of living beings is of the utmost importance to the establishment of his theory of simple substances or monads? I will move from some recently edited texts where Leibniz is concerned with the nature of some climbing plants, and the rejection of the theory of the spontaneous generation of plants and insects. In both cases, Leibniz defends the universality of the sexual reproduction of living beings (stressing the analogy between plants and animals), while, at the same time, maintaining that living beings are never generated mechanically by something which is not already organic.  Leibniz's account of life wants to keep together the following claims: 1) the rejection of spontaneous generation (because nothing organized can be generated from chaos or inorganic matter);  2) the thesis of preformation of all organisms; and 3) their status of ‘machines’ endowed with an infinity of organs. Thus, Leibniz employs the research and observations of the moderns on the generation of insects, plants, and animals as an empirical confirmation of his metaphysical thesis on  the infinite complexity of the machines of nature. From the metaphysical point of view, the infinite composition of natural machines explains the sense in which each soul is able to express an infinite universe (like a living mirror of the universe, as Leibniz says).

Dr. Assaf Marom, Technion

Title: Anatomy education in the age of digital reproduction

Abstract:

Anatomy is a core subject among the basic sciences of medicine, which has been studied in medical schools through dissection ever since anatomy was established as a science several centuries ago. However, since the last quarter of the previous century anatomical education has been undergoing a process of transformation, mainly fueled by the development of medical imaging technology. Consequently, many medical schools have been modifying their courses to an imaging-based education, relying on computer software, medical images, and holograms, among others. While both traditional and imaging-based approaches to anatomical education have advantages and disadvantages, here I discuss the drawbacks of the new approach, arguing that the technological advances we witness provoke scientific, pedagogic, and ethical challenges that should be addressed before rushing to implement them.

Dr. Daniel Kunzelmann (University of Basel)

Title: Doing fieldwork with(in) surveillance architectures: methodological and ethical implications for digital researchers.

Abstract:

Social media create seemingly transparent contexts of information. Comments, attitudes and attributions become (often) easily accessible to researchers. Not least for qualitative approaches, this provides an empirical treasure of data offering valuable insights into sometimes very intimate spheres of life.

We might understand the virtual spaces in which such data are generated as architectures of surveillance that establish a specific regime of (in)visibility. This conceptual reframing raises some fundamental methodological and research ethics questions anew: Who is allowed to see (and know) for what purpose? And how may this knowledge be generated and disseminated?

Using ethnographic material this lecture reflects on some of the challenges that scientists face while conducting research via social media. The main focus will be on the tension between the actors' wish (or need) for anonymity and a scientific standard that demands the disclosure and traceability of empirical material. A "case-based approach" is presented that makes it possible to uphold established research ethical standards and, at the same time, benefit from the potential of easily accessible material on social media

Prof. Catherine Wilson

Title: What Use Was Science to Philosophy—and What Use is Philosophy’s History to Science? 

Abstract:

It is widely, though not universally believed that philosophy from certain periods is best understood by studying it in the context of the scientific discoveries and hypotheses that preoccupied its authors. I will give a few examples of how to read and misread classical texts in the history of philosophy –Descartes, Leibniz, and Kant, who were essentially philosophers of nature– by taking their scientific context into account. It is also widely, though not universally believed that scientists do not benefit from studying the history of philosophy. I would next like to challenge this view. After professional philosophy (inevitably) detached itself from professional science, many scientists lost the ethical orientation that had guided their predecessors. This orientation must be recovered if we are not to enter a scientific-technological dystopia.

Dr. Lior Arbel

TitleMusic Technology – Bottom Up

Abstract:

Can electric guitars, synthesizers and Spotify really be the focus of serious academic research? Yes! In this talk we will take a bottom-up look on the academic discipline of music technology. We’ll start with an overview of Lior Arbel’s work – the invention and research of several musical instruments, software and hardware. We’ll see how music technology research blends engineering, computer science, human-computer interaction and social sciences. We’ll go on to describe the international music technology discipline and glance at its current state in Israel, or lack thereof.

Dr. Jonathan Najenson, Technion

Title

Long-Term Potentiation (LTP) Revisited: Reconsidering the Explanatory Power of Synaptic Efficacy

Abstract:

Changes in synaptic strength are described as a unifying hypothesis for memory formation and storage, leading philosophers to consider the "synaptic efficacy hypothesis" as a paradigmatic explanation in neuroscience. However, in Craver's mosaic view, while synaptic efficacy plays a unifying role by integrating diverse fields within a hierarchical mechanism, it does not have explanatory power across distinct memory types. In this talk, I will show how the mechanistic approach can accommodate the explanatory power of this hypothesis by examining its role across different mechanistic hierarchies, which in turn supports the idea of unification.

Dr. Oren Bader, Technion; Heidelberg University Hospital.

Title:

Neurophenomenology as a research program – Bridging the gap between philosophy and neuroscience (Abstract below).

Abstract:

Francisco Varela coined the term Neurophenomenology (NP) to suggest a research area that combines phenomenology – the philosophical study of the structures of subjective experiences with scientific investigations into their underlying neurocognitive mechanisms. Although Varela introduced the concept of NP over 20 years ago, only a  few attempts were made to implement this unique approach in concrete studies. In this talk, I’ll discuss the benefits and limitations of an integrated philosophical and scientific research program for studying subjective experiences by reviewing results from a recent NP study on intergroup empathy. Specifically, I’ll ask can naturalizing phenomenology help bridge the gap between empirical investigations and philosophical approaches and what can be achieved when designing hybrid research.

Prof. Emmanuel Farjoun  Hebrew University

Title:

Exponential economic growth: Causes and Costs (Abstract below).

The talk is based on his recently published book "How Labour powers the global economy – a labor theory of capitalism" (coauthored by M. Machover and D. Zachariah).

Abstract: In this talk, the basic drive of the Capitalist economy to grow will be reduced to basic labor consideration, conservation and measures. The limits of this growth will be explained: Both lower and upper bounds are explored.

Real monetary prices do not faithfully reflect social-economic causes and costs. A deeper analysis is needed. Its main tool is a probabilistic reduction of the apparent, easy-to-observe money costs, capital costs, and natural resources cost to labor inputs. These are measured simply by hours of work and are somewhat harder to trace. Labor inputs, in turn, reflect much deeper social-economic realities. The implications, under the present system, to the ecological and the human-social future are shown to be grave. Ways to overcome these grave and inevitable implications will be discussed.

Prof.  Elay Shech

Title:
Scientific Understanding, Modal Reasoning, and (Infinite) Idealization

Abstract:

One of the main goals of science is to provide understanding. Yet, the use of idealizations and abstraction in science, which are falsehoods of sorts, is ubiquitous. How can science afford understanding in light of idealizations? How can falsity allow us to understand that truth? In this talk, I attempt to make some headway in answering such questions.

Specifically, by appealing to resources found in the scientific understanding literature, I identify in what senses idealizations afford understanding in the context of the (magnetic) Aharonov-Bohm effect. Using this example, which appeals to infinite idealizations, I will suggest that idealizations facilitate modal reasoning in a manner that is essential for understanding both scientific theory and what some phenomenon is supposed to be in the first place.

Prof . Aviram Rosochotsky (Tel Aviv University)

Title

Relationism in Shape Dynamics

Abstract:

Relationism is a philosophical view according to which motion consists solely in changes to position relations between material bodies. Consequently, it rules out absolute motions – those that take place with respect to space and time which (supposedly) exist independently of matter. In my talk I'll show how Barbour and Bertotti (1982) were able to create a modification of Newton's theory of gravity which is compatible with the relationist view of motion using a theoretical device called best-matching. I will then review the treatment of motion in Shape Dynamics, which is the generalization of Barbour and Bertotti's approach in a relativistic context. The comparison between the revisions of Newtonian gravity and General Relativity will show interesting differences.


סמינרים – חורף + אביב תשפ"ב

Prof.  Omer Einav (Molad).

Title

Defending the Goal: Football and the Relations between Jews and Arab in Mandatory Palestine, 1917-1948

Abstract:

My research uses the periodization of the Mandate in analyzing relations between the Zionist and Palestinian national movements, in the socio-cultural context. As a research tool, that context with its diverse aspects has been used in recent years as a field for exploring the dynamic created in Mandatory Palestine, between the two societies that lived there, and as a reflection of nationalist aspects at the heart of existing historiography. The focus is the game of football, through which a distinctive angle an attempt is made to examine the development and traits of relations between Jews and Arabs


Dr. Lotem Elber-Dorozko (Technion) and Arnon Levy (Hebrew University).

Title

What is so good about being optimal? On appeals to optimality in the cognitive sciences

Abstract:

Models in cognitive science often describe people’s behavior as optimal. What are the motivations for such descriptions? We discuss three – empirical, methodological, and conceptual. The first involves a claim about the power of natural selection, namely that it can be expected to lead many cognitive capacities to be performed optimally, in the sense that they maximize fitness. On this view, appeals to optimality have explanatory value in virtue of background assumptions about evolution. However, we present several interrelated reasons to question the claim that cognitive capacities maximize fitness.

Alternatively, optimal models may serve as a good first approximation for the behavior. We suggest that it is more accurate to consider optimal models as idealizations, because ‘approximation’ suggests that the model can be brought closer (often, arbitrarily closer) to reality, and this does not hold for most cognitive models. It is an open question what can be learned from optimality-based models, construed as idealizations.

Finally, we consider a conceptual motivation for viewing people as optimal. Here, the idea is that optimality is part of an attempt to “make sense of the organism”, to rationalize its behavior. We accept that such a perspective is important, perhaps indispensable in studying minds. But we urge caution about tying it to the notion of optimality or to strong notions of rationality.


Dr. Sharon Bassan

Title

A Proportionality-Based Framework for Government Regulation of Digital Tracing Apps in Times of Emergency

Abstract:

Times of emergency present an inherent conflict between the public interest and the preservation of individual rights. Such times require granting emergency powers to the government on behalf of the public interest and relaxing safeguards against government actions that infringe rights. The lack of theoretical framework to assess governmental decisions in times of emergency leads to a polarized and politicized discourse about potential policies, and often, to public distrust and lack of compliance.

Such a discourse was evident regarding Digital Tracing Apps (“DTAs”), which are apps installed on cellular phones to alert users that they were exposed to people who tested positive for COVID-19. DTAs collect the most sensitive types of information, such as health-related and location or proximity information, which violates the right to privacy and the right to be free of surveillance. This sensitive information is normally legally protected. But in emergencies there are no legal restrictions limiting the collection of such data. The common privacy-law approach supports DTA implementation under the condition that the technology preserves the privacy of users. But this Article suggests that the privacy approach focuses on micro considerations and under-addresses the implications of DTA-based policy. Instead, this Article suggests rethinking DTA implementation during COVID-19 through the doctrine of proportionality. Often used by European Union courts in areas where decisions entail meaningful implications to individual rights, the doctrine offers a clear and workable normative evaluation of tradeoffs in a more nuanced, explicable, and transparent way. Highlighting macro considerations, the doctrine of proportionality suggests that 1) DTA-based policy is less proportionate compared to traditional contact-tracing methods; 2) policies created while relying on smartphones are inequitable and biased; and 3) the sharing of sensitive personal information with private companies will have irreversible social surveillance implications. Additionally, the proportionality method not only provides a flexible methodological tool to evaluate government decisions in times of emergency but also offers an opportunity to examine how governments achieve and justify the acceptance and assimilation of new technological policy measures, which may take societies in new directions.


Dr. Paula Reichert (LMU München).

Title
Why does time have a direction?

Abstract

This talk discusses the physical origin of time directionality. In nature we observe irreversible (i.e. time-directed) processes, like the breaking of an egg, diffusion through cell membranes, and so on, despite the fact that all the fundamental laws of physics are time-symmetric. How can this apparent paradox be resolved? We discuss how macroscopic time asymmetry can be grounded on time-symmetric microscopic laws by means of a typicality argument based on special initial conditions. This demand for a special initial condition thrusts the origin of time asymmetry back in time towards the beginning of the universe. In this context, we discuss how modern cosmology seeks to explain the origin of time directionality via a time-symmetric one past – two futures scenario where the Big Bang provides merely a special instant, a Janus point, in an otherwise eternal universe.


Professor Omri Barak (Technion)

Title

Understanding the brain with models we don't understand

Abstract

I will present the common manner in which computational models are used in neuroscience, and then an alternative that combines machine learning, reverse engineering and dynamical systems. The talk will touch upon questions such as "What defines a good model?", "How should we compare models to data?" and more.


Dr. Alik Pelman (Technion), Dr.Alon Shepon (TAU), Dr.Jerke de Vries (Van Hall Larenstein), Dr.Sigal Teper (Tel-Hai)

Title

What is the Most Environmental (Healthy) Nutrition? Comparing a case study of self-sufficient farming to common industrial alternatives

Abstract

Providing food security to feed a growing population with reduced environmental impacts and resilient to climate change is an ongoing challenge. Here, we detail a low-input subsistence Mediterranean agroforestry system that is based on traditional annual crop rotation and perennial shrubs and trees and provides adequate nutritional supply with limited labor and using reduced land and on-site water resources. Our records span a 9 year time period in which 0.1 of hectare provided a balanced diet to the producing farmer, in effect closing a full farm-table-farm cycle, with no synthetic fertilizers or herbicides and with zero waste. Situated within a semi-arid region that is a climate change ‘hotspot’ this food system serves as a case study to further examine food production systems that provide healthy diets with lower environmental impacts and greater agrobiodiversity and resilience than conventional industrial farming practices and even organic ones.