Events 2013-2014

Logic Colloquium

September 06, 2013, 4:10 PM (60 Evans Hall)

Theodore A. Slaman
Professor of Mathematics, University of California, Berkeley

On Normal Numbers

A real number is simply normal in base b if in its base-b expansion each digit appears with asymptotic frequency 1/b. It is normal in base b if it is simply normal in all powers of b, and absolutely normal if it is simply normal in every integer base. By a theorem of E. Borel, almost every real number is absolutely normal. We will present three main results. We give an efficient algorithm, which runs in nearly quadratic time, to compute the binary expansion of an absolutely normal number. We demonstrate the full logical independence between normality in one base and another. We will give a necessary and sufficient condition on a set of natural numbers M for there to exist a real number X such that X is simply normal to base b if and only if b is an element of M.

Logic Colloquium

September 20, 2013, 4:10 PM (60 Evans Hall)

Lotfi A. Zadeh
Professor Emeritus of Electrical Engineering and Computer Sciences, Director, Berkeley Initiative in Soft Computing (BISC), University of California, Berkeley

Truth and Meaning

The theory which is outlined in my lecture, call it RCT for short, is a departure from traditional theories of truth and meaning, including correspondence theory, coherence theory, possible-world semantics and truth-conditional semantics. The principal objective of RCT is construction of a procedure which on application to a proposition, p, drawn from a natural language leads to: (a) a mathematically well-defined meaning of p; and (b) truth value of p. 

The centerpiece of RCT is the concept of a restriction. Informally, a restriction, R(X), on a variable, X, is a limitation on the values which X can take. Typically, a restriction is described in a natural language. Simple example. Usually X is significantly larger than a, where a is a real number. The canonical form of a restriction is: X isr R, where X is the restricted variable, R is the restricting relation, and r is an indexical variable which defines the way in which R restricts X.

There are two key postulates in RCT. First, the meaning postulate, MP. MP asserts that the meaning of p is a restriction, X isr R, in which X, R and r are implicit in p. This restriction is referred to as the canonical form of p. Second, the truth postulate, TP, which asserts that the truth value of p is the degree to which X satisfies R.

In RCT, a proposition, p, is associated with two truth values—internal truth value and external truth value. The internal truth value modifies the meaning of p. The external truth value relates to the degree of agreement of p with factual information. In the definition of truth value which was stated earlier, the truth value is internal. In RCT, truth values are numerical, taking values in the unit interval, or linguistic, e.g., very true, not quite true, more or less true, usually true, etc.

Slides

Logic Colloquium

October 04, 2013, 4:10 PM (60 Evans Hall)

Richard Tieszen
Professor of Philosophy, San Jose State University

Monads and Mathematics: Gödel, Leibniz, and Husserl

On the basis of his discussions with Kurt Gödel, Hao Wang (A Logical Journey: From Gödel to Philosophy, p. 166) tells us that “Gödel’s own main aim in philosophy was to develop metaphysics – specifically, something like the monadology of Leibniz transformed into exact theory – with the help of phenomenology”.  Gödel began to study Edmund Husserl’s phenomenology in 1959. In 1928 Husserl (“Phenomenology”, Encyclopedia Britannica draft) wrote that “The ideal of the future is essentially that of phenomenologically based (”philosophical“) sciences, in unitary relation to an absolute theory of monads”.  In the Cartesian Meditations and other works Husserl identifies ‘monads’ (in his sense) with ‘transcendental egos in their full concreteness’. In the first part of my talk I explore some prospects for a Gödelian monadology that result from this identification, with reference to texts of Gödel, Wang’s reports, and aspects of Leibniz’s original monadology. The latter part of the talk will be on human monads, the incompleteness theorems, and (Turing) machines. (For background see my recent book, After Gödel: Platonism and Rationalism in Mathematics and Logic, Oxford University Press.)

Logic Colloquium

October 18, 2013, 4:10 PM (60 Evans Hall)

James Freitag
National Science Foundation Postdoctoral Fellow, University of California, Berkeley

Differential Algebra and the Model Theory of Groups

Sacks called differential fields the “least misleading” examples of omega-stable theories. This talk will reinforce that philosophy, because the theorems we will discuss fall into two general situations: (1) theorems from model theory which inspired results in differential algebra or (2) theorems from differential algebra which were generalized to the model-theoretic setting. Specifically, we will be discussing theorems regarding differential algebraic groups and groups definable in superstable theories (a generalization of omega-stability). We will discuss three types of theorems: (1) Jordan-Hölder theorems, (2) indecomposability theorems, and (3) results on central extensions. To open the talk, we will give a survey of the model theory of differential fields.

Logic Colloquium

November 01, 2013, 4:10 PM (60 Evans Hall)

Lara Buchak
Assistant Professor of Philosophy, University of California, Berkeley

Risk and Inequality

Decision theory concerns the evaluation of gambles. When choosing among gambles, individuals are forced to consider how they will turn out under various circumstances, and decide how to trade off the possibility that a gamble will turn out well against the possibility that it will turn out poorly. How should we aggregate the values one might get in different possible circumstances, in order to arrive at a single value for a gamble? The orthodox view is that there is only one acceptable way to do this: rational individuals must maximize expected (i.e. average) utility. The contention of my recent book, however, is that the orthodox theory (expected utility theory) dictates an overly narrow way in which considerations about risk can play a role in an individual’s choices. There, I argued for an alternative, more permissive, theory of decision-making: risk-weighted expected utility theory (REU theory). This theory allows individuals to pay proportionally more attention to the worst-case scenario than the best-case scenario.

Social choice theory concerns the evaluation of social distributions: distributions of goods or outcomes to individuals. To determine which social distribution is better, we must consider how distributions go for various individuals, and decide how to trade off the fact that one distribution is better for some people against the fact that is it worse for others. How should we aggregate the values that go to each person, in order to arrive as a single value for a social distribution and to say which of two social distributions is better? A traditional answer to this question is that the value of a social distribution is the average of the utility values that each individual in the society receives (utilitarianism). And one traditional justification of utilitarianism relies on the assumption that expected utility theory is the correct decision theory.

There are two ways in which decision theory has been used to justify a particular aggregation method in social choice theory. The first is to propose that facts about the values of social distributions are determined or discerned from individuals’ preferences about gambles. This is the method employed by John Rawls and John Harsanyi, for example: both consider individual preferences about gambles over social distributions in which one does not know which place one will occupy in society, and use these to determine the aggregative social welfare function. The second way to use decision theory to justify a particular aggregation method in social choice theory is to examine the conditions on the preference relation in decision theory, and explore whether analogous conditions might hold of the betterness relation. This is the rough strategy behind John Broome’s justification of utilitarianism in his book Weighing Goods.

Existing philosophical versions of these justifications are based on views about decision-making that I’ve argued are incorrect: thus, I claim, their conclusions for social choice theory are unjustified.

In this work-in-progress talk, I explore the prospects for using REU theory to justify an alternative evaluation of social distributions, one that allows us to pay special attention (but not exclusive attention) to the worst-off person as opposed to the best-off person.

Logic Colloquium

November 15, 2013, 4:10 PM (60 Evans Hall)

Dana S. Scott
University Professor, Emeritus, Carnegie Mellon University; Visiting Scholar, University of California, Berkeley

Stochastic Lambda-Calculi

Many authors have suggested ways of adding random elements an probability assessments to versions of Church’s Lambda-Calculus. When asked recently about models, the speaker realized that the so-called Graph Model (based on using enumeration operators acting on the power set of the integers) could easily be expanded to allow random variables taking values in the model. Other models can be treated similarly. The talk will discuss randomizing algorithms and how a continuation-passing semantics can be used for modeling a branching combinator using random coin tossing. These ideas can also be employed for introducing many other kinds of random combinators.

Logic Colloquium

December 06, 2013, 4:10 PM (60 Evans Hall)

Alexander S. Kechris
Professor of Mathematics, California Institute of Technology

Topological Dynamics and Ergodic Theory of Automorphism Groups of Countable Structures

I will discuss some aspects of the topological dynamics and ergodic theory of automorphism groups of countable first-order structures and their connections with logic, finite combinatorics, and probability theory. This is joint work with Omer Angel and Russell Lyons.

Logic Colloquium

January 31, 2014, 4:10 PM (60 Evans Hall)

Jan Reimann
Assistant Professor of Mathematics, The Pennsylvania State University

Effective Multifractal Spectra

Multifractal measures play an important role in the study of point processes and strange attractors. A central component of the theory is the multifractal formalism, which connects local properties of a measure (pointwise dimensions) with its global properties (average scaling behavior).

In this talk I will introduce a new, effective multifractal spectrum, where we replace pointwise dimension by asymptotic compression ratio. It turns out that the underlying measure can be seen as a universal object for the multifractal analysis of computable measures. The multifractal spectrum of a computable measure can be expressed as a “deficiency of multifractality” spectrum with respect to the universal measure. This in turn allows for developing a quantitative theory of dimension estimators based on Kolmogorov complexity. I will discuss some applications to seismological dynamics.

Logic Colloquium

February 14, 2014, 4:10 PM (60 Evans Hall)

Sherrilyn Roush
Professor of Philosophy, University of California, Berkeley

The Difference Between Knowledge and Understanding

In the classic Gettier problem, cases are imagined where a person has very good reasons to believe a proposition p, and p is true, but many people think the person still doesn’t have knowledge. I characterize what is missing in these cases probabilistically, as a failure of what I call relevance matching. Relevance matching is best interpreted as understanding why p is true, and it is deterministically related to but distinct from the tracking conditions that in my view define knowledge. This view of understanding makes it a simulation among your mental states of the dispositions of factors probabilistically relevant to p’s being true, and as such avoids requiring unrealistic information- and computation-intensive beliefs and inferences about why p is true in order to understand.

Logic Colloquium

February 28, 2014, 4:10 PM (60 Evans Hall)

Alexander Melnikov
Lecturer, Massey University, Auckland, New Zealand, and Postdoctoral Scholar, University of California, Berkeley

Infinitely Generated Abelian Groups with Solvable Word Problem

Logic Colloquium

March 14, 2014, 4:10 PM (60 Evans Hall)

G. Aldo Antonelli
Professor of Philosophy, University of California, Davis

General First-Order Models: Concepts and Results

In his 1950 dissertation, Leon Henkin showed how to provide higher- order quantifiers with non-standard, or “general” interpretations, on which, for instance, second-order quantifiers are taken to range over collections of subsets of the domain that may fall short of the full power-set. In contrast, first-order quantifiers are usually regarded as immune to this sort of non- standard interpretations, since their semantics is ordinarily taken to be completely determined once a first-order domain of objects is selected.

The asymmetry is particularly evident from the point of view of the modern theory of generalized quantifiers, according to which a first-order quantifier is construed as a predicate of subsets of the domain. But the generalized conception still views first-order quantifiers as predicates over the full power-set. Accordingly, the possibility that they, similarly to their second-order counterparts, might denote arbitrary collections of subsets has not been pursued in full generality.

This talk introduces a Henkin-style semantics for arbitrary first-order quantifiers, exploring some of the resulting properties, and emphasizing the effects of imposing various further closure conditions on the second-order component of the interpretation. Among other results, we show by a model- theoretic argument that in certain cases the notion of validity relative to models satisfying the closure conditions is axiomatizable.

Alfred Tarski Lectures

March 31, 2014, 4:10 PM (3 LeConte Hall)

Stevo Todorcevic
University of Toronto and CNRS, Paris

The Measurability Problem for Boolean Algebras

Alfred Tarski Lectures

April 02, 2014, 4:10 PM (3 LeConte Hall)

Stevo Todorcevic
University of Toronto and CNRS, Paris

Chain-conditions of Horn-Tarski

Alfred Tarski Lectures

April 04, 2014, 4:10 PM (3 LeConte Hall)

Stevo Todorcevic
University of Toronto and CNRS, Paris

Combinatorial and Set-theoretic Forcing

Logic Colloquium

April 12, 2014, 4:10 PM (60 Evans Hall)

Zoe Chatzidakis
Senior Researcher, CNRS, Paris; Research Professor, MSRI, Berkeley (Spring 2014)

Model Theory of Difference Fields and Applications

I will present classical results on the model theory of difference fields, and some recent applications by Hrushovski and myself.

Logic Colloquium

April 25, 2014, 4:10 PM (60 Evans Hall)

Lou van den Dries
Professor of Mathematics, University of Illinois (Urbana-Champaign); Research Professor, MSRI (Spring 2014)

The Differential Field of Transseries

Transseries originated some 25 years ago, mainly in Ecalle’s work, but also independently in the study of Tarski’s problem on the field of reals with exponentiation. Some 20 years ago I formulated some conjectures about the differential field of transseries, and started exploring its model-theoretic properties, first jointly with Matthias Aschenbrenner, and a few years later also in collaboration with Joris van der Hoeven. Around 2011 we finally saw a clear path towards a proof, sharpening the conjectures in the meantime. In the last few weeks we finished the job, with one proviso: as part of a division of labor and for lack of time, some final details have only been checked by some of us, but not yet by all three of us. In any case, I will motivate these conjectures, and give an account of their status.

Slides