Seminars

Seminars appear in decreasing order in relation to date. To find an activity of your interest just go down on the list. Normally seminars are given in english. If not, they will be marked as Spanish Only.

 

Development and Evaluation of an Internet-Based Tutorial Module (i-TModule) For Statistics Learning Among Postgraduate Students

Event Date: Apr 26, 2017 in Education, Seminars

 Abstract: Because students’ ability to use statistics, which is mathematical in nature, is one of the concerns of instructors embedding within an e-learning system the pedagogical characteristics of learning is ‘value added’. It could facilitate the traditional method of learning mathematics which is usually a teacher-centered. Nowadays, many different types of online learning platform and Learning Management Systems, LMSs, (such as Moodle) are used in the teaching and learning process especially in universities, but there is a lack of...

TOPOLOGICAL DYNAMICS OF PIECEWISE Λ-AFFINE MAPS OF THE INTERVAL

Event Date: Apr 17, 2017 in Dynamical Systems, Seminars

ABSTRACT: Let 0 < a < 1, 0 ≤ b < 1 and I = [0,1). We call contracted rotation the interval map φa,b : x ∈ I → ax+b mod1. Once a is fixed, we are interested in the dynamics of the one-parameter family φa,b, where b runs on the interval interval [0, 1). Any contracted rotation has a rotation number ρa,b which describes the asymptotic behavior of φa,b. In the first part of the talk, we analyze the numerical relation between the parameters a, b and ρa,b and discuss some applications of the map φa,b. Next, we introduce a generalization of...

Quantitative multiple recurrence for two and three transformations.

Event Date: Apr 10, 2017 in Dynamical Systems, Seminars

Abstract:  In this talk I will provide some counter examples for quantitative multiple recurrence problems for systems with more than one transformation.  For instance, I will show that there exists an ergodic system $(X,\mathcal{X},\mu,T_1,T_2)$ with two commuting transformations such that for every $\ell < 4$ there exists $A\in \mathcal{X}$ such that  \[ \mu(A\cap T_1^n A\cap T_2^n A) < \mu(A)^{\ell} \]  for every $n \in \mathbb{N}$.   The construction of such a system is based on the study of “big” subsets of...

Recurrences for generating polynomials

Event Date: Apr 07, 2017 in Discrete Mathematics, Seminars

Abstract:   In this talk we consider sequences of polynomials that satisfy differential–difference recurrences. Our interest is motivated by the fact that polynomials satisfying such recurrences frequently appear as generating polynomials of integer valued random variables that are of interest in discrete mathematics. It is, therefore, of interest to understand the properties of such polynomials and their probabilistic consequences. We will be primarily interested in the limiting distribution of the corresponding random variables....

Stability of Hamiltonian systems which are close to integrable : introduction to KAM and Nekhoroshev theory

Event Date: Mar 29, 2017 in Optimization and Equilibrium, Seminars

Abstract: We give a panorama of classical theories of stability of Hamiltonian systems close to integrable which are of two kind : – Stability in measure over infinite time (KAM theory). – Effective stability over finite but very long time (Nekhoroshev theory)

Limit distributions related to the Euler discretization error of Brownian motion about random times

Event Date: Mar 28, 2017 in Núcleo Modelos Estocásticos de Sistemas Complejos y Desordenados, Seminars

Resumen: In this talk we study the simulation of barrier-hitting events and extreme events of one-dimensional Brownian motion. We call “barrier-hitting event” an event where the Brownian motion hits for the first time a deterministic “barrier” function; and call “extreme event” an event where the Brownian motion attains a minimum on a given compact time interval or unbounded closed time interval. To sample these events we consider the Euler discretization approach of Brownian motion; that is, simulate the...

“DASH: Deep Learning for the Automated Spectral Classification of Supernovae”

Event Date: Jan 27, 2017 in Other Areas, Seminars

ABSTRACT:   We have reached a new era of ‘big data’ in astronomy with surveys now recording an unprecedented number of spectra. In particular, new telescopes such as LSST will soon incease the spectral catalogue by a few orders of magnitude. Moreover, the Australian sector of the Dark Energy Survey (DES) is currently in the process of spectroscopically measuring several thousands of supernovae. To meet this new demand, novel approaches that are able to automate and speed up the classification process of these spectra is...

Yaglom limits can depend on the initial state

Event Date: Jan 16, 2017 in Seminars, Stochastic Modeling

Abstract:   To quote the economist John Maynard Keynes: “The long run is a misleading guide to current affairs. In the long run we are all dead.” It makes more sense to study the state of an evanescent system given it has not yet expired. For a substochastic Markov chain with kernel K on a state space S with killing this amounts to the study of of the Yaglom limit; that is the limiting probability the state at time n is y given the chain has not been absorbed; i.e. lim_{n\to\infty}K^n(x,y)/K^n(x,S).   We given an example...

Provably efficient high dimensional feature extraction

Event Date: Dec 28, 2016 in Discrete Mathematics, Seminars

Abstract: The goal of inference is to extract information from data. A basic building block in high dimensional inference is feature extraction, that is, to compute functionals of given data that represent it in a way that highlights some underlying structure. For example, Principal Component Analysis is an algorithm that finds a basis to represent data that highlights the property of data being close to a low-dimensional subspace. A fundamental challenge in high dimensional inference is the design of algorithms that are provably efficient...

Provably efficient high dimensional feature extraction

Event Date: Dec 28, 2016 in Optimization and Equilibrium, Seminars

Abstract: The goal of inference is to extract information from data. A basic building block in high dimensional inference is feature extraction, that is, to compute functionals of given data that represent it in a way that highlights some underlying structure. For example, Principal Component Analysis is an algorithm that finds a basis to represent data that highlights the property of data being close to a low-dimensional subspace. A fundamental challenge in high dimensional inference is the design of algorithms that are provably efficient...