### Table of Contents

# Schedule

## Spring 2019

Unless stated otherwise, the seminar takes place on Mondays, starting at 13:30 in Seminar Room 211, General Research Building 10 (No. 60 on this map).

#### April 22

The first meeting of the semester.

Let us gather, greet new members to the Department, and set a plan for the semester before us.

We can play some warm-up board games to get to know each other and sharpen our cognitive skills.

#### May 20

##### Speaker: Zhe Zhang

*Discrete Mathematics Research Group,*

**Topic: An Integer Programming-Based Method to Control Problems in Boolean Networks**

#### May 27

##### Speaker: Naveed Ahmed Azam

*Discrete Mathematics Research Group,*

**Topic: An Improved Method for Enumerating Pairwise Compatibility Graphs with a Given Number of Vertices**

#### June 3

**Cancelled by speaker**

#### June 17

##### Speaker: Katsuki Kobayashi

*Applied Mathematical Analysis Laboratory,*

**Topic: Orthogonal Polynomials and Discrete Integrable Systems**

Orthogonal polynomials and discrete integrable systems are closely related with each other.
In this talk, I will talk about how to derive an integrable system from orthogonal polynomials.
In addition, we use the theory of orthogonal polynomials to relate discrete integrable systems to eigenvalue problems.

#### June 24

##### Speaker: Hardik Tankaria

*System Optimization Research Group,*

**Topic: Progressive Batching Stochastic Variance Reduced Gradient (SVRG) with L-BFGS Method**

The Stochastic Gradient Method (SGD) and its variants have become more effective to solve optimization problems in machine learning. One of the important tasks is to reduce variance and make method stable. In order to reduce variance, we use stochastic variance reduced gradient with a progressive-batch approach, in which sample size increases if a sufficient variance reduction condition is satisfied. We propose to use limited memory BFGS method as a second order information with average update in SVRG which satisfies the curvature condition. We use the overlap between consecutive samples in the difference of gradients in L-BFGS. We propose this technique with the proximal gradient method and update iteration by using convex combination of L-BFGS and proximal update along with learning rate by using smoothness parameter.

#### July 1

##### Speaker: Man Zheng

*Control Systems Theory Group,*

**Topic: Hyperparameters Estimation for Bayesian Positive System Identification via the EM Algorithm**

The Bayesian method becomes a crucial and practical technique for limited data identification problems. Recently, the Bayesian identification is applied to positive linear systems whose impulse responses are constrained to be nonnegative. The truncated normal distribution is proved to be a maximum entropy prior for positive linear systems in previous researches. Then, the estimation of hyperparameters can be handled by heuristic algorithms such as the genetic algorithm (GA). However, the hyperparameters estimation can also be solved by a non-heuristic approach such as the expectation-maximization (EM) algorithm. In this paper, based on the EM algorithm, we develop an optimization algorithm for the hyperparameters estimation using the truncated normal distribution. We also compare the simulation results between the EM algorithm and the GA. The simulation shows that the EM algorithm performs more stable and precise than the GA.

#### July 22

##### Speaker: Tomoyuki Mao

*Physical Statistics Research Group,*

**Topic: Estimation of Physiological State Focused on Chaotic Properties of Heart Rate Variability**

In physiology, autonomic nervous system (ANS) activity is evaluated by analyzing heart rate variability (HRV) using methods of statistical analysis or frequency analysis. In my research, I am trying to measure chaotic properties of HRV by using Chaos Degree. I propose Improved Chaos Degree for calculating a value equivalent to Lyapunov exponent from data.

#### July 29

##### Speaker: Shoya Motonaga

*Dynamical Systems Group,*

**Topic: TBA**