Syllabus

The timetable is in this link.

The personal webpages of the lectures are in here.

The booklet including the abstracts for the posters and list of participants is here

Practical information regarding the school as well as tips to survive in La Laguna can be found here.​

Overview -- Djorgovski 

1.- A broad intro into astroinformatics in the context of data science
        - Transformation of science driven by computing and information technology.
        - From Virtual Observatory to Astroinformatics and beyond.
        - Methodology transfer in data science.

2.- Data visualization
        - Basics of data visualization.
        - Using color.
        - Multidimensional data visualization and Virtual Reality.

3.- Classification challenges in time domain astronomy, part 1
        - The challenges of transient classification.
        - A brief overview of the ML methodology and feature selection.
        - Characterizing the light curves.

4.- Classification challenges in time domain astronomy, part 2
        - Archival data mining in time domain.
        - Periodicity searches.
        - Predictive data mining.
        - Automated follow-up decision making.

Challenges -- Juric

This series will discuss data challenges and solutions in forthcoming large astronomical surveys. It will cover the topic broadly, but with additional focus on Large Synoptic Survey Telescope (LSST) as the largest ground-based survey of the next decade, and the Zwicky Transient Facility (ZTF) the closest time-domain precursor to LSST which is operating today.

 

Machine Learning: Unsupervised -- Baron

  1. Introduction: the differences between supervised and unsupervised learning and their affect on our scientific methodology.
  2. Clustering algorithms
  3. Dimensionality reduction algorithms
  4. Decision Trees and Random Forests
  5. Anomaly detection and object-based retrieval algorithms.
  6. Summary: advanced concepts and algorithms, current status of the field, and open questions.

Machine Learning: Supervised -- Biehl

My lectures will comprise 4 main chapters (not necessary aligned with the scheduled hours, though). After a brief general introduction (chapter A), in the core of the course I will introduce selected "classical" and some more "modern" systems for classification and  regression (B and C). Basic architectures and concepts will be presented and key training strategies and methods will be discussed in greater detail for  some of them. In chapter D general aspects of supervised learning like the bias/variance dilemma and the risk of overfitting and its control will be discussed.


A)  Brief Introduction
          - classification, regression, and other types of  supervisedlearning
          - machine learning and statistical modelling
B)  Classification
          - Perceptron and Support Vector Machine
          - Distance-based classification:
              K-Nearest-Neighbor, Learning Vector Quantization
              adaptive distance measures and relevance learning
C)  Regression
          - classical regression methods (linear & logistic regression)
          - (shallow) feedforward neural netwoks, alternative architectures
            e.g. Radial Basis Function Networks, Extreme Learning Machines
          - classification based on regression systems
D) Validation and Regularization
          - evaluation schemes and performance measures
          - bias and variance dilemma, over- and under-fitting
          - regularization techniques

Tutorials: In the tutorials, students will be asked to implement (code) example algorithms, and apply them to toy or benchmark data sets. In addition ready-made machine learning toolboxes will be explored and used.

 

Deep Learning -- Huertas-Company

 

1.- Basics of "Shallow" Neural Networks

  • Pereptron, neuron definition
  • Layer of neurons, hidden layers
  • Activation Functions
  • Optimization (gradient descent)
  • Backpropagation

2.- Deep Learning: convolutional neural networks for classification and regression

  • Convolutions as neurons
  • CNNs (pooling, dropout)
  • Vanishing Gradient / Batch normalization
  • Data Augmentation
  • Transfer Learning
  • CNN as feature extractor for astronomy
  • Optimizing your net: hyper parameter search
  • Visualizing CNNs (deconvnets, inceptionism, integrated gradients)

3.- CNNs as Image Generators

  • Variational Auto Encoders
  • Generative Adversarial Networks
  • U-nets