Research
Research Projects
- Towards Cognitive Control: Transfer Learning for Robust Steering of Myoelectric Devices (CITEC)
- Relevance learning for temporal neural maps (DFG) (RLNM)
- Prototype-based learning for large and multimodal data sets (CITEC) (PLM)
- Discriminative Dimensionality Reduction (DFG) (DIDI)
- Learning Feedback in Intelligent Tutoring Systems (DFG-SPP) (FIT)
Task Forces
We are involved in
Source Code
Unless stated otherwise, all code packages are implemented in Matlab (R).
- related to prototype-based learning:
- Python Generalized Learning Vector Quantization Toolbox: This Python 3 Toolbox provides a scikit-learn compatible implementation of all usual Generalized Learning Vector Quantization (GLVQ) algorithms, namely classic GLVQ, relevance GLVQ, Generalized Matrix LVQ (GMLVQ), local GMLVQ (LGMLVQ) and limited rank GMLVQ (LiRaMGLVQ). All implementations use the fast limited memory BFGS solver of scipy to optimize the cost function. This toolbox can also be conveniently installed as pip package via "pip install sklearn-glvq". The source code is also available at GitHub.
- Linear Supervised Transfer Learning Toolbox: This Matlab (R) toolbox (Python3 Version is available here) provides several algorithms to learn a linear mapping from an n-dimensional source space to an m-dimensional target space, such that it makes a classification or clustering model that has been trained in the source space applicable in the target space. The source space model is assumed to be either a vector quantization model (such as learning vector quantizations and variations thereof, neural gas or k-Means) or a (labelled) mixture of Gaussians. If you use this code please cite: Expectation maximization transfer learning and its application for bionic hand prostheses.
- Relevance/Matrix LVQ Toolbox: This toolbox is a joint work with Prof. M. Biehl at the University of Groningen. This toolbox contains standard Generalized LVQ (GLVQ), GLVQ with relevance/matrix learning, etc. More details can be found here.
-
proto-dist-ml implements prototype-based methods for distance data in Python3, namely relational neural gas, relational generalized learning vector quantization, and median generalized learning vector quantization (see below for more info on these methods). All models are scikit-learn compatible. Installation is possible via
pip3 install proto-dist-ml
. - Relational Neural Gas: This is a Java 7, fully MATLAB (R)-compatible implementation of relational neural gas ass suggested by Hammer and Hasenfuss (2007). Relational neural gas is a clustering algorithm which assigns data points softly to prototypes based on their distance ranking. In most cases, relational neural gas is considerably more robust compared to K-Means and also provides a good initialization heuristic for other prototybe-based models, such as median relational generalized LVQ or relational generalized LVQ.
- Median Generalized LVQ for Distance Data: This is a Java 7, fully MATLAB (R)-compatible implementation of median generalized learning vector quantization (MGLVQ) for distance and dissimilarity data as proposed by Nebel, Hammer, Frohberg, and Villmann (2015). Median versions of LVQ use data points as prototypes, that is: Each prototype corresponds exactly to a data point from the training data. This particular implementation of median LVQ takes distances or dissimilarities as input, which is why we call it relational.
- Relational Generalized LVQ: This code contains the methods called relational generalized learning vector quantization which can directly deal with arbitrary symmetric dissimilarity data. If you use this code please cite: Learning vector quantization for (dis-)similarities. This code was part of the best paper contribution by B. Mokbel et al. at ESANN 2014.
- Fast Soft Competitive Learning: This code contains batch relational neural gas. If you use this code pleas cite: Fast approximated relational and kernel clustering.
- Core Soft Competitive Learning: This code contains some basic code snips for core soft competitive learning. If you use this code please cite: Soft Competitive Learning for large data sets.
- Kernelized Generalized LVQ: This code contains a kernelized version of GLVQ. If you use this code please cite: Efficient Kernelized Prototype-based Classification.
- related to dissimilarity representations:
-
edist is a Python3 library for edit and alignment distances between sequences and trees. It implements the Levenshtein distance, dynamic time warping, affine edit distance, tree edit distance, and more. The library also fully supports backtracing, parallel computation, and metric learning. Critical routines are implemented in Cython, yielding fast runtimes. Installation is possible via
pip3 install edist
. - TCS Alignment Toolbox: This Java Toolbox provides several algorithms to align two input sequences, where your sequential data is allowed to be multimodal and multidimensional. We also provide additional tools to inspect the alignment results in more detail or even calculate derivatives of the alignment w.r.t. metric parameters such that you can optimize the alignment parameters according to some cost function. It is written in Java 1.7. and is also compatible with Matlab (version 2013b or higher). If you use this code please cite: A Toolbox for Adaptive Sequence Dissimilarity Measures for Intelligent Tutoring Systems.
- Time Series Prediction for Relational and Kernel Data : This Matlab (R) toolbox provides algorithms to predict the future location of some object in a kernel / distance embedding space. This permits to apply time series prediction to non-vectorial data, such as sequences, trees and graphs. The input for this toolbox are time series of relational or kernel data given as distance or kernel matrices and successor mappings. The output are affine coefficients of training data points, which can be used to locate the predicted point relative to the training data or new data and apply other relational or kernel-based approaches on the predicted point. In more detail, this toolbox implements kernel regression (Nadaraya-Watson regression), Gaussian Processes and the robust Bayesian Committee machine and provides a demo script demonstrating the function of this toolbox. If you use this code please cite: Time Series Prediction for Graphs in Kernel and Dissimilarity Spaces.
- Nyström Toolbox: This toolbox can be used to approximate symmetric (dis)similarity matrices. It allows to execute different manipulations on the matrices in linear time, such as performing transformations between dis- and similarities as well as computing eigenvalue decomposition and correcting the similarities to make them positive semi-definite kernel matrices. If you use this code please cite: Metric and non-metric proximity transformations at linear costs.
-
edist is a Python3 library for edit and alignment distances between sequences and trees. It implements the Levenshtein distance, dynamic time warping, affine edit distance, tree edit distance, and more. The library also fully supports backtracing, parallel computation, and metric learning. Critical routines are implemented in Cython, yielding fast runtimes. Installation is possible via
- related to dimensionality reduction:
- Classifier Visualization Toolbox: This Matlab toolbox enables the user to visualize a trained high-dimensional classification model in two dimensions. The approach is general, such that any classification model which provides some kind of certainty estimate can be visualized this way. If you use this code, please cite Using Discriminative Dimensionality Reduction to Visualize Classifiers.
- Kernel Mapping Toolbox: This toolbox implements the general dimensionality reduction framework allowing to train a kernel mapping on a small training data set and then extend this mapping to new data in linear time. It also makes use of Fisher information to efficiently generate discriminative dimensionality reduction. If you use this code please cite: Parametric nonlinear dimensionality reduction using kernel t-SNE, Neurocomputing 2015.
- Quality for Dimensionality Reduction: This Matlab code package demonstrates different schemes to evaluate the quality or reliability of a low-dimensional embedding of high-dimensional data. The quantitative evaluation is entirely unsupervised and enables a coarse-grained, overall assessment of the embedding, as well as a detailed visual inspection on a point-wise basis. If you use this code, please cite: Visualizing the quality of dimensionality reduction.
For further information on the activities of the group please have a look at the publications and the team member pages.