About the Lab
The MLDS lab focused on the development of machine learning models and algorithms for addressing a variety of challenging problems in the areas of computational social science, computational ecology, computational behavioral science and computational medicine.
The MLDS lab’s research continues in multiple labs within the the College of Information and Computer Sciences including the REML lab, the SLANG lab, and Prof. Sheldon’s research group.
Publications
2017 |
Soha, Rostaminia; Addison, Mayberry; Deepak, Ganesan; Benjamin, Marlin; Jeremy, Gummeson iLid: Low-power Sensing of Fatigue and Drowsiness Measures on a Computational Eyeglass Journal Article In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 1, no. 2, pp. 23, 2017. @article{soha2017ilid, <p>The ability to monitor eye closures and blink patterns has long been known to enable accurate assessment of fatigue and drowsiness in individuals. Many measures of the eye are known to be correlated with fatigue including coarse-grained measures like the rate of blinks as well as fine-grained measures like the duration of blinks and the extent of eye closures. Despite a plethora of research validating these measures, we lack wearable devices that can continually and reliably monitor them in the natural environment. In this work, we present a low-power system, iLid, that can continually sense fine-grained measures such as blink duration and Percentage of Eye Closures (PERCLOS) at high frame rates of 100fps. We present a complete solution including design of the sensing, signal processing, and machine learning pipeline; implementation on a prototype computational eyeglass platform; and extensive evaluation under many conditions including illumination changes, eyeglass shifts, and mobility. Our results are very encouraging, showing that we can detect blinks, blink duration, eyelid location, and fatigue-related metrics such as PERCLOS with less than a few percent error.</p> |
Adams, Roy J; Marlin, Benjamin M Learning Time Series Detection Models from Temporally Imprecise Labels Conference The 20th International Conference on Artificial Intelligence and Statistics, 2017, (<p>n/a</p>). @conference{288, <p>In this paper, we consider a new low-quality label learning problem: learning time series detection models from temporally imprecise labels. In this problem, the data consist of a set of input time series, and supervision is provided by a sequence of noisy time stamps corresponding to the occurrence of positive class events. Such temporally imprecise labels commonly occur in areas like mobile health research where human annotators are tasked with labeling the occurrence of very short duration events. We propose a general learning framework for this problem that can accommodate different base classifiers and noise models. We present results on real mobile health data showing that the proposed framework significantly outperforms a number of alternatives including assuming that the label time stamps are noise-free, transforming the problem into the multiple instance learning framework, and learning on labels that were manually re-aligned. </p> |
Dadkhahi, Hamid; Marlin, Benjamin Learning Tree-Structured Detection Cascades for Heterogeneous Networks of Embedded Devices Proceedings 2017, (<p>To appear.</p>). @proceedings{291, <p>In this paper, we present a new approach to learning cascaded classifiers for use in computing environments that involve networks of heterogeneous and resource-constrained, low-power embedded compute and sensing nodes. We present a generalization of the classical linear detection cascade to the case of tree-structured cascades where different branches of the tree execute on different physical compute nodes in the network. Different nodes have access to different features, as well as access to potentially different computation and energy resources. We concentrate on the problem of jointly learning the parameters for all of the classifiers in the cascade given a fixed cascade architecture and a known set of costs required to carry out the computation at each node. To accomplish the objective of joint learning of all detectors, we propose a novel approach to combining classifier outputs during training that better matches the hard cascade setting in which the learned system will be deployed. This work is motivated by research in the area of mobile health where energy efficient real time detectors integrating information from multiple wireless on-body sensors and a smart phone are needed for real-time monitoring and the delivery of just-in-time adaptive interventions. We evaluate our framework on mobile sensor-based human activity recognition and mobile health detector learning problems.</p> |
Dadkhahi, Hamid; Duarte, Marco F; Marlin, Benjamin M Out-of-Sample Extension for Dimensionality Reduction of Noisy Time Series Journal Article In: IEEE Transactions on Image Processing, vol. 26, no. 11, pp. 5435–5446, 2017. @article{dadkhahi2017out, <p>This paper proposes an out-of-sample extension framework for a global manifold learning algorithm (Isomap) that uses temporal information in out-of-sample points in order to make the embedding more robust to noise and artifacts. Given a set of noise-free training data and its embedding, the proposed framework extends the embedding for a noisy time series. This is achieved by adding a spatio-temporal compactness term to the optimization objective of the embedding. To the best of our knowledge, this is the first method for out-of-sample extension of manifold embeddings that leverages timing information available for the extension set. Experimental results demonstrate that our out-of-sample extension algorithm renders a more robust and accurate embedding of sequentially ordered image data in the presence of various noise and artifacts when compared with other timing-aware embeddings. Additionally, we show that an out-of-sample extension framework based on the proposed algorithm outperforms the state of the art in eye-gaze estimation.</p> |
2016 |
Bernstein, Garrett; Sheldon, Daniel R Consistently Estimating Markov Chains with Noisy Aggregate Data. Conference AISTATS, Cadiz, Spain, 2016. @conference{248, |