Publications

In recent years, great success has been witnessed in building problem-specific deep networks from unrolling iterative algorithms, for …

Solving continuous minimax optimization is of extensive practical interest, yet notoriously unstable and difficult. This paper …

The record-breaking performance of deep neural networks (DNNs) comes with heavy parameterization, leading to external dynamic …

Deep, heavily overparameterized language models such as BERT, XLNet and T5 have achieved impressive success in many NLP tasks. However, …

Multiplication (e.g., convolution) is arguably a cornerstone of modern deep neural networks (DNNs). However, intensive multiplications …

Meta-learning improves generalization of machine learning models when faced with previously unseen tasks by leveraging experiences from …

Many applications require repeatedly solving a certain type of optimization problem, eachtime with new (but similar) data. Data-driven …

Activity recognition in wearable computing faces two key challenges: i) activity characteristics may be context-dependent and change …

We present SmartExchange, a hardware-algorithm co-design framework to trade higher cost memory storage/access for lower cost …

(Frankle & Carbin, 2019) shows that there exist winning tickets (small but critical subnetworks) for dense, randomly initialized …

Convolutional neural networks (CNNs) have been increasingly deployed to Internet of Things (IoT) devices. Hence, many efforts have been …

Deep neural networks based on unfolding an iterative algorithm, for example, LISTA (learned iterative shrinkage thresholding …

In recent years, unfolding iterative algorithms as neural networks has become an empirical success in solving sparse recovery problems. …