LR-CSNet: Low-Rank Deep Unfolding Network for Image Compressive Sensing

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Standard

LR-CSNet : Low-Rank Deep Unfolding Network for Image Compressive Sensing. / Zhang, Tianfang; Li, Lei; Igel, Christian; Oehmcke, Stefan; Gieseke, Fabian; Peng, Zhenming.

2022 IEEE International Conference on Computer and Communications (ICCC), Chengdu, China. IEEE, 2023. s. 1951-1957.

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Harvard

Zhang, T, Li, L, Igel, C, Oehmcke, S, Gieseke, F & Peng, Z 2023, LR-CSNet: Low-Rank Deep Unfolding Network for Image Compressive Sensing. i 2022 IEEE International Conference on Computer and Communications (ICCC), Chengdu, China. IEEE, s. 1951-1957, International Conference on Computer and Communications, 09/12/2022. https://doi.org/10.1109/ICCC56324.2022.10065722

APA

Zhang, T., Li, L., Igel, C., Oehmcke, S., Gieseke, F., & Peng, Z. (2023). LR-CSNet: Low-Rank Deep Unfolding Network for Image Compressive Sensing. I 2022 IEEE International Conference on Computer and Communications (ICCC), Chengdu, China (s. 1951-1957). IEEE. https://doi.org/10.1109/ICCC56324.2022.10065722

Vancouver

Zhang T, Li L, Igel C, Oehmcke S, Gieseke F, Peng Z. LR-CSNet: Low-Rank Deep Unfolding Network for Image Compressive Sensing. I 2022 IEEE International Conference on Computer and Communications (ICCC), Chengdu, China. IEEE. 2023. s. 1951-1957 https://doi.org/10.1109/ICCC56324.2022.10065722

Author

Zhang, Tianfang ; Li, Lei ; Igel, Christian ; Oehmcke, Stefan ; Gieseke, Fabian ; Peng, Zhenming. / LR-CSNet : Low-Rank Deep Unfolding Network for Image Compressive Sensing. 2022 IEEE International Conference on Computer and Communications (ICCC), Chengdu, China. IEEE, 2023. s. 1951-1957

Bibtex

@inproceedings{f16c7fe4fcb641e5b7acb52ddb7204ca,
title = "LR-CSNet: Low-Rank Deep Unfolding Network for Image Compressive Sensing",
abstract = "Deep unfolding networks (DUNs) have proven to be a viable approach to compressive sensing (CS). In this work, we propose a DUN called low-rank CS network (LR-CSNet) for natural image CS. Real-world image patches are often well-represented by low-rank approximations. LR-CSNet exploits this property by adding a low-rank prior to the CS optimization task. We derive a corresponding iterative optimization procedure using variable splitting, which is then translated to a new DUN architecture. The architecture uses low-rank generation modules (LRGMs), which learn low-rank matrix factorizations, as well as gradient descent and proximal mappings (GDPMs), which are proposed to extract high-frequency features to refine image details. In addition, the deep features generated at each reconstruction stage in the DUN are transferred between stages to boost the performance. Our extensive experiments on three widely considered datasets demonstrate the promising performance of LR-CSNet compared to state-of-the-art methods in natural image CS.",
author = "Tianfang Zhang and Lei Li and Christian Igel and Stefan Oehmcke and Fabian Gieseke and Zhenming Peng",
year = "2023",
doi = "10.1109/ICCC56324.2022.10065722",
language = "English",
pages = "1951--1957",
booktitle = "2022 IEEE International Conference on Computer and Communications (ICCC), Chengdu, China",
publisher = "IEEE",
note = "International Conference on Computer and Communications, ICCC ; Conference date: 09-12-2022 Through 12-12-2022",
url = "http://www.iccc.org/2022.html",

}

RIS

TY - GEN

T1 - LR-CSNet

T2 - International Conference on Computer and Communications

AU - Zhang, Tianfang

AU - Li, Lei

AU - Igel, Christian

AU - Oehmcke, Stefan

AU - Gieseke, Fabian

AU - Peng, Zhenming

N1 - Conference code: 8

PY - 2023

Y1 - 2023

N2 - Deep unfolding networks (DUNs) have proven to be a viable approach to compressive sensing (CS). In this work, we propose a DUN called low-rank CS network (LR-CSNet) for natural image CS. Real-world image patches are often well-represented by low-rank approximations. LR-CSNet exploits this property by adding a low-rank prior to the CS optimization task. We derive a corresponding iterative optimization procedure using variable splitting, which is then translated to a new DUN architecture. The architecture uses low-rank generation modules (LRGMs), which learn low-rank matrix factorizations, as well as gradient descent and proximal mappings (GDPMs), which are proposed to extract high-frequency features to refine image details. In addition, the deep features generated at each reconstruction stage in the DUN are transferred between stages to boost the performance. Our extensive experiments on three widely considered datasets demonstrate the promising performance of LR-CSNet compared to state-of-the-art methods in natural image CS.

AB - Deep unfolding networks (DUNs) have proven to be a viable approach to compressive sensing (CS). In this work, we propose a DUN called low-rank CS network (LR-CSNet) for natural image CS. Real-world image patches are often well-represented by low-rank approximations. LR-CSNet exploits this property by adding a low-rank prior to the CS optimization task. We derive a corresponding iterative optimization procedure using variable splitting, which is then translated to a new DUN architecture. The architecture uses low-rank generation modules (LRGMs), which learn low-rank matrix factorizations, as well as gradient descent and proximal mappings (GDPMs), which are proposed to extract high-frequency features to refine image details. In addition, the deep features generated at each reconstruction stage in the DUN are transferred between stages to boost the performance. Our extensive experiments on three widely considered datasets demonstrate the promising performance of LR-CSNet compared to state-of-the-art methods in natural image CS.

U2 - 10.1109/ICCC56324.2022.10065722

DO - 10.1109/ICCC56324.2022.10065722

M3 - Article in proceedings

SP - 1951

EP - 1957

BT - 2022 IEEE International Conference on Computer and Communications (ICCC), Chengdu, China

PB - IEEE

Y2 - 9 December 2022 through 12 December 2022

ER -

ID: 338603504