AutoClustering: A feed-forward neural network based clustering algorithm

研究成果: Conference contribution

抄録

Since a clustering process can be regarded as a map of data to cluster labels, it should be natural to employ a deep learning technique, especially a feed-forward neural network, to realize the clustering method. In this study, we discussed a novel clustering method realized only by a feed-forward neural network. Unlike self-organizing maps and growing neural gas networks, the proposed method is compatible with deep learning neural networks. The proposed method has three parts: A map of records to clusters (encoder), a map of clusters to their exemplars (decoder), and a loss function to measure positional closeness between the records and the exemplars. In order to accelerate clustering performance, we proposed an improved activation function at the encoder, which migrates a soft-max function to a max function continuously. Though most of the clustering methods require the number of clusters in advance, the proposed method naturally provides the number of clusters as the number of unique one-hot vectors obtained as a result. We also discussed the existence of local minima of the loss function and their relationship to clusters.

元の言語English
ホスト出版物のタイトルProceedings - 18th IEEE International Conference on Data Mining Workshops, ICDMW 2018
編集者Jeffrey Yu, Zhenhui Li, Hanghang Tong, Feida Zhu
出版者IEEE Computer Society
ページ659-666
ページ数8
ISBN(電子版)9781538692882
DOI
出版物ステータスPublished - 2019 2 7
イベント18th IEEE International Conference on Data Mining Workshops, ICDMW 2018 - Singapore, Singapore
継続期間: 2018 11 172018 11 20

出版物シリーズ

名前IEEE International Conference on Data Mining Workshops, ICDMW
2018-November
ISSN(印刷物)2375-9232
ISSN(電子版)2375-9259

Conference

Conference18th IEEE International Conference on Data Mining Workshops, ICDMW 2018
Singapore
Singapore
期間18/11/1718/11/20

Fingerprint

Feedforward neural networks
Clustering algorithms
Self organizing maps
Labels
Chemical activation
Neural networks
Gases
Deep learning

ASJC Scopus subject areas

  • Computer Science Applications
  • Software

これを引用

Kimura, M. (2019). AutoClustering: A feed-forward neural network based clustering algorithm. : J. Yu, Z. Li, H. Tong, & F. Zhu (版), Proceedings - 18th IEEE International Conference on Data Mining Workshops, ICDMW 2018 (pp. 659-666). [8637379] (IEEE International Conference on Data Mining Workshops, ICDMW; 巻数 2018-November). IEEE Computer Society. https://doi.org/10.1109/ICDMW.2018.00102

AutoClustering : A feed-forward neural network based clustering algorithm. / Kimura, Masaomi.

Proceedings - 18th IEEE International Conference on Data Mining Workshops, ICDMW 2018. 版 / Jeffrey Yu; Zhenhui Li; Hanghang Tong; Feida Zhu. IEEE Computer Society, 2019. p. 659-666 8637379 (IEEE International Conference on Data Mining Workshops, ICDMW; 巻 2018-November).

研究成果: Conference contribution

Kimura, M 2019, AutoClustering: A feed-forward neural network based clustering algorithm. : J Yu, Z Li, H Tong & F Zhu (版), Proceedings - 18th IEEE International Conference on Data Mining Workshops, ICDMW 2018., 8637379, IEEE International Conference on Data Mining Workshops, ICDMW, 巻. 2018-November, IEEE Computer Society, pp. 659-666, 18th IEEE International Conference on Data Mining Workshops, ICDMW 2018, Singapore, Singapore, 18/11/17. https://doi.org/10.1109/ICDMW.2018.00102
Kimura M. AutoClustering: A feed-forward neural network based clustering algorithm. : Yu J, Li Z, Tong H, Zhu F, 編集者, Proceedings - 18th IEEE International Conference on Data Mining Workshops, ICDMW 2018. IEEE Computer Society. 2019. p. 659-666. 8637379. (IEEE International Conference on Data Mining Workshops, ICDMW). https://doi.org/10.1109/ICDMW.2018.00102
Kimura, Masaomi. / AutoClustering : A feed-forward neural network based clustering algorithm. Proceedings - 18th IEEE International Conference on Data Mining Workshops, ICDMW 2018. 編集者 / Jeffrey Yu ; Zhenhui Li ; Hanghang Tong ; Feida Zhu. IEEE Computer Society, 2019. pp. 659-666 (IEEE International Conference on Data Mining Workshops, ICDMW).
@inproceedings{f994127e2b624edababb413f0aea2147,
title = "AutoClustering: A feed-forward neural network based clustering algorithm",
abstract = "Since a clustering process can be regarded as a map of data to cluster labels, it should be natural to employ a deep learning technique, especially a feed-forward neural network, to realize the clustering method. In this study, we discussed a novel clustering method realized only by a feed-forward neural network. Unlike self-organizing maps and growing neural gas networks, the proposed method is compatible with deep learning neural networks. The proposed method has three parts: A map of records to clusters (encoder), a map of clusters to their exemplars (decoder), and a loss function to measure positional closeness between the records and the exemplars. In order to accelerate clustering performance, we proposed an improved activation function at the encoder, which migrates a soft-max function to a max function continuously. Though most of the clustering methods require the number of clusters in advance, the proposed method naturally provides the number of clusters as the number of unique one-hot vectors obtained as a result. We also discussed the existence of local minima of the loss function and their relationship to clusters.",
keywords = "Autoencoder, Clustering, Feed forward neural network",
author = "Masaomi Kimura",
year = "2019",
month = "2",
day = "7",
doi = "10.1109/ICDMW.2018.00102",
language = "English",
series = "IEEE International Conference on Data Mining Workshops, ICDMW",
publisher = "IEEE Computer Society",
pages = "659--666",
editor = "Jeffrey Yu and Zhenhui Li and Hanghang Tong and Feida Zhu",
booktitle = "Proceedings - 18th IEEE International Conference on Data Mining Workshops, ICDMW 2018",

}

TY - GEN

T1 - AutoClustering

T2 - A feed-forward neural network based clustering algorithm

AU - Kimura, Masaomi

PY - 2019/2/7

Y1 - 2019/2/7

N2 - Since a clustering process can be regarded as a map of data to cluster labels, it should be natural to employ a deep learning technique, especially a feed-forward neural network, to realize the clustering method. In this study, we discussed a novel clustering method realized only by a feed-forward neural network. Unlike self-organizing maps and growing neural gas networks, the proposed method is compatible with deep learning neural networks. The proposed method has three parts: A map of records to clusters (encoder), a map of clusters to their exemplars (decoder), and a loss function to measure positional closeness between the records and the exemplars. In order to accelerate clustering performance, we proposed an improved activation function at the encoder, which migrates a soft-max function to a max function continuously. Though most of the clustering methods require the number of clusters in advance, the proposed method naturally provides the number of clusters as the number of unique one-hot vectors obtained as a result. We also discussed the existence of local minima of the loss function and their relationship to clusters.

AB - Since a clustering process can be regarded as a map of data to cluster labels, it should be natural to employ a deep learning technique, especially a feed-forward neural network, to realize the clustering method. In this study, we discussed a novel clustering method realized only by a feed-forward neural network. Unlike self-organizing maps and growing neural gas networks, the proposed method is compatible with deep learning neural networks. The proposed method has three parts: A map of records to clusters (encoder), a map of clusters to their exemplars (decoder), and a loss function to measure positional closeness between the records and the exemplars. In order to accelerate clustering performance, we proposed an improved activation function at the encoder, which migrates a soft-max function to a max function continuously. Though most of the clustering methods require the number of clusters in advance, the proposed method naturally provides the number of clusters as the number of unique one-hot vectors obtained as a result. We also discussed the existence of local minima of the loss function and their relationship to clusters.

KW - Autoencoder

KW - Clustering

KW - Feed forward neural network

UR - http://www.scopus.com/inward/record.url?scp=85062848726&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85062848726&partnerID=8YFLogxK

U2 - 10.1109/ICDMW.2018.00102

DO - 10.1109/ICDMW.2018.00102

M3 - Conference contribution

AN - SCOPUS:85062848726

T3 - IEEE International Conference on Data Mining Workshops, ICDMW

SP - 659

EP - 666

BT - Proceedings - 18th IEEE International Conference on Data Mining Workshops, ICDMW 2018

A2 - Yu, Jeffrey

A2 - Li, Zhenhui

A2 - Tong, Hanghang

A2 - Zhu, Feida

PB - IEEE Computer Society

ER -