Atenção: Todas as denúncias são sigilosas e sua identidade será preservada.
Os campos nome e e-mail são de preenchimento opcional
Metadados | Descrição | Idioma |
---|---|---|
Autor(es): dc.contributor | Universidade Estadual Paulista (UNESP) | - |
Autor(es): dc.creator | Valem, Lucas Pascotti [UNESP] | - |
Autor(es): dc.creator | Pedronette, Daniel Carlos Guimarães [UNESP] | - |
Data de aceite: dc.date.accessioned | 2022-08-04T22:12:25Z | - |
Data de disponibilização: dc.date.available | 2022-08-04T22:12:25Z | - |
Data de envio: dc.date.issued | 2022-04-28 | - |
Data de envio: dc.date.issued | 2022-04-28 | - |
Data de envio: dc.date.issued | 2021-08-24 | - |
Fonte completa do material: dc.identifier | http://dx.doi.org/10.1145/3460426.3463645 | - |
Fonte completa do material: dc.identifier | http://hdl.handle.net/11449/222408 | - |
Fonte: dc.identifier.uri | http://educapes.capes.gov.br/handle/11449/222408 | - |
Descrição: dc.description | Image and multimedia retrieval has established as a prominent task in an increasingly digital and visual world. Mainly supported by decades of development on hand-crafted features and the success of deep learning techniques, various different feature extraction and retrieval approaches are currently available. However, the frequent requirements for large training sets still remain as a fundamental bottleneck, especially in real-world and large-scale scenarios. In the scarcity or absence of labeled data, choosing what retrieval approach to use became a central challenge. A promising strategy consists in to estimate the effectiveness of ranked lists without requiring any groundtruth data. Most of the existing measures exploit statistical analysis of the ranked lists and measure the reciprocity among lists of images in the top positions. This work innovates by proposing a new and self-supervised method for this task, the Deep Rank Noise Estimator (DRNE). An algorithm is presented for generating synthetic ranked list data, which is modeled as images and provided for training a Convolutional Neural Network that we propose for effectiveness estimation. The proposed model is a variant of the DnCNN (Denoiser CNN), which intends to interpret the incorrectness of a ranked list as noise, which is learned by the network. Our approach was evaluated on 5 public image datasets and different tasks, including general image retrieval and person re-ID. We also exploited and evaluated the complementary between the proposed approach and related rank-based approaches through fusion strategies. The experimental results showed that the proposed method is capable of achieving up to 0.88 of Pearson correlation with MAP measure in general retrieval scenarios and 0.74 in person re-ID scenarios. | - |
Descrição: dc.description | Department of Statistics Applied Math. and Computing São Paulo State University (UNESP), SP | - |
Descrição: dc.description | Department of Statistics Applied Math. and Computing São Paulo State University (UNESP), SP | - |
Formato: dc.format | 294-302 | - |
Idioma: dc.language | en | - |
Relação: dc.relation | ICMR 2021 - Proceedings of the 2021 International Conference on Multimedia Retrieval | - |
???dc.source???: dc.source | Scopus | - |
Palavras-chave: dc.subject | Content-based image retrieval | - |
Palavras-chave: dc.subject | Convolutional neural networks | - |
Palavras-chave: dc.subject | Denoising | - |
Palavras-chave: dc.subject | Effectiveness estimation | - |
Palavras-chave: dc.subject | Query performance prediction | - |
Palavras-chave: dc.subject | Self-supervised learning | - |
Palavras-chave: dc.subject | Unsupervised learning | - |
Título: dc.title | A denoising convolutional neural network for self-supervised rank effectiveness estimation on image retrieval | - |
Aparece nas coleções: | Repositório Institucional - Unesp |
O Portal eduCAPES é oferecido ao usuário, condicionado à aceitação dos termos, condições e avisos contidos aqui e sem modificações. A CAPES poderá modificar o conteúdo ou formato deste site ou acabar com a sua operação ou suas ferramentas a seu critério único e sem aviso prévio. Ao acessar este portal, você, usuário pessoa física ou jurídica, se declara compreender e aceitar as condições aqui estabelecidas, da seguinte forma: