|
|
Venues (Conferences, Journals, ...)
|
|
GrowBag graphs for keyword ? (Num. hits/coverage)
Group by:
The graphs summarize 146 occurrences of 121 keywords
|
|
|
Results
Found 6509 publication records. Showing 6509 according to the selection in the facets
Hits ?▲ |
Authors |
Title |
Venue |
Year |
Link |
Author keywords |
105 | Lizhong Chen, Yongyou Hu, Hongye Su, Jian Chu |
A comparative study for solution methods of a multicomponent distillation model. |
SMC (5) |
2004 |
DBLP DOI BibTeX RDF |
|
86 | Shengli Liu 0001, Henk C. A. van Tilborg, Marten van Dijk |
A Practical Protocol for Advantage Distillation and Information Reconciliation. |
Des. Codes Cryptogr. |
2003 |
DBLP DOI BibTeX RDF |
advantage distillation, unconditional security, secret key agreement, information reconciliation |
77 | Xiaoyu Wang, Zhiguo Lu, Aoying Zhou |
Topic Exploration and Distillation for Web Search by a Similarity-Based Analysis. |
WAIM |
2002 |
DBLP DOI BibTeX RDF |
|
72 | Yiming Yang, Abhimanyu Lad |
Modeling Expected Utility of Multi-session Information Distillation. |
ICTIR |
2009 |
DBLP DOI BibTeX RDF |
Multi-session distillation, utility evaluation based both on novelty and relevance, stochastic modeling of user browsing behavior |
72 | Ikkyu Choi, Minkoo Kim |
Topic distillation using hierarchy concept tree. |
SIGIR |
2003 |
DBLP DOI BibTeX RDF |
hierarchy concept tree, link analysis, topic distillation |
68 | Mostafa Keikha |
Investigation on smoothing and aggregation methods in blog retrieval. |
SIGIR |
2010 |
DBLP DOI BibTeX RDF |
user generated data, temporal analysis, blog search |
64 | Kaizhong Jiang, Zhao Lu, Yuan-Qiong Wu, Junzhong Gu |
An Algorithm of Topic Distillation Based on Anchor Text. |
ISECS |
2008 |
DBLP DOI BibTeX RDF |
|
64 | José J. Macias-Hernandez, Plamen P. Angelov, Xiaowei Zhou |
Soft sensor for predicting crude oil distillation side streams using evolving takagi-sugeno fuzzy models. |
SMC |
2007 |
DBLP DOI BibTeX RDF |
|
64 | Tao Qin 0001, Tie-Yan Liu, Xu-Dong Zhang 0001, Guang Feng, Wei-Ying Ma |
Subsite Retrieval: A Novel Concept for Topic Distillation. |
AIRS |
2005 |
DBLP DOI BibTeX RDF |
|
64 | Yiqun Liu 0001, Min Zhang 0006, Shaoping Ma |
Effective Topic Distillation with Key Resource Pre-selection. |
AIRS |
2004 |
DBLP DOI BibTeX RDF |
|
63 | Kevin Kyung Nam, Mark S. Ackerman |
Arkose: reusing informal information from online discussions. |
GROUP |
2007 |
DBLP DOI BibTeX RDF |
collaborative distillation, community knowledge, CSCW, online communities, design rationale, information organization, information reuse, incremental formalization, knowledge communities |
58 | Krisztian Balog, Maarten de Rijke, Wouter Weerkamp |
Bloggers as experts: feed distillation using expert retrieval models. |
SIGIR |
2008 |
DBLP DOI BibTeX RDF |
feed distillation, people-topic associations, language modeling |
58 | Zhuoming Xu, Xiao Cao, Yisheng Dong, Yahong Han |
s-HITSc: an improved model and algorithm for topic distillation on the Web. |
Soft Comput. |
2006 |
DBLP DOI BibTeX RDF |
Site granularity, Content analysis, Topic distillation, Web IR, Hyperlink analysis |
58 | Mingfang Wu, Gheorghe Muresan, Alistair McLean, Muh-Chyun (Morris) Tang, Ross Wilkinson, Yuelin Li, Hyuk-Jin Lee, Nicholas J. Belkin |
Human versus machine in the topic distillation task. |
SIGIR |
2004 |
DBLP DOI BibTeX RDF |
evaluation, interactive information retrieval, topic distillation, search result organization |
50 | Nick S. Jones, Lluis Masanes |
Key Distillation and the Secret-Bit Fraction. |
IEEE Trans. Inf. Theory |
2008 |
DBLP DOI BibTeX RDF |
|
50 | Akemi Gálvez, Andrés Iglesias 0001 |
Binary Distillation Column Design Using Mathematica. |
International Conference on Computational Science |
2003 |
DBLP DOI BibTeX RDF |
|
50 | Gianluigi Greco, Sergio Greco, Ester Zumpano |
STED: A System for Topic Enumeration and Distillation. |
ITCC |
2002 |
DBLP DOI BibTeX RDF |
Web searching and mining, semi-structured Data, Information Processing on the Web, Databases and Information Retrieval |
49 | Jiyin He, Wouter Weerkamp, Martha A. Larson, Maarten de Rijke |
Blogger, stick to your story: modeling topical noise in blogs with coherence measures. |
AND |
2008 |
DBLP DOI BibTeX RDF |
blog distillation, coherence measures, language models |
45 | Craig Macdonald, Iadh Ounis |
Key blog distillation: ranking aggregates. |
CIKM |
2008 |
DBLP DOI BibTeX RDF |
blog distillation, feed search, expert search |
45 | Yiming Yang, Abhimanyu Lad, Ni Lao, Abhay Harpale, Bryan Kisiel, Monica Rogati |
Utility-based information distillation over temporally sequenced documents. |
SIGIR |
2007 |
DBLP DOI BibTeX RDF |
flexible user feedback, passage ranking, utility-based distillation, adaptive filtering, novelty detection, evaluation methodology, unified framework |
45 | Vassilis Plachouras, Fidel Cacheda, Iadh Ounis |
A decision mechanism for the selective combination of evidence in topic distillation. |
Inf. Retr. |
2006 |
DBLP DOI BibTeX RDF |
Decision mechanism, Selective combination of evidence, Query scope, Aggregates, Web information retrieval, Topic distillation |
36 | Mostafa Keikha, Fabio Crestani |
Effectiveness of Aggregation Methods in Blog Distillation. |
FQAS |
2009 |
DBLP DOI BibTeX RDF |
|
36 | Zhengfang Wang, Yong Wang, Jan Zhang, Dawei Qu, Xiuhua Liu |
Grey Correlation Analysis of Corrosion on Oil Atmospheric Distillation Equipment. |
FSKD (5) |
2008 |
DBLP DOI BibTeX RDF |
|
36 | Geoff W. Hamilton |
Distillation: extracting the essence of programs. |
PEPM |
2007 |
DBLP DOI BibTeX RDF |
superlinear improvement, program transformation, termination, generalisation, tail-recursion |
36 | Matthias Christandl, Artur Ekert, Michal Horodecki, Pawel Horodecki, Jonathan Oppenheim, Renato Renner |
Unifying Classical and Quantum Key Distillation. |
TCC |
2007 |
DBLP DOI BibTeX RDF |
|
36 | Yongfeng He, Quanyi Fan |
A Levinson Predictor Based Compensatory Fuzzy Neural Network and Its Application in Crude Oil Distillation Process Modeling. |
ISNN (2) |
2006 |
DBLP DOI BibTeX RDF |
|
36 | Almila Bahar, Evren Giiner, Canan Özgen, Ugur Halici |
Design of State Estimators for the Inferential Control of an Industrial Distillation Column. |
IJCNN |
2006 |
DBLP DOI BibTeX RDF |
|
36 | Jin-Hau Kuo, Chin-Wei Fang, Jen-Hao Yeh, Ja-Ling Wu |
An importance measurement for video and its application to TV news items distillation. |
ICIP |
2004 |
DBLP DOI BibTeX RDF |
|
36 | Andris Ambainis, Ke Yang 0005 |
Towards the Classical Communication Complexity of Entanglement Distillation Protocols with Incomplete Information. |
CCC |
2004 |
DBLP DOI BibTeX RDF |
|
36 | Elizabeth Margaglio, Rosalba Lamanna, Pierre-Yves Glorennec |
Analysis and Comparison of Recurrent Neural Networks for the Identification of a Pilot Plant Distillation Column. |
IBERAMIA-SBIA |
2000 |
DBLP DOI BibTeX RDF |
|
36 | Yeha Lee, Seung-Hoon Na, Jong-Hyeok Lee |
An improved feedback approach using relevant local posts for blog feed retrieval. |
CIKM |
2009 |
DBLP DOI BibTeX RDF |
blog distillation, feed search, pseudo-relevance feedback |
36 | Tao Qin 0001, Tie-Yan Liu, Xu-Dong Zhang 0001, De-Sheng Wang, Wen-Ying Xiong, Hang Li 0001 |
Learning to rank relational objects and its application to web search. |
WWW |
2008 |
DBLP DOI BibTeX RDF |
learning to rank relational objects, relational ranking svm, pseudo relevance feedback, topic distillation |
36 | Theodora Tsikrika, Mounia Lalmas |
Combining Evidence for Relevance Criteria: A Framework and Experiments in Web Retrieval. |
ECIR |
2007 |
DBLP DOI BibTeX RDF |
best entry point, Dempster-Shafer theory, topic distillation |
36 | Einar Landre, Harald Wesenberg, Harald Rønneberg |
Architectural improvement by use of strategic level domain-driven design. |
OOPSLA Companion |
2006 |
DBLP DOI BibTeX RDF |
context map, distillation, responsibility layer, complexity, enterprise architecture, domain-driven design |
36 | Hooshang Jazayeri-Rad |
The nonlinear model-predictive control of a chemical plant using multiple neural networks. |
Neural Comput. Appl. |
2004 |
DBLP DOI BibTeX RDF |
Multi-component distillation column, Nonlinear model-predictive control, Neural networks |
31 | Amy Ciric, Zeynep H. Gümüs |
MINLP: Reactive Distillation Column Synthesis. |
Encyclopedia of Optimization |
2009 |
DBLP DOI BibTeX RDF |
Reactive distillation, Bilevel programming, MINLP |
31 | Angelo Lucia |
Successive Quadratic Programming: Applications in Distillation Systems. |
Encyclopedia of Optimization |
2009 |
DBLP DOI BibTeX RDF |
Lagrangian function, Quadratic programming subproblem, Hessian matrix of a Lagrangian function, Range and null space decomposition, Full space methods, Distillation, Mass and energy balance equations, Ideal and nonideal phase equilibrium equations, Successive quadratic programming |
31 | Soumen Chakrabarti |
Integrating the document object model with hyperlinks for enhanced topic distillation and information extraction. |
WWW |
2001 |
DBLP DOI BibTeX RDF |
segmentation, minimum description length principle, topic distillation, document object model |
27 | Yi Fang |
Entity information management in complex networks. |
SIGIR |
2010 |
DBLP DOI BibTeX RDF |
entity profiling, social network analysis, entity retrieval |
27 | Hervé Chabanne, Guillaume Fumaroli |
Noisy Cryptographic Protocols for Low-Cost RFID Tags. |
IEEE Trans. Inf. Theory |
2006 |
DBLP DOI BibTeX RDF |
|
27 | David Hawking, Paul Thomas 0001 |
Server selection methods in hybrid portal search. |
SIGIR |
2005 |
DBLP DOI BibTeX RDF |
|
27 | Ian Soboroff |
On evaluating web search with very few relevant documents. |
SIGIR |
2004 |
DBLP DOI BibTeX RDF |
web search, measurement error |
27 | Ji-Zheng Chu, Po-Feng Tsai, Wen-Yen Tsai, Shi-Shang Jang, David Shan-Hill Wong, Shyan-Shu Shieh, Pin-Ho Lin, Shi-Jer Jiang |
An Experimental Study of Model Predictive Control Based on Artificial Neural Networks. |
KES |
2003 |
DBLP DOI BibTeX RDF |
|
27 | Jiying Wang, Frederick H. Lochovsky |
Data-rich Section Extraction from HTML pages. |
WISE |
2002 |
DBLP DOI BibTeX RDF |
|
27 | Xiaodong Zhang 0001 |
Parallel Triangular Decompositions of an Oil Refining Simulation. |
International Conference on Supercomputing |
1993 |
DBLP DOI BibTeX RDF |
|
23 | |
Distillation. |
Encyclopedia of Database Systems |
2009 |
DBLP DOI BibTeX RDF |
|
23 | Mostafa Keikha, Mark James Carman, Fabio Crestani |
Blog distillation using random walks. |
SIGIR |
2009 |
DBLP DOI BibTeX RDF |
random walks, blog retrieval |
23 | Kamand Kamangar, Dilek Hakkani-Tür, Gökhan Tür, Michael Levit |
An iterative unsupervised learning method for information distillation. |
ICASSP |
2008 |
DBLP DOI BibTeX RDF |
|
23 | Thomas Sierocinski, Antony Le Béchec, Nathalie Théret, Dimitri Petritis |
Semantic Distillation: A Method for Clustering Objects by their Contextual Specificity. |
NICSO |
2007 |
DBLP DOI BibTeX RDF |
|
23 | Borhan Molazem Sanandaji, Karim Salahshoor, Alireza Fatehi |
Multivariable GA-Based Identification of TS Fuzzy Models: MIMO Distillation Column Model Case Study. |
FUZZ-IEEE |
2007 |
DBLP DOI BibTeX RDF |
|
23 | Moinuddin K. Qureshi, M. Aater Suleman, Yale N. Patt |
Line Distillation: Increasing Cache Capacity by Filtering Unused Words in Cache Lines. |
HPCA |
2007 |
DBLP DOI BibTeX RDF |
|
23 | Maciej Lawrynczuk, Piotr Tatjewski |
An Efficient Nonlinear Predictive Control Algorithm with Neural Models and Its Application to a High-Purity Distillation Process. |
ICAISC |
2006 |
DBLP DOI BibTeX RDF |
|
23 | Yafen Li, Qi Li, Huijuan Wang, Ningsheng Ma |
Soft Sensing Based on LS-SVM and Its Application to a Distillation Column. |
ISDA (1) |
2006 |
DBLP DOI BibTeX RDF |
|
23 | João Alberto Fabro, Flávio Neves Jr., Lúcia V. R. Arruda |
A Proposal of a Fuzzy-Neuro Predictive Control, Tuned by Genetic Algorithms, with application to the Start-Up Control of a Distillation Column. |
HIS |
2005 |
DBLP DOI BibTeX RDF |
|
23 | Vassilis Plachouras, Iadh Ounis |
Distribution of relevant documents in domain-level aggregates for topic distillation. |
WWW (Alternate Track Papers & Posters) |
2004 |
DBLP DOI BibTeX RDF |
distribution of relevant documents, aggregates, web IR |
23 | Ding-Yi Chen, Xue Li 0001 |
PLD: A Distillation Algorithm for Misclassified Documents. |
WAIM |
2004 |
DBLP DOI BibTeX RDF |
|
23 | Hao Tang, Quanyi Fan, Bowen Xu, Jianming Wen |
A Technological Parameter Optimization Approach in Crude Oil Distillation Process Based on Neural Network. |
ISNN (2) |
2004 |
DBLP DOI BibTeX RDF |
|
23 | Ke Yang 0005 |
On the (Im)possibility of Non-interactive Correlation Distillation. |
LATIN |
2004 |
DBLP DOI BibTeX RDF |
|
23 | Jie Tang 0001, Juan-Zi Li, Kehong Wang, Yue-Ru Cai |
Loss Minimization Based Keyword Distillation. |
APWeb |
2004 |
DBLP DOI BibTeX RDF |
|
23 | Kai Dadhe, Volker Roßmann, Kazim Durmus, Sebastian Engell |
Neural Networks as a Tool for Gray Box Modelling in Reactive Distillation. |
Fuzzy Days |
2001 |
DBLP DOI BibTeX RDF |
|
23 | Javier Fernández de Cañete, T. Cordero, D. Guijas, J. Alarcon |
An Adaptive Neuro-Fuzzy Approach to Control a Distillation Column. |
Neural Comput. Appl. |
2000 |
DBLP DOI BibTeX RDF |
|
22 | Amit C. Awekar, Pabitra Mitra, Jaewoo Kang |
Selective hypertext induced topic search. |
WWW |
2006 |
DBLP DOI BibTeX RDF |
searching, link analysis, topic distillation |
22 | Wen-Syan Li, Necip Fazil Ayan, Okan Kolak, Quoc Vu, Hajime Takano, Hisashi Shimamura |
Constructing multi-granular and topic-focused web site maps. |
WWW |
2001 |
DBLP DOI BibTeX RDF |
decision tree algorithm, logical domain, multi-granularity, topic distillation, site map |
22 | Wen-Syan Li, K. Selçuk Candan |
Integrating content search with structure analysis for hypermedia retrieval and management. |
ACM Comput. Surv. |
1999 |
DBLP DOI BibTeX RDF |
link analysis, organization, topic distillation |
18 | Nicolas Boizard, Kevin El Haddad, Céline Hudelot, Pierre Colombo |
Towards Cross-Tokenizer Distillation: the Universal Logit Distillation Loss for LLMs. |
CoRR |
2024 |
DBLP DOI BibTeX RDF |
|
18 | Lirong Wu, Haitao Lin, Zhangyang Gao, Guojiang Zhao, Stan Z. Li |
A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation. |
CoRR |
2024 |
DBLP DOI BibTeX RDF |
|
18 | Phuc Phan, Hieu Tran, Long Phan |
Distillation Contrastive Decoding: Improving LLMs Reasoning with Contrastive Decoding and Distillation. |
CoRR |
2024 |
DBLP DOI BibTeX RDF |
|
18 | Yiru He, Shiqian Wang, Junyang Yu, Chaoyang Liu, Xin He, Han Li |
Joint weighted knowledge distillation and multi-scale feature distillation for long-tailed recognition. |
Int. J. Mach. Learn. Cybern. |
2024 |
DBLP DOI BibTeX RDF |
|
18 | Dongping Liao, Xitong Gao, Chengzhong Xu 0001 |
Impartial Adversarial Distillation: Addressing Biased Data-Free Knowledge Distillation via Adaptive Constrained Optimization. |
AAAI |
2024 |
DBLP DOI BibTeX RDF |
|
18 | Lin Zhu, Qiang Hao, Xingyan Zeng, Chunhua Zhang, Jianting Yu, Liping Lv, Qian Zhou |
Control of side-stream pressure-swing distillation and extractive distillation for separating azeotropic mixture of cyclohexane and acetone. |
Comput. Chem. Eng. |
2023 |
DBLP DOI BibTeX RDF |
|
18 | Linfeng Li, Weixing Su, Fang Liu, Maowei He, Xiaodan Liang |
Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms. |
Neural Process. Lett. |
2023 |
DBLP DOI BibTeX RDF |
|
18 | Shuiping Ni, Xinliang Ma, Mingfu Zhu, Xingwang Li 0001, Yu-Dong Zhang 0001 |
Reverse Self-Distillation Overcoming the Self-Distillation Barrier. |
IEEE Open J. Comput. Soc. |
2023 |
DBLP DOI BibTeX RDF |
|
18 | Songling Zhu, Ronghua Shang, Bo Yuan, Weitong Zhang, Yangyang Li, Licheng Jiao |
DynamicKD: An Effective Knowledge Distillation via Dynamic Entropy Correction-Based Distillation for Gap Optimizing. |
CoRR |
2023 |
DBLP DOI BibTeX RDF |
|
18 | Zhendong Yang, Ailing Zeng, Zhe Li, Tianke Zhang, Chun Yuan, Yu Li 0003 |
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels. |
CoRR |
2023 |
DBLP DOI BibTeX RDF |
|
18 | Min Wei, Jingkai Zhou, Junyao Sun, Xuesong Zhang |
Adversarial Score Distillation: When score distillation meets GAN. |
CoRR |
2023 |
DBLP DOI BibTeX RDF |
|
18 | Songling Zhu, Ronghua Shang, Ke Tang, Songhua Xu, Yangyang Li 0001 |
BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning. |
Knowl. Based Syst. |
2023 |
DBLP DOI BibTeX RDF |
|
18 | Zhendong Yang, Ailing Zeng, Zhe Li, Tianke Zhang, Chun Yuan, Yu Li 0007 |
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels. |
ICCV |
2023 |
DBLP DOI BibTeX RDF |
|
18 | Mengyuan Zhao, Yonghong Song |
Abnormal-Aware Loss and Full Distillation for Unsupervised Anomaly Detection Based on Knowledge Distillation. |
ICIP |
2023 |
DBLP DOI BibTeX RDF |
|
18 | Norah Alballa, Marco Canini |
A First Look at the Impact of Distillation Hyper-Parameters in Federated Knowledge Distillation. |
EuroMLSys@EuroSys |
2023 |
DBLP DOI BibTeX RDF |
|
18 | Zeyuan Allen-Zhu, Yuanzhi Li |
Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning. |
ICLR |
2023 |
DBLP BibTeX RDF |
|
18 | Yutao Qin, Yu Zhuang, Chao Wang, Lei Zhang 0040, Linlin Liu, Jian Du 0003 |
Multi-objective optimization and comparison of the entrainer-assisted pressure-swing distillation and extractive distillation separation sequences for separating a pressure-insensitive binary azeotrope. |
Comput. Chem. Eng. |
2022 |
DBLP DOI BibTeX RDF |
|
18 | Jae-woong Lee, Minjin Choi 0001, Lee Sael, Hyunjung Shim, Jongwuk Lee |
Knowledge distillation meets recommendation: collaborative distillation for top-N recommendation. |
Knowl. Inf. Syst. |
2022 |
DBLP DOI BibTeX RDF |
|
18 | Kien Do, Hung Le, Dung Nguyen 0001, Dang Nguyen 0002, Haripriya Harikumar, Truyen Tran 0001, Santu Rana, Svetha Venkatesh |
Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation. |
CoRR |
2022 |
DBLP DOI BibTeX RDF |
|
18 | Yu Wang, Xin Li 0106, Shengzhao Wen, Fukui Yang, Wanping Zhang, Gang Zhang, Haocheng Feng, Junyu Han, Errui Ding |
Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling. |
CoRR |
2022 |
DBLP DOI BibTeX RDF |
|
18 | Junzhuo Li, Xinwei Wu, Weilong Dong, Shuangzhi Wu, Chao Bian 0006, Deyi Xiong |
Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework. |
CoRR |
2022 |
DBLP DOI BibTeX RDF |
|
18 | Kien Do, Hung Le, Dung Nguyen 0001, Dang Nguyen 0002, Haripriya Harikumar, Truyen Tran 0001, Santu Rana, Svetha Venkatesh |
Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation. |
NeurIPS |
2022 |
DBLP BibTeX RDF |
|
18 | Sumanth Chennupati, Mohammad Mahdi Kamani, Zhongwei Cheng, Lin Chen |
Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation. |
CoRR |
2021 |
DBLP BibTeX RDF |
|
18 | Guangyu Guo, Longfei Han, Junwei Han, Dingwen Zhang |
Pixel Distillation: A New Knowledge Distillation Scheme for Low-Resolution Image Recognition. |
CoRR |
2021 |
DBLP BibTeX RDF |
|
18 | Yunjie Ge, Qian Wang 0002, Baolin Zheng, Xinlu Zhuang, Qi Li 0002, Chao Shen 0001, Cong Wang 0001 |
Anti-Distillation Backdoor Attacks: Backdoors Can Really Survive in Knowledge Distillation. |
ACM Multimedia |
2021 |
DBLP DOI BibTeX RDF |
|
18 | Ilia Sucholutsky, Matthias Schonlau |
Soft-Label Dataset Distillation and Text Dataset Distillation. |
IJCNN |
2021 |
DBLP DOI BibTeX RDF |
|
18 | Sumanth Chennupati, Mohammad Mahdi Kamani, Zhongwei Cheng, Lin Chen |
Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation. |
BMVC |
2021 |
DBLP BibTeX RDF |
|
18 | Laëtitia Shao, Max Moroz, Elad Eban, Yair Movshovitz-Attias |
Neighbourhood Distillation: On the benefits of non end-to-end distillation. |
CoRR |
2020 |
DBLP BibTeX RDF |
|
18 | Zaida Zhou, Chaoran Zhuge, Xinwei Guan, Wen Liu |
Channel Distillation: Channel-Wise Attention for Knowledge Distillation. |
CoRR |
2020 |
DBLP BibTeX RDF |
|
18 | Zeyuan Allen-Zhu, Yuanzhi Li |
Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning. |
CoRR |
2020 |
DBLP BibTeX RDF |
|
18 | Cody Blakeney, Xiaomin Li, Yan Yan 0002, Ziliang Zong |
Craft Distillation: Layer-wise Convolutional Neural Network Distillation. |
CSCloud/EdgeCom |
2020 |
DBLP DOI BibTeX RDF |
|
18 | Adrien Gresse, Mathias Quillot, Richard Dufour, Jean-François Bonastre |
Apprentissage automatique de représentation de voix à l'aide d'une distillation de la connaissance pour le casting vocal (Learning voice representation using knowledge distillation for automatic voice casting ). |
JEP-TALN-RECITAL (1) |
2020 |
DBLP BibTeX RDF |
|
18 | Wangshu Zhang, Junhong Liu, Zujie Wen, Yafang Wang, Gerard de Melo |
Query Distillation: BERT-based Distillation for Ensemble Ranking. |
COLING (Industry) |
2020 |
DBLP DOI BibTeX RDF |
|
18 | Yingjie Ma, Yiqing Luo, Xigang Yuan |
Towards the really optimal design of distillation systems: Simultaneous pressures optimization of distillation systems based on rigorous models. |
Comput. Chem. Eng. |
2019 |
DBLP DOI BibTeX RDF |
|
18 | Lei Zhao, Xinyu Lyu, Wencheng Wang, Jun Shan, Tao Qiu |
Comparison of heterogeneous azeotropic distillation and extractive distillation methods for ternary azeotrope ethanol/toluene/water separation. |
Comput. Chem. Eng. |
2017 |
DBLP DOI BibTeX RDF |
|
18 | William L. Luyben |
Comparison of extractive distillation and pressure-swing distillation for acetone/chloroform separation. |
Comput. Chem. Eng. |
2013 |
DBLP DOI BibTeX RDF |
|
Displaying result #1 - #100 of 6509 (100 per page; Change: ) Pages: [ 1][ 2][ 3][ 4][ 5][ 6][ 7][ 8][ 9][ 10][ >>] |
|