New Project Proposal to NCSS I/UCRC - Bayesian Optimization for Deep Learning in Sensor Applications
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
New Project Proposal to NCSS I/UCRC Bayesian Optimization for Deep Learning in Sensor Applications Presenter: Giulia Pedrielli Project Leads: Gautam Dasarathy (PI), Giulia Pedrielli, and Andreas Spanias Date: June 29, 2021 ASU 2021-6-1 Rev 2 Copyright © 2020 NSF Net-Centric I/UCRC. All Rights Reserved.
Problem Statement › Why is this research needed? – Several problems in science, engineering, and medicine can be modeled as the optimization of an expensive black-box function: › Hyperparameter tuning of deep neural networks › Optimal Testing for cyber-physical systems (e.g., self-driven cars) – Bayesian optimization (BO) is a promising approach to solve such black-box optimization problems. – It is extremely computationally intensive to scale BO to high dimensions reducing its practicality. › What is the specific problem to be solved? › Develop novel strategies for scaling BO to high dimensions with application to DNN tuning and sensor data science; › Develop new algorithms and theory for effective scaling by incorporating graph structure › Challenges › Requires new theory for systematic scaling that combines combinatorial properties (e.g., graphs, NN layout) with continuous parameters (e.g., weights) › Requires new analytical techniques for devising optimal sampling protocols in such mixed 2 spaces.
Project Description › How will this project approach the problem? › Create a novel framework for combining Bayesian optimization techniques with structure encoded as graphs. › Leverage these techniques to perform DNN training for sensor applications › Create and disseminate (via open source software) a general purpose framework for BO with graph structure. › Preliminary results from this or previous projects: › PI Dasarathy has developed novel theory and algorithms for BO and sequential ML frameworks that leverage structure. For instance: [1-4]. › Co-PI Pedrielli has developed several techniques for accelerating BO with local search and for scaling BO by leveraging low-fidelity models. For instance: [5-6]. › Co-PI A. Spanias Synergies with SenSIP on BO for ML [7] 3
Structured BO sampler 4
Project Differentiators › What results does this project seek that are different (better) than others? › Novel approach to scale BO to high dimensions by leveraging structure › Novel strategies for designing efficient and effective DNNs whose hyperparameters are tuned using insights from applications › What specific innovations or insights are sought by this research that distinguish it from related work? › Computationally efficient algorithms for BO, especially in inputs from mixed spaces › Ready extension to other areas such as: › Other ML applications (e.g., GAN training) › Biomedical and computational chemistry applications (e.g., prediction of secondary structure, drug discovery) 5
Connection to NCSS Competencies/Capabilities Sensors Neural & ML Nets 6 =Primary, =Secondary, =Tertiary
Statement of work Statement of Work: Briefly describe the work to be performed, task budgets, and deliverables for the 5 most important tasks planned for this project. Task# Description Budget Deliverable Task-1 Development of novel graph- Preliminary results on sampler and validation based sampler for efficient 3 MOS on publicly available datasets. Open-source sampling from structured spaces software. Task-2 Development of integrated BO End to end system. Performance analysis. with structured and continuous 4 MOS Open-source software spaces Task-3 DNN training for sensor Presentation and trained NN with validation on applications using our novel BO 2 MOS publicly available datasets framework Task-4 Compare algorithms with prior 3 MOS Software, Final Report. Prepare IEEE paper. work and establish final results. 7
Sponsorship and Collaboration › Efforts to involve multiple companies in project sponsorship: – Raytheon – On Semi – NXP Multi-university Collaboration: Describe efforts to involve multiple universities in sponsorship of the proposed research (whether or not they were successful). This will likely mostly be performed at ASU. 8
References – [1] LEJEUNE, D., DASARATHY, G. and BARANIUK, R. (2020). Thresholding Graph Bandits with GrAPL. In International Conference on Artificial Intelligence and Statistics (AISTATS) pp 2476–85. PMLR. – [2] KANDASAMY, K., DASARATHY, G., SCHNEIDER, J. and PÓCZOS, B. (2017). Multi-fidelity bayesian optimisation with continuous approximations. In International Conference on Machine Learning (ICML) pp 1799–808. PMLR. – [3] KANDASAMY, K., DASARATHY, G., OLIVA, J. B., SCHNEIDER, J. and PÓCZOS, B. (2016). Gaussian process bandit optimisation with multi-fidelity evaluations. Advances in neural information processing systems (NeurIPS) 29 992–1000. – [4] KANDASAMY, K., DASARATHY, G., OLIVA, J., SCHNEIDER, J. and POCZOS, B. (2019). Multi-fidelity gaussian process bandit optimisation. Journal of Artificial Intelligence Research (JAIR) 66 151–96 – [5] Mathesen, L., Pedrielli, G., Ng, S.H. et al. Stochastic optimization with adaptive restart: a framework for integrated local and global learning. J Glob Optim 79, 87–110 (2021). – [6] Zabinsky Z.B., Pedrielli G., Huang H. (2019) A Framework for Multi-fidelity Modeling in Global Optimization Approaches. In: Nicosia G., Pardalos P., Umeton R., Giuffrida G., Sciacca V. (eds) Machine Learning, Optimization, and Data Science. LOD 2019. Lecture Notes in Computer Science, vol 11943. Springer, Cham. – [7] M. Malu, G. Dasarathy, A. Spanias, Bayesian Optimization Survey, IEEE IISA 2021, July 2021.. 9
You can also read