MS-EE-L Archives

August 2020

MS-EE-L@LISTSERV.GMU.EDU

Options: Use Proportional Font
Show HTML Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Content-Type:
multipart/alternative; boundary="_000_MN2PR05MB62088B662DA0D935319E0937B8480MN2PR05MB6208namp_"
Date:
Thu, 6 Aug 2020 13:30:13 +0000
Reply-To:
Jammie Chang <[log in to unmask]>
Subject:
From:
Jammie Chang <[log in to unmask]>
Message-ID:
MIME-Version:
1.0
Sender:
Comments:
Parts/Attachments:
text/plain (2348 bytes) , text/html (10 kB)
Notice and Invitation

Oral Defense of Doctoral Dissertation

The Volgenau School of Engineering, George Mason University



Zhuwei Qin



Bachelor of Science, Tianjin University of Science and Technology, 2014

Master of Science, Oregon State University, 2016



 Interpretable Deep Learning for Efficient Mobile Computing



Thursday, August 6, 2020, 10:00 AM-11:00 AM



Via Zoom Meeting:



https://secure-web.cisco.com/1MgOEXzQ3Nu4wTiae8BVt6JGNlS1f1b6N5CpgdNbpoMOymRTqIRnj8DcxLfGy0HWqZg9crdZlsH5k75uaLvLMPn80T7pFmF0wc1w9zfenobCAAVbe2YjphUEDDwCNPqPZKsbGRbTxZ_xewEKAtgXq3-IU5Wm2_BAZmyRCj8SlmrkY-s6EOUrxCymJGuY_ICIoP10UHT1fZFAkC22JP8b4yAsnYsEy1izdP-i86LUem1xuCU9ep-oOl1PBr0UsLpLMQKzR9ciojbCRbaDDaUSAeLwJ739UC_gIAuFCqAJOltswlPFjZYdlH-92WoFfNUwi8DLC4OpnQkXyfGizSpLOxIeqA8Q8ub8SWvFbkNgOUiqRCJGyhgaFO99dIGAzp0XJgDVm0Cpja0HeFj931C-9Bo7YsgEydD0mRlX6aZb0Iq5ajb5BwpQ2L_E98Rpv95wC/https%3A%2F%2Fgmu.zoom.us%2Fj%2F97522143463



All are invited to attend.



Committee



Dr. Xiang Chen, Chair

Dr. Kun Sun

Dr. Zhi Tian

Dr. Brian L. Mark





Abstract

Promoted by the evolution of artificial intelligence and deep learning, more and more intelligent applications have emerged on mobile devices. As one of the most representative deep learning technologies, deep neural networks (DNNs) have been considered as a primary tool in computer vision fields. However, the heavy computation, memory, and energy demands of the DNN model restrict their deployment on resource-constrained mobile devices. In this dissertation, I focus on research solutions that enable efficient processing of DNNs by qualitatively interpreting their inside working mechanism (i.e., neuron functionality). I proposed a set of computation optimization approaches for DNN execution on mobile devices through better model interpretability. I first proposed a functionality-oriented convolutional filter pruning method to optimize the DNN algorithm for fast inference. To further adapt the DNN to diverse mobile applications, I proposed a class-adaptive DNN reconfiguration framework for mobile applications. Finally, I proposed a collective edge learning system to enable training DNN on mobile system. As a result, this dissertation provides a novel mobile DNN optimization approach by examining the close combination of the neural network interpretation and the mobile system features for more performance escalation.



ATOM RSS1 RSS2