Torrent details for "Marchisio A. Energy Efficiency and Robustness of Advanced ML Architectures 2025 [andryold1]"    Log in to bookmark

wide
Torrent details
Cover
Download
Torrent rating (0 rated)
Controls:
Category:
Language:
English English
Total Size:
25.24 MB
Info Hash:
07d1b5bedb396431b6fa2a711d0d830b90d388d5
Added By:
Added:  
14-09-2024 12:30
Views:
72
Health:
Seeds:
5
Leechers:
0
Completed:
253
wide




Description
wide
Externally indexed torrent
If you are the original uploader, contact staff to have it moved to your account
Textbook in PDF format

Machine Learning (ML) algorithms have shown a high level of accuracy, and applications are widely used in many systems and platforms. However, developing efficient ML-based systems requires addressing three problems: energy-efficiency, robustness, and techniques that typically focus on optimizing for a single objective/have a limited set of goals. This book tackles these challenges by exploiting the unique features of advanced ML models and investigates cross-layer concepts and techniques to engage both hardware and software-level methods to build robust and energy-efficient architectures for these advanced ML networks. More specifically, this book improves the energy efficiency of complex models like CapsNets, through a specialized flow of hardware-level designs and software-level optimizations exploiting the application-driven knowledge of these systems and the error tolerance through approximations and quantization. This book also improves the robustness of ML models, in particular for SNNs executed on neuromorphic hardware, due to their inherent cost-effective features. This book integrates multiple optimization objectives into specialized frameworks for jointly optimizing the robustness and energy efficiency of these systems. This is an important resource for students and researchers of computer and electrical engineering who are interested in developing energy efficient and robust ML.
Among Machine Learning (ML) systems, Deep Neural Networks (DNNs) have emerged as an established milestone for several applications, such as computer vision, medicine, finance, and robotics. This led to the need to deploy the DNN inference workload across various devices, including embedded systems with constrained resources. However, the current trends in the ML community are projected in the other direction since the newer networks tend to be deeper and more complex. For instance, Capsule Networks (CapsNets) are peculiar types of DNNs based on capsules, which are arrays of neurons, to learn high-level features with better capabilities than traditional DNNs. As a result, the next generation of computing platforms executing advanced DNNs would exhibit high complexity and consume high-energy, thus challenging their feasible implementations in resource-constrained devices.
On the other hand, Spiking Neural Networks (SNNs) emerged as an efficient computation infrastructure for elaborating event-based DNNs, which represent a closer manner to our current understanding of the human brain's functionality. This led to the development of the neuromorphic computing paradigm, whose hardware architectures support the execution of energy-efficient event-based SNNs.
Another fundamental aspect to consider when deploying advanced Deep Learning (DL) architectures is security. The system requires high robustness against various vulnerability threats when dealing with safety-critical applications. An adversary can threaten the integrity of the DL system through attacks at different levels, including the hardware and software stacks, and perturbing the inputs, the memory, or the computational engine. As a result, defensive countermeasures in different abstraction layers of the system must be applied, which typically require some energy and computation overhead. Moreover, while the security of traditional DNNs has been extensively studied, investigating the security of advanced DL systems offers unique opportunities to exploit their peculiar features.
Introduction
Background and Related Work
Hardware and Software Optimizations for Capsule Networks
Adversarial Security Threats for DNNs and CapsNets
Integration of Multiple Design Objectives into NAS Frameworks for CapsNets and DNNs
Efficient Optimizations for Spiking Neural Networks on Neuromorphic Hardware
Security Threats for SNNs on Discrete and Event-Based Data
Conclusion and Outlook

  User comments    Sort newest first

No comments have been posted yet.



Post anonymous comment
  • Comments need intelligible text (not only emojis or meaningless drivel).
  • No upload requests, visit the forum or message the uploader for this.
  • Use common sense and try to stay on topic.

  • :) :( :D :P :-) B) 8o :? 8) ;) :-* :-( :| O:-D Party Pirates Yuk Facepalm :-@ :o) Pacman Shit Alien eyes Ass Warn Help Bad Love Joystick Boom Eggplant Floppy TV Ghost Note Msg


    CAPTCHA Image 

    Anonymous comments have a moderation delay and show up after 15 minutes