PDF4PRO ⚡AMP

Modern search engine that looking for books and documents around the web

Example: confidence

Robust Physical-World Attacks on Deep Learning Visual ...

This paper appears at CVPR 2018 Robust Physical-World Attacks on deep Learning Visual ClassificationKevin Eykholt 1, Ivan Evtimov*2, Earlence Fernandes2, Bo Li3,Amir Rahmati4, Chaowei Xiao1, Atul Prakash1, Tadayoshi Kohno2, and Dawn Song31 University of Michigan, Ann Arbor2 University of Washington3 University of California, Berkeley4 Samsung Research America and Stony Brook UniversityAbstractRecent studies show that the state-of-the-art deep neuralnetworks (DNNs) are vulnerable to adversarial examples,resulting from small-magnitude perturbations added to theinput. Given that that emerging physical systems are us-ing DNNs in safety-critical situations, adversarial examplescould mislead these systems and cause dangerous , understanding adversarial examples in the physi-cal world is an important step towards developing resilientlearning algorithms.

ples on deep learning models that interact with the physi-cal world through vision. Our overarching goal with this work is to inform research in building robust vision mod-els and to raise awareness on the risks that future phys-ical learning systems might face. We include more ex-amples and videos of the drive-by tests on our webpage

Tags:

  World, Building, Learning, Physical, Robust, Deep, Attacks, Robust physical world attacks on deep learning

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Spam in document Broken preview Other abuse

Transcription of Robust Physical-World Attacks on Deep Learning Visual ...