Skip to Main Content Area
CPS-VO
Contact Support
Browse
Calendar
Announcements
Repositories
Groups
Search
Search for Content
Search for a Group
Search for People
Search for a Project
Tagcloud
› Go to login screen
Not a member?
Click here to register!
Forgot username or password?
Cyber-Physical Systems Virtual Organization
Read-only archive of site from September 29, 2023.
CPS-VO
targeted misclassification rates
biblio
BlurNet: Defense by Filtering the Feature Maps
Submitted by grigby1 on Mon, 12/28/2020 - 11:48am
Scalability
malicious examples
Metrics
neural nets
Neural networks
Perturbation methods
pubcrawl
resilience
Resiliency
robust physical perturbations
Robustness
RP
malicious adversary
security of data
standard blur kernels
standard-architecture traffic sign classifiers
standards
stop signs
substitute model
targeted misclassification rates
traffic engineering computing
victim model
white stickers
white-box attacks
frequency analysis
adaptive attack evaluation
adaptive filtering
adversarial defense
adversarial images
Adversarial Machine Learning
Adversarial robustness
attack algorithms
black stickers
blackbox transfer attack
BlurNet
depthwise convolution layer
Adaptation models
gradient information
high frequency noise
image recognition
image restoration
input image
Kernel
layer feature maps
learning (artificial intelligence)
low-pass filters
lowpass filtering behavior