Neural Networks Intuitions: 13.Out-of-Distribution Detection and HOD Loss — Paper Explanation


The term “unseen” can mean two or more things here:
inputs that do not correspond to any of the class patterns that the network was trained on.
inputs that differ highly in terms of texture, lighting conditions, environment etc. from that present part of the training set.
Read more at Medium…

Discover more from Emsi's feed

Subscribe now to keep reading and get access to the full archive.

Continue reading