The definition needs clarification
While I try to understand the meaning of selectivity, this article is not of much help right now. A figure would help.
As I understand it, selectivity is the largest ratio between an interferer (large) and a wanted signal (small) which still gives a detectable signal. The interferer and the wanted signal are measured at the receiver input, but filters or other circuitry removes most of the interferer before it reaches the detector. The selectivity of a receiver would then depend not only on the filters, but also on the signal to noise (SNR) that the detector needs.
Or do I misunderstand the word selectivity?
No, you are confusing sensitivity and selectivity. Selectivity in a receiver essentially refers to its ability to filter out an unwanted signal, which is usually larger than the desired signal, at a different frequency but still within the RF passband of the receiver. It is typically tested by injecting a "desired" tone at some reference power level together with a second tone offset in frequency from the first, (but at a larger power level), and measuring the ratio of the two tones at the output of the receiver. You are correct that the ratio is often measured indirectly (e.g. by bit error rate), but selectivity is backed out as a ratio from this measurement.
Sensitivity essentially refers to the weakest signal the receiver can detect in the presence of noise and other interferers. This depends on the modulation format of the signal and the SNR at the detector. The "noise and other interferers" that corrupt the desired signal are comprised of thermal and shot noise and various forms of nonlinear distortion which are produced by the interfering tones at other frequencies. Even though these interfereing tones might be removed by the selectivity of the receiver, they can still produce distortion which falls at the same frequency as the desired signal, thus cannot be filtered.