We tested the accuracy of the human auditory system to represent abstract rules in the auditory environment by means of the MMN and P3a event-related brain potentials (ERPs). For this purpose, ERPs were recorded to pairs of auditory stimuli in 16 adults. The abstract rule was the direction of the frequency within the pair. Standard pairs (p=0.8) were formed by two pure tones of the same frequency. Deviant pairs (p=0.2) were formed by tones of different frequency (the second tone being 2, 4, 6 or 8 musical steps of the tempered scale higher or lower in frequency than the first one). Each deviant type was presented in a separated block, thus having four conditions of ascending and four of descending frequency. There were thirteen frequency levels and five different physical pairs for each stimulus type. Stimuli pairs were presented with a stimulus-onset-asynchrony of 700 ms. EEG was recorded from ten electrodes (Fp1, Fp2, F3, F4, Fz C3, C4, Cz, LM, RM; reference: nose). Epochs of 700 ms were averaged including a 100ms pre-stimul baseline from the first stimulus of the pairs. The MMN and P3a ERPs increased linearly as a function of the magnitude of the abstract rule. These results indicate that, in addition to extracting abstract rules from among discrete stimuli, the auditory system can represent these relationships quantitatively.