TL;DR: I can consistently detect the difference between a tone in the 30-60kHz range being played and digital silence in a blind test. As mentioned in my profile post here I did a blind test between a 30kHz sinewave and digital silence, could hear a difference and got a 10/10 in the blind test. Prior to that I didn't think that I should be able to hear a difference, but I did. I just repeated the experiment and got perfect 10/10 scores all the way up to 60kHz. I also measured the output of both my DAC and the speaker drivers to see if there were any subharmonics (or other types of distortion) that I could measure that I could've heard, but I couldn't find any. Again I tried to match the level to around 80db at 20kHz. According to my own measurements I should have decent output to about 60kHz. In general I mainly felt a bit of a pressure on my head with the tones, a bit like a headache and a bit like when the acoustic reflex sets in (which it certainly does at those levels for sounds in the audible range). I wouldn't say I heard a high pitched tone, but I felt the noise floor increasing and maybe felt a bit uneasy in general. Detecting the tones got more and more difficult past 40kHz and 60kHz wasn't easy. Not as difficult as the 24 vs 20 bit files, more on the level of 512kbit/s .ogg vorbis and lossless redbook. The 70kHz tone was odd in that I'm sure I heard a very quiet difference tone, but to the best of my ability I couldn't measure one. It sounded like a tone in the 2kHz range. I could not hear a difference between silence and the 80, 85 and 90kHz tones. I know it's no proof, but I took a screenshot of a 10/10 score I got in one of the blind tests at 50kHz. I'm not making this up and hearing the difference here was a lot easier than the difference between 20bit and 24bit files. I later added 70, 80 and 85kHz files.