Effect of Internet Service Quality on Streaming?

I’ve struggled for a long time with sound getting much, much worse around dinner time, and in some rare cases I don’t get depth, clarity, dynamics and imaging back until around midnight. Like many people I’ve attributed this to noise on my AC lines. But recently I’ve been wondering if maybe internet service quality is at least contributing to the issue in some manner. When I run tests it appears that speed, jitter, and latency are all higher at times when the sound is poor. That got me wondering if anyone knows whether one type of internet service is better than another for HiFi streaming? For example, is ADSL or DSL better, or does it matter? And what about speed? Particularly interested in anyone who has real world experiences from experimenting in this area…

As I have understood streaming over the many years that I have been doing it (remember the Griffin iMic?), the "streamed" files are downloaded, cached and replayed by the given software: Apple Music/iTunes, Spotify, Qobuz et al.

Although someone of you may refute that, it seems that the problem of line, processor, cable etc. noise has largely been solved by well-wrought(writ?) playback software. 

Just measuring the sound floor in a quiet NJ den, I get readings of 27-30dB. 

Whose ears and brain can process that amount of ambient hash and hear the "effect" of AC, cable, fiber transmission?


nyev  did you say you had measured the power to see how bad it is?

Ever try an ADD-POWR  Sorcer X4 ?  They seem to work well with bad power  systems.

If you live in the NW or So Cal I can help you try one.

@jkevinoc , I believe you are correct in the way that modern playback software works. But I don’t believe the “captured” and cached stream would be a bit-perfect copy of an equivalent purchased local file at the same bit rate. Possibly due to differences in the streaming company’s source file, and possibly due to errors introduced in the stream that the cached copy came from originally (due to noise or jitter). And yes, I am aware that the Ethernet protocol has error correction built in, but I still believe noise and jitter are still factors.


@jeffseight , I am based in western Canada. Never heard of the device you are referring to but I’ll look into it - thanks for the suggestion.

I’ve not been able to measure the quality of my power as I don’t have a meter that reads THD. I mentioned above that I had measured my internet service quality using Fusion network tests (a website you go to and click to start the test). Measures jitter, latency, download and upload speed. And as I mentioned, when the sound is bad, jitter and latency go from their normal 3ms and 12ms to spikes of 40ms or so and 70ms, respectively.  I’m simply guessing that AC quality is also poorer during those times.


@nyev Could the TV be sending out some sort of interference? Could it be one of the components (doubtful). What kind of lights are in use near by? Is there some big eclectic using appliance near by? Micro wave tower near by?

Perhaps I missed it but have you tried another streamer?

’Round suppah time is when internet usage in residential areas is highest.

FYI I use starlink and that’s way up from the 12MB service before it.