Will Quantum Computing Bridge the analog gap?


Some time back I posed the question, 'Will Quantum Computing bridge the digital/analog gap...will it finally, once and for all take us to that next musical level?'
For those too young to remember, early digital was abysmal. Just god awful. Tekkis loved it..'It's so quiet', no ticks and pops.
Music lovers decried the total lack of harmonic sructure. The falacy in thinking I suppose was that 'digital is perfect'...ones and zero...there can be no mistake.
Enter, Bill Conrad and Lew Johnson, California Audio Labs, Boothroyd Stuart...those who looked at the 'big picture' and saw that the original signal, no matter how well done, was NOT getting converted back to analog...started fiddling with the audio circuitry, yada yada.
Thirty years hence...quantum computing tells us that we have 'alternate states' of the digital signal....that, 'hey maybe they're not just ones and zeroes...maybe there can be MORE information...and what if we could 'read' those alternate states.
I tried to pose that 'next step thinking' but kept running into the 'well the files are only 'this big', mentality.
The whole point is...NO, the files are virtually infinite, if/when we apply 'alternate' states of superposition, in which the binary becomes not 'either a one or a zero' but part of each, in micro nano seconds....approaching, to my addled brain what could be....an almost analog state.
I just found this:

"A quantum computer would be unlike traditional silicon-based computers that use binary codes of ones and zeros. Quantum computing is based on "superpositions" and qubits that would allow a computer to store information as both zero and one at the same time. The struggle now is to make qubits more reliable, as information becomes lost over time due to quantum fluctuations, a phenomenon known as "fault tolerance."

The new "superhighway" technology will allow intense computing tasks like code-breaking, climate modeling and biomedical simulations to be carried out thousands of times faster, the Rice physicists explained.

"In principle, we don't need many qubits to create a powerful computer. In terms of information density, a silicon microprocessor with one billion transistors would be roughly equal to a quantum processor with 30 qubits," said Du."
So I'll throw this highly speculative question out there....will 'quantum Computing take us there....?

Who knows...I'm hoping I live to see it.

Good listening,

Larry
lrsky
The Virtual Girlfriend has been in the works, maybe Quantum Girlfriend is the better upgrade. I'm still interested in the Analog Girlfriend. Of course, those require constant tweeking:)
"Quantum" is not a synonym for "magic" or circumventing the laws of thermodynamics.
Daverz,
Not sure I understand?

Before when I posed the Quantum, I was brought back to the 'size of the file' thoughts that kept coming up.
To my limited brain...the whole nature of this is...given 'superposition' in which the binary code is no longer, just ones and zeroes...but virtually infinite 'versions' of those previously limited data points...aren't we talking about 'virtual analog' here?

Would Einstein, DaVinci, Galileo, step forward and clear this up?? At least someone with a better grasp of this...
Tell me where my logic is failing...

Good listening,
Larry
The simple answer is that no one really has an idea of what quantum computing would involve; Roger Penrose has written a book outlining his idea that it is quantum processes that allow human conciousness to function. While "Quantum" has become a buzzword for all types of total nonsense the truth is that Quantum processes lie at the very heart of our reality; at a much deeper lever than say the Laws of Thermodynamics; which are descriptive laws rather than proscriptive laws. There are no reasons why they must be true, although they appear to function in our local reality. The laws of physics say that any reaction can function in either direction equally and there is no good reason why they don't. Quantum theory has predicted totally wierd things like quantum entanglement, which Einstein and Bohr agreed was totally absurd. Whatever quantum computing turns out to be like it is a good bet that it will be different from anything we expect.