Quantum computers as experiments that are hard to predict outcomes Updated +Created
One possibly interesting and possibly obvious point of view, is that a quantum computer is an experimental device that executes a quantum probabilistic experiment for which the probabilities cannot be calculated theoretically efficiently by a nuclear weapon.
This is how quantum computing was originally theorized by the likes of Richard Feynman: they noticed that "Hey, here's a well formulated quantum mechanics problem, which I know the algorithm to solve (calculate the probability of outcomes), but it would take exponential time on the problem size".
The converse is then of course that if you were able to encode useful problems in such an experiment, then you have a computer that allows for exponential speedups.
This can be seen very directly by studying one specific quantum computer implementation. E.g. if you take the simplest to understand one, photonic quantum computer, you can make systems for which you need exponential time to calculate the probabilities that photons will exit through certain holes and not others.
The obvious aspect of this idea is by coming from quantum logic gates are needed because you can't compute the matrix explicitly as it grows exponentially: knowing the full explicit matrix is impossible in practice, and knowing the matrix is equivalent to knowing the probabilities of every outcome.
Quantum logic gate Updated +Created
At Section "Quantum computing is just matrix multiplication" we saw that making a quantum circuit actually comes down to designing one big unitary matrix.
We have to say though that that was a bit of a lie.
Quantum programmers normally don't just produce those big matrices manually from scratch.
Instead, they use quantum logic gates.