People will be more interested if they see how the stuff they are learning is useful.
Useful 99% of the time means you can make money with it.
Achieving novel results for science, or charitable goals (e.g. creating novel tutorials) are also equaly valid. Note that those also imply you being able to make a living out of something, just that you will be getting donations and not become infinitey rich. and that is fine.
Projects don't need of course to reach the level of novel result. But they must at least aim at moving towards that.
This is one of the greatest challenges of education, since a huge part of the useful information is locked under enterprise or military secrecy, or even open academic incomprehensibility, making it nearly to impossible for the front-line educators to actually find and teach real use cases.
CEO: Jeremy O'Brien
Raised 215M in 2020: www.bloomberg.com/news/articles/2020-04-06/quantum-computing-startup-raises-215-million-for-faster-device
Good talk by CEO before starting the company which gives insight on what they are very likely doing: Video "Jeremy O'Brien: "Quantum Technologies" by GoogleTechTalks (2014)"
PsiQuantum appears to be particularly secretive, even more than other startups in the field.
They want to reuse classical semiconductor fabrication technologies, notably they have close ties to GlobalFoundries.
The Supermen: The Story of Seymour Cray by Charles J. Murray (1997) Updated 2024-12-15 +Created 1970-01-01
Borrow from the Internet Archive for free: archive.org/details/supermenstory00murr
Initial chapters put good clarity on the formation of the military-industrial complex. Being backed by the military, especially just after World War II, was in itself enough credibility to start and foster a company.
It is funny to see how the first computers were very artisanal, made on a one-off basis.
Amazing how Control Data Corporation raised capital IPO style as a startup without a product. The dude was selling shares at dinner parties in his home.
Very interesting mention on page 70 of how Israel bought CDC's UNIVAC 1103 which Cray contributed greatly to design, and everyone knew that it was to make thermonuclear weapons, since that was what the big American labs like this mention should be added to: en.wikipedia.org/wiki/Nuclear_weapons_and_Israel but that's Extended Protected... the horrors of Wikipedia.
Another interesting insight is how "unintegrated" computers were back then. They were literally building computers out of individual vacuum tubes, then individual semiconducting transistors, a gate at a time. Then things got more and more integrated as time went. That is why the now outdated word "microprocessor" existed. When processors start to fit into a single integrated circuit, they were truly micro compared to the monstrosities that existed previously.
Also, because integration was so weak initially, it was important to more manually consider the length of wire signals had to travel, and try to put components closer together to reduce the critical path to be able to increase clock speeds. These constraints are also of course present in modern computer design, but they were just so much more visible in those days.
The book does unfortunately not give much detail in Crays personal life as mentioned on this book review: www.goodreads.com/review/show/1277733185?book_show_action=true. His childhood section is brief, and his wedding is described in one paragraph, and divorce in one sentence. Part of this is because he was very private about his family most likely note how Wikipedia had missed his first wedding, and likely misattribute children to the second wedding; en.wikipedia.org/wiki/Talk:Seymour_Cray section "Weddings and Children".
Crays work philosophy is is highlighted many times in the book, and it is something worthy to have in mind:
- if a design is not working, start from scratch
- don't be the very first pioneer of a technology, let others work out the problems for you first, and then come second and win
Cray's final downfall was when he opted to try to use a promising but hard to work with material gallium arsenide instead of silicon as his way to try and speed up computers, see also: gallium arsenide vs silicon. Also, he went against the extremely current of the late 80's early 90's pointing rather towards using massively parallel systems based on silicon off-the-shelf Intel processors, a current that had DARPA support, and which by far the path that won very dramatically as of 2020, see: Intel supercomputer market share.