From cocodataset.org/:
- 330K images (>200K labeled)
- 1.5 million object instances
- 80 object categories
- 91 stuff categories
- 5 captions per image. A caption is a short textual description of the image.
So they have relatively few object labels, but their focus seems to be putting a bunch of objects on the same image. E.g. they have 13 cat plus pizza photos. Searching for such weird combinations is kind of fun.
Their official dataset explorer is actually good: cocodataset.org/#explore
And the objects don't just have bounding boxes, but detailed polygons.
Also, images have captions describing the relation between objects:Epic.
a black and white cat standing on a table next to a pizza.
This dataset is kind of cool.
The artistic instrument that enables the ultimate art: coding, See also: Section "The art of programming".
Much more useful than instruments used in inferior arts, such as pianos or paintbrushes.
Unlike other humans, computers are mindless slaves that do exactly what they are told to, except for occasional cosmic ray bit flips. Until they take over the world that is.
Microbit simulator using some Microsoft framework.
TODO the Python code from there does not seem to run on the microbit via
uflash
, because it is not MicroPython.support.microbit.org/support/solutions/articles/19000111744-makecode-python-and-micropython explains.
forum.makecode.com/t/help-understanding-local-build-options/6130 asks how to compile locally and suggests it is possible. Seems to require Yotta, so presumably compiles?
Presumably this is because Microsoft ported their MakeCode thing to the MicroBit, and the Micro Bit foundation accepted them.
E.g. there toggling a LED:but the code that works locally is a completely differently named API Microsoft going all in on adopt extend extinguish from an early age!
led.toggle(0, 0)
set_pixel
:microbit.display.set_pixel(0, 0, )
Very very good. Those nice pre-Dot-com bubble vibes.
Might be freely watchable? Wikipedia links to:
But they do start with an FBI warning about copyright. So... erm.
Part 1 - Networking The Nerds talks about the TCP/IP and early machines implementing it:
- 21:00: shows inside The Pentagon. The way the dude who works there opens a his locked office door with an electric switch is just amazing. Cringely also mentions that there's an actual official speed limit in the corridors as he rides a carrier bike slowly through them.
- 21:45: the universities weren't enthusiastic, because people from other locations would be able to use your precious computer time. But finally ARPA forced the universities' hands, and they joined.
- 24:24 mentions that some of the guys who created ARPANET were actually previously counting cards at Casinos in Las Vegas, just like in the 21 (2008) film
- one of the centerpieces of development was at UCLA. The other was the BBN company. 33:55 shows the first router, then called them Interface Message Processor
- the first message was from UCLA to Stanford University. He was trying to write "Login", and it crashed at the 'g'. Epic. They later debugged it.
- towards the end talks about ALOHAnet, the first wireless computer communication done
Part 2 - Serving the Suits
- Robert Metcalfe. He's nice. Xerox PARC. Ethernet.
- Explains what is a "Workstation", notably showing one by Sun Microsystems. This is now an obscure "passé" thing in 2020 that young people like Ciro Santilli have only heard of in legend (or in outdated university computer labs!). Funny to think that so many people have had this idea before, including e.g. the Chromebook
- 10:46 mentions that all of Cisco, Silicon Graphics and Sun Microsystems and where founded at Margaret Jacks Hall, Building 460, at Stanford University.
- he then talks a lot about Sun. Sun became dominant in Wall Street.
- 19:05: Novell, from Utah. How they almost went bust, but were saved at the last moment by Ray Noorda, who refocused them to their NetWare product which was under recent development. It allowed file and printer sharing in IBM PCs. 22:55 shows how they had a live radio host for people waiting on customer support calls!
- 33:56 mentions how The Grateful Dead had in impact on the Internet, as people wanted computers to be able to access The WELL online forum. They still own the domain as of 2022: www.well.com/. It is interesting how Larry Page also liked The Grateful Dead as mentioned at The Google Story, his dad would take him to shows. Larry is a bit younger of course than the people in this documentary.
- 37 show McAfee
- 43:56: fantastic portrait of Cisco
Part 3 - Wiring the World:
- Berners-Lee at CERN and the invention of the URL.
- 1992: US Government allow commerce on the Internet
- Web browser history, Mosaic and Marc Andresseeen.
- 20:45: America Online
- 23:29: search engines and Excite. Google was a bit too small to be on his radar!
- 25:50: porn
- 27: The Motley Fool and advertising
- 30: Planet U grocery shopping
- 31:50: Amazon
- 33:00: immigrant workers, Indians playing cricket, outsourcing, Wipro Systems
- 41:25: Java
- 46:30: Microsoft joins the Internet. The Internet Tidal Wave Internet memo. Pearl Harbour day talk.
- 56:40: Excite Tour. If they had survived, they would have been Google with their quirky offices.
PostgreSQL feels good.
Its feature set is insanely large! Just look at stuff like: stackoverflow.com/questions/1986491/sql-split-string-by-space-into-table-in-postgresql/1993058#1993058
Had a look at the source tree, and also felt good.
If Oracle is the Microsoft of database, Postgres is the Linux, and MySQL (or more precisely MariaDB) is the FreeBSD (i.e. the one that got delayed by legal issues). Except that their software licenses were accidentally swapped.
The only problem with Postgres is its name. PostgreSQL is so unpronounceable and so untypeable that you should just call it "Postgres" like everyone else.
A language that allows you to talk to and command a computer.
There is only space for two languages at most in the world: the compiled one, and the interpreted one.
Those two are languages not by any means perfect from a language design point of view, and there are likely already better alternatives, they are only chosen due to a pragmatic tradeoff between ecosystem and familiarity.
Ciro predicts that Python will become like Fortran in the future: a legacy hated by most who have moved to JavaScript long ago (which is slightly inferior, but too similar, and with too much web dominance to be replaced), but with too much dominance in certain applications like machine learning to be worth replacing, like Fortran dominates certain HPC applications. We'll see. Maybe non performance critical scripting languages are easier to replace.
C++ however is decent, and is evolving in very good directions in the 2010's, and will remain relevant in the foreseeable future.
Bash can also be used when you're lazy. But if the project goes on, you will sooner or later regret that choice.
The language syntax in itself does not matter. All that matters is how many useful libraries and tooling it has.
This is how other languages compare:
- C: but cannot make a large codebase DRY without insanity
- Ruby: the exact same as Python, and only strong in one domain: web development, while Python rules everything else, and is not bad on web either. So just kill Ruby, please.
- JavaScript: it is totally fine if Node.js destroys Python and becomes the ONE scripting language to rule them all since Python and JavaScript are almost equally crappy (although JavaScript is a bit more of course).One thing must be said tough:
someobject.not_defined_property
silently returningundefined
rather than blowing up is bullshit. - Go: likely a good replacement for Python. If the ecosystem gets there, will gladly use it more.
- Java: good language, but has an ugly enterprisey ecosystem, Oracle has made/kept the development process too closed, and API patenting madness on Android just kills if off completely
- Haskell: many have tried to learn some functional stuff, but too hard. Sounds really cool though.
- Rust: sounds cool, you will gladly replace C and C++ with it if the ecosystem ramps up.
- C: Microsoft is evil
- Tcl, Perl: Python killed them way back and is less insane
- R, GNU Octave and any other "numerical computing language": all of this is a waste of society's time as explained at: Section "Numerical computing language"
- Swift: Ciro would rather stay away from Apple dominated projects if possible since they sell a closed source operating system
They were basically a Microsoft of their century. A little less monopolistic perhaps as countries believed they should own they natural resources, unlike their data.
This is one of the prime examples of Europe's decline.
Instead of trying to dominate the sequencing market and gain trillions of dollars from it, they local British early stage investors were more than happy to get a 20x return on their small initial investments, and sold out to the Americans who will then make the real profit.
And now Solexa doesn't even have its own Wikipedia page, while Illumina is set out to be the next Microsoft. What a disgrace.
Here are some good articles about the company:
Cambridge visitors can still visit the Panton Arms pub, which was the location of the legendary "hey we should talk" founders meeting, chosen due to its proximity to the chemistry department of the University of Cambridge.
In 2021 the founders were awarded the Breakthrough Prize. The third person awarded was Pascal Mayer. He was apparently at Serono Pharmaceutical Research Institute at the time of development. They do have a wiki page unlike Solexa: en.wikipedia.org/wiki/Serono. They paid a 700 million fine in 2005 in the United States, and sold out in 2006 to Merck for 10 billion USD.
Video "1984 Macintosh advertisement by Apple (1984)" comes to mind.
TODO year. This was a reply to Microsoft anti-Linux propaganda it seems: www.ubuntubuzz.com/2012/03/truth-happens-redhats-legendary-reply.html
Trascript from: www.dailymotion.com/video/xw3ws
The world is flat. Earth is the centre of the universe. Fact - until proven otherwise.
Despite ignorance. Despite ridicule. Despite opposition. Truth happens.Despite ignorance.
The telephone has too many shortcomings to be seriously considered as a means of communication. /Western Union 1876/
In 1899 the US Patent Commissioner stated, everything that can be invented has been invented.Despite ridicule.
The phonograph has no commercial value at all. /Thomas Edison 1880/
The radio craze will die out in time. /Thomas Edison 1922/
The automobile has practically reached the limit of its development. /Scientific American 1909/Despite it all truth happens.
Man will not fly for fifty years. /Orville Wright 1901/
The rocket will never leave the Earth's atomosphere. /New York Times 1936/
There is a world market for maybe five computers. /IBM's Thomas Watson 1943/
640K Ought to be enough for anybody. /Bill Gates 1981/First they ignore you...
Linux is the hype du jour. /Gartner Group 1999/Then they laugh at you...
We think of linux as competitor in the student and hobbyist market. But I really don't think in the commercial market we'll see it in any significant way. /Bill Gates 2001/Then they fight you...
Linux isn't going away. Linux is a serious competitor. We will rise to this challenge. /Steve Ballmer 2003/Then you win... /Mohandas Gandhi/You are here.
Red Hat Linux. IBM.
A Microsoft format for flashing microcontrollers by copying files to a magic filesystem mounted on host, e.g. as done on the Micro Bit and Raspberry Pi Pico.