Follows the "certified teacher only" approach which is in Ciro Santilli's opinion a fatal flaw of most elearning systems out there, OurBigBook.com won't suffer from that!
But that is a very, very good project.
All notes appear to have been extracted from existing notes, as noted on the bottom of each page.
Appears to have mixed licenses. E.g.:
- phys.libretexts.org/Bookshelves/University_Physics/Book%3A_University_Physics_(OpenStax)/Book%3A_University_Physics_III_-_Optics_and_Modern_Physics_(OpenStax)/06%3A_Photons_and_Matter_Waves/6.06%3A_De_Broglies_Matter_Waves is CC BY
- but we had seen another one that was CC BY-NC-SA
- phys.libretexts.org/Courses/University_of_California_Davis/UCD%3A_Physics_9HE_-_Modern_Physics/06%3A_Emission_and_Absorption_of_Photons/6.1%3A_Transitions_Between_Stationary_States CC BY-SA
- chem.libretexts.org/Bookshelves/Introductory_Chemistry/Introductory_Chemistry_(CK-12) uses the custom "CK-12 license" which seems a bit like CC BY-NC-SA
- some don't even have a free license, e.g.: phys.libretexts.org/Bookshelves/Quantum_Mechanics/Quantum_Mechanics_(Fowler)/00%3A_Front_Matter/04%3A_Licensing
TODO how does it work exactly? Do they ask for permission from authors in every case, including when the content has open license? Or when it has open license, do they just do it? In some cases, the notes have no license, so they must have asked.
TODO what is the source code that authors write? LaTeX or something else? LaTeX feels extremely likely given that it is what most original materials were already written in.
They are attempting a "model up this entire university" thing: phys.libretexts.org/Courses which is good. E.g. they have a bunch of "quantum mechanics ones under: phys.libretexts.org/Bookshelves/Quantum_Mechanics
Appears to be UC Davies-based mostly.
They claim to use this closed source backend: www.nice.com/resources/cxone-expert-knowledge-management? Seriously? For a publicly funded project with low-tech requirements?? It is mind blowing.
Some issues:
- the internal cross references are somewhat broken as of 2022.
- their URLs are HUGE! All components of every ancestor are in it. E.g. check this out: phys.libretexts.org/Bookshelves/Quantum_Mechanics/Introductory_Quantum_Mechanics_(Fitzpatrick)/12%3A_Time-Dependent_Perturbation_Theory/12.13%3A_Forbidden_Transitions Insane.
OK let's database it:
The general result from eigendecomposition of a matrix:becomes:where is an orthogonal matrix, and therefore has .
Let' see if there's anything in records/mx.xz.
mx.csv is 21GB.
They do have
"
in the files to escape commas so:mx.pyWould have been better with csvkit: stackoverflow.com/questions/36287982/bash-parse-csv-with-quotes-commas-and-newlines
import csv
import sys
writer = csv.writer(sys.stdout)
with open('mx.csv', 'r') as f:
reader = csv.reader(f)
for row in reader:
writer.writerow([row[0], row[3]])
then:
# uniq not amazing as there are often two or three slightly different records repeated on multiple timestamps, but down to 11 GB
python3 mx.py | uniq > mx-uniq.csv
sqlite3 mx.sqlite 'create table t(d text, m text)'
# 13 GB
time sqlite3 mx.sqlite ".import --csv --skip 1 'mx-uniq.csv' t"
# 41 GB
time sqlite3 mx.sqlite 'create index td on t(d)'
time sqlite3 mx.sqlite 'create index tm on t(m)'
time sqlite3 mx.sqlite 'create index tdm on t(d, m)'
# Remove dupes.
# Rows: 150m
time sqlite3 mx.sqlite <<EOF
delete from t
where rowid not in (
select min(rowid)
from t
group by d, m
)
EOF
# 15 GB
time sqlite3 mx.sqlite vacuum
Let's see what the hits use:
awk -F, 'NR>1{ print $2 }' ../media/cia-2010-covert-communication-websites/hits.csv | xargs -I{} sqlite3 mx.sqlite "select distinct * from t where d = '{}'"
At around 267 total hits, only 84 have MX records, and from those that do, almost all of them have exactly:with only three exceptions:We need to count out of the totals!which gives, ~18M, so nope, it is too much by itself...
smtp.secureserver.net
mailstore1.secureserver.net
dailynewsandsports.com|dailynewsandsports.com
inews-today.com|mail.inews-today.com
just-kidding-news.com|just-kidding-news.com
sqlite3 mx.sqlite "select count(*) from t where m = 'mailstore1.secureserver.net'"
Let's try to use that to reduce where
av.sqlite
from 2013 DNS Census virtual host cleanup a bit further:time sqlite3 mx.sqlite '.mode csv' "attach 'aiddcu.sqlite' as 'av'" '.load ./ip' "select ipi2s(av.t.i), av.t.d from av.t inner join t as mx on av.t.d = mx.d and mx.m = 'mailstore1.secureserver.net' order by av.t.i asc" > avm.csv
avm
stands for av
with mx
pruning. This leaves us with only ~500k entries left. With one more figerprint we could do a Wayback Machine CDX scanning scan.Let's check that we still have most our hits in there:At 267 hits we got 81, so all are still present.
grep -f <(awk -F, 'NR>1{print $2}' /home/ciro/bak/git/media/cia-2010-covert-communication-websites/hits.csv) avm.csv
secureserver is a hosting provider, we can see their blank page e.g. at: web.archive.org/web/20110128152204/http://emmano.com/. security.stackexchange.com/questions/12610/why-did-secureserver-net-godaddy-access-my-gmail-account/12616#12616 comments:
secureserver.net is the name GoDaddy use as the reverse DNS for IP addresses used for dedicated/virtual server hosting
Some amazing people have put book source codes on GitHub. This is a list of such repos.
Mentioned at: aws.amazon.com/ec2/instance-types/g4/
TODO meaning of "nd"? "n" presumably means Nvidia, but what is the "d"? Compare it g4ad.xlarge which has AMD GPUs. aws.amazon.com/ec2/instance-types/g4/ mentions:
G4 instances are available with a choice of NVIDIA GPUs (G4dn) or AMD GPUs (G4ad).
Price:
- 2025-03-10: 0.526 USD / Hour
Then a specific metric is involved, sometimes we want to automatically add it to products.
E.g., in a context considering the common Minkowski inner product matrix where the 4x4 matrix and is a vector in which leads to the change of sign of some terms.
Solutions for the Schrodinger equation with multiple particles Updated 2025-03-28 +Created 1970-01-01
Bibliography:
- Quantum Mechanics for Engineers by Leon van Dommelen (2011) "5. Multiple-Particle Systems"
There are unlisted articles, also show them or only show them.