Chomsky hierarchy Updated 2025-07-16
This is the classic result of formal language theory, but there is too much slack between context free and context sensitive, which is PSPACE (larger than NP!).
A good summary table that opens up each category much more can be seen e.g. at the bottom of en.wikipedia.org/wiki/Automata_theory under the summary thingy at the bottom entitled "Automata theory: formal languages and formal grammars".
Chordate Updated 2025-07-16
Chordate is a sad clade.
You read the name and think: hmm, neural cords!
But then you see that his is one of its members:
Yup. That's your cousin. And it's a much closer cousin than something like arthropods, which at least have heads eyes and legs like you.
Chrome Android extension Updated 2025-07-16
Lol it is note possible what a joke. Notably this makes it harder to have of a superior third party password manager like Proton Pass (though there seems to be an autocomplete app as an alternative path), and an ad blocker. Fuck Google.
Also, Chromium is not available on Google Play by default, you can install the apk, but you will miss updates:
Chromium (web browser) Updated 2025-07-16
Google is trying to kill it as of 2021: www.omgubuntu.co.uk/2021/01/chromium-sync-google-api-removed The lack of sync is a major major blow. So selfish. Google makes billions, and it won't give in a little bit of settings storage...
Chromosome Updated 2025-07-16
Church of England Updated 2025-07-16
Political division:
  • nominal leader: British monarch
  • toplevel arch-dioceses/provinces of Cantebury and York. One archbishop each, who is also bishop of Cantebury and York diocese
  • within provinces: one cathedral and bishop per diocese
This article is about covert agent communication channel websites used by the CIA in many countries from the mid 2000s until the early 2010s, when they were uncovered by counter intelligence of some of the targeted countries, notably Iran and China, circa 2010-2013.
This article uses publicly available information to publicly disclose for the first time a few hundred of what we feel are extremely likely candidate sites of the network. The starting point for this research was the September 2022 Reuters article "America’s Throwaway Spies" which for the first time gave nine example websites, and their analyst from Citizenlabs claims to have found 885 websites in total, but did not publicly disclose them. Starting from only the nine disclosed websites, we were then able to find a few hundred websites that share so many similarities with them, i.e. a common fingerprint, that we believe makes them beyond reasonable doubt part of the same network.
If you enjoy this article, consider dropping some Monero at: 4A1KK4uyLQX7EBgN7uFgUeGt6PPksi91e87xobNq7bT2j4V6LqZHKnkGJTUuCC7TjDNnKpxDd8b9DeNBpSxim8wpSczQvzf so I can waste it on my foolish attempts to improve higher education. Other sponsorship methods: Section "Sponsor Ciro Santilli's work on OurBigBook.com".
https://raw.githubusercontent.com/cirosantilli/media/master/CIA_Star_Wars_website_promo.jpg
The discovery of these websites by Iranian and Chinese counterintelligence led to the imprisonment and execution of several assets in those countries, and subsequent shutdown of the channel by the CIA when they noticed that things had gone wrong. This is likely a Wikipedia page that talks about the disastrous outcome of the websites being found out: 2010–2012 killing of CIA sources in China, although it contained no mention of websites before Ciro Santilli edited it in.
Of particular interest is that based on their language and content, certain of the websites seem to have targeted other democracies such as Germany, France, Spain and Brazil.
If anyone can find others websites, or has better techniques feel free to contact Ciro Santilli at: Section "How to contact Ciro Santilli". Contributions will be clearly attributed if desired. Some of the techniques used so far have been very heuristic, and that added to the limited amount of data makes it almost certain that some websites have been missed. Broadly speaking, there are two types of contributions that would be possible:
The fact that citizenlabs reported exactly 885 websites being found makes it feel like they might have found find a better fingerprint which we have not managed to find yet. We have not yet had to pay for our data. If someone wants to donate to the research, some ideas include:
Disclaimers:
May this article serve as a tribute to those who spent their days making, using, and uncovering these websites under the shadows.
Hostprobes quick look on two ranges:
208.254.40:
... similar down

208.254.40.95	1334668500	down	no-response
208.254.40.95	1338270300	down	no-response
208.254.40.95	1338839100	down	no-response
208.254.40.95	1339361100	down	no-response
208.254.40.95	1346391900	down	no-response
208.254.40.96	1335806100	up	unknown
208.254.40.96	1336979700	up	unknown
208.254.40.96	1338840900	up	unknown
208.254.40.96	1339454700	up	unknown
208.254.40.96	1346778900	up	echo-reply (0.34s latency).
208.254.40.96	1346838300	up	echo-reply (0.30s latency).
208.254.40.97	1335840300	up	unknown
208.254.40.97	1338446700	up	unknown
208.254.40.97	1339334100	up	unknown
208.254.40.97	1346658300	up	echo-reply (0.26s latency).

... similar up

208.254.40.126	1335708900	up	unknown
208.254.40.126	1338446700	up	unknown
208.254.40.126	1339330500	up	unknown
208.254.40.126	1346494500	up	echo-reply (0.24s latency).
208.254.40.127	1335840300	up	unknown
208.254.40.127	1337793300	up	unknown
208.254.40.127	1338853500	up	unknown
208.254.40.127	1346454900	up	echo-reply (0.23s latency).

208.254.40.128	1335856500	up	unknown
208.254.40.128	1338200100	down	no-response
208.254.40.128	1338749100	down	no-response
208.254.40.128	1339334100	down	no-response
208.254.40.128	1346607900	down	net-unreach
208.254.40.129	1335699900	up	unknown

... similar down
Suggests exactly 127 - 96 + 1 = 31 IPs.
208.254.42:
... similar down

208.254.42.191	1334522700	down	no-response
208.254.42.191	1335276900	down	no-response
208.254.42.191	1335784500	down	no-response
208.254.42.191	1337845500	down	no-response
208.254.42.191	1338752700	down	no-response
208.254.42.191	1339332300	down	no-response
208.254.42.191	1346499900	down	net-unreach

208.254.42.192	1334668500	up	unknown
208.254.42.192	1336808700	up	unknown
208.254.42.192	1339334100	up	unknown
208.254.42.192	1346766300	up	echo-reply (0.40s latency).
208.254.42.193	1335770100	up	unknown
208.254.42.193	1338444900	up	unknown
208.254.42.193	1339334100	up	unknown

... similar up

208.254.42.221	1346517900	up	echo-reply (0.19s latency).
208.254.42.222	1335708900	up	unknown
208.254.42.222	1335708900	up	unknown
208.254.42.222	1338066900	up	unknown
208.254.42.222	1338747300	up	unknown
208.254.42.222	1346872500	up	echo-reply (0.27s latency).
208.254.42.223	1335773700	up	unknown
208.254.42.223	1336949100	up	unknown
208.254.42.223	1338750900	up	unknown
208.254.42.223	1339334100	up	unknown
208.254.42.223	1346854500	up	echo-reply (0.13s latency).

208.254.42.224	1335665700	down	no-response
208.254.42.224	1336567500	down	no-response
208.254.42.224	1338840900	down	no-response
208.254.42.224	1339425900	down	no-response
208.254.42.224	1346494500	down	time-exceeded

... similar down
Suggests exactly 223 - 192 + 1 = 31 IPs.
Let's have a look at the file 68: outcome: no clear hits like on 208. One wonders why.
It does appears that long sequences of ranges are a sort of fingerprint. The question is how unique it would be.
First:
n=208
time awk '$3=="up"{ print $1 }' $n | uniq -c | sed -r 's/^ +//;s/ /,/' | tee $n-up-uniq
t=$n-up-uniq.sqlite
rm -f $t
time sqlite3 $t 'create table tmp(cnt text, i text)'
time sqlite3 $t ".import --csv $n-up-uniq tmp"
time sqlite3 $t 'create table t (i integer)'
time sqlite3 $t '.load ./ip' 'insert into t select str2ipv4(i) from tmp'
time sqlite3 $t 'drop table tmp'
time sqlite3 $t 'create index ti on t(i)'
This reduces us to 2 million IP rows from the total possible 16 million IPs.
OK now just counting hits on fixed windows has way too many results:
sqlite3 208-up-uniq.sqlite "\
SELECT * FROM (
  SELECT min(i), COUNT(*) OVER (
    ORDER BY i RANGE BETWEEN 15 PRECEDING AND 15 FOLLOWING
  ) as c FROM t
) WHERE c > 20 and c < 30
"
Let's try instead consecutive ranges of length exactly 31 instead then:
sqlite3 208-up-uniq.sqlite <<EOF
SELECT f, t - f as c FROM (
  SELECT min(i) as f, max(i) as t
  FROM (SELECT i, ROW_NUMBER() OVER (ORDER BY i) - i as grp FROM t)
  GROUP BY grp
  ORDER BY i
) where c = 31
EOF
271. Hmm. A bit more than we'd like...
Another route is to also count the ups:
n=208
time awk '$3=="up"{ print $1 }' $n | uniq -c | sed -r 's/^ +//;s/ /,/' | tee $n-up-uniq-cnt
t=$n-up-uniq-cnt.sqlite
rm -f $t
time sqlite3 $t 'create table tmp(cnt text, i text)'
time sqlite3 $t ".import --csv $n-up-uniq-cnt tmp"
time sqlite3 $t 'create table t (cnt integer, i integer)'
time sqlite3 $t '.load ./ip' 'insert into t select cnt as integer, str2ipv4(i) from tmp'
time sqlite3 $t 'drop table tmp'
time sqlite3 $t 'create index ti on t(i)'
Let's see how many consecutives with counts:
sqlite3 208-up-uniq-cnt.sqlite <<EOF
SELECT f, t - f as c FROM (
  SELECT min(i) as f, max(i) as t
  FROM (SELECT i, ROW_NUMBER() OVER (ORDER BY i) - i as grp FROM t WHERE cnt >= 3)
  GROUP BY grp
  ORDER BY i
) where c > 28 and c < 32
EOF
Let's check on 66:
grep -e '66.45.179' -e '66.45.179' 66
not representative at all... e.g. several convfirmed hits are down:
66.45.179.215   1335305700      down    no-response
66.45.179.215   1337579100      down    no-response
66.45.179.215   1338765300      down    no-response
66.45.179.215   1340271900      down    no-response
66.45.179.215   1346813100      down    no-response
Let's check relevancy of known hits:
grep -e '208.254.40' -e '208.254.42' 208 | tee 208hits
Output:
208.254.40.95	1355564700	unreachable
208.254.40.95	1355622300	unreachable
208.254.40.96	1334537100	alive, 36342
208.254.40.96	1335269700	alive, 17586

..

208.254.40.127	1355562900	alive, 35023
208.254.40.127	1355593500	alive, 59866
208.254.40.128	1334609100	unreachable
208.254.40.128	1334708100	alive from 208.254.32.214, 43358
208.254.40.128	1336596300	unreachable
The rest of 208 is mostly unreachable.
208.254.42.191	1335294900	unreachable
...
208.254.42.191	1344737700	unreachable
208.254.42.191	1345574700	Icmp Error: 0,ICMP Network Unreachable, from 63.111.123.26
208.254.42.191	1346166900	unreachable
...
208.254.42.191	1355665500	unreachable
208.254.42.192	1334625300	alive, 6672
...
208.254.42.192	1355658300	alive, 57412
208.254.42.193	1334677500	alive, 28985
208.254.42.193	1336524300	unreachable
208.254.42.193	1344447900	alive, 8934
208.254.42.193	1344613500	alive, 24037
208.254.42.193	1344806100	alive, 20410
208.254.42.193	1345162500	alive, 10177
...
208.254.42.223	1336590900	alive, 23284
...
208.254.42.223	1355555700	alive, 58841
208.254.42.224	1334607300	Icmp Type: 11,ICMP Time Exceeded, from 65.214.56.142
208.254.42.224	1334681100	Icmp Type: 11,ICMP Time Exceeded, from 65.214.56.142
208.254.42.224	1336563900	Icmp Type: 11,ICMP Time Exceeded, from 65.214.56.142
208.254.42.224	1344451500	Icmp Type: 11,ICMP Time Exceeded, from 65.214.56.138
208.254.42.224	1344566700	unreachable
208.254.42.224	1344762900	unreachable
Let's try with 66. First there way too much data, 9 GB, let's cut it down:
n=66
time awk '$3~/^alive,/ { print $1 }' $n | uniq -c | sed -r 's/^ +//;s/ /,/' | tee $n-up-uniq-c
OK down to 45 MB, now we can work.
grep -e '66.45.179' -e '66.104.169' -e '66.104.173' -e '66.104.175' -e '66.175.106' '66-alive-uniq-c' | tee 66hits
Nah, it's full of holes:
4,66.45.179.187
12,66.45.179.188
2,66.45.179.197
1,66.45.179.202
2,66.45.179.205
2,66.45.179.206
1,66.45.179.207
won't be able to find new ranges here.
Let' see if there's anything in records/mx.xz.
mx.csv is 21GB.
They do have " in the files to escape commas so:
mx.py
import csv
import sys
writer = csv.writer(sys.stdout)
with open('mx.csv', 'r') as f:
    reader = csv.reader(f)
    for row in reader:
        writer.writerow([row[0], row[3]])
Would have been better with csvkit: stackoverflow.com/questions/36287982/bash-parse-csv-with-quotes-commas-and-newlines
then:
# uniq not amazing as there are often two or three slightly different records repeated on multiple timestamps, but down to 11 GB
python3 mx.py | uniq > mx-uniq.csv
sqlite3 mx.sqlite 'create table t(d text, m text)'
# 13 GB
time sqlite3 mx.sqlite ".import --csv --skip 1 'mx-uniq.csv' t"

# 41 GB
time sqlite3 mx.sqlite 'create index td on t(d)'
time sqlite3 mx.sqlite 'create index tm on t(m)'
time sqlite3 mx.sqlite 'create index tdm on t(d, m)'

# Remove dupes.
# Rows: 150m
time sqlite3 mx.sqlite <<EOF
delete from t
where rowid not in (
  select min(rowid)
  from t
  group by d, m
)
EOF

# 15 GB
time sqlite3 mx.sqlite vacuum
Let's see what the hits use:
awk -F, 'NR>1{ print $2 }' ../media/cia-2010-covert-communication-websites/hits.csv | xargs -I{} sqlite3 mx.sqlite "select distinct * from t where d = '{}'"
At around 267 total hits, only 84 have MX records, and from those that do, almost all of them have exactly:
smtp.secureserver.net
mailstore1.secureserver.net
with only three exceptions:
dailynewsandsports.com|dailynewsandsports.com
inews-today.com|mail.inews-today.com
just-kidding-news.com|just-kidding-news.com
We need to count out of the totals!
sqlite3 mx.sqlite "select count(*) from t where m = 'mailstore1.secureserver.net'"
which gives, ~18M, so nope, it is too much by itself...
Let's try to use that to reduce av.sqlite from 2013 DNS Census virtual host cleanup a bit further:
time sqlite3 mx.sqlite '.mode csv' "attach 'aiddcu.sqlite' as 'av'" '.load ./ip' "select ipi2s(av.t.i), av.t.d from av.t inner join t as mx on av.t.d = mx.d and mx.m = 'mailstore1.secureserver.net' order by av.t.i asc" > avm.csv
where avm stands for av with mx pruning. This leaves us with only ~500k entries left. With one more figerprint we could do a Wayback Machine CDX scanning scan.
Let's check that we still have most our hits in there:
grep -f <(awk -F, 'NR>1{print $2}' /home/ciro/bak/git/media/cia-2010-covert-communication-websites/hits.csv) avm.csv
At 267 hits we got 81, so all are still present.
secureserver is a hosting provider, we can see their blank page e.g. at: web.archive.org/web/20110128152204/http://emmano.com/. security.stackexchange.com/questions/12610/why-did-secureserver-net-godaddy-access-my-gmail-account/12616#12616 comments:
secureserver.net is the name GoDaddy use as the reverse DNS for IP addresses used for dedicated/virtual server hosting
ns.csv is 57 GB. This file is too massive, working with it is a pain.
We can also cut down the data a lot with stackoverflow.com/questions/1915636/is-there-a-way-to-uniq-by-column/76605540#76605540 and tld filtering:
awk -F, 'BEGIN{OFS=","} { if ($1 != last) { print $1, $3; last = $1; } }' ns.csv | grep -E '\.(com|net|info|org|biz),' > nsu.csv
This brings us down to a much more manageable 3.0 GB, 83 M rows.
Let's just scan it once real quick to start with, since likely nothing will come of this venue:
grep -f <(awk -F, 'NR>1{print $2}' ../media/cia-2010-covert-communication-websites/hits.csv) nsu.csv | tee nsu-hits.csv
cat nsu-hits.csv | csvcut -c 2 | sort | awk -F. '{OFS="."; print $(NF-1), $(NF)}' | sort | uniq -c | sort -k1 -n
As of 267 hits we get:
      1 a2hosting.com
      1 amerinoc.com
      1 ayns.net
      1 dailyrazor.com
      1 domainingdepot.com
      1 easydns.com
      1 frienddns.ru
      1 hostgator.com
      1 kolmic.com
      1 name-services.com
      1 namecity.com
      1 netnames.net
      1 tonsmovies.net
      1 webmailer.de
      2 cashparking.com
     55 worldnic.com
     86 domaincontrol.com
so yeah, most of those are likely going to be humongous just by looking at the names.
The smallest ones by far from the total are: frienddns.ru with only 487 hits, all others quite large or fake hits due to CSV. Did a quick Wayback Machine CDX scanning there but no luck alas.
Let's check the smaller ones:
inews-today.com,2013-08-12T03:14:01,ns1.frienddns.ru
source-commodities.net,2012-12-13T20:58:28,ns1.namecity.com -> fake hit due to grep e-commodities.net
dailynewsandsports.com,2013-08-13T08:36:28,ns3.a2hosting.com
just-kidding-news.com,2012-02-04T07:40:50,jns3.dailyrazor.com
fightwithoutrules.com,2012-11-09T01:17:40,sk.s2.ns1.ns92.kolmic.com
fightwithoutrules.com,2013-07-01T22:46:23,ns1625.ztomy.com
half-court.net,2012-09-10T09:49:15,sk.s2.ns1.ns92.kolmic.com
half-court.net,2013-07-07T00:31:12,ns1621.ztomy.com
Doubt anything will come out of this.
Let's do a bit of counting out of the total:
grep domaincontrol.com ns.csv | awk -F, '{print $1}' | uniq | wc
gives ~20M domain using domaincontrol. Let's see how many domains are in the first place:
awk -F, '{print $1}' ns.csv | uniq | wc
so it accounts for 1/4 of the total.
We've noticed that often when there is a hit range:
  • there is only one IP for each domain
  • there is a range of about 20-30 of those
and that this does not seem to be that common. Let's see if that is a reasonable fingerprint or not.
Note that although this is the most common case, we have found multiple hits that viewdns.info maps to the same IP.
First we create a table u (unique) that only have domains which are the only domain for an IP, let's see by how much that lowers the 191 M total unique domains:
time sqlite3 u.sqlite 'create table t (d text, i text)'
time sqlite3 av.sqlite -cmd "attach 'u.sqlite' as u" "insert into u.t select min(d) as d, min(i) as i from t where d not like '%.%.%' group by i having count(distinct d) = 1"
The not like '%.%.%' removes subdomains from the counts so that CGI comms are still included, and distinct in count(distinct is because we have multiple entries at different timestamps for some of the hits.
Let's start with the 208 subset to see how it goes:
time sqlite3 av.sqlite -cmd "attach 'u.sqlite' as u" "insert into u.t select min(d) as d, min(i) as i from t where i glob '208.*' and d not like '%.%.%' and (d like '%.com' or d like '%.net') group by i having count(distinct d) = 1"
OK, after we fixed bugs with the above we are down to 4 million lines with unique domain/IP pairs and which contains all of the original hits! Almost certainly more are to be found!
This data is so valuable that we've decided to upload it to: archive.org/details/2013-dns-census-a-novirt.csv Format:
8,chrisjmcgregor.com
11,80end.com
28,fine5.net
38,bestarabictv.com
49,xy005.com
50,cmsasoccer.com
80,museemontpellier.net
100,newtiger.com
108,lps-promptservice.com
111,bridesmaiddressesshow.com
The numbers of the first column are the IPs as a 32-bit integer representation, which is more useful to search for ranges in.
To make a histogram with the distribution of the single hostname IPs:
#!/usr/bin/env bash
bin=$((2**24))
sqlite3 2013-dns-census-a-novirt.sqlite -cmd '.mode csv' >2013-dns-census-a-novirt-hist.csv <<EOF
select i, sum(cnt) from (
  select floor(i/${bin}) as i,
         count(*) as cnt
    from t
    group by 1
  union
  select *, 0 as cnt from generate_series(0, 255)
)
group by i
EOF
gnuplot \
  -e 'set terminal svg size 1200, 800' \
  -e 'set output "2013-dns-census-a-novirt-hist.svg"' \
  -e 'set datafile separator ","' \
  -e 'set tics scale 0' \
  -e 'unset key' \
  -e 'set xrange[0:255]' \
  -e 'set title "Counts of IPs with a single hostname"' \
  -e 'set xlabel "IPv4 first byte"' \
  -e 'set ylabel "count"' \
  -e 'plot "2013-dns-census-a-novirt-hist.csv" using 1:2:1 with labels' \
;
Which gives the following useless noise, there is basically no pattern:
https://raw.githubusercontent.com/cirosantilli/media/master/cia-2010-covert-communication-websites/2013-dns-census-a-novirt-hist.svg
The JavaScript of each website appears to be quite small and similarly sized. They are all minimized, but have reordered things around a bit.
First we have to know that the Wayback Machine adds some stuff before and after the original code. The actual code there starts at:
ap={fg:['MSXML2.XMLHTTP
and ends in:
ck++;};return fu;};
We can use a JavaScript beautifier such as beautifier.io/ to be abe to better read the code.
It is worth noting that there's a lot of <script> tags inline as well, which seem to matter.
Further analysis would be needed.
There are two keywords that are killers: "news" and "world" and their translations or closely related words. Everything else is hard. So a good start is:
grep -e news -e noticias -e nouvelles -e world -e global
iran + football:
  • iranfootballsource.com: the third hit for this area after the two given by Reuters! Epic.
3 easy hits with "noticias" (news in Portuguese or Spanish"), uncovering two brand new ip ranges:
  • 66.45.179.205 noticiasporjanua.com
  • 66.237.236.247 comunidaddenoticias.com
  • 204.176.38.143 noticiassofisticadas.com
Let's see some French "nouvelles/actualites" for those tumultuous Maghrebis:
  • 216.97.231.56 nouvelles-d-aujourdhuis.com
news + world:
  • 210.80.75.55 philippinenewsonline.net
news + global:
  • 204.176.39.115 globalprovincesnews.com
  • 212.209.74.105 globalbaseballnews.com
  • 212.209.79.40: hydradraco.com
OK, I've decided to do a complete Wayback Machine CDX scanning of news... Searching for .JAR or https.*cgi-bin.*\.cgi are killers, particularly the .jar hits, here's what came out:
  • 62.22.60.49 telecom-headlines.com
  • 62.22.61.206 worldnewsnetworking.com
  • 64.16.204.55 holein1news.com
  • 66.104.169.184 bcenews.com
  • 69.84.156.90 stickshiftnews.com
  • 74.116.72.236 techtopnews.com
  • 74.254.12.168 non-stop-news.net
  • 193.203.49.212 inews-today.com
  • 199.85.212.118 just-kidding-news.com
  • 207.210.250.132 aeronet-news.com
  • 212.4.18.129 sightseeingnews.com
  • 212.209.90.84 thenewseditor.com
  • 216.105.98.152 modernarabicnews.com
Wayback Machine CDX scanning of "world":
  • 66.104.173.186 myworldlymusic.com
"headline": only 140 matches in 2013-dns-census-a-novirt.csv and 3 hits out of 269 hits. Full inspection without CDX led to no new hits.
"today": only 3.5k matches in 2013-dns-census-a-novirt.csv and 12 hits out of 269 hits, TODO how many on those on 2013-dns-census-a-novirt? No new hits.
"world", "global", "international", and spanish/portuguese/French versions like "mondo", "mundo", "mondi": 15k matches in 2013-dns-census-a-novirt.csv. No new hits.
whoisxmlapi WHOIS history March 22, 2011:
  • Registrar Name: NETWORK SOLUTIONS, LLC.
  • Created Date: January 26, 2010 00:00:00 UTC
  • Updated Date: November 27, 2010 00:00:00 UTC
  • Expires Date: January 26, 2012 00:00:00 UTC
  • Registrant Name: Corral, Elizabeth|ATTN ACTIVEGAMINGINFO.COM|care of Network Solutions
  • Registrant Street: PO Box 459
  • Registrant City: PA
  • Registrant State/Province: US
  • Registrant Postal Code: 18222
  • Registrant Country: UNITED STATES
  • Administrative Name: Corral, Elizabeth|ATTN ACTIVEGAMINGINFO.COM|care of Network Solutions
  • Administrative Street: PO Box 459
  • Administrative City: Drums
  • Administrative State/Province: PA
  • Administrative Postal Code: 18222
  • Administrative Country: UNITED STATES
  • Administrative Email: xc2mv7ur8cw@networksolutionsprivateregistration.com
  • Administrative Phone: 5707088780
  • Name servers: NS23.DOMAINCONTROL.COM|NS24.DOMAINCONTROL.COM
Previously it was unclear if there were any .org hits, until we found the first one with clear comms: web.archive.org/web/20110624203548/http://awfaoi.org/hand.jar
Later on, two more clear ones were found with expired domain trackers:
further settling their existence. Later on newimages.org also came to light.
Others that had been previously found in IP ranges but without clear comms:
  • 65.61.127.177: material-science.org
  • 212.4.17.61: tech-stop.org
  • 74.116.72.244 arborstribune.org
.org is very rare, and has been excluded from some of our search heuristics. That was a shame, but likely not much was missed.

Unlisted articles are being shown, click here to show only listed articles.