Test buy 2023-04-10 in the UK:
- fee: 0.99 pounds, minimum buy: 1.99 pounds
- bought 10 pounds, minus 0.99 fee, totalled: 0.00039162 BTC (£8.92) presumably after further fees/spread
- bitcoin price on Google on that day: 22,777.54 GBP / BTC
- bitcoin transaction fees were about 2.7 BTC on that day
Sending 5 pounds to wallet
12dg2FaiZLp3VzDtLvwPinaKz41TQcEGbs
- network fee: 0.00001989 BTC
- total bitcoin cost: -0.00023928 BTC
- new balance: 15,234 satoshi (39,162 - 23,928).
- total spent: £5.45
- time est.: about 30 minutes
This worked and I received 21939 satoshis (23928 - 1989) on Electrum on one of the outputs of transaction 1177268091cbeaacbcaac5dc4f6d1774c4ec11b4bcffafa555cd2775eafb954c.
Sending 1 satoshi back! The lowest fee in Electron is 1120 Satoshis targeting 25 blocks (4 hours). Let's do it. Failed, server forbids dust, minimum is 1000 satoshi. OK, sending 1000 satoshi, at 1139 fee.
CNN convolution kernels are also learnt by Ciro Santilli 35 Updated 2024-12-15 +Created 1970-01-01
CNN convolution kernels are not hardcoded. They are learnt and optimized via backpropagation. You just specify their size! Example in PyTorch you'd do just:as used for example at: activatedgeek/LeNet-5.
nn.Conv2d(1, 6, kernel_size=(5, 5))
This can also be inferred from: stackoverflow.com/questions/55594969/how-to-visualise-filters-in-a-cnn-with-pytorch where we see that the kernels are not perfectly regular as you'd expected from something hand coded.
Object detection model.
You can get some really sweet pre-trained versions of this, typically trained on the COCO dataset.
This is about transactions that are interesting not because of their inscriptions, but for some other reason, such as transaction size, etc.
mlcommons.org/en/ Their homepage is not amazingly organized, but it does the job.
Benchmark focused on deep learning. It has two parts:Furthermore, a specific network model is specified for each benchmark in the closed category: so it goes beyond just specifying the dataset.
Results can be seen e.g. at:
- training: mlcommons.org/en/training-normal-21/
- inference: mlcommons.org/en/inference-datacenter-21/
And there are also separate repositories for each:
E.g. on mlcommons.org/en/training-normal-21/ we can see what the the benchmarks are:
Dataset | Model |
---|---|
ImageNet | ResNet |
KiTS19 | 3D U-Net |
OpenImages | RetinaNet |
COCO dataset | Mask R-CNN |
LibriSpeech | RNN-T |
Wikipedia | BERT |
1TB Clickthrough | DLRM |
Go | MiniGo |
The most important thing this project provides appears to be the
.onnx
file format, which represents ANN models, pre-trained or not.Deep learning frameworks can then output such
.onnx
files for interchangeability and serialization.Some examples:
- activatedgeek/LeNet-5 produces a trained
.onnx
from PyTorch - MLperf v2.1 ResNet can use
.onnx
as a pre-trained model
Some interesting analysis by Parth Shukla twitter.com/pparth | www.linkedin.com/in/parth-shukla-59583b20/:
Apparently most of the routers were Chinese. No surprise there.
Data format overview: opendata.stackexchange.com/questions/1951/dataset-of-domain-names/21077#21077
TODO was this data also obtained illegally like the Carna botnet
The Neurokernel Project aims to build an open software platform for the emulation of the entire brain of the fruit fly Drosophila melanogaster on multiple Graphics Processing Units (GPUs).
The student organized bar of the École. There's a corresponding Binet that takes care of it.
- www.facebook.com/events/d41d8cd9/b%C3%B4bar-polytechnique/306343312823548/
- www.leparisien.fr/faits-divers/un-bar-clandestin-decouvert-a-polytechnique-25-06-2006-2007106594.php: in 2006, almost 30 years after 1975 the police finally discovered that they were not licensed to sell alcohol
There are unlisted articles, also show them or only show them.