For the discovery of green fluorescent protein.
It is quite amusing that the starting point to identifying the heat one was capsaicin, as it stimulates the exact same receptor!!!
Shows up when trying to solve 2D wave equation on a circular domain in polar coordinates with separation of variables, where we have to decompose the initial condition in termes of a fourier-Bessel series, exactly like the Fourier series appears when solving the wave equation in linear coordinates.
For the same fundamental reasons, also appears when calculating the Schrödinger equation solution for the hydrogen atom.
One of the Holiest age old debugging techniques!
Git has some helpers to help you achieve bisection Nirvana: stackoverflow.com/questions/4713088/how-to-use-git-bisect/22592593#22592593
Obviously not restricted to software engineering alone, and used in all areas of engineering, e.g. Video "Air-tight vs. Vacuum-tight by AlphaPhoenix (2020)" uses it in vacuum engineering.
The cool thing about bisection is that it is a brainless process: unlike when using a debugger, you don't have to understand anything about the system, and it incredibly narrows down the problem cause for you. Not having to think is great!
HTML snippet:
new class extends OurbigbookCanvasDemo {
init() {
super.init('webgl', {context_type: 'webgl'});
this.ctx.viewport(0, 0, this.ctx.drawingBufferWidth, this.ctx.drawingBufferHeight);
this.ctx.clearColor(0.0, 0.0, 0.0, 1.0);
this.vertexShaderSource = `
#version 100
precision highp float;
attribute float position;
void main() {
gl_Position = vec4(position, 0.0, 0.0, 1.0);
gl_PointSize = 64.0;
}
`;
this.fragmentShaderSource = `
#version 100
precision mediump float;
void main() {
gl_FragColor = vec4(0.18, 0.0, 0.34, 1.0);
}
`;
this.vertexShader = this.ctx.createShader(this.ctx.VERTEX_SHADER);
this.ctx.shaderSource(this.vertexShader, this.vertexShaderSource);
this.ctx.compileShader(this.vertexShader);
this.fragmentShader = this.ctx.createShader(this.ctx.FRAGMENT_SHADER);
this.ctx.shaderSource(this.fragmentShader, this.fragmentShaderSource);
this.ctx.compileShader(this.fragmentShader);
this.program = this.ctx.createProgram();
this.ctx.attachShader(this.program, this.vertexShader);
this.ctx.attachShader(this.program, this.fragmentShader);
this.ctx.linkProgram(this.program);
this.ctx.detachShader(this.program, this.vertexShader);
this.ctx.detachShader(this.program, this.fragmentShader);
this.ctx.deleteShader(this.vertexShader);
this.ctx.deleteShader(this.fragmentShader);
if (!this.ctx.getProgramParameter(this.program, this.ctx.LINK_STATUS)) {
console.log('error ' + this.ctx.getProgramInfoLog(this.program));
return;
}
this.ctx.enableVertexAttribArray(0);
var buffer = this.ctx.createBuffer();
this.ctx.bindBuffer(this.ctx.ARRAY_BUFFER, buffer);
this.ctx.vertexAttribPointer(0, 1, this.ctx.FLOAT, false, 0, 0);
this.ctx.useProgram(this.program);
}
draw() {
this.ctx.clear(this.ctx.COLOR_BUFFER_BIT);
this.ctx.bufferData(this.ctx.ARRAY_BUFFER, new Float32Array([Math.sin(this.time / 60.0)]), this.ctx.STATIC_DRAW);
this.ctx.drawArrays(this.ctx.POINTS, 0, 1);
}
}
Webpack is like a magic hydra that can eat any type of file and bundle it into a single output: .js, .ts, .ccs, .scss, .jsx, .tsx,
require
, import
, import
css from .js
, it doesn't matter at all, it just digests all into the same dump.When it works, you are just left in awe and with a single Js file. When it doesn't, you're fucked and have to debug for several hours.
Demos under: webpack/. To run all of them by default:To easily make changes and reload the .js output live let this run on a terminal:
cd webpack/min
npm install
npm run build
xdg-open index.html
npx webpack watch
Examples:
- webpack/min: minimal hello world. Doesn't do much, just copies
index.js
todist/index.js
. - webpack/require:
require
andimport
demo. Both work from the same file.dist/index.js
now contains all of:notindex.js
notindex2.js
- Lodash, a common third-party helper library specified in the package.json and installed with npm
- webpack/node: produce Node.js output, as opposed to the default web output. To test it run:Achieved simply with:
npm run build node dist/index.js
as documented at: webpack.js.org/concepts/targets/target: 'node'Fatman in Robin,
- webpack/sequelize: attempts at getting Sequelize to work with webpack. It's just not supported by Sequelize:
www.threekit.com/blog/gltf-everything-you-need-to-know comparision of several formats
This is likely a system that uploads text to the blockchain.
One example can be seen on the marijuana plant.
Messages are uploaded one line per transaction, and thus may be cut up on the blk.txt, and possibly even out of order.
But because each line starts with
j(
you can generally piece things up regardless.TODO identify. The first occurrence seems to be in tx e8c61e29c6b829e289f8d0fc95f9eb2eb00c89c85cfa3a9c700b15805451ae6a:
j(DOCPROOF@?pnvf=!;AG
This looks a lot like the beans that Brazilians venerate and can be easily found in the United Kingdom as of 2020.
The more exact type seems to be pinto bean, but this is close enough.
2021-03: same but 2.5 teaspons, seems to be the right ammount.
2021-02-10: attempt 3: 500g 1 hour 30 minutes no pressure, uncontrolled water. Salt with one chorizo: put 3 teaspoons, it was a bit too much, going to do 2 next time and see.
2020-12-14: attempt 3: 250g of beans, 1.5l of water, 30 minutes pressure.
2020-11-30: attempt 2: 275ml of dry beans, about 50% of 500g bag, putting 1650 ml (6x) of water on pressure cooker Still had to throw out some water.
Density dry raw: 216 g/250 ml = 432 g / 500 ml = 500 g / 580 ml = 864 g/L
500 g dry expands to in water after 12 hours: 1200 ml
Therefore 500 g dry = 864 / 2 L = 432 ml expands about 3x.
Therefore, to the maximum 2.5L of the cooker with 8x dry volume water from this recipe I can use:and so:which is about 227 / 580 = 40% of the 500 g bag.
2500 = volume expanded bean + volume water = 3 volume dry bean + 8 volume dry bean = 11 volume dry bean
volume dry bean = 2500/11 = 227ml
After first try, I found that 8x volume of water is way, way too much. Going to try 6x next time.
Opens a virtual MIDI piano GUI. It just works on Ubuntu 20.04: askubuntu.com/questions/34391/virtual-midi-piano-keyboard-setup/1298026#1298026
VMPK is a virtual device that replicates what you would get by connecting a physical MIDI keyboard to your computer. It is not a software synthesizer on its own. But it does connect to a working synthesizer by default (Sonivox EAS) which makes it produce sounds out-of-the box.
TODO: then I messed with my sound settings, and then it stopped working by default on the default "MIDI Connection" > "MIDI Out Driver" > "Network". But it still works on "SonivoxEAS".
A hello world of actually connecting it to a specific software synthesizer manually on Advanced Linux Sound Architecture with
aconnect
can be found at: askubuntu.com/questions/34391/virtual-midi-piano-keyboard-setup/1298026#1298026Save to a MIDI file: askubuntu.com/questions/709673/save-as-midi-when-playing-from-vmpk-qsynth/1298231#1298231
Reasonable default key mappings to keyboard covering 2 octaves.
3 multiple simultaneous keys did not work (tested "ZQI"). This might just be a limitation of my keyboard however.
TODO how to save to a
.mid
file? askubuntu.com/questions/709673/save-as-midi-when-playing-from-vmpk-qsynth Unlisted articles are being shown, click here to show only listed articles.