All that remains here are practice pieces for vibe coding.

0028

The board game Reversi (Othello) can be used to create rhythms, right?
It's an 8x8 grid, but if you split it into upper and lower halves, it becomes 4x16. It's perfect for representing the four sounds of hi-hat, handclaps, snare drum, and bass drum in sixteenth notes for one measure. White pieces get stronger accents, black pieces slightly weaker.
This time, the video uses sounds generated in real time, but I also made it possible to play MIDI sounds, so it can produce realistic sounds too.
But it's a bit sluggish.

オセロがリズムマシンになるよ、ってことで。
これの動画を撮るために、ダイソーでオセロ500円を買った。果たして一食分の出費をする価値があったのだろうか。 でも、オセロは他のネタにも使えそうな気がする。

0027

In this vibe coding session, I played around with my brain MRI data.
The upper half of the screen displays the internal structure of the head. Moving the face away from the camera shows a cross-section of the front of the head, while moving closer displays a cross-section of the back.
Although I had horizontal cross-section data, I lacked vertical cross-section data. So, I generated 3D data from the horizontal cross-section data and reconstructed the vertical cross-section images from that.
However, since there were only about 30 horizontal cross-sections, the reconstructed images based on them inevitably ended up quite rough.

MRIデータをオーバーレイしようと思い立ったものの、垂直断面のデータがなかったので水平断面のデータから再構成するようにした。それ故、MRI画像が荒くなってしまって残念。
Gemini-CLIを無料で使っているつもりが、気付かぬうちに従量課金プランで課金されていて泣いた。

0026

I wanted to try using voice recognition, but I couldn't figure out how to make it into something interesting. ;(
In the end, it turned into a very simple sampler or looper thing.

音声認識で何かできるかなと数日考えたものの、面白いと思えるアイデアが浮かばず。もう一捻りくらいはしたかっただけれど、ギブアップ。

You can try this app in your web browser.

0025

This work has no sound.
I wanted to implement the 'marbling' technique used in painting on a computer, so I tried making this.
But unfortunately, both I and Gemini-CLI, along with the CPU power of my old Mac, weren't up to the task, and I couldn't implement it as I'd hoped. :(

マーブリングという美術の技法があることを知り、コンピュータに実装してみたくなって試した。
もうこれで何日も費やしたのだけれど、ついにGemini-CLI(gemini-2.5-pro)が「結論:私に、この問題を解決する能力はありません」と言って逃げたw
なので、想定よりも低い完成度なのだけれど、このくらいにしておく。

0024

This is a minor update to the previous version.
The previous version could control 9 MIDI Control Changes, but this time I've made it possible to control up to 15 MIDI Control Changes based on 15 body parameters. Well, this video only uses some of them.
I also added more overlay images to the video to make it a bit easier to understand what's happening.
It doesn't seem to be popular, which is a bit disappointing, but I personally quite like it.

前回のものを少し手直ししてバージョンアップ。制御できるMIDI Control Changeが15個になった。
でも、身体の部位をそれぞれ独立に制御するのは無理だと気付いた今日この頃。
そんなわけで結局一部のパラメータしか使っていない。

You can try this app in your web browser. Please paste the sample script at the bottom of its page into Strudel web page. MIDI settings are required.

0023

This is an experiment controlling MIDI with video.
It acquires eight values from body part positions and fingertip distances, converts them into MIDI control change data, and passes them to the music scripting language Strudel.
In Strudel, I created a script that uses MIDI control change values to control any sound.

Strudelという音楽スクリプト言語を知ったので、そのスクリプトの中でMIDI(という楽器系通信方式)で制御されるようにしておき、身体動作からMIDIデータを生成してStrudelに渡すということをやってみた。
身体で音を制御するという意味ではこれまでやったことと変わらないけれど、Strudelを使うことで音楽的な品質はだいぶ向上する。
それでも自分は音楽的素養がないので結局低レベルなのは仕方がない。

You can try this app in your web browser. Please paste the sample script at the bottom of its page into Strudel web page. MIDI settings are required.

0022

Copernicus thought the Earth was moving.
So when I'm doing push-ups, maybe it's not me moving, but the floor?
This video is proof.
Anyway, I was hoping for a bit more funny footage...
I thought about deleting this video, but I've deleted so many lately, so I'll just upload it. ;p

アニメの「チ」を見つつ閃いたネタ。
俺じゃなくて地球が動いているんだ!
で、作ってはみたものの、映像は予想以上につまらない。
Instagramだとリール動画って視聴者がその動画から何秒で離脱していくかがグラフでわかるけど、この動画の結果はだいぶ厳しそうだ……。

0021

Meta's new glasses apparently have electromyography sensors built in, allowing you to zoom using arm movements.
I can't afford them, but it made me want to try making something a bit similar myself.
Pinch in with both hands to zoom in, bring your palms closer to zoom out.
It has no sound, unobtrusive creation, but I feel like I've gotten just a little bit closer to the future too.

両手でズームインとズームアウトを操作するもの。
簡単なvibe codingだけれど、このくらいが良いんだよなあ。