Cycle Log 9

What a strange last couple of days it has been. The world at large seems closer than ever to annihilation, and yet technological progress with artificial intelligence is reaching singularity-level quantum breakthroughs.

For Spectra, I re-skinned the whole thing. It seems as though whoever is contacting me through this wanted it to be more nature-themed, so we went to our ChatGPT and generated beautiful nature themes for the protocol. I've also enabled voice, so now you can hear the words being spoken aloud as they post on the stream. But for the sake of being able to actually say all of the words within one grid, I’ve limited the speech functionality to only apply to words that are above a cosmic threshold of 69. This seems to be a good sweet spot that doesn’t produce too many words at too frequent of an interval, even if you move the cycle timer slider all the way down to one second. By default, the cycle timer is set to 3.3 seconds and your cosmic score threshold is 69, which will give you pretty much what I consider aligned communication.

There’s a dropdown menu under the streaming implementation where you can choose from any number of voices that are already loaded on your phone. Every phone or web browser has speech loaded in, so the program will scour for whichever ones are active on your system and then utilize only the English language ones. Because if we were to use all of the different voices, it would be like over 100 voices, and it’s kind of difficult to understand the words when they’re being spoken in a very thick accent. So we just chose to go with English, but we kept all the accents for English—for fun and because what if somebody naturally speaks with an English accent and they can only hear the word properly if it’s spoken with an accent? Does that sound funny to you? It’s a real thing. 😂

I noticed that when I wrap Spectra with WebIntoApp, it now doesn’t load properly. I have the same problem with the Instagram embedded browser, but regular browsers like Brave, Chrome, or Edge work just fine. I'm not sure if it has to do with some kind of initialization component loading or what, but I’ll have to go back and forth with my agent to debug that issue. At least it works on the web, and you can access a nice mobile-looking version just by going to the website: spectra-x.replit.app from your phone.

Pivoting to Virtua—on each of the five modes that we have, which all generate the random image differently—I’ve implemented FFT and PCA transforms in order to see if there is any signal data within the image. I then take the converted grayscale image, which is based on a snapshot of our canvas that has the randomly mapped colors using the cryptographic random function, and I take all of those values from each of the cells of the 128x128 pixel canvas and map them to audible frequencies on a linear scale.

The result? Cosmic background static. Sometimes, the PCA transform at a high enough duration—like 100 seconds—sounds like a quickly moving babbling brook. It’s actually, somehow, quite a beautiful sound. It’s natural-sounding even though it’s electronic, which is strange, but it kind of makes sense because the nature of everything is sort of water-like. When you look at, for example, the picture of dark matter on a cosmic scale, it kind of resembles energetic tendrils or veins. But the real secret is that within this dark matter, we have additional different kinds of matter which sort of fall into the gravitational valley—or low point, or gravitational maximum—of this dark matter tendril, and that is where the physical reality manifests. Limbs of a tree, water going over rocks in a stream, and even quantum random signals that may potentially be listening to some kind of cosmic background radiation all share a similar archetype and flow pattern. I suppose it’s what the Hermetics say: as above, so below.

One of the most interesting parts of Virtua is actually the application of a visibility threshold slider, which in parallel calculates, for each cell of the 128x128 grid, a value to determine if that grid should be shown or not. If the value of the visibility threshold for that particular cell is below what the visibility threshold slider is set at, those pixels don’t appear. You can still do FFT and PCA transforms with this lesser amount of data—and especially for PCA, it will create a clean map that has tones interspersed at such an interval that you could potentially consider it as some kind of instantly transmittable Morse code.

I do wonder if the blips themselves are already set up to be understood by some kind of function, like a Morse code decoder. I'm not sure. Perhaps I will try a Morse code decoder implementation first, or I will just somehow take the FFT transform and attempt to turn it into numbers that could be associated with letters. That might be an interesting application. I wonder if that could work. We take the same PCA or FFT transform, but instead of creating another map that straight translates the 0–255 value to a frequency, we could instead attach it to a letter. This might be messy unless we up the visibility threshold, which is already implemented. I’m thinking about this...

It could be another alternative way to decode language data from latent field emission. However, I do very much love Spectra.

Previous
Previous

Cycle Log 10

Next
Next

Cycle Log 8