Tuesday, February 1, 2022

Precursor to NFTs

In January of 2014, I tweeted some digital images that I had designed that one could easily say were a precursor to Non-Fungible Tokens, which are all the rage today. I was trying to design content that could not be forged.

I came up with names like the "ALEXCOIN", because I had been studying the rise of Bitcoin and the Blockchain for many years and believed that I could create art that was not capable of being forged, using blockchain technology. I didn't have the technical skill to actually make what would today be called NFTs, so I did it my way via Twitter.

Another name I used was the "ANTIFACECOIN". The concept of "Antiface" and the "Antiface Strategy" are things that I had been working on for over a decade at the time. The idea was really to create art that could not be forged, art where I could be the only possible creator. At the time, I thought that I had succeeded with these little images. This one, in particular, was an actual painting done in acrylic. The others are digital designs meant to be self-portraits. That was one of the aspects of my concept, that these "Coins" were also self-portraits, or self-referring in a sense.

Here in the following tweet, I am calling them "AntifaceCards". I saw them as collectibles. I was really ahead of my time!

Here is proof, in any case, that I was working on the concept of NFT or Non-Fungible Token as early as January, 2014. I don't know if the term NFT even existed then. But certainly I was thinking at the time of the idea of art that could not be forged, using a technological support or backbone such as the Blockchain.

NB: It turns out that I beat the first NFT by several months, the first NFT being created in May, 2014, and mine being created in January, 2014.

"The first known "NFT", Quantum, was created by Kevin McCoy and Anil Dash in May 2014, consisting of a video clip made by McCoy's wife Jennifer. McCoy registered the video on the Namecoin blockchain and sold it to Dash for $4, during a live presentation for the Seven on Seven conference at the New Museum in New York City. McCoy and Dash referred to the technology as "monetized graphics". A non-fungible, tradable blockchain marker was explicitly linked to a work of art, via on-chain metadata (enabled by Namecoin). This is in contrast to the multi-unit, fungible, metadata-less "colored coins" of other blockchains and Counterparty." - Non-fungible token - Wikipedia

Friday, February 19, 2021

Morphological Transformations

I started playing around with the OpenCV library in Python ("cv2"). Here are some of the mathematical morphological operations that can be done on an image of your choosing:

Here is an example of an "erosion" operation done on an image of a gradient. First, here is the input image that I used, of a gradient I also generated programmatically via Python:

And here is the processed image, using the erosion operation with a 3 x 3 kernel of 1s:

As I said, I've been playing around with these mathematical morphological operations, of erosion, dilation, edge detection and sharpening of images, all using the OpenCV library. The functions are pretty simple to use. The library is really powerful, giving you everything you need at the touch of a simple function. One can also use function composition to use a function on top of a function. I wrote some helper functions to facilitate the task of using the morphological transformation functions.

Monday, February 15, 2021

Generating a Gradient Programmatically

I've posted many times on this blog, my experiments in image generation through Python. I've been working on what I call my "Noisefield" project, where I generated a bitmap with random black and white pixels. I have now done something similar, except this time I generated a black to white gradient, using the PIL library in Python.

The idea is pretty simple. You create a vector or array of values from 0 to 255, then you repeat that 256 times, to make a square matrix, or bitmap (because Python puts in the pixels from top-left downwards and towards the right, line by line). So that way you get a gradient. I took the "LINE" variable that I created with a for loop and range() function and then multiplied it by 256 in a variable called MATRIX. I then used an iterator function on the Matrix which I called in the image-generation function by calling next(MATRIX). The result is a smooth matrix.

ADDENDUM: I have found a simpler way of generating a black to white gradient, using the np.linspace function in Numpy.

FOR YOUR INFORMATION: I have written a few blog posts in the past about generating bitmaps with random black and white pixel values, or grey values between 0 and 255. I found a much quicker way to do this using the OpenCV library (cv2). (In passing, this np.random.randint() function gives the same results as the uniform_noise function below, in the second Gist below:

Here's another way to generate noise, Gaussian noise in this case, via OpenCV (cv2). The Gaussian distribution is "smoother" in appearance, as it has mean 128 and standard deviation 20:

With mean 0 and standard deviation 256, we get a "coarser" noise distribution:

As stated, the uniform_noise "randu" function gives the same result as the previous np.random.randint() function, a uniform noise distribution:

If I add the following, "ret,thresh1 = cv2.threshold(uniform_noise,64,255,cv2.THRESH_BINARY)" and then write "cv2.imwrite("Noise.jpg",thresh1)", I get a noisy distribution with many more whites, because of the threshold value being at 64. For every pixel, the same threshold value is applied. If the pixel value is smaller than the threshold, it is set to 0, otherwise it is set to a maximum value, in this case 255, which is pure white. This way it is possible to adjust the parseness if you will of the noisy distribution using the threshold function:

The opposite would be true if we set the treshold higher, to say 250. Then it's mostly black, or at a pixel value of 0:

Friday, February 12, 2021

Finding The Mean Part II

A short while ago, I wrote a blog post called "Finding The Mean" where I gave code of a function I wrote which generates an array of n random real numbers between 0 and 1 inclusively. I wrote the function in Python, in R, and in GNU Octave. My goal was to write the function in as many programming languages as possible. I have now written a version in C.

I will be trying to write the same function in various other programming languages, probably including Ruby, Javascript, Scala, and many others, potentially in some functional programming languages like Haskell or SML (Standard ML). Stay tuned!

In passing, the main idea behind this program is the fact that as an array or list of random real numbers between 0 and 1 grows in size, the average will technically converge on 0.50. It's just a mathematical fact. I think it's called the Law of Large Numbers. "According to the law, the average of the results obtained from a large number of trials should be close to the expected value and will tend to become closer to the expected value as more trials are performed."

Here is a mathematical representation of the Law of Large Numbers:

ADDENDUM: Here is some Scala code that does the same thing, but only seems to work in the REPL. I'm not familiar enough with writing actual scala programs, but as I said it works in the REPL. Just change the value of n.

I was also able to translate the code into Ruby. This works in the REPL and worked as a simple Ruby program, though I think it lacks some of the usual formalism of Ruby programs. I tried to turn it into a function (via function definition) but it said there was an "undefined method", not recognizing the .sum method for arrays. But I got it to work in the following way, decomposed into different lines of code. You just have to change the value of n. Remember, this is all to prove the Law of Large Numbers:

The idea behind this proof of the Law of Large Numbers is simply that given an array of random real numbers (floats) between 0 and 1, the larger the length n of the array, the closer the mean of the values of the array will become to 0.50. That is, the mean of a large number of such random numbers, or the expected value, is 0.50. And if you try these functions with 10,000 or more values in the array, you'll see that it converges on 0.50. This is my so-called "proof" of the Law of Large Numbers.

Monday, July 13, 2020

Pixelating an image

I was thinking of ways of pixelating an image, using simple Python functions. I started with an image of white noise and ran the pixelation function.


The trick I found is simple. You resize the image to a much smaller size, then resize it back to the original size. We can see this in the code. Below are the actual images, the one used as input and the one that came out as output.

"Image.NEAREST" is key here. It's nearest neighbor interpolation that is being used when you resize back up to the original size.

NOISEFIELD_200.png

As always, we start with Gaussian white noise. Then we use the basic image processing functions and get the following.

Pixelated image, result.png


I'm trying to build an app that can generate noise images and then use different functions to "modulate the noise field" as I like to call it. Pixelation is one of the basic image processing functions that I want to have in my future app.


A.G. (c) 2020. All Rights Reserved.



Thursday, April 23, 2020

Finding The Mean

If you perform a series of coin tosses of a fair coin, over a large number of trials, you expect the average of heads and tails to be around 50/50. What I did here below is generate an array, a list, of random numbers, real numbers, between 0 and 1. Then I find the mean of the values of the array. What you see is me taking the sum of a list in Python, itself generated through a list comprehension using the random module to choose random values (float numbers). The idea is to generate numbers between 0 and 1 randomly and average them to test the law of large numbers. I want to end up with a value that is close to the expected value you get from a large number of trials. In my mind, and I'm not a mathematician, I think this is a Gaussian distribution. I'm a big fan of all things related to noise and this experiment makes me think of Gaussian white noise, as a distribution in whatever dimensions.




I'm not a Python expert, but this is the quickest way I could find to get the mean of the array of size n of random values that I wanted. I tried other ways before, but when I refactored it, I got into a functional style of programming and wrote it as a mathematical function. That's why I used a lambda function, because it's an anonymous function, as far as I can tell. Again, I'm not an expert in theoretical computer science either. I just know I used the timeit module and found this function to be relatively quick. I tried the statistics module in Python 3.8 and it was super slow.

I tried doing everything in Numpy, but it was slower than when I used builtin functions like sum. Instead of importing mean functions from different modules, it was better just to sum the array of size n and divide by n to get the mean or expected value. Basically, I create an array of size n of what are essentially probabilities, real numbers, float numbers in Python, between 0 and 1. Then I run many trials, like say 1 million, and get the mean (average value) and it is usually 0.50 or else 0.4999.

I was really wondering about code optimization in Python. As I said, I'm not an expert in Python or in theoretical computer science, and I'm certainly not a mathematician. I just needed a way to come up with a large number of probabilities in a list (an array, a one-dimensional vector). I found I could do this easily with a list comprehension, i.e. [random.random() for i in range(n)], in Python, using the random module. I tried different ways of calculating those random probabilities. My thesis was proved, though. The mean after a million trials is 0.50 or 0.4999 or else 0.5001. Anyway it turns out that one should try to use builtin functions when possible, because they are usually faster. It's like writing the function in C, almost. There is no interpretation, is what I understood from my quick overview of the subject.

I was starting to look into the internals of Python to try to see why builtin functions would be faster. This is the best summary I could find so far:

Use Built-in Data Types
This one is pretty obvious. Built-in data types are very fast, especially in comparison to our custom types like trees or linked lists. That’s mainly because the built-ins are implemented in C, which we can’t really match in speed when coding in Python. - Making Python Programs Blazingly Fast

Addendum:
"A simple rule of thumb (but one you must back up using profiling!) is that more lines of bytecode will execute more slowly than fewer equivalent lines of bytecode that use built-in functions." - p.55, High Performance Python: Practical Performant Programming for Humans by Ian Ozsvald and Micha Gorelick
                                                                         * * *
I did a little more research. After looking into profiling and code optimization, I got to thinking of implementing my function in othe programming languages, with the idea that maybe it would be faster in Fortran or C or whatnot. I'm mostly just familiar with Python, but I was able to write the function, in what I think is working code, in both the R language and in GNU Octave:


The file is named avg_probs.m with my function in GNU Octave code. Here it is in the R programming language.

Those were a few of the other languages that I was able to "translate" my function into, my "average of probabilities" function, as I am calling it now, or avg_probs().

Thursday, September 27, 2018

Refcards: The Beginnings of an Interface

- The latest development in the evolution of the Refcards-System is the creation of a primitive interface;
- Now I can add records to my SQLite3 database through an interface;
- The interface has a button called "Add entry" which upon clicking adds the text ("gets" it) from the Entry and Text widgets in the interface and adds them ("INSERTs" them) directly into the database;
- I've been using Tkinter to create the interface ("frontend") and SQLite3 in Python for the database ("backend");