SMART SKIN

CHROMATOPHORES + NEURAL ACTIVITY + PATTERNS

Hele-Shaw Ferrohydrodynamics for Rotating and Axial Magnetic Fields
Department of electrical engineering and computer science Lab of electromagnetic and electronic systems MIT 2002

50 Microliters of ferrofluid in a 1.1mm gap Hele-Shaw cell with an applied 100Gauss DC Axial Uniform In-Plane Magnetic Field Rotating Clockwise at 25 Hertz

200 Microliters of ferrofluid in a 1.1mm gap Hele-Shaw cell with an applied 20Gauss  In-Plane Magnetic Field Rotating Clockwise at 25 Hertz followed by a 100 Gauss DC Axial Magnetic Field

 

 

 

 

 

 

 

 

 

 

 


 

 

ideas

Unlearning Machine / Bioluminescent Cloud

I have to admit that I felt a bit worried or scared before taking Nature of Code, mainly because well, I don’t have a coding background of any kind.

However, the idea of attempting to understand and explore how nature and physics work through code seemed to me extremely exciting, even if I could not follow the math.

To my surprise, instead of feeling scared or troubled by numbers, I actually feel as if every class is a new door or window to infinite possibilities!  Even if I can’t make the math, I do visualize everything in a strange way. It is as if I was doing a mental drawing with layers of interconnections in space and as I keep on realizing those connections, new ideas emerge-new conceptions of reality.

I also realized that I can’t understand all the algorithmic component in such a little time, but I could explore it deeper and find meaning. This has, in turn, resulted in an amazing learning process, I encountered: Richard Feynman, Seymour Papert, John Tyler Bonner and so many other books and information that seems to be like a never-ending source of ideas.

For my final project, I would like to explore one of these ideas. I know that they are way beyond my capabilities at this point, but maybe I can do a first step or a silhouette and see what happens.

I have noticed I see nature of code a lot like drawing and just as I would criticize my sketches for a design project or a piece of art, I see myself criticizing my attempts in code but I wish I could freely play around and continue to have that “aha” moment with code.

So my idea has some layers, interesting layers I think  : )

Once I realized the capabilities of particle systems, forces and possible autonomous agents that can learn and somehow adapt, I immediately thought about Chromatophores in the skin of squids and other cephalopods. I wondered if there could be a way to kind of replicate or simulate the biological interaction that occurs in-between brain-skin-environment. Furthermore, my real question is: could we maybe understand something about how our brain works by generating this study? Could we understand the process of learning and maybe of unlearning?

Camouflage-brain motor patterns and neural activity-emotions- evolution

(If technology provides the tools for a certain “evolution” of humankind if it can give you the tools to be “Superhuman”, how can we actually assimilate or grow and contribute to tech. if being “human” is all we know… how can we redesign the learning or unlearning process in us humans so that we reassess our own limits?)

Another Idea would be to use nature of code in my ongoing experiment with clouds of bioluminescent algae.

Maybe I can simulate and predict what would happen to phytoplankton when suspended in a cloud water molecule. Potentially I could understand what kind of microorganism will survive my actual physical experiment. 

  • Building architecture from the frequencies of our brains
  • MicroOrganisms (unicellular) movement being affected/changed by human-made frequencies, visualizing this in a project mapping or AR- how frequencies affect our bodies at a cellular level-how do we move?
  •  TO AFFECT THE CHROMATOPHORES IN THE SQUID, WE CAN USE THE FREQUENCIES FROM THE BRAIN. AND THEN IT WILL BE LIKE COMMUNICATING THRU COLOR WITH THE POTENTIAL THAT MAYBE WE CAN INVERSE THAT ONE DAY AND THAT OUR SKIN COULD ACT AS A SCREEN
  • Creating a choreography with frequencies (we can trace or make visible as patterns or paths of movement )

 

A CERTAIN CHANCE

 

 

 

https://alpha.editor.p5js.org/full/Bk-RWGtVx

 

 

var mic;
var micLevel;
var points = [];
var x, y;
var px, py;

function setup() {

createCanvas(windowWidth, windowHeight);
textSize(25);
fill(0);
text(“SEGURO AZAR “, 0, 60);

mic = new p5.AudioIn()
mic.start();
x = width / 2;
y = height / 2;
px = x;
py = y;
noStroke();
//background(0);
}

function draw() {

//Move

x += random(-4, 4);
y += random(-4, 4);

// Draw a line from the previous loc to this loc
stroke(0, 0, 200);
line(px, py, x, y);

// Remember current location for next frame
px = x;
py = y;

micLevel = mic.getLevel();
console.log(micLevel);

//if(micLevel){
for (i = 0; i < points.length; i++) {
points[i].display(micLevel);
}

noStroke(10);
fill(247, 170);
}

function mouseMoved() {
if (mouseMoved) {
var newPoint = new Point();
newPoint.xpos = mouseX;
newPoint.ypos = mouseY;
newPoint.display(micLevel);
// newPoint.displayOffset(micLevel);
points.push(newPoint);
}
}

function Point() {
this.xpos = 0;
this.ypos = 0;
this.display = function(micLevel) {
//fill(255,0, 0);
ellipse(this.xpos, constrain(this.ypos – micLevel * this.ypos * 10, 0, this.ypos), 60, 10);
//fill(0,0,255);
ellipse(this.xpos, constrain(this.ypos – 40 + micLevel * this.ypos * 10, 0, this.ypos), 10, 10);
//point(this.xpos + micLevel ,this.ypos + micLevel);
}
// this.displayOffset = function(micLevel){
// ellipse(this.xpos, constrain(this.ypos+micLevel*this.ypos*5, 0, this.ypos), 1, 1);
textSize(30);
//fill(250);
text(“CERTAIN CHANCE “, 40, 100);
text(“CERTAIN CHANCE “, 80, 200);
text(“CERTAIN CHANCE “, 90, 300);
text(“CERTAIN CHANCE “, 100, 400);

}