FINAL PROJECT Process

SYSTEM DIAGRAM

PCOMP_sequencer.png

Song is used to chain together a sequence of patterns for each track. This can be used to chain together patterns during a live performance to create an entire arrangement of song. Matrix mode in a bigger perspective also allows you to record 64 songs for each pad.

Song mode gives you an overview of the project and 8x8 layout of the clips. Each column in song mode represent a track. Tracks are used to hold audio samples, drum kits, instruments.

Each track is made up of clips which are short sequences of audio. Only one clip in a track can play at a time. 

Four pads under screen represently composed by play, stop, record buttons which also have the functionality of copy, paste and clear. The fourth button represents function button which allows you to navigate back and forth on menu. Each green buttons from the left side controls sound,clip, sound, sequence functionality from the screen view.

In Song mode, each pad represents a single clip in a track. The lighted pads represent the clips and the dim pads are empty clips. Clips will continue to play until another clip in the same track is launched or the clip is stopped. To stop playback, press stop. In addition to launching clips in a track, you can launch multiple clips across tracks in the same row. Each row of clips used to arrange different parts of the song. 

In clip view, you can view and edit the parameters of the clip container itself. Clip mode is used to edit the parameters of individual clips. In this mode the view will show all available pads in a vertical view with their corresponding data. To enter clip edit mode, press the second green button from the right side. A song consist of up to 64 slots, each holding a set of patterns to be played on the 8 track. In addition to using the clip launch pads, you can also use the piano keyboard to record notes chromatically.

In sound mode, you can edit parameters of audio clip itself. Holding a step lets you edit notes contained within the step.


MATERIAL LIST

  • 16 rotary encoder (NOT FIXED Which one to buy)

  • OUTEMU (Gaote) Red Switch 3 Pin Keyswitch DIY Replaceable Switches for Mechanical Gaming Keyboard (20 PCS x 2) (FIXED)

  • Keycaps(approx 19mm X 19mm.)

  • Matrix Buttons (NOT FIXED)

  • 4.0 touch screen (FIXED): 

  • Knob cap (no indicator) for the rotary encoder x 16


DESCRIPTION OF FINAL PROJECT

IDEA

a2468431fde2fc1f62820ab6b9c70ca8_original.png

We are going to create a hardware of music sequencer, which allows user to create music more intuitively by combining Ableton Live’s session view with the Elektron’s step sequencer.

The idea is to create a more user friendly and easy to use concept of live electronic music performance for musicians. 

CONCEPT

Asset 1 copy.png

The current Ableton Live’s piano roll interface is not easy to control on where to put each notes on which timeline, because of the lack of external hardware controller for users to interact more efficiently. 

Although the session view is useful, it allows you to arrange pattern with session, it’s not enough for musicians during live performance because of the mouse use. 

Keyboard on the other hand, doesn’t have the display which allows you to see where to put the trigger notes. 

Asset 1.png

Suitable approach is to combine these two features in one hardware platform.

CODE

Pure Data and Ofelia library.

MATERIAL LIST

Asset 1 copy 2.png
  • 7’’ touch screen: The screen that displays 8x8 button matrix. 

  • Rotery encoder

  • Raspberry Pi

  • Keyboard(pressure sensitive buttons)

  • Piezoelectric sensor(to measure pressure)

  • LED’s (for light for the buttons)

DESIGN 

The instrument is designed as a rectangular shape.

Left side of the instrument includes the 8x8 buttons.

There will be record , play and stop buttons on the left side of the display screen. Top left side of the instruments contains knobs to control the sound. 

Right buttom part of the instrument contains keyboard.

PROJECT IDEALIZATION

IDEA 1

Another idea is to create an art work controlled by servo motors. I’m inspired by Daniel Rozin Kinetic Mirrors that can echo your movements . His installations and sculptures have the unique ability to change and respond to the presence and point of view of the viewer.

This idea of exploring the subjectivity of self-perception, blows my mind. Of course, this is an awesome art piece comes to life with a huge work. But I can learn a lot during a process of exploring movement detection, kinect motion and motors. But the action which will appear and the sequence between the motors have a crucial importance and not easy.

The action of servo motors create an electronic work of art interest me a lot, but putting the the user on to the center of this project and make it interactive even more interesting. At that point, it’s becoming something else, becoming the experience between the object and the person.

But, I’m not sure about what to use as a material or what would be the action etc. It’s just the idea of creating a piece which doesn’t have meaning without a user feels like a there can’t be a world without life in it or happiness without sharing etc. for me.

Daniel Rozin- Angles Mirror

Daniel Rozin- Angles Mirror

IDEA 2

The Bouba-Kiki effect is a non-arbitrary mapping between speech sounds and the visual shape of objects. This effect was first observed by German-American psychologist Wolfgang Köhler in 1929.

The Bouba-Kiki effect is a non-arbitrary mapping between speech sounds and the visual shape of objects. This effect was first observed by German-American psychologist Wolfgang Köhler in 1929.

I always wonder if some shapes have certain "sounds" to people, even if they have different native languages. For example, does everyone match certain physical characteristics, like sharpness or roundness, with certain sounds? Are there certain human sounds with meanings that can cross the language barrier?

I want to investigate this at some point in my life. The human brain's fascinating connection between shapes and sounds. I’ve researched about the Bouba-Kiki Effect ,a non-arbitrary mapping between speech sounds and the visual shape of objects. This effect was first observed by German-American psychologist Wolfgang Kohler and it will turn out that abstract visual properties can be linked to sound. I’ve asked this question of “What do think about these shapes?“ and the answer I got was the same. They told me the image on the right side is Kiki and the left is Bouba. When I asked them why the answers were interesting. Some of them said that is related with the letters in their mind. Kiki is composed by long staight letters like “l“, on the other hand Bouba was fat, round letters like “O“,”U”. But, what if they interact with shapes with sharpness or colors.. An experimental project of sound and how people feel about it relating it with abstract objects and shapes would answers the questions in mind.

From this point, I tought about an instrument that can create sound depending on shapes of objects, but it should be interactive with user too. So, maybe an instrument that can create specific sounds depending on the user itself. But is it going to be related with how the user look like? or how they interact with the instrument or how they communicate with it ?

MIDTERM PROJECT FINAL

Meet the fortune teller. It’s practice to predict information about a peoples life. For such a long time he has been traveling space back and forth and his divinations are dark, so be careful!

z1.jpg
zo1.jpg
zi4.jpg
Screen Shot 2019-10-24 at 11.50.48 PM.jpg

USER SIDE

In order to start, user has to “hi” or equivalent word like ”hello” in order to start conversation. After asking about your future, the fortune teller robot’s led lights turn on and this is the time you will get your printed fortune. Later, to en up the experience user has to say “bye” or equivalent word “goodbye”.

FABRICATION AND DESIGN

The fortune teller is made of a wooden box, covered by a metal mesh which represents the maturity of the fortune teller. Traveling space between time made it tired so I’ve given the box an old and weary appearance by metallic cover and with assemblage the crews of corner components outside the box. Before covering the woods, I’ve painted the woods with gold colored spray paint. This was the wood itself doesn’t appear behind the metal mesh’s wholes.In the beginning of this project, my intention was to used ultrasonic sensor for leds to turn on but as I will talk about it later on in this blog, it didn’t work well so I decided to keep it as part a design of the box, corresponding for eyes. To make the face bored over years by answering human’s selfish questions, I’ve gave the eyes a blase appearance with eyebrows. 

The box has instructions over the front side. The instructions are simple and short. It includes how to start the experience and how end it. “Say hi, ask about future, say bye.” Instructions are handmade cut from black colored cardboard. Arduino is banded on the bottom of the box. Wires are hidden inside the box. Two cables are coming out of the box, one of them for powering the printer the other is the connection between Arduino and computer standard A-B usb cable.

IMG_8103.jpeg

As I mentioned before, my intention was to use 22 pieces of 5voltage led lights in the beginning of the project. So, I’ve opened 22 drill wholes on the top of the box. I’ve build a serial connection between the anodes and cathode legs of 22 pieces of led’s. I’ve spend a lot of time to build this circuit but appearently the led’s legs were touching each other inside of the wholes so it was making short circuit. I had to open the wholes much more bigger. So, I had to break everything. To avoid spending time, We’ve changed the idea of 22 pieces of leds to use a strip led. But this time, It wasn’t going to be light depending on distance. The led blinks before the message prints.

To change the led wholes that I’ve open with drill, I’ve laser cutted a rectangular space for lights to go out. For design concern, I decided to cover the wholes from the back of the wood with a red colored plastic sleeve. Because of it’s transparent the light would go out. And we’ve changed the color of the led to red from the Arduino code. But, I still think it’s not that clear to see the lights come off from the box. Maybe, I should have used another material.

Also when using ultrasonic sensor, the data I got wasn’t accurate. I’ve used the below code y adding a heat and temperature sensor to get more accurate data and it worked.But, unfortunately we couldn’t use it.

Screen Shot 2019-10-25 at 12.40.10 AM.png
IMG_7932 2.jpeg

CODE

IMG_4847.jpeg
IMG_4714.jpeg
IMG_5586.jpeg
IMG_5575.jpeg
Animated GIF-downsized_large (4).gif
Screen Shot 2019-10-25 at 1.23.10 AM.png
Screen Shot 2019-10-25 at 12.40.10 AM.png

P5.js side of the code in build on Daniel Shiffan’s Speech Recognition code with P5.Speech. 

The connection between Arduino and computer via serial communication with P5.serial control. Here you can see the Ardunio side of the communication. 

https://idmnyu.github.io/p5.js-speech/

The code includes a library that can detect speech and console.log what it’s perceived. What we’ve done is to make three arrays. One of them includes keywords, which the code can detect if user used them. These keywords are selected by most highly used words when someone ask about their future. To give an example, people are generally use “health, love, job, career” kind of words. But you can think it’s a much on array for possibilities. 

Second and third array includes the answers which then they will be printed on the Arduino side of the code. The second array is made of specific answer for use of certain words and the third array is made of abstract answers in case, any of the keywords are not used. It gives general answers like “Be a New Yorker, walk fast and get out” or “You know the answer”. I want to mention again, the robot’s concept is dark for Halloween is on the way. So, the responses are dark for entertain.

You can see the Arduino side of the coding side.

You can reach the P5.js part of the code below;

https://editor.p5js.org/bestesaylar/sketches/CwGv4y6jF

Screen Shot 2019-10-25 at 12.43.50 AM.png

concept

Question: Am I gonna find true love?

Answer: You are gonna soon regret things you did for love. Think twice.

Question: When am I gonna die?

Answer:You are not gonna die in a week. The rest is unclear.

CREDITS

We got so much helped by residents in ITP in coding part of the project. So, I would like to thank you to everyone who helped us during this stressful 2 weeks. Huiyi Chen, Maxwell da Silva, August Luhrs and people that I can’t remember the names. I would like to thank my partner Elena Glazkova to share this experience with me.

PROCESS

MIDTERM PROJECT_First half

For midterm project we came up with an idea of a Fortune-teller robot, that responds to user the questions by printing predictions. The predictions will be dark, we tought this will be more convenient for Halloween concept. It will also have the ability to detect whether the user is approaching close to it..

DESIGN

It will be a box from wooden which later on we’ll cover it with another material to make it look like a robot. We’re going to make a whole for the ultrasonic sensor and for the printer paper. We also gonna make wholes on the top of the box, for LED’s which will turn on with distance detection.  So that the ultrasonic sensor looks like eyes, printer looks like mouth and LED’s for the hair. 

COMPONENTS 

  1. Arduino Uno: We planned to use NANO but it turned out that thermal printers would normally need from 5 to 9 Volts to print dark and readable letters.

  2. Adafruit Thermal Printer: We bought Product ID:2753. Inspide of trying everything, we couldn’t make it work. Then, thanks for our friend August, we borrowed another version. Product ID: 600, tested it successfully and gonna use it for our circuit. So, for the first week of the project, we had success on testing the product after spending a whole Thursday. On the product sheet, it was written that this particular printer doesn’t need to have a power supply. It look us a lot of time to figure it out. Test code for Arduino is available after downloading Adafruit Thermal Printers library from Adafruitwebsite. Than expels can be reached from Arduino-File-Examples.

  3. Ultrasonic Sensor: It is going to be used for the distance detection of the user. When someone get closer to the robot, the LED’s going to turn on.

  4. LED’s

CODE

We are using Daniel Shiffan’s Speech Recognition code with P5.Speech. It is a p5.js library. So, we made the code continuously listening and giving back text of the speech. 

You can reach the code we’ve so far below.
https://editor.p5js.org/bestesaylar/sketches/ZhG66Ds0J

*It still needs to be worked on. We need an array of keywords. So, during the speech ,an event listener willl detect the keywords and depending on these keywords, it will choose random answers from an answer array. 

So, we are not going to write answers as text in Arduino code. Answers will be sent by specific keywords or numbers to the Arduino for serial communication reasons. 

IMG_4189.png

PROCESS

IMG_0353.JPG
IMG_0375.JPG
IMG_6782.png
IMG_9113.png
IMG_9978.png
IMG_0376.JPG
Screen Shot 2019-10-18 at 12.03.17 AM.png
Screen Shot 2019-10-18 at 12.02.58 AM.png
Screen Shot 2019-10-18 at 12.03.05 AM.png

SERIAL COMMUNICATION_serial input

I’ve created a sketch of multiple bubbles which change color randomly when they’re close to colliding and create a separate circle in the middle of the sketch. I’m controlling the x position of this specific circle by potentiometer. To communicate with the microcontroller serially, I used the P5.js serialport library and the P5.serialserver. Because the browser doesn’t have direct access to the serial port. But it can communicate with a server program on the computer that can exchange data .I’ve included the library to the html code.

 
 <script language="javascript" type="text/javascript" src="p5.serialport.js"></script>

Alternatively, you can skip the installation of the p5.serialport.js file and use a copy from an online content delivery system by using the following line in your index.html instead:

<script language="javascript" type="text/javascript" src="https://cdn.jsdelivr.net/npm/p5.serialserver@0.0.28/lib/p5.serialport.js"></script>
Screen Shot 2019-10-17 at 11.27.29 PM.png

The p5.js Sketch

To start off, the programming environment needs to know what serial ports are available in the operating system. On the sketch file I’ve added this code.

Serial Events

JavaScript, the language on which P5.js is based, relies heavily on events and callback functions. An event is generated by the operating system when something significant happens, like a serial port opening, or new data arriving in the port. In the sketch, I wrote a callback function to respond to that event. The serialport library uses events and callback functions as well. It can listen for the following serialport events:

Screen Shot 2019-10-17 at 11.31.20 PM.png
  • list – the program asks for a list of ports.

  • connected – when the sketch connects to a webSocket-to-serial server

  • open – a serial port is opened

  • close – a serial port is closed

  • data – new data arrives in a serial port

  • error – something goes wrong.

I’m already using a callback for the ‘list’ event in the code right I’ve set a callback for the ‘list’ event, then I called it with serial.list(). Generally, I should set my callbacks before I use them like this.

To use the rest of the serialport library’s events, I need to set callback functions for them as well. So, I’veadded a new global variable called portName and initialize it with the name of your serial port. Then change your setup() function to include callbacks for open, close, and error like so.


The function that matters the most, though, is serialEvent(), the one that responds to new data. Each time a new byte arrives in the serial port, this function is called. Lastly, I made serialEvent() do some work. I’ve added a new global variable at the top of my sketch called inData like so:

var serial;

var portName = '/dev/cu.usbmodem1421'; //

var inData;

The sensor value onscreen changes as I turn your potentiometer.

 

MELODY

I’ve tried Frere Jacques Melody.

Screen Shot 2019-10-17 at 11.03.06 PM.png

TONE AND FREQUENCY OUTPUT USING ARDUINO

In this part of the lab, I’ve concentrated on generating simple tones on an Arduino. The Nano’s 3.3V pin is connected to the left side red column of the breadboard. The Nano’s GND pin is connected to the left side black column. I’ve connected a potentiometer in the left center section of the breadboard and tried to use potentiometer to arrange the frequency rate. One wire from potentiometer is connected to positive the other one to the ground . The third wire is connected  Arduino’s analog input pin A0. The positive wire of the speaker is connected to digital pin 9 of the Arduino.

CODE

Screen Shot 2019-10-03 at 10.34.23 PM.png

Servo Motor Control

In this lab, I’ve controlled a servomotor’s position using the value returned from an analog sensor. I’ve connected an analog input to analog pin 0 and an analog input. The servomotor and an analog input attached to an Arduino Nano. 3.3 Volts and ground are connected to the voltage and ground buses of the breadboard as usual. The force-sensing resistor is mounted below the Nano. A 10-kilohm resistor connects one leg of the FSR to the left side ground bus. A wire connects the row that connects these two to analog in 0 on the Arduino Nano. Another wire connects the other pin to the left side voltage bus. The servomotor’s voltage and ground connections are connected to the voltage and ground buses on the right side of the breadboard. The servomotor’s control wire is connected to pin D3 of the Nano.

Screen Shot 2019-10-03 at 10.33.12 PM.png

LAB 2_ DIGITAL INPUT AND OUTPUT WITH AN ARDUINO

In this lab, I’ve connected several digital input circuits and a digital output circuits to a microcontroller. I’ve used LEDs and pushbuttons , switches during the development for testing whether everything’s working.

Materials I’ve used are an Arduino Nano 33 not, jumper wires, a solder less breadboard, s speaker, LED’s, 220-ohm and 10-kilohm resistors and a pushbutton.

I’ve mounted my Arduino Nano , at the top of the solderless breadboard, straddling the center divide, with USB connector facing up. The top pins of the Nano are in row of the breadboard. The Nano’s 3.3V pin is connected to the left side red column of the breadboard. The Nano’s GND pin is connected to the left side black column. I’ve connected a pushbutton to digital input 2 on the Arduino. Than, Ive connected a 220-ohm resistor and an LED in series to digital pin 3 and another to digital pin 4 of the Arduino.

The Nano’s 3.3V pin (physical pin 2) is connected to the left side red column of the breadboard. The Nano’s GND pin (physical pin 14) is connected to the left side black column.

Circuit

Circuit

Schematic Diagram

Schematic Diagram

 

After I’ve checked my board’s type(Arduino NANO 33 not) and port(/dev/tty.usbmodem-XXXX), I wrote a program that reads the digital input on pin 2. When the pushbutton is pressed, turn the white LED on and the red one off. When the pushbutton is released, turn the red LED on and the white LED off. Lastly, I’ve compiled my sketch and uploaded it.

Program Module

Program Module

Preview

Preview

 

In this lab, I’ve connect a variable resistor to a microcontroller and read it as an analog input which allows me to read changing conditions from the physical world and convert them to changing variables in a program. In addition to the similar materials above, I needed a potentiometer and a force sensing resistor.

As I always do, I’ve connected a power and ground on the breadboard to power and ground from the microcontroller. Then, I’ve connected the wiper of a potentiometer to analog in pin 1 of the module and its outer connections to voltage and ground. Than, I’ve connected a 220-ohm resistor to digital pin 9 and put the anode of an LED to the other side of the resistor, and the cathode to ground. The +3.3 and ground pins of Arduino are connected by bored and blue wires to the left and outer side row and for positive, inner side row. I’ve mounted the potentiometer that I’ve soldered before in the left side of the board. I’ve put an additional red LED and resistor. I’ve connected the other end of the resistor to Nano’s digital pin 8. As a second part of the LAB, I’ve replaced the LED with a speaker to try audible output.

Force sensing resistor

Force sensing resistor

Program Module_2

Program Module_2

Audible Outputs

Physical Computing_analog_audible output

Program module_3

Program module_3

LAB2

IMG_9754.JPG

I've made a game. The rule of the game is to put the coin in to the holes in the platform. If you be succesful, the LEDs in the breadbord turn on. 

There is two holes in the platform. I've insert conductive wires in them and the coin is covered with conductive fiber. So the switch is the coin in a whole. If the conductive fiber around the coin touches the wires inside the holes the lights turns on. 

Each holes represent a LED. If you managed to put the coin in the longer distance hole, the red LED turns on and if you put it in the hole at the shorter distance the white light turns on. (Each wire windings inside the holes are connected to)

The circuit is in parallel connection.

The materials Ive used for this project are;

Breadbord

Energy Power

Red LED

White LED

2/ 220ohm Resistor

Wires under board

Wires under board

Breadboard

Breadboard

Schematic Diagram

Schematic Diagram

Process

Process

Preview

Preview

 

FINAL

Animated GIF-downsized_large (3).gif