Friday, May 12, 2017

FinalProject - NMD 342 - Sk8 n Scratch





https://www.youtube.com/watch?v=uaYORJm9xE8&feature=youtu.be





For my final I made  a ridable skate deck that can turn over and be used as a dj board using rotary encoders to control audio on my laptop. 1 slide for a distortion effect. 2 skip song triggers and an on/off. Code in arduino and max msp. 




Thursday, May 11, 2017

Video Respons: Schematics Diagrams.

Schematics let us represent electronics in a more abstract way. A basic symbols of a schematic are: Power (+), Ground (-), Resistor, Variable Resistor, Potentiometer, Diodes (Allows electricity to go in one direction but blocks it going the other way), 

Tools Videos Response


Some tools for electronics are wire strippers, diagonal cutters and needle nose plyers.
Now how to solder properly. First thing you do is clean the tip with a dry sponge or tin and then apply a little bit of solder to the tip. You then heat the surface of the point you want to solder and then apply it. Solder takes three seconds to solidify.  When done, you want to leave a bubble of solder at the tip to prevent oxidation.

Self-evaluation and Group Assessment

Self-evaluation and Group Assessment
NMD342  05/11
Alex Turanski


I thought the course was a good opportunity to learn new valuable skills as well as be in an environment of engaged people who are are willing to argue an opinion on how advanced technology is and how it changes our lives. We did plenty of research on everything that relates to physical computing and interactive design. From memory I remember talking about in class on ubiquitous computing and the future of how technology will be everywhere without us even knowing it. We read and talked about some physical computing examples from the articles everyone in class were asked to share and talk about in class. We never got around to my article though. I was going to talk about how amazon go is turning everything in the grocery store into tangible sensors that we can pick up and leave the store with to make shopping more convenient. Some of the other student articles talked about recording data from the brain by calculating and observing the electro synapse energy running through our heads when we make connections. This could perhaps help with the future in ubiquitous computing development to make technology change an environment based on what we are thinking. Someone had an article about an interactive table that uses a scanner to collect data on hand movements and uses it to control pegs on a table that can move up and down and replicate what the hand is doing. This kind of technology will allow business meetings to be more interactive for long distance communication. Someone else talked about Google and their work flow in designing a user interface for their site and applications.
   The projects where good experience for understanding sensors, practicing teamwork and thinking ahead to get something done. We also had to write a blog on the articles we researched and readings we had to go through. The first project I remember was the scratch made sensor. I chose to make a flex sensor using two copper foils that I soldered to copper wire to plug into the bread board. When I first got into this first project on the first few weeks of class I thought I was way in over my head because I had forgot all the Arduino code I learned from a few semesters back and honestly didn’t think I would even stay to make it through the semester. However, I was very curious to see what I could learn from the research and was both entertained and educated in the class discussions. There was definitely a lot of things said. And not said. But it was all in all worth my time attending. The Rube Goldberg was an interesting team work project. For mine I had to build a trigger sensor with a laser and photoresistor that activates a DC motor that is used as an elevator to pull up a golf ball to the starting point. I had all my code working for a smaller motor but it couldn’t pull the weight of the elevator and ball so I had to switch to a DC motor which changed the code in a way where I couldn’t figure out how to make the elevator go up and down once. I tried to ask a few people on my team for help and we couldn’t figure it out and my part wasn’t done on presentation day. I learned a lot in that project. Working with other people to create a device with separate parts that work together is a great way to get experience in building prototypes for cool new things. I enjoyed working on the project with Seth and Chris. They were very helpful in getting the project up and going. Seth built a sensor with a magnet the gets hit when the ball hits a paddle on a bearing which triggers the pathway that was suppose to go to my elevator. Chris made a trap door that is triggered when water fills up a tube and activates a water sensor that goes back to the trap door and opens it to go down some pvc pipe that we cut in half. We all worked about the same on the construction of the project. If only the first motor could pull the weight of the golf ball then my part would have worked with the coding. The other Rube Goldberg projects were actually really creative! I really liked the basketball hoop idea and the interaction in the spinning elevator project.
   The Sk8 n Scratch prototype project was fun to make. I literally came up with the name painting on the project ten minutes before class. It came out really nice with the black paint and electrical tape wrapped around the wires. If I didn’t use two potentiometers than the board would totally be ridable with rotary encoders. The potentiometers would break because they can only spin at like a 270 angle and then stop like a volume knob. So that in mind I’ve learned to use four rotary encoders when i start making my second prototype. I will also turn the black box into a touch screen display where the corners upload audio control patches. Software that is programmed with MaxMSP and Raspberry Pi are programmed specifically for the Sk8 n Scratch bearing sensors. These sensors must be used with a rotary encoder that connects to a bluetooth module that is installed on the trucks. When you put the wheels on the trucks the rotary encoder in the wheel connects with the bluetooth module that sends integer data to the center touch screen display. You will be able to upload patches and your own playlist of music to play on the the scratch pad from your computer or phone. I wouldn’t have ever had this idea if I never went to this class. I learned a lot about thinking on how to create technology that fits in with what we already do in our everyday lives and routines. Almost every day I use my pennyboard to cruz around campus or town and when you go on pennyboard adventures out in the woods I thought the Sk8 n Scratch would be a cool idea to bring something new and convenient to something I use all the time. I say convenient because the Sk8 n Scratch pennyboard will have speakers on them or the option to bluetooth connect to other portable speakers. This class definitely motivated me to get the first prototype done.
  There was definitely some competitive talk in the class discussions and some pretty interesting energy in opinionated conversations. When trying to talk and figure out how the psychology of a target market correlates with how technology can be implemented in their lives then there are a lot of different ideas that can swing back and forth in talking about some of the class discussions. Some people have different experiences and it is sometimes hard to get a point across when other people can’t relate on why they would use a certain method of technology in making a software interactive for people in a fun way that triggers their curiosity. People have different curiosities and experiences which makes these discussions difficult sometimes. However there were a lot of questions asked and a lot of attempted answers. Sometimes we were definitely able to spark our creative thinking which is why I thought class was worth attending. Other times it just seemed like a looped argument. But that’s usually how it is. The ying yang applies to everything including conversation.
  The best skills I learned in this class is how to think about future technology, how to apply well designed interactive design into physical computing and the knowledge of ubiquitous technology and how we can use it to make our lives more convenient and efficient. Also this class forced me to learn arduino even more which I am thankful for because now I actually have a pretty good understanding for how the code and sensors work which will expand my creative thinking for innovative and meaningful ideas for products and services. It will also allow me to build more effective prototypes. I wouldn’t really have done anything different in this class. I had a good honest attempt with the Rube Goldberg project and asked plenty of questions when I needed help. I read the articles and took some good notes on what stood out to me and my final project came out a lot better than expected. So for me at least I thought the class and group projects where a success. I’m glad I stuck with it. It was good practice for engaged conversation and using teamwork to get projects completed.

Sunday, April 2, 2017

Designing Tangible Interaction - Response

Designing Tangible Interaction


It is good to follow principled design. This is making a decision based on some kind of collective wisdom about design rather than personal preference. Tangible user interfaces can be designed using complex sensor based data collection, conductive fabrics, mechanical devices and physical computing. Many installations of tangible interaction use an interactive space that uses sensors to track users behaviors and can also integrate tangible object in the space as well. Movements of the human body can provide direct input into interactive technologies. Our hand motion or even where our eyes look can be recorded motion data that can be can used to make physical object more interactive. Application areas with this technology could be in learning and education, domestic appliances, games, interactive music installations or instruments, museum installations, tools to support planning and decision making and health and fitness gyms. The four major design principles this article mentions are, tangibility and materiality, physical embodiment of data, bodily interaction and embeddedness in real spaces and contexts. This article also gave some really good questions to ask yourself when designing for tangible interaction. There where roughly 30 questions but out of those some that stood out to me as important where: Do people and objects meet and invite into interaction? How can the human body relate with the space? Can you communicate through your body movement? Are actions publicly available? Can you hand over control anytime and fluidly share an activity? These are all really important types of questions to ask yourself in order to make a successful design in tangible interaction.

Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms - Response

Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms.


Human Computer Interaction, or "Tangible Bits", enables users to be aware of background bits at the periphery of human perception using ambient display media such as light, sound, airflow and water movement in an augmented space. Tangible bits are used to bridge the gaps between both cyberspace and the physical environment. To create a background for human activities. We humans are now almost constantly wired to some form of tangible bits by always have cyberspace in out physical environment. This article raised an interesting question of how can we move past the graphic user interfaces (GUIs) that is shown on computers and other rectangular displays and make it all truly ubiquitous and invisible. A way to make this possible is by wireless networking combined with tangible user interfaces (TUIs). A form of TUI would be installing interactive surfaces into a living, teaching and meeting environment. This could be transforming each surface in an architectural space, (walls, ceilings, doors, windows) into an active interface between the physical and virtual worlds. To make cyberspace even more connected to the periphery of human perception, we can use ambient media which is the use of sound, light, airflow, and water movement for background interfaces to make all the technology feel and seem more natural to a user. The use of water in an ambient media of tangible interaction could be having a sink running and running your finger up and down the stream of water to change the temperature so that you can feel and have control of the heat of water at the same time just by touching it. This article ends with studies in all this information can allow users to grasp and manipulate foreground bits by coupling bits with physical objects, and enabling users to be aware of background bits at the periphery using ambient media in an augmented space. This is all really cool in understanding how we can merge technology into our everyday lives without even knowing it. The delivery of computation should be transparent in the future. The most common approach to augmented reality is by sending digital visual information to a head mounted display device or video projection. ClearBoard is a way to make architectural spaces more real virtual and interactive. A graspable user interface allows direct control of virtual objects through physical handles called "bricks". Bricks can be attached to virtual objects making a virtual reality more physical and graspable. LiveWire is a wire connected to a network of bits flowing through it and is tangibly interactive through motion, sound and even touch. ambientROOM is a graphically intensive interaction using light, shadow, sound, airflow and water flow as a means for communication information to the human perception. A cool question this made me think of is, how can you make a user feel rain through an augmented reality without getting wet? Using all these functions in an ambientROOM takes advantage of our brains natural abilities to work as a parallel processor and attention manager. This article mentions that the most compelling interface in spanning virtual and physical space is through optical stimulation using bits to manipulate light and shadows. However, GUIs still fall short of embracing the richness of human senses and skills people have developed through a lifetime of interaction with the physical world. 

Tangible User Interface for Children - An Overview - Response.

Tangible User Interface for Children - An Overview


This article starts off strong by stating technology should support their curiosities, their love of repetition and their need for control. Advances in ubiquitous computing, or "the method of enhancing computer use by making many computers available throughout the physical environment, but making them effectively invisible to the use." will bring a more heuristic learning environment to children. Ubiquitous computing is an integration of human factors, computer science, engineering and social sciences. All combine offer lots of stimulus to enhance the curiosity in children. Computers and networks are already encouraged in schools by governments. A very popular form of this is called Edutainment. This is a game-like educational environment which is fun to children. If kids can have fun then they can learn a lot easier. A way to make edutainment possible is by putting a tangible user interface (TUI) into the schools learning material. TUIs requires little time to learn how to use which benefits children learning. TUIs offer an alternative way of interaction which is fun for children. TUIs support trial and error activity and also support teamwork development and supply a fun social experience. This is all important because physical action benefits learning. Physical materials give rise to mental images which can then guide and constrain future problem solving. This is a great article in further understanding how technology can enhance our learning capability by stimulating the curiosity with TUI environments making learning more heuristic all around.

Interaction Design: What is it, and How can we use it? - Response.

Interaction Design: What is it, and How can we use it? 

Interaction design is about shaping digital things for people to use. It is the practice of designing interactive digital products, environments, systems and services. It connects the digital words to the human one. The concepts and principles are relatively the same in good interactive design. This article gave some great examples for some of these such as:
Motion, Space, Time, Sounds, Aesthetics, Space, Color, Typography, Contrast and Readability. These are all pretty self taught in understanding through New Media. There is a phycological way in which the brain can respond positively to color contrast, sounds, and all those examples to which can be put in an interactive product, environment, systems and services. Going over these concepts will definitely further ones understanding in interaction design. 

An Encompassing View on Tangible Interaction: A Framework: RESPONSE

An Encompassing View on Tangible Interaction:  A Framework:

This article talked about the design process in both digital and physical systems. It talked about a data centered view, an expressive movement centered view and space centered view in design. The Data centered view defines tangible user interfaces as utilizing physical representation and manipulation of digital data, offering interacting coupling of physical artifacts with computationally mediated digital information. The Expressive Movement centered view emphasizes bodily interactions with objects, exploiting the "sensory richness and action potential of physical objects", so that "meaning is created in the interaction. The Space centered view focuses on combining real space and real objects with digital displays or sound installations. Tangible interaction as we understand is giving physical form to digital information. To start designing something with tangible interaction, you must start with the frameworks. A framework systematically maps out an abstract design space with showing the human interaction experience in it. The framework should focus on the social interaction experience. The social interaction and collaboration is the most important feature of tangible interaction. There are four types of tangible interaction. 
These are: Tangible Manipulation, Spatial Interaction, Embodied Facilitation, and Expressive Representation. These are all good areas to study when attempting to expand the use of technological externalization. Externalization is if users can think and talk with or through objects, using them as props to act with. Are physical and digital representations seemingly naturally coupled? Willow glass is a great new invention where touchscreen flexible glass can be attached to any surface thus expanding ones technological mind with tangible manipulation working with spatial interaction. Refrigerators and table tops will be wirelessly connected to you phone making technology more ubiquitous in our everyday environment.


Thursday, March 2, 2017

Electricity Videos Response

These videos talked about the basics of how electronics work. It started with OHMS Law which is how electrical currents can be measured by Voltage (V), Current (I) and Resistance (R). The equation for these are V = I*R   I = v/r and r = v/i. Resistance in important to understand because that is what controls the flow of an electrical current. The resistance in the air is really high and copper has very little. There are many other materials that have different amount of resistance. Voltage is how much power is running. If there is to much voltage running to a sensor or a LED light it can brake it which is why it is important to understand resistance so you can control the current and voltage. Fixed resistors are necessary for proper electronic functions.


Multi-Sensor Context-Awareness in Mobile Devices and Smart Artefacts - Reflection.


In this article they talked about augment mobile devices with awareness of their environment and situation as context. Context is what surrounds, and in mobile and ubiquitous computing the term is primarily used in reference to the physical world that surrounds the use of a mobile device. Examples for situational context are being in a meeting, driving in a car, and user activities such as sleeping, watching TV, cycling, and so on. context in mobile devices is receiving considerable attention in various fields of research including mobile computing, wearable computing, augmented reality, ubiquitous computing and human-computer interaction. Most research into context-aware mobile devices has considered the use of single but powerful sensors, specifically position sensors and cameras. Position sensors provide access to location as rather specific but particularly useful context. Its usefulness as context also largely depends on pre-captured knowledge about locations. Cameras similarly provide access to potentially rich information that can be derived by means of feature extraction and video analysis. Electronics with awareness to their environment have potential to make a customers life a lot easier because they can react to our everyday routines and physical actions.

Physical Telepresence: Shape Capture and Display for Embodied, Computer-mediated Remote Collaboration - Reaction

This article talked about how we can interact with each other in remote locations through physical telepresence. This will hopefully solve the problem of being able to give someone a handshake in another country. And many other things like that. The goal of physical telepresence is to extend the physical embodiment of remote participants and combine it with the physical embodiment of remote TUIs. This article talked about a few examples of this by explaining a few telemanipulation scenarios. These are Video Mediated Communication (used for collaborative websites and family communication), Mixed Reality (Large spatial environments that can create collaborative workspaces that can give users an experience of immersion in certain environment), Telerobotics ( Representing remote people with telepresence). Telemanipulation can be used to handle hazardous materials or other objects from a distance. It can also be used for surgeons for more precise control to remove tremors. This is why it is important to study and design physical telepresence and manipulation because it can expand human capabilities. We primarily emphasize the potential in interactions with shared digital models of arbitrary shapes, linked physical objects, and manipulation of remote objects through direct gesture. Using shape capture and display, users can reach through the network and pick up a remote physical object. This is called Direct Gesture Control and it allows a user to interact directly with a remote tangible object though transmitted physical embodiment. Tangible Tokens are the term used to represent a remote tangible object. As a user moves the token, the model of the remote object is updated, and the remote object is moved to reflect the changed state. Tangible tokens can be updated and transformed with activity like scaling, translation, rotations, shearing, stretching and other distortions. They talked about some problems with direct gesture control and what they did to fix it. As the physical shape rendering contains some noise and the motors require a certain time to reach their target position, our detection algorithm thresholds the deformation image and compensates for the time delay. That is something to keep in mind when designing for physical telepresence.

Designing Tangible Interaction - Reaction

When designing tangible interaction, you must follow rules of principled design which is making a decision based on some kind of collective wisdom about design rather than personal preference. Next the article talks about material that can be used in the design of tangible interaction. Some things that can be used are complex sensor based data collection, conductive fabrics, mechanical devices and physical computing. It also talks about Interactive Spaces which is cool because the interaction in certain spaces can track and record data on the movement and behaviors of the users. This is possible because we can interact with small objects that we can grab and move around within arms reach and interact with large objects within a large space and therefore need to move around with our whole body. Some applications that interactive spaces can bring us include learning and education, domestic appliances, games, interactive music installations or instruments, museum installations, tools to support planning and decision making and, increasingly, health and fitness and more. There are four major design principles to make all this work which are tangibility and materiality, physical embodiment of data, bodily interaction, and embeddedness in real spaces and contexts. This article also listed some really good question to ask yourself or group members for a tangible interactive design. Some of them are How can the human body relate with the space? Can you create a meaningful place with atmosphere? Does shifting stuff around have meaning? Can everybody see and follow what happening? Can you hand over control anytime and fluidly share an activity? Does the representation build on users experience and connect with their skills? What is the entry threshold for interaction? These are all really good questions which I think was the most important thing to take out of this article.

Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms - Reaction


Tangible Bits have to do with bringing the human perception ambient display media such as light, sound, airflow and water movement in augmented space. The goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities in a way that makes the computing truly ubiquitous yet invisible. Wireless connection will make this more possible as we start connecting all our everyday electronic devices with bluetooth or internet. Another way to make bring this to the world is through interactive surfaces where we transform architectural spaces likes walls, ceiling, doors, windows into an active interface. A way to make cyberspace really connect with human perception is to intergrade ambient media into spacial areas. Ambient Media has to do with sound, light, airflow and water movement which stimulates our senses and makes an augmented reality much more real. This is all exploring ways of both improving the quality and broadening the bandwidth of interaction between people and digital information by allowing users to grasp and manipulate foreground bits by coupling bits with physical objects and enabling users to be aware of background bits by using ambient media in an augmented space. This article sparked a question for me and that is how can we make a user feel augmented rain without getting them wet? When it comes to having direct control of virtual objects through physical handles we call the handles “Bricks”. Bricks can be attached to virtual objects to serve as dedicated transducers, each occupying its own space and can be used through motion, sound and touch. We continue to study tangible bits and augmented reality because graphic user interfaces still fall short of embracing the richness of human senses and skills people have developed through a lifetime of interaction with the physical world.

Tangible User Interface for Children - An Overview: - Reaction

This is a very cool article because education and a good learning environment is the most important thing for young children and bringing TUI’s to it is making education more of a social and interactive experience as well as supporting their curiosities, their love of repetition and their need for control. This article talked about Ubiquitous Computing which is the method of enhancing computer use by making many computers available throughout the physical environment, but making them effectively invisible to the user. This can be applied to computer science, engineering and social science and is now being encouraged to be more widely used in schools around the world. The word “Edutainment” stood out to me which is a way to bring more technology to schools by making education a game like experience. If the technology in interactive it will make learning more fun which will get the childs mind more engaged on the subject. Education using TUI’s has child's such as requiring little time to learn how to use the interface, offers an alternative way of interaction which is fun for children, supports trial and error activity and is interactive and can develop teamwork and support a fun social experience. I think this was a very important article to read because learning how to make technology work with helping children learn educational lessons instead of distracting them is a process that will definitely benefit this world.

Interaction Design: What is it, and How can we use it? - Reaction


Interaction design is the practice of designing interactive digital products, environments, systems and services. It connects the digital world with the human one. Some concepts to think about when starting interaction design is define how users can interact with the interface, consider feedback and response time and to simplify the learnability. Those are what I found to be the most important concepts to think about in this article. Next it talked about some principles which are relatively the same for the process in creating any good design. These principles are Motion (There are contemporary motions for touch screen devices that most people already know like swiping left to go through photos. This can help for an easier user experience) , Space ( Is the space only in a screen or does it also take place in the physical world) , Time ( The amount of time a user spends with each interaction is very important for keeping them interested to move forward) , Sounds ( While some users love it, others are quickly annoyed. When using sound in interactions, always account for users who will disable the function; the design needs to work just as effectively without it.), Aesthetics ( Does the look have any emotional impact?), Color ( Consider bright and engaging hues especially for mobile and watch apps) , Typography, Contrast, and Readability.  

An Encompassing View on Tangible Interaction: A Framework: - Reaction

Tangible User Interface and interaction require design in systems within embodied interaction, tangible manipulation, physical representation of data and embeddedness in real space. Tangible user interfaces utilize physical representation and manipulation of digital data which makes it possible to create interactive physical artifacts. There is endless potential on what TUI can do with our surrounding physical world. There are different views of TUI’s. One is the Expressive Movement view which has the goal of putting meaning into the interaction with physical objects. Another is a Space Centered view with rely on combining real space and real objects with digital displays or sound installations. TUI is suppose to exploit the richness of bodily movement and with a Space Centered view we can make our whole body become a sensor for a physical environment connected to a software data. To start designing TUI’s you must start with the frameworks. Frameworks map out an abstract design and show the human interaction experience. There are 4 types of tangible interaction the framework can focus on. These are Tangible Manipulation, Spatial Interaction, Embodied Facilitation, and Expressive Representation. In interaction we read and interpret representations, act on and modify them. Some questions used for interpreting would be are the representations meaningful and have long lasting importance? Can users think and talk with or through objects, using them a s props to act with? Are physical and digital representation seemingly naturally coupled? These are all questions that are useful for designing an effective framework.