About my blog

The aim of the ITD course (ID4220) at the Delft University of Technology is to provide Design For Interaction Master students with in-depth theoretical and practical interaction design knowledge to help develop future products based on user-product social interaction. ITD proceeds through a sequence of iterations focusing on various aspects of the brief and the design, and culminates in an experiential prototype.


This blog is managed by Walter A. Aprile: please write if you have questions.

Archive

Disclaimer

De meningen ge-uit door medewerkers en studenten van de TU Delft en de commentaren die zijn gegeven reflecteren niet perse de mening(en) van de TU Delft. De TU Delft is dan ook niet verantwoordelijk voor de inhoud van hetgeen op de TU Delft weblogs zichtbaar is. Wel vindt de TU Delft het belangrijk - en ook waarde toevoegend - dat medewerkers en studenten op deze, door de TU Delft gefaciliteerde, omgeving hun mening kunnen geven.

Posted in March 2011

Throwing the ball

Making the concept more engaging and more expressive we are captured by throwing.

Do you have something to share with your colleagues? load it on the ball – throw the ball!The other has to catch it, places the ball on the RFID reader and directly the item to share is opened on the computer screen.  

It is about       Sharing (information)       Instance       Connectedness       Playfulness       Ease       Focus       ……..What more can we do with this concept?

Week 5-7 – Decision Making I

It’s Alive… Really, it is!

How it all began: 
 
In symbiosis with human beings, but how do they communicate? Does the human even realize he’s dealing with a living creature that needs attention and sustenance? 

Hard, serious work

 
We got the heart to beat!
CPR heroes with Arduino…

Squeezing or Shaking?

Feeding is usually done by putting something in someone, not by shaking or squeezing. What can we do to convey this to the user?

Objects associated with shaking:

 

 
Objects associated with squeezing:

 

  We need to find a way to combine this with the interaction metaphors posted earlier so we can keep our alien baby alive!

 

Tampering the toolkit

 

Hello everybody!

During the 2nd week of nutcracking,we focused on developing the concet idea and discuss about technologies for our next prototype.

Our concept idea is a toolkit that helps students learn basic electronics while giving opportunity to start with simple examples, then making more complex tasks; see the alternative solutions and applythem to their own projects.

 We are now in progress of improving the sketches and searching for "learning paths" through Arduino systems to adapt these into our interactive toolkit design.

 We will keep you informed about this progress during the ITD day this week.

 

More struggling

Starting from last week, we keep struggling with our concept. On one hand, the ‘client’ likes to see a combination of sensor data with tagging aspects combined in a mobile device. On the other hand, the ‘teachers’ like to see daily aspects implemented in a highly interactive product. Our previous aspects were lagging on one or more of these aspects. When discussing, and brainstorming, with our client, we gained a good insight into this problem and how to solve it. Therefore we now present concept 1.3.

In this concept the elderly has a watch, with which he can communicate. The caregivers have a mobile phone with same interface in the shape of an application. The communication between elderly and caregivers is done via simple symbols, e.g. square or triangle. We do not link a meaning to these symbols; this is done by the elderly and caregivers. In addition, the brightness of the colour (bright is pure blue in this case, where inactive represents a white colour) of the background communicates the ‘activity’ of the elderly. This is thus the amount of movement. The last part of the communication is the background colour. By using RFID, we are able to measure the place of the elderly, e.g. at home or outdoors. The place is here represented by a colour (blue versus green for example). This could lead also to an extra layer of communication: “When the caregiver sees that the elderly is still moving around at 23.00, he knows that they still can call without waking the elderly.”

 

Beside just concept generating, the programmers already started to work with model. Here the inactivity is programmed with a row of lights indicating the amount of activity on this bar, like it happens with audio strength. Next to that we started playing with sensors like encoders and RFID. Though only with a steady concept, the maker and programmers can continue their work. Still we have to discuss this concept thoroughly next Friday, to make sure every aspect is thought about.

Doing

 
Good afternoon, 

 

This week the theme is: Doing!

 It is time to make tough decisions so we can start DOING.

After the lecture Sound feedback in interaction design by Reinier Jansen we got inspired to integrate the feedback of our system in the yoga movement. In this case the programming should be linked to some specific points at the curtain rail, because we want different sounds and light intensity at different times. Since we want it to go fluently, we will have to work with dimmers to control the level of feedback. 

  The lecture also made us think about sounds that are ‘correct’ with the movement. In an earlier stage we already thought of birds (a common and pleasant morning sound), but now we also think about integrating the breathing in and out of the yoga movement. 

 

Another decision made this week is the context of our window:

The window will not be for personal use, but will be a special feature for

HOTELS

This brings also possibilities to change the sound feedback depending on the location.

For example: a seaside hotel will wake you up with sounds of waves and seagulls, while a hotel in forest surroundings will wake you up with frolicking birds.  

 

Another feature, not mentioned on the blog before is the

ACTIVATION OF THE SYSTEM:

In an earlier stage we wanted our a sensor to trace the movement of the user and then ‘call him’ with a pulsing light and/ or sound.

Now we decided to build in an alarm.

On a specific time, set by the user, the system will start ‘calling’ the user. The curtain should tempt the user to get out of bed and open her. 

 

 In the workshop we presented our ideas.

  

Now the concept idea is quite steady, everybody can go to work.

The role division still works well and were help is needed we take some time for discussion.

Team Happy3 is still Happy.   

 

 

The Plan
 

  

 
 
 
And the first testing…!
 
 

 

 

 
 

 

 

Still a lot of work to do, but it seems to be rolling! Stay tuned! 

Happy Regards, Team Happy3 

 

Week 5 – Dilemmas

Tagging 4 – Cracking the big fat nut

Howdy! It’s about time for an update from our side!

After the standalone presentation we got some pretty positive feedback and heard that we should look a bit more into our interactions. That is exactly what this nutcracking is going to be all about! We spent the last week going through literature about emotions and looking for (sort of) easy ways to measure your heartbeat. We’ve decided on taking the following for emotions into consideration: anger, fear, sadness and excitement. We’ve also brainstormed about the types of interactions you would have with your product, according to your emotion, and also how you can react to that. We decided on the following interactions: squeeze, hit, hug, shake, throw, caress and kiss. Our nutcracking is going to be all about getting all these types of interactions into one product. So how can the sensor see the difference in hug and squeeze? etc. We’re going to start simple and keep adding sensors/interactions one at a time. Hopefully we will have them all by the next presentation!

For the heartbeat detection we have decided (after getting some help from Daniel) that the best for this course would be to just buy a heart beat detector. Since this course is not about ‘how to build a heart beat sensor’ but about the interactions that come with it. We already found some good places on internet, and components for the arduino to do this. We will leave this out of consideration for this iteration, but plan to incorporate that in the next.

Last we have some pictures of brainstorms/scenario’s/etc of the last weeks!

 

 

© 2011 TU Delft