Programming error help

Can anyone help me with the error that I keep getting below? It only does it when I have processing use my USB camera.

 

 

 

 

2008-11-06 15:45:56.095 java[10117:18503] Found image sensor of class:MyHDCS1000Sensor, camera of class:MyQCExpressADriver

 

2008-11-06 15:47:47.965 java[10117:18503] *** -[NSLock lock]: deadlock (<NSLock: 0x1c116f10> ‘(null)’)

2008-11-06 15:47:47.966 java[10117:18503] *** Break on _NSLockError() to debug.

2008-11-06 15:47:48.069 java[10117:1a203] *** -[NSLock unlock]: lock (<NSLock: 0x1c116f10> ‘(null)’) unlocked from thread which did not lock it

2008-11-06 15:47:48.070 java[10117:1a203] *** Break on _NSLockError() to debug.

Memory Tracer

Overview

Memory Tracer(working title) will be an interactive audio detective game.  According to an Asian ancient belife, every object has spirit and memory.  In this game, a guest has to find out who is the muderer by picking up physical objects and listening the memory of those objects. 

 

Gameplay

CSI invented a new high technology machine.  It takes you to the past and let you hear what has happened in that space.  You also can hear a behind story of each object you pick up (like a memory).  There was a case of murder in a closed place. A guest is a detector and  you have a partner (assistance).  Your partner acts like an audio guide.  A guest put the headphone and travels the time.  You are now when the murder case happened or even before.  You hear dialogues of suspects.  You pick up the evidences and hear about the behind story of them. You can fast forward or rewind what you are hearing, and finally make a guess who is the murderer among the suspects.

 

System Diagram

 

Chanllenge

1) good /interesting stroy-line

2) find appropriate sensors for each object so that whenever the user pick up or touch it it reacts.

3) good sound recording quality

 

Project Schedule

1st week : story line and game design

2nd week : story line and game design, search for sensors

3rd week : record dialogues, compose/find music, 

4th week : make this happend~

Virtual Creature Creator

Abstract:

Idea is to create a creature physically and bring it to life virtually.

Related Works:

  • Posey

    This is a construction kit that allows user to build an interface for application running on computer.

  • Designosaur

    User can use physical parts to build a dinosaur, and the dinosaur will be displayed on modeling application.

Design Details:

User will be able to use different orientations and combination of physical parts to create a unique creature, which will come to life in Processing. User will be able to define a set of physical and behavior properties using these physical parts.  By connecting the physical pieces to Arduino board, Arduino board would send data about the orientation and combination of physical components to Processing. Processing will then create a virtual creature in an environment with different physics applied.

User can then interface with the creature in Processing.

I am also thinking about possible interaction between the physical and virtual components.

This is for fun and probably more for younger age group. I would play with it too though.


Skills List:

  • Processing, and the sending useful data from Arduino to Processing
  • Using hardware components and getting orientation/combination data

Parts List:

  • Physical parts to build creature

4-week timeline

  • Week 1: Figure out how hardware will work and look at different parts
  • Week 2: Build small scale parts to make sure concept works between hardware and Processing
  • Week 3; Build up the physical parts and the Processing application
  • Week 4: Finish project and wrap up documentation

Magic Wand

The Pitch

This wand allows you to manipulate the environment magically, you never know what you can do as your powers grow cumulatively over time. You wand glows radiently when something you can use your power on is near. Flick your wand to find out what your new powers are.

Practically Speaking

The wand is an IR sensor/reciever that interacts in a special setting– theme park, family entertainment center, mall– with special props that look innocuous but have hidden components that can be manipulated by the wand.

 

How it Works

Most of the intelligence is built into the devices that are interacted with, this makes the wands cheap, since there will be many more of them then interactions. The IR sensor on the wand “listens” for devices which can be interacted with. Different objects have different difficulty levels so some form of communication is nessecary beyond simple on/off sensing. The props sensors are networked, keeping an online total of your score.

 

Minimum

Working wand a interaction device.

 

Goals

A high form-factor wand that looks clean and polished, highly reliable sensing, and a really fun demo interaction.

 

Required materials

  • Mini-arduino
  • Force sensor
  • IR sensor/reciever
  • RFID Tags?

 

Draft Schedule

Week One: Schematic diagram device and interaction, industrial design wand, purchase components

Week Two: Build Wand electronics

Week Three: Build Interaction electronics

Week Four: Build prop

Keep me alive! Community role playing game

Abstract

The main idea of this project is to strengthen community bonds using a physical and web based game. Influenced by emotional design readings I found it interesting to involve people in a common and shared game, with no limit for expanding and exploring affective relationships among participants. People generally make effort when they feel shared responsibility and act together within the community when something depends on them. The concept of this game is it to take care of an object that depends on “its family” and vice versa. The difference between this project and existing similar projects, like tamagochi or webikinz, is that this one combines the physical and the virtual world to explore feelings and role sharing among people that use this object. The project intent is to develop affective, strategy and community competences across all ages.

Physical sketch

The objects that compose this game are called “dudies”. They have a sphere, a spring and a small box on the bottom and each part of their composition has a specific function. Participants use their own dudie, and all dudies live and act as a family. Each dudie uses a wireless connection to a computer, which is connected to a web server where all dudies share their emotions. As they live as a family, dudies depend on each other, which means that each player should play alone as part of a group and keep them alive. If a dudie dies, the whole family dies. These objects have some common characteristics such as: age; level of anger; level of happiness; self-intelligence; life. The objective of this game is that each participant has to contribute with real life actions to keep them happy and alive.

So the questions at this point are, how do humans interact with these objects and get responses? How do they understand dudies needs and operate to keep them alive?

There are some physical actions humans can operate on dudie, and that actions have real time consequences on all dudie’s lives. The actions that users can perform are:

 Users should show affectivity for dudies, touching them in a very specific way.

 

 Dudies will feel happy if they know that their owners spend time close to them.

 

 Dudies, as kids and animals love to play, so owners show dedicate some time playing with them, in a reasonable manner.

 

 Users can also put their dudies together to save the family, if they are in risk of death. This action will increase the level of happiness to the maximum.

 

 Dudies have the capability to calculate the time users spend with them.

Dudies will display the following outputs:

 Dudies will increase or decrease temperature depending on their happiness level.

 When they Dudies feel lonely they warn their owners using vibration.

– Taping the sphere make it shakes, which means that players should manage to do that, but carefully because too much taps will result in anger or fury. Anger and fury are represented using lights and/or sounds.

As we can see, these objects are able to recognize the inputs coming from the environment and transform them intelligently to display adequate outputs. These outputs are shared, which means that dudies have a shared “machine brain”.

The web server is where this “machine brain” transforms and compiles information coming from all dudies. Although they depend on the Internet to communicate feelings, each one have the capability to live isolated for a while. This means that each duddie is able to survive lonely storing and working on the last global state, and once connected it will update the global mood and “get back to the family again”.

Since there is no limit bringing dudies to the game, the system is able to self-adapt and recalculate ratio regardless the number of players. This means that there is a global emotional mood that is calculated and updated based on all dudies state. The funniest thing in this game is that you never know who is playing good or playing bad, the only thing you know is that you must do things right to keep the family alive. This may be frustrating if someone is playing good, but the objective is to build develop community bonds.

 

System diagram

 

Skills list

Learn how to use accelerometers, heating resistors, and vibration motors. Also I’ll have to learn how to use new libraries in processing and develop some programming techniques, specially using file system and/or databases. 

 

 

PROJECT DOCUMENTATION

Source code

Final paper

Proposal for Desktop Coworker

Objective:
Robots traditionally exist in the workplace strictly in the subordinate service of humans. The objective of this project is to create a small robot that fulfills a workplace role of coworker, capable of working alongside its human counterparts. Like real colleagues, this robotic coworker should be capable of accomplishing tasks (or at least appearing to), but may also be susceptible to distractions.
 
Approach:
Create a arm-like robot that sits at a desk and types on a computer keyboard. The robot occasionally looks up and stares at interesting things. Arduino will control three servo motors to position the robot, relative to positions sent from processing. Processing will receive a signal directly from a web camera mounted to the end of the “arm” and process the image, as well as verify input on the keyboard. Interesting distractions will be determined as points of high contrast and/or intensely-moving things.

Sketch + Diagram:

Precedents:
Double-Taker (Snout) by Golan Levin. A rooftop robot arm “looks at” visitors entering the Pittsburgh center of the arts. 

ASIMO Robot by Honda. Robot intended to be a humanoid aid in the workplace.

Typing Robot by “pacoliketaco” (alias). Typing robot based on cartesian axis positioning.

Skills List:
– Control multiple servo motors from an arduino
– Use trigonometry to dynamically position the end of the arm in 3D space
– Control Arduino through serial signals from Processing
– Create a virtual internal representation of the physical robot
– Use image processing to determine points of interest
– Move three-dimensionally, based on a 2-D image coordinate and the known 3-D position and angle of the camera

Parts list:
– 3 servo motors capable of supporting moderate torque.
– Macally Portable Goose Neck USB Video Web Cam (currently on order)
– piano wire and tubing for frame

Minimum Conditions:
– Robot capable of typing gibberish on a keyboard.
– Robot looks up in an arbitrary direction to simulate distraction.

Maximum Condition:
– Robot capable of composing legible text on a keyboard.
– Robot looks around for moving and shinny things.

Week 1:
– Spec and order motors
– create scale diagram of robot
– calculate positioning algorithms
– start working on image processing using built-in camera

Week 2:
– Get arduino to control 3 servos simultaneously
– Assemble robot based on diagrams
– Write and test positioning code

Week 3:
– Write higher level control functions to press keys and look in a particular direction
– Write an auto-composing text program in processing (have some experience with this already)
– Test image processing with external camera

Week 4:
– Link the auto-composing text program with the robot to type words
– Program in periodic “distraction time” for robot
– Get robot to look at areas of interest