<bgsound src="http://images.jian2587.multiply.com/playlist/3/1/full/U2FsdGVkX192IlbpiMF8r3F2BmqRKJ,Ik7F0cyknCak=/infernal%20affairs.m3u" type="audio/mpeg">

Tuesday, January 30, 2007

Ethelene - 003

Sunday, January 28, 2007

Ethelene - 002

Saturday, January 27, 2007

Ethelene - 001


this comic is hence tentatively named 'ethelene', after a friend of mine.

Tuesday, January 23, 2007

Heroine

There, our first heroine.

Edit: A comic-ish version.




edited: P/S: I value comments.

Sunday, January 21, 2007

Multi touch screen display

Here are some video clips which demonstrate the capabilities and possibilities you can achieved with a multi touch screen display, currently one of the most active research pursued by computer scientists involved in haptic interface research.

This one is created by Jefferson Han, who apparently drop out from his senior year in Cornell University to join a startup company. He's now a research scientist in NYU. This piece of technology is based upon frustrated total internal reflection, an optical phenomena he took advantage of.


here's another version, with a transparent screen. It's called Touchlight. This one is created by Andy Wilson, who's part of Microsoft Research. He got BA in CS from Cornell University, and his PhD from MIT Media Labs.




Here's another one which Wilson made. It's called PlayAnywhere.





I'd have to say Jeff's gives faster response time, which is very crucial in creating an input device. Wilson's touchlight is good and novel because it uses a transparent screen, but as we can see from the demo, there's not much applications which exploit such features. Jeff's gives lots of demos and really open up the many possibilities of taking advantage of this novel and new form of human computer interaction.


One thing I should insist that, all these technologies have existed for some time, but due to many factors including market acceptance and technology maturity, they're often delayed years and even decades.


So my point is, at this point of time, many futuristic technologies are already available, waiting to be crafted into novel applications, or to be perfected.


Allow me to digress. In the 60s, we possess the theoretical technology to allow us to travel up to nearly 0.1c. This is very impressive, considering the difference from the speed of light is just a single order of magnitude. Nevertheless, that technology made use of nuclear warheads, and that means possible pollution and hazardous radiation to lives on earth as the rocket makes its way to the space. The technology has since been abandoned, and we still rely on chemical energy, an energy form we've been using since we started off as boiling proteins and nucleic acids in some random unknown warm pool during primordial times. Now that it's 21st century, I expect we should've reached the point of making good use of nuclear energy and of course, finding novel and safe ways to dispose off the radioactive fuel rods. But alas, our progression to type I civilization is halted. Despite the potential danger to environment, I still believe it's up to technological advancement and a leap of faith to make that transition into an advanced yet peaceful and safe civilization.

Motion Detect

Here's a snapshot of the things I played to investigate motion detection. I was sort of hoping to find some invariance in shapes or my hand, but alas, not a single one that is reliable enough is found. I hope to be able to control things on my screen by just waving my hand infront of the webcam. I've several ideas, each one is different, depending on how I want to design the interaction. If I just want a really simple back/forward instruction, I can look for blobs at the bottom and on the sides of screen. First I find difference of two successive frames, filter off the noise or lone pixel changes (I want to look for big changes), then find biggest Connected Component either by means of DFS or BFS, and find the major/minor axis using very simple approximation, and probably get the centroid as well. So we've a matrix of biggest motion here, along with the mean positions. Then within the interval 0.6 plus minus 0.2 seconds, I'll look for another calculated mean position that differ from the previous one by a fixed percentage, and still another mean position that's roughly the same as the first one. This approximates to a hand lifting from the bottom (the first motion matrix), to the side (the second motion matrix), and then to bottom again (the third motion matrix). Of course, to avoid other motions, such as body/head and people walking by, we've to make some assumptions here. The user will not move too much. The hand movement is usually from the bottom then to the side and then to the bottom again (if we only look for such patterns, it should reduce lots of false positives), it will generate the largest motion, and set a threshold for pixel change between frames for that biggest connected component.


There's also another idea, which basically try to identify which pixels amongst all are foreground/background. This will ensure only objects close enough to the webcam will be able to affect the on screen objects. This has the added benefit of not constraining the hand shape/movement, in addition to using other objects to control on screen objects, simply by waving it near to the webcam. This is hard, however, because we're approximating objects' distance from a single monocular frame, and within such a short interval, about 0 to roughly 1 feet.

Another one is to use Vonoroi diagrams. However, differing lighting conditions, reflections, different skin color, user possibly wearing gloves/long sleeves, and the proximity of the hand's skin color to the head's and body's mean this method is inaccurate, if it works at all.

Another method is to detect edges, then connect regions with small color difference. Of all the regions, look for one that closely approximates the shape of a hand. The processing needed is quite a lot, and this impose a shape restriction to the hand, which after some time, the hand will not be very comfortable.

MF267

Pilot: Flight MF267 on approach. Requesting permission to land.
ATC: Roger flight MF267. Clearing runway 13. Please remain airbourne at 2000 feet.
Pilot: Roger that. Fuel's running very low. Please be quick.
ATC: We got that. Please wait for another 2 minutes.
Pilot: Roger control tower. Approaching traffic circuit.
ATC: Roger flight MF267. Remain at 2000 feet. Traffic circuit is congested.
Pilot: Roger that.
ATC: Runway cleared. Flight MF267, you are cleared to land.
Pilot: Roger that. I'm off the runway. Joining the traffic circuit.
ATC: Roger flight MF267. We sense a crosswind at 43 degrees relative to runway, about 50 knots.
Pilot: Roger that. Control's a bit hard here. Joining downwind.
ATC: We got you.
Decreasing engine power. Deploying 2nd level flaps. Now deploying gear.

Pilot: Control tower, my right gear is stuck. Please advise.
ATC: Roger that. You are too low now. Regain altitude and knock off the gear.
Pilot: Roger control tower. Leaving traffic circuit.
Pilot: Control Tower! My gear is still stuck. Fuel is running out. Please advise!
ATC: Roger that. Please remain calm.
Pilot: Control Tower! Right engine losing power! Can't control!
ATC: Roger that. Retract your flaps. Lose some altitude to regain airspeed.
Pilot: Roger Control Tower! Flaps retracted. Regaining control! Gear still stuck! In Base now! Checkerboard in sight! Can't risk crash and burn! Control tower please advise!
ATC: Please remain calm. You have to land with single gear.
Pilot: Roger that! Crosswind is strong. Right engine will hit the ground!
Pilot: Engine shut down! Losing power, all systems down!
ATC: Roger that. Turbine can't generate enough power at current airspeed. Your plane is fly-by-wire. You will lose control in a moment.

Aligning at 45 degrees off the runway. Deploying 2nd level flaps. 70 feet. 50 feet. Pulling ailerons. Re-aligning with runway. 20 feet. 10 feet.
Pilot: Control tower! Right engine hit the ground! I repeat! Right engine hit the ground!
ATC: Roger pilot. We see that. Deploying fire bridgades, and..
ATC Officer: Oh my gosh...he's going to tilt over!
Pilot: Control Tower...help!! we

Deafening boom staggers the control room. Glasses shattered into pieces. All the screens went black.
.