Friday, November 2, 2012

Programmers Anonymous notes, 1110


Robot news:

Human edges out robot car on race track (BBC)

... 
The robot car in the race has been developed by researchers at the Centre for Automotive Research at Stanford University (Cars).
Called Shelley, the autonomous vehicle is fitted with sensors that work out its position on the road, feed back information about the grip of its tyres and help it plot the best route around the circuit.
Prof Chris Gerdes, head of the Cars Lab at Stanford, said Thunderhill was chosen because its 15 turns present the car's control systems with a wide variety of challenges. Some corners can be taken at high speed, some are chicanes, others are sharp and come at the end of long straights down which the car hit a top speed of 115mph (185kph).
... 
"As we set up these systems in the future, it's important not to build autonomous vehicles that are merely a collection of systems designed for human support but to think a little bit more holistically about making them as good as the very best human drivers," said Prof Gerdes. "It's not so much the technology as the capability of the human that is our inspiration now."

3D printed custom exoskelteton
Two-year-old Emma wanted to play with blocks, but a condition called arthrogryposis meant she couldn't move her arms. So researchers at a Delaware hospital 3D printed a durable custom exoskeleton with the tiny, lightweight parts she needed. 



"The current methods we have for monitoring or interacting with living systems are limited," said Lieber. "We can use electrodes to measure activity in cells or tissue, but that damages them. With this technology, for the first time, we can work at the same scale as the unit of biological system without interrupting it. Ultimately, this is about merging tissue with electronics in a way that it becomes difficult to determine where the tissue ends and the electronics begin." 
The research addresses a concern that has long been associated with work on bioengineered tissue -- how to create systems capable of sensing chemical or electrical changes in the tissue after it has been grown and implanted. The system might also represent a solution to researchers' struggles in developing methods to directly stimulate engineered tissues and measure cellular reactions.


Robot learns to recognise itself in mirror

So far the robot has been programmed to recognise a reflection of its arm, but ultimately Mr Hart wants it to pass the "full mirror test".
The so-called mirror test was originally developed in 1970 and has become the classic test of self-awareness.
More usually performed on animals, the creature is given time to get used to the mirror and is then anesthetized and marked on the face with odourless, non-tactile dye.
The animal's reaction to their reflection is used as a gauge of their self-awareness, based on whether they inspect the mark on their own body, or react as if it does not appear on themselves.
Increasingly scientists have used similar tests to analyse self-awareness in robots but none have yet programmed a robot to fully recognise itself from appearance alone.To date, only a few non-human species pass these tests, including some primates, elephants and dolphins. Human babies are unable to pass the test until they are 18 months old.



I Made the Robot Do It (NY Times, by Thomas L. Friedman)

And therein lie the seeds of a potential revolution. Rethink’s goal is simple: that its cheap, easy-to-use, safe robot will be to industrial robots what the personal computer was to the mainframe computer, or the iPhone was to the traditional phone. That is, it will bring robots to the small business and even home and enable people to write apps for them the way they do with PCs and iPhones — to make your robot conduct an orchestra, clean the house or, most important, do multiple tasks for small manufacturers, who could not afford big traditional robots, thus speeding innovation and enabling more manufacturing in America.




Thursday, November 1, 2012

Talk:User_interface#Edits_done_on_26_June_2005

http://en.wikipedia.org/wiki/Talk:User_interface#Edits_done_on_26_June_2005

" Can't say your narrowmindedness and your comfort with censorship appeals to me

So you're a Human Computer-Interaction Student? Well, well. Let me tell you who I am. I designed a touchscreen interface paradigm that has been copied the world over and covers the globe. Millions of people use it. Just about every touchscreen interface in the world borrows freely and liberally from my work, none of which was patented and none of which requires royalties. And you don't think my mention of touchscreens in a discussion of user interfaces is noteworthy. You can surely understand my position if I think you have your head up your ass.
Every time I walk into a restaurant I see people using touchscreens and the software paradigm that I developed 20 years ago. Every time I go into the post office I see touchscreens. In supermarkets, I see people using touchscreens to cash out their own purchases. In many of the new cars I see people using touchscreens. On all the new consumer electronics gear I see touchscreens replacing buttons, knobs & dials. In the library, I see people checking out materials with touchscreens. The latest airplanes all have touchscreens. And you have the nerve to dismiss all of this? It's clear that you don't get out much. There are 4,000 stories in Google News today worldwide with the word touchscreen or touch-screen in them. I don't think you have any idea what the concept of User Interface means in 2005. And you think you're going to tell us all about user interfaces on the cell phones and the airplanes they're designing and building today without talking about touchscreens? Let us all know how that goes, won't you? It should be quite a laugh. GeneMosher"

The Wikipedia User Interface entry is pretty good too, something for everyone to disagree with.

For reference, I got 25,500 results for "touchscreen" on Google News today.