A Google Glass developer with a clear vision of what Glass can be, Brandyn White sees how Glass can be a force for good with a feature that Glass doesn't even have yet. Brandyn White's eye tracking peripheral could become part of the next generation of Google Glass. (Credit: James Martin/CNET) Brandyn White's adventures hacking on Google Glass began not in a fancy Silicon Valley lab, but in a St. Petersburg, Fla., car repair shop in the mid-1990s. His dad gave him a Tandy personal computer. The classic Windows desktop tower, "which was old then," White said with a laugh, was part of a payment his dad had received for fixing a customer's car. Limited in what he could do with the Tandy, White soon picked up a programming book from his school library. He was 10. By the time he was a teenager, White had started a company called Connor Software, which involved him "knocking off" -- his words -- other software and giving it away for free. Related stories: Restaurant on Google Glasser: Man-child stinking up the joint Xbox One, you're not making my family life easier Google lets developers look into Glass Google Glass developers: We're still flying half-blind Google Glass may have found a prescription lens partner Tech Retrospect: Xbox One launches and Samsung loses again Xbox One: 20 things you need to know "Nothing's changed, I still have the same mentality," he told CNET after a Google Glass hackathon in San Francisco. But also by that time, the open source fan had received several other now-classic 1990s computer towers from his dad. White's mom was not happy with their blocky, beige appearance and told him to reduce their numbers to one. Since they were running servers and he unwilling to get rid of them, White got into casemodding to make them more visually appealing. Then the teenaged White learned how to run a business, he gained at 16 while managing Florida's Pinellas County Credit Union. Fast-forward two decades, and you get the current Brandyn White: a 27-year-old programmer working toward his Ph.D in computer science at the University of Maryland; co-founder of the computer vision consulting firm Dapper Vision; and developer and hacker who wants to change the world through Google Glass. To that end, he's built an eye tracker hardware attachment for Glass. It adds eye-tracking features that the current Explorer Edition lacks, but he thinks it's destined to be much more than a kludgy prototype Glass peripheral. Question: How did your interest in Glass lead you to eye tracking? White: I've been very interested in wearables for a long time, and I'm interested in doing automatic recognition in visual systems. You have this problem where you want to use machine learning, where you want lots and lots of little things. You need to know when you're in the kitchen, in the grocery store, when you're [physically] picking something up in the grocery store. "Right now, there's no way to control Glass that's acceptable in a large group of people." --Brandyn White, Google Glass developer I'm interested in solving the meta problem of how do you find things, find everything. My [Ph.D] research project is knowing everything about what you're doing, so that once I have that information I can add a lot of value. You just need to tack on to that data. People have the sense that Glass is the creepiest thing in the world, so I want to build a database that's on their side. It has strong cryptography, it's open source, and people know how their data is being used. I want this to be your advocate, your agent. We want to set it up so that if we ever wanted to be evil, somebody could just fork it and set up something better. Let's talk about why you built an eye-tracker for Google Glass. Why is it important? White: If you want things to line up between the real world and augmented reality, you have to know where the person is looking. All that information gives you an intimate understanding of where they are, just using the accelerometer. When it comes to wearables, not everybody can do wristbands, but many people can wear Glass. It's very hackable, it's very powerful. Right now, there's no way to control Glass that's acceptable in a large group of people. Touching it [via the touchpad on the side] shows that you're not paying attention. You can't use voice commands, or pull out your phone. Eye tracking, though, that's a use-case scenario. Can't Google Glass be controlled with blinks? Isn't that the same as eye tracking? White: Blinks is a private feature [that most developers can't program for] but it's public enough that I can talk about it. It has an infrared emitter on the inside and proximity sensor, but you can modify it to do things like get the ambient reflection off your face, such as knowing when you blink. But you can't get real eye-tracking. Google releases Glass Development Kit (pictures) 1-2 of 14 Scroll Left Scroll Right Blinks are not as good as looking at something with your eye. Using eye gestures is much simpler. What's the benefit of eye gestures? What can they do that other input methods can't? White: I'm interested in helping people with disabilities. I've been trying to push computer vision researchers into accessibility more. If people took what they already know, it'd make an enormous difference. But there's no money in it, no grant money. The reason that visual impairments are important to me is that in my background, computer vision, gets used for surveillance. And surveillance makes me feel bad. I've built technology that can detect things in the world. If I could tell you that there's a couch in front of you, that's almost never useful to you. Even if you're blind, you probably know that it's there. "I'm putting myself into a position where I'm not making any money, so I've been making my money consulting. I'd prefer to stay independent." --Brandyn White, Google Glass developer But if I could use the aggregate data, that could be useful. And for the visually impaired, that in turn could be useful. A sighted person could wear Glass and use it to identify all the objects in a room, couch and keys and table and remote control. Then if a visually-impaired person had it, Glass could tell them where things are. Or the visually-impaired person could use Glass to remember where he put his keys down. So people who are unable to use their hands or voice can still use the Internet, through an eye-tracking enabled Glass. How did you get into building the eye-tracking hardware? White: One of my colleagues in the media lab introduced me to the Pupil Project, which has the goal of building an open source eye tracker. I took their ideas and designs, and I built a much better tracker that they're now going to use. We basically took the two guys that built this device previously and they're going to start working with us on the next version. The eye tracking hardware only cost $25 for White to build. (Credit: James Martin/CNET) Right now it has to plug in to a host computer. We also have to do something to augment the power. So if you can imagine it looking just like Glass, it allows for very natural gestures. We wouldn't have to do the head twitch [the Google Glass activation gesture.] It's big and clunky and enormous [relative to Glass itself.] It works, and it cost $25. The goal of doing it was to see what it could do. I don't know that Google is going to have anything to do with it for sure, but if it's successful, if it shows that it's useful and easier to use, then it's serves its purpose. What's your stake in this? White: My goal is impact, not making money. Money is nice. I could work for Google and add a new feature to Glass, or I could make a video and impact the next version of Glass. I'm putting myself into a position where I'm not making any money, so I've been making my money consulting. I'd prefer to stay independent. I'm part of the Glass research program, so I get fairly close access. I've already impacted the next version of glass. I put out a video [two weeks ago] showing eye tracking. Everybody thinks Glass has it, but it doesn't. The reason I put it out was to show that it can be done, and it can be useful. White believes that there are several important use-cases for eye tracking, including eye gesture controls and eye-position based responses from Glass. (Credit: James Martin/CNET) Does Google Glass with eye-tracking constitute any kind of privacy violation? Won't this mean that Google just knows even more about you? White: Right now the rules say that you can't have ads on anything, but it's unavoidable. People are going to market back to you some other way. So, ads on Glass are unavoidable? White: The worst thing that would happen in Glass would be to siphon off user data and sell them on things. That would kill off wearable computing. I want people to have applications on their device that know everything about their lives, and not be creepy about it. It's not a bad thing that Glass can see everything that I can see, it just require a higher level of trust. It should go from I can check my email, which I can already do, to, "Oh, I can see how this can make my life better!"

Posted by : Unknown Friday, November 29, 2013

A Google Glass developer with a clear vision of what Glass can be, Brandyn White sees how Glass can be a force for good with a feature that Glass doesn't even have yet.




Brandyn White's eye tracking peripheral could become part of the next generation of Google Glass.


(Credit: James Martin/CNET)

Brandyn White's adventures hacking on Google Glass began not in a fancy Silicon Valley lab, but in a St. Petersburg, Fla., car repair shop in the mid-1990s.


His dad gave him a Tandy personal computer. The classic Windows desktop tower, "which was old then," White said with a laugh, was part of a payment his dad had received for fixing a customer's car. Limited in what he could do with the Tandy, White soon picked up a programming book from his school library. He was 10.


By the time he was a teenager, White had started a company called Connor Software, which involved him "knocking off" -- his words -- other software and giving it away for free.



"Nothing's changed, I still have the same mentality," he told CNET after a Google Glass hackathon in San Francisco.


But also by that time, the open source fan had received several other now-classic 1990s computer towers from his dad. White's mom was not happy with their blocky, beige appearance and told him to reduce their numbers to one. Since they were running servers and he unwilling to get rid of them, White got into casemodding to make them more visually appealing.


Then the teenaged White learned how to run a business, he gained at 16 while managing Florida's Pinellas County Credit Union. Fast-forward two decades, and you get the current Brandyn White: a 27-year-old programmer working toward his Ph.D in computer science at the University of Maryland; co-founder of the computer vision consulting firm Dapper Vision; and developer and hacker who wants to change the world through Google Glass.


To that end, he's built an eye tracker hardware attachment for Glass. It adds eye-tracking features that the current Explorer Edition lacks, but he thinks it's destined to be much more than a kludgy prototype Glass peripheral.


Question: How did your interest in Glass lead you to eye tracking?

White: I've been very interested in wearables for a long time, and I'm interested in doing automatic recognition in visual systems. You have this problem where you want to use machine learning, where you want lots and lots of little things. You need to know when you're in the kitchen, in the grocery store, when you're [physically] picking something up in the grocery store.


"Right now, there's no way to control Glass that's acceptable in a large group of people."

--Brandyn White, Google Glass developer


I'm interested in solving the meta problem of how do you find things, find everything.


My [Ph.D] research project is knowing everything about what you're doing, so that once I have that information I can add a lot of value. You just need to tack on to that data.


People have the sense that Glass is the creepiest thing in the world, so I want to build a database that's on their side. It has strong cryptography, it's open source, and people know how their data is being used.


I want this to be your advocate, your agent. We want to set it up so that if we ever wanted to be evil, somebody could just fork it and set up something better.


Let's talk about why you built an eye-tracker for Google Glass. Why is it important?

White: If you want things to line up between the real world and augmented reality, you have to know where the person is looking. All that information gives you an intimate understanding of where they are, just using the accelerometer.


When it comes to wearables, not everybody can do wristbands, but many people can wear Glass. It's very hackable, it's very powerful. Right now, there's no way to control Glass that's acceptable in a large group of people. Touching it [via the touchpad on the side] shows that you're not paying attention. You can't use voice commands, or pull out your phone.


Eye tracking, though, that's a use-case scenario.


Can't Google Glass be controlled with blinks? Isn't that the same as eye tracking?

White: Blinks is a private feature [that most developers can't program for] but it's public enough that I can talk about it. It has an infrared emitter on the inside and proximity sensor, but you can modify it to do things like get the ambient reflection off your face, such as knowing when you blink. But you can't get real eye-tracking.



Google releases Glass Development Kit (pictures)


1-2 of 14


Scroll Left Scroll Right



Blinks are not as good as looking at something with your eye. Using eye gestures is much simpler.


What's the benefit of eye gestures? What can they do that other input methods can't?

White: I'm interested in helping people with disabilities. I've been trying to push computer vision researchers into accessibility more. If people took what they already know, it'd make an enormous difference. But there's no money in it, no grant money.


The reason that visual impairments are important to me is that in my background, computer vision, gets used for surveillance. And surveillance makes me feel bad. I've built technology that can detect things in the world. If I could tell you that there's a couch in front of you, that's almost never useful to you. Even if you're blind, you probably know that it's there.


"I'm putting myself into a position where I'm not making any money, so I've been making my money consulting. I'd prefer to stay independent."

--Brandyn White, Google Glass developer


But if I could use the aggregate data, that could be useful. And for the visually impaired, that in turn could be useful. A sighted person could wear Glass and use it to identify all the objects in a room, couch and keys and table and remote control. Then if a visually-impaired person had it, Glass could tell them where things are. Or the visually-impaired person could use Glass to remember where he put his keys down.


So people who are unable to use their hands or voice can still use the Internet, through an eye-tracking enabled Glass. How did you get into building the eye-tracking hardware?

White: One of my colleagues in the media lab introduced me to the Pupil Project, which has the goal of building an open source eye tracker.


I took their ideas and designs, and I built a much better tracker that they're now going to use. We basically took the two guys that built this device previously and they're going to start working with us on the next version.



The eye tracking hardware only cost $25 for White to build.


(Credit: James Martin/CNET)

Right now it has to plug in to a host computer. We also have to do something to augment the power. So if you can imagine it looking just like Glass, it allows for very natural gestures. We wouldn't have to do the head twitch [the Google Glass activation gesture.]


It's big and clunky and enormous [relative to Glass itself.] It works, and it cost $25. The goal of doing it was to see what it could do. I don't know that Google is going to have anything to do with it for sure, but if it's successful, if it shows that it's useful and easier to use, then it's serves its purpose.


What's your stake in this?

White: My goal is impact, not making money. Money is nice. I could work for Google and add a new feature to Glass, or I could make a video and impact the next version of Glass.


I'm putting myself into a position where I'm not making any money, so I've been making my money consulting. I'd prefer to stay independent.


I'm part of the Glass research program, so I get fairly close access. I've already impacted the next version of glass. I put out a video [two weeks ago] showing eye tracking. Everybody thinks Glass has it, but it doesn't. The reason I put it out was to show that it can be done, and it can be useful.



White believes that there are several important use-cases for eye tracking, including eye gesture controls and eye-position based responses from Glass.


(Credit: James Martin/CNET)

Does Google Glass with eye-tracking constitute any kind of privacy violation? Won't this mean that Google just knows even more about you?

White: Right now the rules say that you can't have ads on anything, but it's unavoidable. People are going to market back to you some other way.


So, ads on Glass are unavoidable?

White: The worst thing that would happen in Glass would be to siphon off user data and sell them on things. That would kill off wearable computing. I want people to have applications on their device that know everything about their lives, and not be creepy about it.


It's not a bad thing that Glass can see everything that I can see, it just require a higher level of trust. It should go from I can check my email, which I can already do, to, "Oh, I can see how this can make my life better!"



Translate

Like fanpage

Popular Post

Blog Archive

Powered by Blogger.

- Copyright © News and design logo -Metrominimalist- Powered by Blogger - Designed by Johanes Djogan -