Most approaches to capturing 3D models of real-world objects involve multiple cameras that are rarely cheap, and are sometimes tricky to calibrate. The University of Glasgow has developed a method that ditches those cameras altogether. Its system has four single-pixel sensors stitching together a 3D image based on the reflected intensity of light patterns cast by a projector. Reducing the pixel count lowers the cost per sensor to just a few dollars, and extends the sensitivity as far as terahertz wavelengths. Real-world products are still a long way off, but the university sees its invention as useful for cancer detection and other noble pursuits. Us? We’d probably just waste it on creating uncanny facsimiles of ourselves.
Via: New Scientist
Source: University of GlasgowRelated Posts:
If you’re attending Google I/O this week, you will be a part of an experiment from the Google Cloud Platform Developer Relations team. On its blog today, the team outlined its plan to gather a bunch of environmental information happening around you as you meander around the Moscone Center.
In the blog post, Michael Manoochehri, Developer Programs Engineer, outlines his team’s plan to place hundreds of Arduino-based environmental sensors around the conference space to track things like temperature, noise levels, humidity and air quality in real-time. This was spawned due to a fascination with wanting to know which areas of the conference were the most popular, so it will be interesting to see what the information the team gathers actually tells us.
At first glance, this seems a little bit creepy, but it’s no different than a venue adjusting the cooling system based on the temperature inside at any given moment. As with anything that Google does, this could have implications for tracking indoor events or businesses in the future, as Manoochehri shared:
Networked sensor technology is in the early stages of revolutionizing business logistics, city planning, and consumer products. We are looking forward to sharing the Data Sensing Lab with Google I/O attendees, because we want to show how using open hardware together with the Google Cloud Platform can make this technology accessible to anyone.
Notice the wrap-up of wanting to show people how open hardware combined with Google’s Cloud Platform benefits everyone. Ok, sure. What could data like this mean for businesses, though? Well, a clothing store would be able to track how many people came in and browsed, which areas of the store were hot-spots for interest and then figure out how their displays converted. It’s like real-world ad-tracking. It makes sense, but still seems a long way off.
What will be interesting is not each dataset that is collected, but what all of them tied together tell us about our surroundings:
Our motes will be able to detect fluctuations in noise level, and some will be attached to footstep counters, to understand collective movement around the conference floor.
Of course, none of this information is personally identifiable, but the thought of our collective steps, movements and other ambient output being turned into something usable by Google is intriguing to say the least…and yes, kind of creepy.
If this particular team can share all of the data it collects in an easy to digest way, then businesses will be clamoring to toss sensors all over their stores and drop the data on whatever cloud platform that will host it the cheapest. Google would like to be that platform.
During the event, the team will hold a workshop on what it calls the “Data Sensing Lab,” so if you’re interested on learning more about what the team is gathering as you walk around, this would be the place to go. You’ll also be able to see some of the real-time visualizations on screens set up throughout the conference floor.
We’ll be covering all of the action as we’re being covered by Google.
- Google|Tech Meets Blog
Epson’s 3D display glasses, the Moverio BT-100 have been floating around as a development platform for a couple years, and APX Labs is the latest to hack the headset. APX Labs is a software firm best known for creating Terminator Vision augmented reality tech for the US military, and it decided to use the BT-100 as a vehicle to develop and showcase a smart glasses platform it’s built to work for both business and consumer applications. In order to get the functionality it needed, APX grafted a 5 megapixel camera, mic and a full suite of motion sensors to provide nine-axis head tracking onto a Moverio headset.
All that gear is shoved into a 3D-printed module and attached to the BT-100 to turn it into a pair of smart glasses. In addition to the cameras and sensors, APX also hacked an Epson daughter board onto the Moverio’s controller to allow an HDMI video feed from a smartphone to be shown on the displays. This result? A system that understands where you are, what you’re seeing and hearing and a UI that allows users to glean information from the world around them using voice commands and head gestures. That should sound familiar to fans of Google Glass, but by using Epson’s binocular displays, these smart glasses can convey depth in a way Mountain View’s monocle cannot. (Not to mention that Glass doesn’t even do AR apps… yet). The hardware we got to see was a crude prototype built for demo purposes only, but the software platform shows promise and Epson’s got a version two Moverio headset in the works — so perhaps you can see a bit of the future of smart glasses in the video after the break.
- Motion|Tech Meets Blog
While Microsoft’s main investment in sensor technology has been Kinect, the software maker hasn’t ventured into wearable devices recently. With devices like Jawbone and Nike’s FuelBand soaring in popularity, and Google’s Glass set to debut later this year, there’s clearly a shift towards wearable computing in general. Speaking at Microsoft’s TechForum event this week, the company’s president of the Interactive Entertainment Business responsible for Xbox, Don Mattrick, offered his own predictions for the future of wearable tech.
Militaries want soldiers to carry an increasing amount of tech on to the battlefield, but that isn’t necessarily convenient — or comfortable. MIT and the US Army have started early work on uniforms with fiberoptic sensors that would alleviate much of that burden. By weaving in microfibers cut from a mix of specialized, fluidized materials, the partnership can build data links that cover the entire body without breaking or adding significant bulk. They could serve as basic elements of a communication system, but MIT has broader ambitions: the sensors could track wounds through heat signatures, and just might prevent friendly fire incidents by sending a don’t-shoot signal when targeted with a laser sight. The fibers still have to get much thinner before the Army can offer smart uniforms as standard issue, but the wearable tech may keep soldiers nimble and, just possibly, save a few lives.
Filed under: Wearables
Source: MITRelated Posts:
Validity Sensors, the San Jose-based maker of fingerprint scanning sensors and authentication technology, announced today that it has closed $ 10 million of a $ 20 million series E financing round. (It will close the second half in the next month.) The investment was led by TeleSoft Partners, with participation from Validity’s previous investors, including Crossslink Capital, Panorama Capital, Qualcomm Ventures and Venture Tech Associates. The round brings Validity’s total funding to $ 78.6 million.
While there are tons of security apps and password lockers that help keep mobile devices, computers and sensitive digital info secure, the prevailing form of authentication still comes in the form of good ole passwords and PINs. Of course, most people use the same password for multiple different accounts, or have a tendency to forget the complex ones login pages ask them to create.
As we’ve all learned, these forms of authentication are difficult to remember, ineffective and fairly easy to hack. With the exploding growth of mobile payment transactions and cloud-based services, new (or better) forms of security are needed to protect our data both in the cloud and on the go, especially considering the expected growth of mobile payments — and how frequently we’ll be using our phones to pay bills, receive coupons and coupons and location based offers etc in the next few years. That’s where Validity Sensors wants to enter the picture.
Validity and companies like it believe that, even with advances in multi-factor authentication technology (facial, voice, etc.), fingerprints are still the best and simplest way to verify identity. The company has developed fingerprint sensor tech that enables authentication, device login, access to digital and mobile wallets, password management, app launching and so on — for smartphones, tablets and notebooks.
In the future, this tech will move to allowing content control for home media usage and home automation and monitoring, and really access control to a wide range of things (namely robot butlers). Collectively, all these apps need a simple way to securely authenticate the user’s identity — that isn’t going away any time soon.
The company’s mobile fingerprint solution provides handset designers with a solution that can identify users, protect mobile payments and launch (and log user into) email, social networks, shopping and banking — just by swiping their finger. Partners can then integrate Validity’s technology in under-glass solutions or add it to home and power buttons on mobile devices and notebooks. Currently, Validity’s solutions support Android and Windows operating systems.
Since launching its products in 2008, Validity has shipped more than 30 million sensors to OEMs, focusing initially on PCs. More recently, it has turned its attention to the smartphone and tablet markets, and its new $ 20 million round will be used to support that push.
Another few potential up-sides for Validity? In May, the company nabbed the former head of PayPal’s mobile ecosystem, Sebastian Taveau, making him CTO.
Secondly, in July, Apple bought its largest competitor, AuthenTec, for $ 356 million. Among other things, AuthenTec is known for making fingerprint sensor chips that are embedded in computing devices to enhance security and identification — sounds familiar, right? Apple’s acquisition came about a month after the company had signed a deal with Samsung to become its security and device management partner for its Android devices.
By pushing more aggressively into the mobile space and bringing on capital from strategic, mobile and software investors, Validity hoping for comparable outcome.Related Posts:
Question by heartless: a site or book to learn robotics? or to be specific, how to put sensors in a hardware? I don’t have any idea on how hardware works,,, but need to learn and study now because of an AI project,,, is there a good site or a book to learn the basics of robotics or hardware applications like connecting sensors and making it work, like a color detecting sensor,,, or a microphone for sound detecting
thanks for any help, ^^
Answer by gc-pLook in here
which tells you about some kinds of sensors.
What do you think? Answer below!Related Posts:
While filmakers have gone gaga over huge sensor’d video cameras, there’s still a need for smaller chips and pro features — to that end, Panasonic has just announced the AG-AC90 AVCCAM. Destined for event and corporate users, it features three smallish 1/4.7-inch CMOS sensors (“3MOS” in company-speak), a 12X zoom, native 1,920 x 1,080 at 60p, 60i, 30p and 24p, and a five-axis image stabilizer. As for video quality, there’s a new “premium professional” recording mode with 28 Mbps throughput at 60p, on top of 24 Mbps and 17 Mbps modes. With two memory card slots, the camcorder supports Panasonic’s proprietary UHS-1 cards, and fortunately works with SDXC and SDHC to boot. It’s slated to arrive in “late fall 2012,” according to the company, and will ring the register at $ 2,250. So, if the first thing that pops into your mind is not DOF, but zebras, timecode and XLR inputs, check the PR for all those specs.
Filed under: Digital Cameras
Starting next month, around one thousand frontline personnel in Afghanistan will begin testing the Soldier Body Unit, a sensor kit for recording the effects of explosions on the human body. While that’s not the most pleasant of subjects, the blast sensors have been rushed out to collect as much data as possible before soldiers head home in 2014. The US Army’s Rapid Equipping Force and the Georgia Tech Research Institute, which developed the sensors, hope to gather info on concussions and traumatic brain injuries to improve aftercare. This will also be used at source to stop super-soldiers heading back out after a concussion and increasing the probability of an even worse injury. Further sensors will be carried on military vehicles, to help measure the effects of IED blasts on passengers. Adding two pounds in extra equipment probably won’t make the Soldier Body Unit too popular, but it’s thought the kit could weigh in at half a pound once it’s been refined.
Filed under: Misc. Gadgets
Question by : Will the Kinect sensors be confused by my penis if I play in the nude? I don’t want to lose a game because my penis flopped around too much and the wrong actions were performed.
Answer by Matthaha. i automatically love you!
Add your own answer in the comments!Related Posts: