When they’re not breaking world records, fuel-hating Wave Glider seabots like to indulge in other hobbies, like shark tracking. One of the vessels has just been launched off the coast near San Francisco (vid after the break), adding a mobile worker to the existing local network of buoy-mounted receivers. They monitor the movements of electronically tagged sea life, including the fearsome Great White, picking up signals within a 1,000-foot range while researchers from Stanford University analyze the data from the safety of the shore. Better still, the free Shark Net iOS app gives anyone the chance to track these things, and activity should increase as the monitoring network (hopefully) expands along the west coast and more bots are introduced. You didn’t think the world’s fascination sharks was limited to only a single single week, did you?
Kinect’s depth image is processed to obtain a 3D point cloud of the scene which is then used to follow the object as it moves (in this case, a moving box). The calculated transformation is sent to a CAD visualization tool where a virtual object performs the same movements as those done by the real object.Related Posts:
IC Realtime systems are renowned for ease of access across the complete range of ICT media. This provides our customers with a truly global reach and multiplicity of choice through which both audio and visual access may be gained to a specific networked camera location. Access to these systems is rigorously controlled through ‘Log-In’ and ‘User Password’ protocols. Our advanced video surveillance systems are capable of communicating with both Mac and PC platforms. Once networked the systems can be accessed through our software by means of a standard Web Browser, mobile phone including IPhone, Android, Symbian, Nokia and Blackberry, iPad, PC and Android tablets. IC Realtime systems are also programmed to communicate with licensed alarm receiving centres through the Sentinel Plus software programme. Our systems are designed to provide our customers with fantastic choice and flexibility of movement irrespective of your global position and whether you are at home, at work, on vacation or travelling.Related Posts:
Wind Maps is a website that allows you to see the current (or past) wind patterns in the U.S. This picture doesn’t really do it justice because on the website all those little lines are moving in spirals and being all trippy. It’s something you could definitely get high and watch if you were desperate and forgot there was any other site on the internet. Alternatively, if you just want to know which way the wind’s blowing, here’s what you do: spit (NOT shit). If it comes back and hits you the winds blowing towards you. If the wind breaks right behind you, you farted. If somebody says something it probably wasn’t silent. That or it smells. Blame someone else and make a quick exit (I recommend a cape flourish/smoke bomb combo).
Wind Maps (click a day or ‘view the live map’ to see one in action) via A mesmerizing, real-time map of US wind patterns [io9]
Thanks to Jaucet and blitz, who agree you don’t need a weatherman to know which way the wind blows (just throw a handful of torn grass).Related Posts:
With real-time translation of text common on the web and instantaneous speech-to-text gaining popularity, it seems that transliteration is cool again. But less obvious, and more difficult, methods of input are yet to be implemented. Case in point: sign language. The complicated and often contextual gestures form a vast visual vocabulary that isn’t easily captured or interpreted.
A team of British researchers, however, is making the attempt, creating a tool that translates a set of standard signs into readable text, in real time. It’s called the Portable Sign language Translator, and it should be out next year.
The signer would gesture as normal towards a camera on a phone or PC, and it would instantly translate based on a database of signs. Right now they are planning to support British Sign Language, but the system is perfectly capable of handling ASL, Makaton, and international languages and alphabets.
It is possible, however, that the static set of known symbols may still be limiting to signers, so the app will also allow the user to create their own signs for more complicated or personal objects.
The obvious application is for day-to-day communication between someone who cannot speak and someone who cannot understand sign language. But a visual, gestural language could be useful in other situations as well, and not just to people with disabilities. Multimodal communication is becoming the standard for interacting with our technology, and while heretofore we have communicated largely with inorganic tools, so to speak, such as the mouse and keyboard. Directly interacting with a machine that understands our voice, gestures, and position is going to produce extremely rich interaction methods in the future.
In the mean time, the app is being developed by Technabling, a company spun off from the University of Aberdeen. They plan to release it as a product next year, though there is no word of platforms or price. It is being funded by the UK’s Department for Business, Innovation and Skills and the Small Business Research Initiative.
You’re constantly feeling stressed out but there’s no way to prove or quantify it? Then this mini stress meter Professor Nitta from the Tokyo Metropolitan University has developed might do the trick for you. The device is basically a pulse-wave sensor and a modified computer mouse rolled into one.
The way it works is that users place their finger on the mouse for 10 seconds, let the meter measures stress by analyzing the blood flow in the user’s fingertip and then let the system analyze the variation. The stress level is displayed on a computer screen in real-time on a four level scale.
Regarding the definition of the term “stress”, professor Nitta explains:
You may wonder if what we’re measuring is really stress. In this regard, the clearest indicator of stress is the amount of hormones in the blood. Data from such blood analysis has about a 70% correlation with the results of our software. So it’s probably fair to interpret this measurement as an indicator of stress, like a blood test.
This video, shot by Diginfo TV (in English), provides more insight:
Via Japan Probe
Continue reading Facebook adds real-time ‘ticker’ to overhauled news feed, donates old layout to science (video)
PermalinkRelated Posts: | | Email this | Comments
Lovers often split up and then get back together. According to Mashable, however, the tiff between Google and Twitter over Realtime Search is taking on a cold air of finality — even though it seemed kinda temporary at the time. The Big G just reiterated plans to restore its social networking search function, based on Google+ and “other sources,” but it made no mention of its former sweetheart. Oh well, a wise person once told us that when it comes to relationships, you should never press rewind.
Permalink |Related Posts: | Email this | Comments
I am extremely unmusical (on the verge of being tone-deaf) so I can’t decide whether this new iPhone app is good or bad: Japan-based musical instrument maker Kawai has developed a camera app that scans music notes printed on paper and plays them back in real-time. Dubbed Gakufu Camera [JP], the app is said to be the first of its kind.
Kawai claims the app also works with handwritten notes, those printed in different colors and under weak lighting. Gakufu Camera also offers a few other bells and whistles, for example a function that allows you to store the notes you scanned first and play the melody afterwards.
Gakufu Camera is only available in the Japanese App Store at the moment for iOS 4.0 and up (price: 350 yen/$ 4.50). But as Kawai is a global company and the app is already available in Japanese and English, expect it to hit other markets rather sooner than later (we’ll keep you posted).
This video shows the app in action (explanations in Japanese, but music fans will get it, I am thinking):
Via Asiajin via IT Media [JP]
So, here’s the skinny — when SceneTap launches in a month or so, it’ll provide Android and iOS users with a frightening amount of analysis before you hit the town. As the story goes, the startup will be tapping into an infrastructure of cameras spread across an untold quantity of bars. The goal? To provide a real-time snapshot of what the demographics are at any location on any given night. According to the company, demographic information, social commentary and “other comprehensive features” will be shown, all of which will help people decide where they’d like to go. For the privacy freaks, they’ll (hopefully) be comforted by the fact that no actual recording is going on, and each person is tracked anonymously. Hailed as a “new type of social network,” SceneTap will initially cover 50 clubs, and of course, there’s no DUI checkpoint feature for those hoping to do something as impractical as drink and drive afterwards. Head on past the break for the rest of the deets, and be sure to ping the company if you’re hoping for a Snooki Sighting push alert in version 2.0.
Continue reading SceneTap app analyzes pubs and clubs in real-time, probably won’t score you a Jersey Shore cameo
PermalinkRelated Posts: | | Email this | Comments