Thursday, May 14, 2009

iUnika Gyy netbook weighs 1.5 pounds, will cost $176

Source: http://www.engadget.com/2009/05/14/iunika-gyy-netbook-weighs-1-5-pounds-will-cost-176/

Hey, remember the $199 Impulse TNX-9500, the "world's cheapest laptop?" Yeah, it was just the beginning. Say hello to the iUnika Gyy, which manages to shave its price down to €130 ($176) by using a slower 400MHz MIPS processor and ditching that costly XP license for Linux. Yeah, it'll run like a dog. On the other hand, just like the Impulse there's something delightfully appealing about a el-cheapo laptop that weigh just 1.5 pounds, and if the company manages to produce its promised €160 ($220) solar-powered version, we could totally find ourselves picking one up on a whim. We'll see -- it's due in July. One more pic after the break.

[Via Engadget Spanish; images courtesy of hoyTecnología]

Continue reading iUnika Gyy netbook weighs 1.5 pounds, will cost $176

Filed under:

iUnika Gyy netbook weighs 1.5 pounds, will cost $176 originally appeared on Engadget on Thu, 14 May 2009 15:36:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Read More...

ASUS Eee PC 1008HA 'Seashell' review roundup

Source: http://www.engadget.com/2009/05/14/asus-eee-pc-1008ha-seashell-review-roundup/


For those near and dear to ASUS' Eee PC netbook line, the 1008HA 'Seashell' is definitely a breath of fresh air. It doesn't look like an Eee, it doesn't feel like an Eee and it doesn't boast a replaceable battery like an Eee; needless to say, only two of those three facts were lauded by reviewers across the web. Much like Apple's MacBook Air, the battery in this here machine is not user-serviceable, and while tests proved that it could last well over three hours with "normal" use, ASUS has yet to make clear what plans it has for offering replacements. In any case, most everything else about the machine was found to be on par or above, with performance being satisfactory for basic tasks and the keyboard / trackpad being exceptionally yummy. Still, it feels as if ASUS is charging a bit much for a familiar lineup of internals, but those willing to pay for style should definitely take a closer look.

Read - Trusted Reviews ("a very refined and classy netbook")
Read - T3 ("a good all-round package")
Read - CNET UK ("great styling and a relatively light chassis")
Read - Bit-Tech ("definitely worth considering, but looks come at a cost")
Read - WhatLaptop ("a compelling proposition")
Read - PCPro ("If you don't mind paying a premium for fine design, then the Seashell is a tantalizing prospect")

Filed under:

ASUS Eee PC 1008HA 'Seashell' review roundup originally appeared on Engadget on Thu, 14 May 2009 10:59:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments

Read More...

Wednesday, May 13, 2009

Task.fm Turns Natural Language Commands into Future Reminders [Task Management]

Source: http://feeds.gawker.com/~r/lifehacker/full/~3/sTQclCaJZmM/taskfm-turns-natural-language-commands-into-future-reminders

Task.fm is a simple web application that turns your natural language commands into email and SMS reminders.

Task.fm takes commands like "meet with Jim tomorrow" or "Replace fish tank filter in 21 days," and converts those commands into future reminders. The reminders will be emailed or sent to you via SMS. The email reminders are free but the SMS reminders require credits with the service—100 messages cost $8. The fee structure isn't outrageous, but we're in agreement that nothing sounds as good as free.

The language engine does have some shortcomings, as well. It doesn't parse commands like "every other" or "next Monday," which makes it less convenient for creating repeating reminders. Regardless of those language hiccups, Task.fm accepted the majority of our test reminders without a problem. If you have a favored service for generating email or SMS-based reminders, sound off in the comments below.



Read More...

Cheap DIY Wi-Fi Tethering Dongle for Your DSLR [DIY]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/Nj4qBdqKnhM/cheap-diy-wi+fi-tethering-dongle-for-your-dslr

For those who can't afford an $800 wireless transmitter for your camera but need one to quickly transmit photos from your DSLR to your computer, here's a DIY wireless tethering solution that costs under $40.

Using a wireless USB tether—specifically, a Cables Unlimited Wireless Adapter Kit—Peter Tsai, a professor in photography, created an easy and cheap tethering module that supposedly seamlessly worked with his Nikon DSLR. Apparently, it also could transfer photos over Wi-Fi from his camera to his computer even quicker than an official $800 WT-4a transmitter. Although it took slightly longer for the dongle to sync with his computer, once connected, it was reportedly able to transfer photos shot in RAW in eight seconds and JPEG photos in four. Tsai also said you could use Nikon's Camera Control 2 software on your computer to remotely control your camera.

However, Tsai pointed out that this particular hack only works with PCs, and that the particular wireless adapter kit needed a bulky AC power brick for it to work. Although he was able to solder a 4-AAA powerpack to the kit, he says he is still looking to fix the problem, and hopefully create an encasing for his homemade adapter to keep it contained and make it into a camera handgrip. [PeteTek via Wired]



Read More...

Giz Explains: GPGPU Computing, and Why It'll Melt Your Face Off [Giz Explains]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/QcfmVfYBayQ/giz-explains-gpgpu-computing-and-why-itll-melt-your-face-off

No, I didn't stutter: GPGPU—general-purpose computing on graphics processor units—is what's going to bring hot screaming gaming GPUs to the mainstream, with Windows 7 and Snow Leopard. Finally, everbody's face melts! Here's how.

What a Difference a Letter Makes
GPU sounds—and looks—a lot like CPU, but they're pretty different, and not just 'cause dedicated GPUs like the Radeon HD 4870 here can be massive. GPU stands for graphics processing unit, while CPU stands for central processing unit. Spelled out, you can already see the big differences between the two, but it takes some experts from Nvidia and AMD/ATI to get to the heart of what makes them so distinct.

Traditionally, a GPU does basically one thing, speed up the processing of image data that you end up seeing on your screen. As AMD Stream Computing Director Patricia Harrell told me, they're essentially chains of special purpose hardware designed to accelerate each stage of the geometry pipeline, the process of matching image data or a computer model to the pixels on your screen.

GPUs have a pretty long history—you could go all the way back to the Commodore Amiga, if you wanted to—but we're going to stick to the fairly present. That is, the last 10 years, when Nvidia's Sanford Russell says GPUs starting adding cores to distribute the workload across multiple cores. See, graphics calculations—the calculations needed to figure out what pixel! s to dis play your screen as you snipe someone's head off in Team Fortress 2—are particularly suited to being handled in parallel.

An example Nvidia's Russell gave to think about the difference between a traditional CPU and a GPU is this: If you were looking for a word in a book, and handed the task to a CPU, it would start at page 1 and read it all the way to the end, because it's a "serial" processor. It would be fast, but would take time because it has to go in order. A GPU, which is a "parallel" processor, "would tear [the book] into a thousand pieces" and read it all at the same time. Even if each individual word is read more slowly, the book may be read in its entirety quicker, because words are read simultaneously.

All those cores in a GPU—800 stream processors in ATI's Radeon 4870—make it really good at performing the same calculation over and over on a whole bunch of data. (Hence a common GPU spec is flops, or floating point operations per second, measured in current hardware in terms of gigaflops and teraflops.) The general-purpose CPU is better at some stuff though, as AMD's Harrell said: general programming, accessing memory randomly, executing steps in order, everyday stuff. It's true, though, that CPUs are sprouting cores, looking more and more like GPUs in some respects, as retiring Intel Chairman Craig Barrett told me.

Explosions Are Cool, But Where's the General Part?
Okay, so the thing about parallel processing—using tons of cores to break stuff up and crunch i! t all at once—is that applications have to be programmed to take advantage of it. It's not easy, which is why Intel at this point hires more software engineers than hardware ones. So even if the hardware's there, you still need the software to get there, and it's a whole different kind of programming.

Which brings us to OpenCL (Open Computing Language) and, to a lesser extent, CUDA. They're frameworks that make it way easier to use graphics cards for kinds of computing that aren't related to making zombie guts fly in Left 4 Dead. OpenCL is the "open standard for parallel programming of heterogeneous systems" standardized by the Khronos Group—AMD, Apple, IBM, Intel, Nvidia, Samsung and a bunch of others are involved, so it's pretty much an industry-wide thing. In semi-English, it's a cross-platform standard for parallel programming across different kinds of hardware—using both CPU and GPU—that anyone can use for free. CUDA is Nvidia's own architecture for parallel programming on its graphics cards.

OpenCL is a big part of Snow Leopard. Windows 7 will use some graphics card acceleration too (though we're really looking forward to DirectX 11). So graphics card acceleration is going to be a big part of future OSes.

So Uh, What's It Going to Do for Me?
Parallel processing is pretty great for scientists. But what about ! those re gular people? Does it make their stuff go faster. Not everything, and to start, it's not going too far from graphics, since that's still the easiest to parallelize. But converting, decoding and creating videos—stuff you're probably using now more than you did a couple years ago—will improve dramatically soon. Say bye-bye 20-minute renders. Ditto for image editing; there'll be less waiting for effects to propagate with giant images (Photoshop CS4 already uses GPU acceleration). In gaming, beyond straight-up graphical improvements, physics engines can get more complicated and realistic.

If you're just Twittering or checking email, no, GPGPU computing is not going to melt your stone-cold face. But anyone with anything cool on their computer is going to feel the melt eventually.




Presented By:


Smart. It's the new speed. Introducing the new Intel® Xeon® Processor 5500 Series.
It adapts to low workloads so you and your servers can use less energy.
That's the new IT intelligence.
See why information technology is now intelligent technology.

www.intel.com/business/xeon/index.htm
 

Read More...