Friday, January 16, 2009

Intel's Barrett on Paranoia, the Core Craze and the End of Gigahertz [Interview]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/58xQMcXjtn0/intels-barrett-on-paranoia-the-core-craze-and-the-end-of-gigahertz

At first, Intel chairman Craig Barrett struck me as a testy old dude.

This would be fair, considering his company was about to announce a sudden 90% plunge in profits. So it's understandable that, when I asked him about Nvidia's recent coup, getting Apple to swap out Intel product for GeForce 9400M chipset, he said with more than a hint of disdain, "You're obviously a Mac user." Here's a guy who is used to making judgments, and doing it quickly.

But when I told him I also built my desktop with an Intel Core 2 Duo Wolfdale chip, he reversed his decision. Laughing, he said, "You're alright for a kid that wears black Keds." This wasn't his first reference to my sneakers—they were Adidas, actually—and it wasn't his last either.

At 69, he is definitely one of the oldest guys running a powerhouse innovation company like Intel, and when he's sitting there in front of you, he conveys an attitude that he's seen it all. He hung up his labcoat for a tailored suit long ago, but talking to him, you can still tell that his degree from Stanford isn't some MBA, but a PhD in materials science. Nerdspeak flows easily out of his mouth, and he closes his eyes while calmly making a point, like a college professor. At the same, you get a sense of the agitation within. After all, he'll be the first to tell you that in business, he still lives by the mantra of his Intel CEO predecessor Andy Grove: "Only the paranoid survive."

In the end, I really liked the guy. He's tough but fair, like an Old Testament king. Here are excerpts from our conversation, chip guru to chip fanboy, about vanquishing your competition, the limitations of clock speed, the conti! nuing ra ge of the multi-core race and how to keep paranoid in your golden years.

What's the endgame of the multi-core arms race? Is there one?
If everything works well, they continue to get Moore's Law from a compute power standpoint. [But] you need software solutions to go hand-in-hand with software solutions...There's a whole software paradigm shift that has to be happen.

How involved is Intel in the software side of making that happen?
Probably the best measure is that if look at the people we hire each year, we still hire more software engineers than hardware engineers.

Where do you see Larrabee, Intel's in-development, dedicated high-end GPU, taking you?
The fundamental issue is that performance has to come from something other than gigahertz... We've gotten to the limit we can, so you've got to do something else, which is multiple cores, and then it's either just partitioning solutions between cores of the same type or partitioning solutions between heterogeneous cores on the same chip.

You see, everybody's kind of looking at the same thing, which is, 'How do I mix and match a CPU- and a GPU-type core, or six of these and two of those, and how do you have the software solution to go hand-in-hand?'

So what do you think of the competition coming from Nvidia lately?
At least someone is making very verbal comments about the competition anyway.

Do you see Nvidia as more of a competitor than AMD? How do you see the competitive landscape now?
We still operate under the Andy Grove scenario that only the paranoid survive, so we tend to be paranoid about where competition comes from any direction. If you look at the Intel history, our major competitor over the years has been everybody from IBM to NEC to Sun to AMD to you-name-it. So the competition continually chang! es, just as the flavor of technology changes.

As visualization becomes more important—and visualization is key to what you and consumers want—then is it the CPU that's important, or the GPU, or what combination of the two and how do you get the best visualization? The competitive landscape changes daily. Nvidia is obviously more of a competitor today than they were five years ago. AMD is still a competitor.

Would you say the same competitive philosophy applies to the mobile space?
Two different areas, obviously. The netbook is really kind of a slimmed down laptop. The Atom processor takes us in that space nicely from a power/performance standpoint. Atom allows you to go down farther in this kind of fuzzy area in between netbooks, MIDs [mobile internet devices] and smartphones. The question there is, 'What does the consumer want?'

The issue is, 'What is the ultimate device in that space?' ...Is it gonna be an extension of the internet coming down, or there gonna be an upgrowth of the cellphone coming up?

Are you planning on playing more directly in phones, then?
Those MIDs look more and more like smartphones to me...All they need to do is shrink down a little bit and they're a damn good smartphone. They have the capability of being a full-internet-functionality smartphone as opposed to an ARM-based one—maybe it looks like the internet you're used to or, maybe it doesn't.

Intel and Microsoft "won" the PC Revolution. There's a computer on basically every office desk in the country. What's beyond that? Mobile, developing countries?
Well, it's a combination. There's an overriding trend toward mobility for convenience. We can shrink the capability down to put it in a mobile form factor, and the cost is not that much more than a desktop, point one. Point two, if you go to the emerging economies where you think that mobile might be lacking, really the only way to get good broadband connectivity in most of ! the emer ging markets is not with wired connectivity or fixed point connectivity, it's gonna be broadband wireless and that facilitates mobile in emerging markets as well.

So where does that take Intel going in the next five years?
It's pushing things like broadband wireless, WiMax...It's broadband wireless capability, that's the connectivity part. It's mobility with more compute power and lower energy consumption to facilitate battery life and all that good stuff. And it's better graphics. That's kind of Larrabee and that whole push.

You've passed AMD on every CPU innovation that it had before you did, such as on-die memory controllers, focus on performance per watt, etc. How do you plan to stay ahead?
The basic way you stay ahead is that you have to set yourself with aggressive expectations. There's nothing in life that comes free. You're successful when you set your expectations high enough to beat the competition. And I think the best thing that we have going for us is...the Moore's Law deal.

As long as we basically don't lose sight of that, and continue to push all of our roadmaps, all of our product plans and such to follow along Gordon's law, then we have the opportunity to stay ahead. That doubling every 18 months or so is the sort of expectation level you have to set for yourself to be successful.

Would you consider that the guiding philosophy, the banner on the wall?
That's the roadmap! That is the roadmap we have. If you dissect a bit, you tend to find that the older you get, the more conservative you get typically and you kinda start to worry about Moore's Law not happening. But if you bring the bright young talent and say, 'Hey, bright young talent, we old guys made Moore's Law happen for 40 years, don't screw it up,' they're smart enough to figure it out.



Read More...

Workflow Charts Finally Put to Good Use Show Fundamental Men vs Women Differences [At The Office]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/9baGVMytD24/workflow-charts-finally-put-to-good-use-show-fundamental-men-vs-women-differences

It's Friday. You are the office, watching another Powerpoint by Jimmy, the product development bozo. "Stupid Jimmy," you think, "these are the only two workflow charts we need after this long work week." UPDATED

Now my question is: Is it really this way? Because one of the most delightful, smartest, and sexiest woman I've ever met was drunk after three hours in a bar drinking beer with orange slices in it, while that night I was drunk with cocktails and my drinks matched my shoes. And I pee sitting down. At least at home, because it's more comfy, but that's another story.

Women. Men. TS/TVs. Speak. [Thanks Oscar]

Update: It gets better.



Read More...

Apple To Use Nvidia's Atom-Powered Ion Platform For Something: Mac Mini or Apple TV? [Rumor]

Source: http://feeds.gawker.com/~r/gizmodo/full/~3/lgn0Ecg7pAw/apple-to-use-nvidias-atom+powered-ion-platform-for-something-mac-mini-or-apple-tv

Tom's Hardware says it's for the sad and neglected Mac Mini. Apple Insider says it could make more sense inside a revamped Apple TV. Either way, a dual-core Atom with Nvidia's 9400M sounds nice.

Tom's Hardware is pointing to an Nvidia source that confirmed Apple was the first to receive Ion test units, and said that Apple most certainly had an Ion-powered Mac Mini in the pipe. Apple Insider is more inclined to believe it's for the Apple TV, since a move to Atom would be a step up from its aging 1GHz Intel Crofton proc.

Both products make the most sense for a low-power, low-cost processor, but a dual-core Atom 330 running at 1.6 GHz would certainly be a step down from the Mini's current Core 2 Duo at 1.83 and 2 GHz, so I would place my chips in the Apple TV stack. The Nvidia 9400M would bring a nice boost to the Mini's paltry HD video capabilities however, and OS X config file snooping has turned up evidence for a 9400M-powered Mini recently (doesn't mean it will also use an Atom processor).

Or maybe they'll simply combine these two fairly confused product lines into one dimunitive full OS X machine (please) that's perfect for the living room. Tom's source says the new Ion-powered product will hit around March. We'll see about that. [Tom's Hardware, Apple Insider]



Read More...

Sanyo's PLC-XF71 projector packs 10,000 lumens for extreme brightness

Source: http://www.engadget.com/2009/01/16/sanyos-plc-xf71-projector-packs-10-000-lumens-of-extreme-bright/

While not high definition like the company's most recent projectors, Sanyo's PLC-XF71 manages to compensate with 10,000 lumens for some serious brightness. By comparison, the recently-unveiled PDG-DHT100JL sports 6,500 lumens and its sub-$2000 PLV-1080HD just 1,200. Beyond that, it's got a 1024 x 768 picture and a 3000:1 contrast ratio. If you don't mind trading resolution for intense luminance, look for it to show up this month for a papered Abe Lincoln under $17,000.

Filed under: ,

Sanyo's PLC-XF71 projector packs 10,000 lumens for extreme brightness originally appeared on Engadget on Fri, 16 Jan 2009 09:39:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Read More...

Panasonic's Lumix DMC-FX150 reviewed, perfect for higher-end casual photographers

Source: http://www.engadget.com/2009/01/16/panasonics-lumix-dmc-fx150-reviewed-perfect-for-higher-end-cas/

Panasonic's Lumix DMC-FX150 reviewed, perfect for higher-end casual photographers
If you've been waiting for SLR quality pics out of a camera you can slip into your pocket (and we're not talking cargo pants here), Panasonic's 14 megapixel Lumix DMC-FX150 is sadly not your product. However, if you've been looking for something that can take shots approaching the quality of something like a Canon G10 but do so in a more slender form factor, keep reading. PhotographyBLOG's review of this higher-end of the point 'n shoot range finds it to be quite good, capturing great images in bright light with very few chromatic aberrations. However, darker shots (bane of the pocket cam market) are still somewhat problematic, as the built-in optical IS fails to keep images sharp and noise appears at ISO 800 and above. Despite those annoyances the $399 camera (yours for about $100 less if you don't mind bargain hunting) scored overall high marks, becoming one of the best quality shooters you can buy and have a hope of fitting in your skinny jeans.

Filed under:

Panasonic's Lumix DMC-FX150 reviewed, perfect for higher-end casual photographers originally appeared on Engadget on Fri, 16 Jan 2009 10:07:00 EST. Please see our terms for use of feeds.

Read | Permalink | Email this | Comments

Read More...