This is default featured slide 1 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions..

This is default featured slide 2 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions..

This is default featured slide 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions..

This is default featured slide 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions..

This is default featured slide 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions..

16.10.12

Apple May Split Ties With Samsung Sooner, Rather Than Later

The latest news on the growing Apple/Samsung split confirms that the two companies are pulling away from each other. Reports now indicate that the A6, Apple's first custom CPU design, was built entirely without input from Samsung. Up to this point, the two manufacturers have been vital partners, even as they attacked each other in courtrooms worldwide.

The Korean Times reports that the relationship between the two has become one-dimensional, with Samsung relegated strictly to manufacturing the A6. That might seem a small difference, but it's a portent of major changes to come. Typically, a foundry like TSMC or GlobalFoundries employs its own engineering teams that will work directly with a customer to implement a processor design. This theoretically provides some of the same benefits an IDM (Integrated Device Manufacturer) would enjoy; the engineering teams are foundry employees, but they have a vested interest in working with particular customers rather than securing business for the foundry as a whole.

Cutting Samsung's experts out of the design process may have made Apple's job a bit harder, but it limits the amount of information Samsung has on exactly how the chip was designed and what the process looked like. That particularly matters given that portions of the A6 were hand-optimized rather than being built on standard logic blocks. It's a sign of how little Apple trusts its biggest manufacturing partner and still more evidence that the two companies are going to break their business arrangement.



The Apple A6 core. The silver areas are hand-tuned for maximum performance.

An unnamed Samsung official has confirmed to The Korean Times that "Samsung’s agreement with Apple is limited to manufacturing the A6 processors. Apple did all the design and we are just producing the chips on a foundry basis... There are three kinds of chip clients. Some want us to handle everything from chip design, architecture and manufacturing. Some want us to just design and manufacture. Some want us to just make the chips. Apple is now the third type."

Changes like this have been forecast for some time, and as we've previously discussed, could highlight some major shakeups in the foundry business. Samsung is now openly manufacturing for other companies, including Nvidia, that've previously been exclusive TSMC customers. With Apple now sourcing its quad-core chips from the Taiwanese manufacturer, Samsung has to find other customers if it wants to continue as a player in the foundry space. Up until now, Samsung has used a wide variety of hardware in its mobile phones, but with Texas Instruments planning to exit the mobile business and its own fab space at risk of running dry, we may see the company put more effort into developing its own Exynos-brand devices.

Apple MacBook Air 13 (Ivy Bridge) vs Ultrabooks

So here it is, the mid-2012 refresh of Apple's trend setting MacBook Air line. It's been four long years since Apple first introduced the MacBook Air to the world, which at the time was almost considered a luxury item, with a comparatively steep price tag versus other ultralight machines. That changed a couple of years later, and in the process of making the MacBook Air more affordable, Apple helped shift the mobile market away from chunky desktop replacements, to a trend where competent computing power exists within thin and light profiles. In all likelihood, the MacBook Air inspired Intel's Ultrabook specification, created so that Windows users would have access to the same overall experience on their platform of choice.

The MacBook Air is the real McCoy, so to speak, and though technically not an Ultrabook, the newest models cross over to Intel's 3rd generation Ivy Bridge Core processor microarchitecture. In fact, Ivy Bridge is at the heart of Apple's mid-2012 refresh, bringing with it not only greater processing power, but a graphics speed bump from Intel HD Graphics 3000 to Intel HD Graphics 4000, topped off with DX11 compatibility and improved power efficiency to boot. Simply put, Intel obviously benefits by remaining platform agnostic, so long as both platforms buy their weapons from the Santa Clara chip maker.  Apple left the Power PC architecture what seems like an eternity ago for X86 and they're obviously not looking back, at least on the desktop and mobile side of the house.


Ivy Bridge isn't the only new addition to the latest generation MacBook Air line. System memory is doubled to 4GB (configurable up to 8GB), SuperSpeed USB 3.0 finally makes a debut, Apple upgraded the power connector to MagSafe 2, the FaceTime camera is now 720p, flash storage is supposedly twice as fast as the previous generation, and Thunderbolt makes its inevitable appearance. None of these upgrades are particularly groundbreaking, though collectively, it's an enticing assortment of enhancements, all of which are packed into the same thin and light frame as before. But how does the new MacBook Air compete versus the current crop of Ultrabooks and is it enough to warrant an upgrade?

AMD A10 and A8 Trinity APU: Virgo Desktop Experience

We’ll be taking a somewhat different two-tiered approach with our coverage of AMD’s new Trinity-based APUs for desktop systems today. AMD is lifting the veil on their new product line-up, in addition to graphics performance and power consumption, but we can’t quite give you the full monty just yet, due to a new multi-tiered launch approach AMD decided to take with these products. If you want to see how well AMD’s latest desktop APUs overclock, how their processor cores perform, or how they’re priced, you’re going to have to stop by in a few more days. For now though, we’ve got graphics performance and power consumption characteristics to talk about and have some rather interesting side-by-side comparisons in store as well.

AMD Trinity APU Die Shot
We’ve already shown you what AMD’s Trinity-based APUs can do in their mobile form, but the desktop variants are somewhat different animals. Although they’re based on the same piece of silicon, Trinity-based APUs for desktop systems have much more power and thermal headroom to play with. As such, the chips are clocked much higher, in regard to both their CPU and GPU cores. In fact, one of the chips we’ll be showing you here today, the A10-5800K, can Turbo all the way up to 4.2GHz. Take a look at desktop Trinity’s main features and specifications below and take a few minutes to see how the A10’s GPU performs versus an Intel Core i3 in the video here. We’ll move on to more details and performance data on the pages ahead.

AMD Teams Up With Dataram To Give You 4GB RAMDisk For Free, 6GB If You Have Radeon Memory

When AMD launched its Radeon-branded memory last fall, it came as a bit of a surprise. In fact, in some ways it felt like AMD was trying anything in order to not keep all of its eggs in one basket, given the company's obvious struggles these past few years. Some thought that launch would be the end of it, that we'd never hear of AMD memory again - but we've been proven wrong this morning.
In a blog post entitled, Give your PC a boost with AMD Radeon RAMDisk, AMD offers us a free version of Dataram's RAMDisk software that will allow you to utilize up to 4GB of your memory for the cause, or up to 6GB if you happen to own AMD's branded sticks. Why would you want a RAMDisk? To speed up processes that constantly read and write to the disk. While SSD users might not see a major performance advantage, using a RAMDisk means that these read / writes hit your RAM, not your SSD, potentially prolonging its life.
AMD notes these perks when using a RAMDisk:
  • Get gaming in as low as 4 secs or up to 1700% faster than an HDD
  • Get significantly faster reading and writing speeds vs HDDs and SSDs
  • RAMDisk read performance: Up to 25600 MB/s with DDR3-1600, and even higher with faster RAM!
  • Protect your data with load and save features
Admittedly, it's going to be difficult to see advantages with the first bullet-point if you can only allocate 4GB or 6GB of RAM, and that's where the full-blown "AMD Radeon RAMDisk Xtreme" for $18.99 comes in. As mentioned above, this version allows you to use up to 64GB of memory. Generally speaking though, your RAMDisk shouldn't hog too high a percentage of your system memory if you want to actually use the rest for regular RAM purposes, so to help you with that math, AMD lays out suggestions on its purchase page.
I admit that while I've long considered using a RAMDisk day-to-day, I've only ever dabbled with it briefly. Do you use a RAMDisk? If so, would you consider it to be genuinely useful - perhaps even something you wouldn't want to go without?

Motorola DROID RAZR M Smartphone Review

It seems like ages ago that the world first heard the now-unmistakable "DROID" sound-byte. DROID was originally billed as the anti-iPhone, pointing things out that Apple's iOS platform couldn't do, but proclaiming proudly that "DROID does". The campaign seems to have worked. To date, the brand has taken off, proving to be a rousing success for Verizon and Motorola. In fact, many people (incorrectly) refer to Android by just "DROID," similar to how many people refer to iOS as "iPhone."



Over the years, Motorola has heavily leveraged the DROID brand. The relaunch of the RAZR was also a monumental decision. Just under a year ago, we reviewed the first DROID RAZR, which shipped with Android 2.3.5, a Kevlar back and pretty impressive specifications for the time. Since then, the DROID RAZR Maxx has launched (with the Maxx HD supplanting that earlier this year). Yet, it seems that there's room for a few more models in the RAZR family.

Enter the RAZR M, which is billed as a mainstream variant that's aimed at those who walk into a Verizon Wireless store looking to spend less than $100 on a new phone.  Here's a quick guided tour of the device in action, that we put together for you...

Raspberry Pi Has More Memory for the Same Price

The tiny Raspberry Pi mini PC got a major performance boost today, thanks to a RAM upgrade. The super-inexpensive computer now has 512MB of RAM instead of 256MB – at the same $35 price. And with the initial shipping hiccups behind them, the Raspberry Pi Foundation and its distributors are proving to be pretty flexible: any Model B orders that haven’t been fulfilled are being automatically upgraded from the 256MB version to the new 512MB computer.


The Raspberry Pi Model B Linux-Based Mini Computer
The Raspberry Pi is a super-small computer. The Foundation gives priority to the system’s price over its performance, which means that it’s likely to remain affordable, but that upgrades are done slowly and with a lot of consideration. It also means that super-charged models are out of the question, at least for now. The computer comes without a case, though third-party cases are available.

So far, the Raspberry folks seem to be pretty on top of caring for their customers. Automatically upgrading orders is a nice move, and so is having a firmware upgrade available on the day those computers land on customer doorsteps.

13” MacBook Pro with Retina Display Could Debut Alongside iPad Mini

Apple wasn't fooling anyone when it neglected to give its 13-inch MacBook Pro model the same Retina Display facelift that the 15-inch model received. Let's face it, Apple loves staggered release schedules, partly to build up hype to a particularly product or technology. We all know it's only a matter of time before the 13-inch model gets the same Retina Display upgrade as its larger 15-inch sibling, and if the latest reports are accurate, Apple will roll it out the same time as the rumored iPad Mini device.

That information comes from 9to5Mac, which in turn was tipped off by "a consistently reliable source at a high-profile U.S. retailer." According to the un-named but apparently reliable source, the new 13-inch MacBook Pro with Retina Display will sport a thinner and lighter chassis, just like the 15-inch version that launched four months ago.

15" MacBook Pro
Apple is getting ready to introduce a 13" MacBook Pro with Retina Display, just like its existing 15" model shown above.

There will be two configurations to choose from, with varying processor and storage options. Both will cost more than the non-Retina Display versions, which currently go for $1,199 (2.5GHz dual-core Intel Core i5, 4GB 1600MHz memory, 500GB 5400 RPM hard drive) and $1,499 (2.9GHz Intel Core i7 processor, 8GB 1600MHz memory, 750GB 5400 RPM HDD). As an additional point of reference, the price difference between 15-inch MacBook Pro models with and without Retina Displays breaks down as follows:
  • $1,799: Non-Retina Display with 2.3GHz processor, 4GB 1600MHz memory, 500GB HDD, GeForce GT 650M GPU
  • $2,199: Non-Retina Display with 2.6GHz processor, 8GB 1600MHz memory, 750GB HDD, GeForce GT 650M GPU
  • $2,199: Retina Display with 2.3GHz processor, 8GB 1600MHz memory, 256GB SSD, GeForce GT 650M GPU
  • $2,799: Retina Display with 2.6GHz processor, 8GB 1600MHz memory, 512GB SSD, GeForce GT 650M GPU
It's all speculation at this point, but the 13-inch model with Retina Display will probably debut at around $1,499, if we had to venture a guess.

Nintendo Wii Bundle Drops to $130 Ahead of Wii U Launch, Includes a Second Game

With Nintendo getting ready to roll out its next generation Wii U console on November 18, 2012, the company decided now was a good time to cut the price of its existing Wii hardware, or at least to announce it. Effective October 28, you'll be able to snag a black Wii bundle for $130 (MSRP), down $20 from the previous price. To sweeten the pot, Nintendo found room on the included Wii Sports disc to also hold a second title, Wii Sports Resort.

"Nearly six years after it launched, people are still attracted to the pure, inclusive fun of the Wii console," said Scott Moffitt, Nintendo of America’s executive vice president of Sales & Marketing. "A new suggested retail price and the inclusion of two great games make it an easy choice for families looking for a great value this holiday season."


Wii Bundle

The new bundle will replace the existing one that costs $150, which includes the New Super Mario Bros. Wii game. Both the existing and upcoming bundle also include a single Wii Remote Plus controller with Nunchuck.

Nintendo's Wii U console will sell for $300 when it launches next and will come with a GamePad, HDMI cable, Wii U sensor bar, and 8GB of storage. There will also be a "Deluxe Edition" that sells for $350 with 32GB of memory, GamePad charging cradle, and stands for the GamePad and console.

Micron RealSSD P320h PCI Express SSD Review

Most of the PCIe SSD cards on the market today, with the exception of products from Fusion-io, still rely on SATA or SAS-based NAND controllers to interface on the backend of the device to the NAND array. PCIe cards from OCZ, Intel, LSI and others use controllers from LSI SandForce or the like.  Fusion-io was the first company to introduce a true native PCI Express to NAND processor employed in their products, though Micron has also been cooking up their own native PCIe SSD technology for some time now.

Today we're looking at the Micron P320h, a PCI Express SSD that was introduced to the market well over a year ago and has actually been shipping to OEM customers for some time, but is just now hitting the market for general availability.  Micron partnered with IDT, a veteran semiconductor manufacturer out of San Jose that specializes in high speed serial switching and memory interface technology.  A match made in high bandwidth heaven, between a bellwether memory giant and a cutting-edge high speed logic manufacturer?  Perhaps.  Read on as we find out.


iBuyPower Valkyrie CZ-17 Gaming Notebook Review

Given all the frenzy around ultrabooks, Microsoft's Surface, and the low power potential of future chips like Haswell, you might think that desktop replacement-class laptops had fallen out of favor with modern laptop manufacturers. iBuyPower's CZ-17 "Valkyrie" is proof that they haven't -- and while its size and weight won't appeal to road warriors or anyone who needs a svelte portable, it packs a number of significant features into its chassis, at a price that won't break the bank. It also makes a few compromises along the way; we'll show you where they are.

 
iBuyPower's CZ-17 Valkyrie
Specifications & Features 
Processor Options

Dimensions

Weight


Display


System Memory

Graphics


Battery


AC Adapter


Hard Drive Options


Wireless Connectivity


Sound


Webcam


Ports and Connectors


 




Operating System 

Pricing
Intel Core i7 3610QM (2.3GHz w/Turbo Boost to 3.3GHz, 6MB L3 cache)

Height: 2.2" / Width: 11.3" / Depth: 16.9"

6.9lbs (with 9-cell battery)


17.3" HD (1920x1080) MATTE 


8GB DDR3 1333MHz (2x DIMM sockets)

Nvidia GeForce GTX 675M 4GB


48Whr batter: 6-Cell (built-in). Up to 7 hours (usage) and 10 days (standby) battery life claimed.


65W AC Adapter


750 GB 7200 RPM HDD (Many other options available)

  Atheros KilerNIC 802.11b/g/n


Unspecified speakers.

3.0 Megapixel Webcame


Mic / Headphone / Line-in / Line-out jacks (Audio)
2x USB 2.0
3x USB 3.0
4-in-1 Card Reader
Gigabit LAN

 Windows 7 Home Premium w/ Windows 8 Upgrade Option

$1459 (As Configured)



The spec sheet gives you a good idea of where iBuyPower has focused its attention. The Valkyrie CZ-17 uses a 17" 1920x1080 panel with a matte finish. If you hate glossy panels, this alone should put the system on your radar. 1080p panels are tough to find at this price point -- Dell's 17" Inspirons all use a 1600x900 resolution; Alienware's cheapest MX-17 starts at $1649 for a 1920x1080 panel -- and theirs is still a glossy.
Wireless Ethernet is courtesy of Atheros' KilerNIC, there's multiple USB 3.0 ports, and the system doesn't skimp on RAM or GPU power. The GTX 675M is a rebadged GTX 580M, but as we'll see, it's more than capable of powering a system of this type.

AT&T And Sprint Getting LG's Optimus G Android Phone In November

LG's Optimus G is undoubtedly the company's flagship Android phone for the moment. We say "for the moment," because that may change if LG introduces the new Nexus phone. At any rate, the superphone will soon be hitting Sprint and AT&T, and both carriers will be selling it for $199.99 on a 2-year agreement. One of the hallmark features is the 13MP camera on the rear, coupled with a 4.7" (1280x768) display on the front. Other specs include Android 4.0, a 1.5GHz quad-core Snapdragon S4, 2GB of RAM and LTE. Of note, the AT&T model gets downgraded to an 8MP camera, oddly. As for dates? Sprint gets theirs on November 11th, while AT&T gets it on November 2nd.


NVIDIA GeForce GTX 650 Ti Round-Up: EVGA, ZOTAC, GB

NVIDIA has been on a tear as of late, releasing a constant stream of GPUs over the last few weeks. In mid-August, the GPU giant released the GeForce GTX 660 Ti and less than a month later, followed up with the GeForce GTX 660 and GeForce GTX 650. Here we are now, less than a month since that release and NVIDIA is at the ready again with yet another new GPU called the GeForce GTX 650 Ti.

As its name implies, the GeForce GTX 650 Ti is a step up from the standard “non-Ti” GeForce GTX 650. The new GeForce GTX 650 Ti, however, isn’t powered by the same GPU as the GTX 650. Whereas the GeForce GTX 650 has a GK107 GPU at its heart, the new GeForce GTX 650 Ti uses the same GK106 as the higher-end GTX 660 series cards, albeit with a few blocks of the chip disabled. The end result is a card that’s much more powerful than the standard GeForce GTX 650, but is priced only slightly higher.

Take a gander at the specifications and reference card below and we’ll follow up with a look at a trio of custom GeForce GTX 650 Ti cards from a few of NVIDIA’s board partners and a full performance profile on the pages ahead…

NVIDIA GeForce GTX 650 Ti Reference Card.
NVIDIA GeForce GTX 650 Ti
Specifications & Features
Processing Units
Graphics Processing Clusters 2 or 3
SMXs 4
CUDA Cores 768
Texture Units 64
ROP Units 16
Clock Speeds
Base Clock 925 MHz
Boost Clock N/A
Memory Clock (Data Rate) 5400 MHz
L2 Cache Size 256KB
Memory
Total Video Memory 1024MB or 2048MB
Memory Interface 128-bit
Total Memory Bandwidth 86.4 GB/s
Texture Filtering Rate (Bilinear) 59.2 GigaTexels/sec
Physical & Thermal
Fabrication Process 28 nm
Transistor Count 2.54 Billion
Connectors 2 x Dual-Link DVI, 1 x Mini HDMI
Form Factor Dual Slot
Power Connectors 1 x 6-pin
Recommended Power Supply 400 watts
Thermal Design Power (TDP) 110 watts
Thermal Threshold 98° C

Like the GeForce GTX 660, the new GeForce GTX 650 Ti is based on the GK106 GPU. Although the GK106 features all of the same technology as NVIDIA’s more powerful Kepler-based graphics processors, this GPU is somewhat smaller and scaled-down versus its higher-end counterparts.

GK106: GeForce GPU Block Diagram (Note - One SMX is Disabled on the GTX 650 Ti)
Here is a high-level block diagram of the GK106 GPU powering the GeForce GTX 650 Ti. The actual chips are manufactured using TSMC’s 28nm process node and are comprised of approximately 2.54B transistors. In its full configuration, the GPU features three Graphics Processing Clusters (GPC) with three SMXs and a total of 960 CUDA cores arranged in 5 SMXs. There are also 80 texture units and 24 ROPs within the GPU, along with 384K of L2 cache. On the GeForce GTX 650 Ti, however, one of the SMXs has been disabled, which results in a total of 768 active CUDA cores, with 64 texture units, 16 ROPs and 256K of L2 cache.

A memory partition on the GK106 has also been disabled on the GeForce GTX 650 Ti. Instead of the GTX 660's 192-bit interface, memory on the GTX 650 Ti is linked to the GPU via a 128-bit interface. In comparison to the GeForce GTX 660, the new 650 Ti has less cache, a narrower memory interface, and fewer CUDA cores, texture units, and ROPs, which results in lower compute performance, fillrate, and memory bandwidth.
 
Some Pics of the NVIDIA GeForce GTX 650 Ti Reference Card.
NVIDIA’s reference specifications call for a base GPU clock of 928MHz on the GeForce GTX 650 Ti, 1GB of memory, and memory clock of 1350MHz (5400MHz effective). At those frequencies, GeForce GTX 650 Ti card will offer up to 86.4GB/s of memory bandwidth and 59.2GTexes/s of textured fillrate. Many of NVIDIA’s partners, however, are ready with factory overclocked models (which we’ll show you on the next page), that will offer somewhat higher performance characteristics. Many cards from NVIDIA's partners will also feature 2GB frame buffers, instead of the 1GB of the reference card.

The GeForce GTX 650 Ti has a TDP of 110 watts and requires a single, supplemental 6-pin PCI express power feed. Cards are two-slot wide, but sport a short 5.65" PCB. The output configuration consists of 2 x DVI outputs and 1 mini-HDMI output, but the GPU supports up to four independent displays, so some parts may release cards with four