Total Pageviews

Popular Posts

Saturday, November 9, 2013

Samsung Galaxy S5 Ready To Beat Apple Next-Gen Chip








Samsung once again mesuring itstarget to bring the better smartphone chip han Apple's A7 64 bit quad core. It looks like Samsung might be working on a smaller, much faster and much more efficient chip for the Samsung Galaxy S5, using its Exynos technology to bring out a chip to rival Apple's pwerful A7 found at the heart of the iPhone 5S. According to reports Samsung is said to be working on 14nm (14 nanometer) chip for its for its new flagship, which would be half the size of 28nm chip of the Samsung Galaxy S4. According to South Korean site DDaily, which is referring to 'industry sources' Samsung is tentatively -tilted as Exynos 6. The smaller size should cut the power consumption of the chip by a large amount and by using les power it also produces less heat minimizing the chances of Samsung Galaxy S5 overheating. Smaller chips have the transistors packed closed together which means that they can work faster and more efficiently, using less power. Most smartphones at the moment, including Samsun's, use 28nm chips and the next logical step down in size would be to 20nm, but Samsung seems to be skipping that altogether and jumping straight down to 14 nm chips. The Samsung Galaxy S5 might use a powerful 64-bit chip. The move to 64-bits holds advantages of its own, as it will be able to work with more memory at once, allowing for devices with more than 4 GB of RAM. It is also better able to multitask and better tackle demanding apps and processes than a 32-bit chip, improving performance in the process. If all this is true then the Exynos 6 should be enough to out-do Apple, which recently made waves to a 64-bit chip in the iPhone 5S. Samsung will not only be able to match Apple with a 64-bit chip of ts own, but actually one-up the Cupertino Company with a chip that is faster, cooler and smaller. The move to 64-bit architecture also opens floodgates for more powerful Anroid handsets with a theoretically almost unlimited amount of RAM in the future.

No comments: