The single go-to music app in Windows 10 is the OS’s default Groove Music app. It is by no means is ‘one-click’ solution for your music storage, but a straight forward solution to…For more visit TheWindowsClub.com.
Our computing or browsing experience is not always a smooth affair. Sometimes, we encounter problems that are not only difficult to fix but completely unknown since they are in the form of codes. One…For more visit TheWindowsClub.com.
Windows App Studio Beta seems to have picked up a handful of new features and some much needed optimization thanks to a new December update released by Microsoft. The update consist of couple of new…For more visit TheWindowsClub.com.
Today we have some news that is kind of unexpected. Razer, the company known for gaming peripherals and gaming laptops such as the Razer Blade, has decided to enter the Ultrabook market with the launch of the Razer Blade Stealth. Not only is an Ultrabook not something expected from Razer, it is also priced very competitively and undercuts the competition on price.
Razer did not cut any corners either when designing the Stealth. Just like it’s more powerful and higher priced siblings, it is built out of a CNC-milled aluminum chassis, which is a designing feature of Razer laptops. But despite the solid frame, the laptop is still only 0.52-inches thick and weighs in at just 2.75 lbs. On the styling front, it keeps the black finish of other Razer laptops, but also outdoes them with a full “Chroma” keyboard with individually lit RGB keys. I’ve been hoping that they would do this for a while when reviewing the Razer Blade, so it’s great to see the RGB keyboard come to the Stealth model.
The 12.5-inch display comes in two options. The base model is a QHD (2560×1440) resolution, but you can also opt for a UHD (3840×2160) model with full Adobe RGB color gamut. I need to check in with Razer on how they are going to handle the wider color gamut, and will let you know after we get some hands-on time on the show floor.
The Stealth, as an Ultrabook, is going to be powered by Ultrabook class components, which in this case is the Intel Core i7-6500U processor. This Skylake chip features two cores, hyperthreading, and a base/turbo frequency of 2.5 GHz / 3.1 GHz. I was hoping that Razer would also offer a model with Intel’s Iris GPU, but that won’t be the case, at least at launch. The only memory option is 8 GB of LPDDR3-1866, and storage options range from 128 GB to 256 GB of PCIe storage on the QHD model, and 256 GB to 512 GB on the UHD model. The battery life will need to be tested, but the laptop has a 45 Wh battery, so it’s not going to be class leading in that regard.
For connectivity, the Stealth will have two USB 3.0 ports, and a USB 3.1 Type-C connector with Thunderbolt 3 support. The Thunderbolt is a key component to the Stealth, thanks to the accessory that Razer is also launching.
The Razer Core is a Thunderbolt 3 connected external GPU, which also acts as a docking station for the Stealth. With a single cable connection, the laptop can power an external display, all of the docking connections with four USB 3.0 ports and Gigabit Ethernet, and support for a 375W GPU.
The Core features a built-in 500W power supply, and the GPU support is for any single card which is full-length and double-wide, which means pretty much any GPU out there. The Core also features two additional Chroma lighting zones so that you can tailor it to your liking.
Razer has not yet announced any updates to the Razer Blade or Razer Blade Pro, but I would expect that both of these will also feature support for the Core when they do get their next refresh.
The Core supports plug and play with validated graphics cards, without the need to reboot.
The addition of the Razer Core brings back some of the gaming performance that Razer has been known for, although with a U series CPU it will be interesting to see what level of GPU is required to become CPU bound, especially with DX 12. If we can track down a review unit, we’ll try to sort that out.
The Razer Blade Stealth will be on-sale starting today, with a starting price of just $999. Considering the high resolution panel and Core i7, this undercuts most, if not all, of the Ultrabook competition on price. The top end 4K model with the UHD display and 512 GB of storage will be $1599.
The webcam market may seem fairly pedestrian, but Razer is trying to kickstart it with a new webcam designed for the modern game streamer. Many of the major game streaming sites have begun the move to 60 FPS video, but the modern webcam is basically stuck at 30 FPS. Razer is directly targeting this market with the Stargazer webcam.
The 60 FPS video capture can be done at 720p, and the camera also supports 1080p at 30 FPS. It also features automatic a noise-cancelling, dual-array microphone.
The Stargazer is powered by the Intel RealSense SR300 camera, which means that it also brings 3D to the mix. This may sound like a waste, but it brings quite a few benefits. The first obvious one is Windows Hello support, for facial recognition in Windows 10.
The part that Razer is most excited about though is the Dynamic Backround Removal capability, which means that the 3D camera can filter out the entire image except for the person. Traditionally when doing game streaming, the game is on most of the screen with the person playing as a box in one of the corners, but with the 3D camera Razer can focus on just the gamer, eliminating the required video box and just leaving the person. This has generally required an elaborate green-screen for gamers to invest in, and the Stargazer brings a similar result for much less cost.
On the other side of gaming, the Stargazer can be used to scan real objects into a digital world, for use as in-game assets, potentially speeding up development.
Finally, the Stargazer supports gesture and facial recognition with up to 78 points on the face and 22 points on each hand. Developers can leverage this for in-game actions, and it is something that Intel is promoting with it’s RealSense camera system, so we’ll have to see if it gains traction with developers.
It may be just a webcam, but as one of the first Windows Hello compatible devices launched, it already has a place with some people. The game streaming crowd will gain the bulk of the benefits for this, and that market is growing quite a bit.
The Stargazer will be available staring in Q2 for $199.99 USD.
ZOTAC is primarily known for its NVIDIA GeForce-based video cards, but in the recent years, the company started to sell motherboards, small form-factor personal computers and various accessories. Last year ZOTAC introduced its entry-level solid-state drives to add another revenue stream. At the International CES 2016, ZOTAC announced its new-generation PCIe SSDs, which are expected to address higher-end market segments.
The new solid-state drives from ZOTAC will be powered by Phison’s PS5007-E7 controller as well as multi-level cell (MLC) NAND flash memory produced by Toshiba. ZOTAC claims that its new SSDs will have sequential read performance of up to 2400 MB/s and sequential write performance of up to 1200 MB/s. The new solid-state drives from ZOTAC will come in half-length half-height PCI Express 3.0 x4 card form-factor and will fully support the NVMe protocol. The first model in ZOTAC’s PCIe SSD lineup will feature 480 GB capacity and will be available sometimes in February, according to the manufacturer.
ZOTAC does not reveal too many details about its new solid-state drives, but since they are based on the Phison PS5007-E7 controller, expect support for the NVMe 1.2, error correction with 120-bit/2KB BCH code, NVMe L1 power sub-states, end-to-end data path protection, advanced global wear-leveling, an AES-256 engine and so on. The AIC form-factor also means that the controller will be able to use all of its eight transfer channels, thus, maximizing performance.
Since the Phison PS5007-E7 controller was developed not only for gaming PCs, but also for enterprise and datacenter applications, it can enable SSDs with up to 350,000 4KB random read IOPS (input/output operations per second) and up to 250,000 random write IOPS. While consumer SSDs featuring the PS5007-E7 may not hit maximum IOPS performance, they will definitely be considerably faster than any previous-generation solid-state drives.
ZOTAC’s first-generation SSDs were arguably a business experiment for the company and its parent, PC Partner Group, which specializes on production of graphics cards, motherboards and other similar products, but not on storage devices. ZOTAC’s initial SSDs use Serial ATA interface and deliver moderate levels of performance. The cautious approach makes a lot of sense. Nowadays the end-user demands SSDs with maximum durability and reliability. ZOTAC yet has to become a well-known maker of solid-state drives and if its products are not rock-solid, its brand will be harmed. As a result, the company decided to focus on maximum quality rather than on maximization of sequential reads and writes.
With its new PCIe SSDs, ZOTAC plans to deliver rather extreme levels of performance. ZOTAC’s solid-state drives with PCI Express 3.0 x4 interface may not be as fast as Samsung’s 950 Pro (at least, on paper), but if the price and performance have the right balance, many end-users will gladly buy them.
We’ve reviewed Huawei Honor devices before, but by and large they were designed to target China and similar markets. There were also a number of growing pains as seen in our Huawei Honor 6 review. However, in the time since that review Huawei has done quite a bit of growing up when it comes to resolving some of their weaknesses and improving upon their strengths. Their Kirin SoCs started off with some notable issues in implementations, but with the Kirin 950 we’ve seen a major leap in performance and power efficiency. To keep their momentum going, Huawei Honor is bringing their first phone to the US, the Honor 5X.
Huawei Honor 5X
Qualcomm Snapdragon 615 1.5/1.2 GHz 4×4 A53
16GB + microSD
5.5" 1920×1080 IPS LCD
151.3mm x 76.3mm x 8.15mm; 158g
13MP Rear Facing f/2.0 28mm equivalent IMX214
5MP Front Facing f/2.4 22mm equivalent OV5648
3000 mAh (11.4Wh)
Android 5.1.1 EmUI 3.1
802.11 b/g/n 2.4 GHz Only, Bluetooth 4.1, GPS/GNSS, Micro USB 2.0
2G / 3G / 4G LTE Category 4
The basic specs aren’t really going to be all that fascinating at this point as Snapdragon 615 is a known quantity. Huawei continues their trend of shipping odd WiFi configurations as this device only supports 2.4 GHz 802.11b/g/n WiFi. The rear camera is a rather well-understood Sony IMX214 sensor and the front camera sensor is a similarly common OmniVision OV5648 sensor.
However, the Honor 5X actually manages to hit the right point for price and features. The display is a 5.5” 1080p LCD, with an aluminum unibody design. There’s also the usual dual SIM capabilities along a decently sized battery and an FPC1020 fingerprint scanner shared with the Ascend Mate7. At 200 USD, this has the potential to beat out the Moto G for best value smartphone in that price range.
Subjectively, the in-hand feel and overall build quality is shockingly good for the price. The Ascend P8 Lite that we reviewed last year was pretty much par for the course when it came to materials and in-hand feel for a ~200 USD phone, so to go from some rather hard and cheap-feeling plastic to an aluminum unibody that is basically comparable to the HTC One M9 in feel is quite a leap in the course of less than a year. The comparison to the One M9 is rather apt in this case, as the design of the phone is such that the phone has a brushed finish that can be seen, but not really felt in the hand.
Unfortunately, the performance of the Honor 5X is a bit wanting. I suspect that Cortex A53s alone aren’t quite enough to get the amount of performance needed to make Android run perfectly smooth, as while in some cases the phone was perfectly smooth in some transitions like opening and closing app folders I saw noticeable frame drops and similar issues.
Casual use of the fingerprint scanner was also quite impressive, as the Honor 5X behaves pretty much identically to the Ascend Mate7 in how the fingerprint scanner will automatically detect and scan a fingerprint even when the screen is off, so with fingerprint unlock set up it’s possible to unlock the phone by simply placing a finger over the fingerprint scanner and waiting for the phone to wake up and unlock automatically.
As previously mentioned, Huawei is selling the Honor 5X for 199.99 USD. It will be available for preorder starting January 6th, and will have general availability starting January 31st on HiHonor.com and Amazon. Although it would have really been exciting to see something like Snapdragon 650 show up in this phone, at the price it’s going it could be a viable option if Huawei has managed to nail down the details without show-stopping issues.
Today during their pre-CES launch event honor announced a new smartphone as well as a new fitness tracker called the honor band Z1. I found interest in the honor band Z1 because of the fact that it acts as a fitness band but doesn't follow the rectangular form factor that many fitness devices such as the Microsoft band have adopted, while allows it to also act as something similar to a typical circular watch.
The honor band Z1 sports a 1.06" 128 x 128 PMOLED display. This is obviously a much lower resolution than high end smartwatches, but for the intended applications of the honor band Z1 it makes sense in order to preserve battery life. The stainless steel case has a diameter of 38mm, a thickness of 9.5mm, and a mass of 25 grams. It's powered by a Cortex M4 based STM32F411 CPU from STMicroelectronics, which is paired with a 70 mAh internal battery. Honor states that the battery will last for 3-4 days of normal use, including daytime fitness use and sleep tracking. Like most wearables, the honor band Z1 is IP68 certified.
In addition to fitness tracking, the honor band Z1 does support some forms of communication, including notification mirroring and caller ID. It connects to your smartphone via Bluetooth 4.1, and supports iOS 7.0 and newer, as well as Android 4.4.4 KitKat and newer. The bands come in black, white, and cream finishes, with the black band model also coming with a black steel finish. It will retail for $79.99 USD when it goes on sale at the end of the month.
There are more than 6.8 billion devices with Wi-Fi technology in use today, but the number of devices that need to share data or access the Internet wirelessly will grow exponentially in the coming years because various wearables, driverless cars, smart sensors and other devices that belong to the Internet-of-Things (IoT) world. The Wi-Fi Alliance this week announced the IEEE 802.11ah standard, which was developed specifically for IoT devices. The tech will be formally called the Wi-Fi HaLow.
The 802.11ah operates in 900 MHz band, which helps to cut down power consumption, extend transmission range, improve propagation (the ability to transmit in the presence of many interferences) and penetration (the ability to transmit through various barriers, such as walls or floors). It is expected that the radius of a Wi-Fi HaLow device will be twice that of modern Wi-Fi standards (i.e., 500 meters in case of the 802.11n) and up to one kilometer, which can be further extended using relay. Actual data-rates supported by the 802.11ah will not be too high: the tech uses 802.11a/g spec with up to 26 channels that provide up to 100 Kb/s throughput.
The Wi-Fi HaLow technology was designed to enable communications between devices at longer distances and/or in challenging environments (with many barriers) using relatively low amounts of power. The tech could challenge both Bluetooth and cellular networks eventually since it combines the best of both worlds: low power operation as well as relatively long range. Moreover, unlike Bluetooth and other short-range radio technologies, the 802.11ah can connect devices directly to the Internet.
Since many devices supporting the Wi-Fi 802.11ah will also be able to operate in 2.4 and 5 GHz bands, they will also support traditional 802.11n/ac technologies and will be able to send and receive data at higher transfer-rates when possible.
The WiFi HaLow will unlikely replace Bluetooth completely due to the vast ecosystem that already uses the technology. However, it will compete against Bluetooth in the future. It will also not be able to replace cellular networks because they are ubiquitous.
The Wi-Fi HaLow 802.11ah technology has a lot of potential and is something that IoT needs. The 802.11ah will enable communications both for tiny battery-operated wearables as well as for various applications that cannot connect using today’s Wi-Fi technology. For example, the Wi-Fi HaLow can connect all personal health trackers at a hospital to its central servers, or enable communications between various machines at a large agriculture or industrial facility.
The Wi-Fi Alliance claims that its technologies operate in unlicensed spectrum; therefore, the Wi-Fi HaLow should not interfere with existing wireless technologies. Nonetheless, it should be noted that 900 MHz band is licensed in some countries. Therefore, the tech may not work everywhere, which will likely slowdown its adoption by the industry.
At present, the Wi-Fi 802.11ah is still a draft specification. Later this year it will be approved by the IEEE Std P802.11ah working group.
Patriot Memory has been selling solid-state drives for about eight years now. To date, virtually all of Patriot’s SSDs have used the Serial ATA interface, which became a performance-limiting factor in the recent years. At the Consumer Electronics Show this week, Patriot finally announced its first SSDs with the PCI Express 3.0 x4 interface. The new Hellfire solid-state drives will be available for purchase at the end of the first quarter.
The Patriot Hellfire SSDs are based on the Phison PS5007-E7, which is an eight-channel controller that supports the NVMe revision 1.20 protocol, the PCI Express 3.0 x4 interface as well as various types of NAND flash memory. The PS5007-E7 controller features error correction with 120-bit/2KB BCH code along with all the modern functionality, such as NVMe L1 power sub-states, power failure protection, end-to-end data path protection, an AES-256 engine, advanced global wear-leveling and so on. The Patriot Hellfire solid-state drives use MLC NAND flash memory, but the manufacturer yet has to reveal its exact type.
Patriot’s Hellfire SSDs will come in two form-factors: M.2 2280 card with PCIe 3.0 x4 interface as well as half-length half-height add-in-card with PCIe 3.0 x4 interface. The Hellfire M.2 and the Hellfire PCIe AIC drives will be available in 240 GB, 480 GB and 960 GB capacities.
The Hellfire M.2 2280 SSDs will offer sequential read speeds of up to 2500 MB/s and write speeds of up to 600 MB/s. The Hellfire PCIe AIC will be considerably faster with sequential read speeds of up to 3000 MB/s and write speeds of up to 2200 MB/s.
One of the reasons why the Hellfire SSDs in different form-factors offer different levels of performance despite of the same controller and logical interface is because the Phison PS5007-E7 controller cannot use all of its channels on an M.2 2280 card. It should also be noted that Phison’s reference M.2 2280 SSD with the PS5007-E7 ASIC (application specific integrated circuit) only supports capacities up to 512 GB.
Patriot will not be the only company on the market to offer high-performance solid-state drives based on the Phison PS5007-E7 controller. Phison sells its chips along with reference designs to actual makers of SSDs, so expect multiple companies to use the PS5007-E7 inside their high-end SSDs in 2.5-inch, M.2 and AIC form-factors. For example, G.Skill demonstrated its PS5007-E7-based Phoenix Blade X SSD at Computex 2015 about six months ago.
According to Patriot, its Hellfire PCIe AIC SSD will offer performance that will be higher than that of Samsung’s 950 Pro, which is one of the fastest solid-state drives today. If other producers manage to design SSDs with similar performance based on the PS5007-E7 ASIC, it will be a huge step forward for the whole market.
Samsung has been somewhat of a small player in the notebook market lately, but today they are announcing two new devices which should appeal to anyone looking for a very portable laptop. The new Notebook 9 series laptops, in both 13.3-inch and 15.6-inch sizes, come in at a very svelte 1.85 lb (840 g) and 2.84 lb (1.29 kg) mass, respectively. The 13.3 is one of the lightest notebooks around, and the 15.6-inch model is, as far as I know, the lightest 15-inch laptop yet. As well as being light, the magnesium framed devices are also very thin, with the smaller model just 13.4 mm thick, and the larger model only 14.5 mm.
So they are small. Both of them are powered by Intel Skylake-U series processors, which have a 15 Watt TDP. Normally 15.6-inch notebooks can sport quad-core H series due to the extra size and mass, but Samsung has clearly made an effort to keep these as thin and light as possible. RAM is 4-8 GB, and storage is 128-256 GB SSDs, which is pretty typical for an Ultrabook.
Despite the ultra-thin design, the keyboards are backlit, and feature 1.5 mm of key travel, which should mean a pretty decent typing experience.
Both versions have two USB 3.0 ports, but the 15.6-inch one also has a Type-C connector with DisplayPort capability and USB 3.1 Gen 1 speeds (which are the same as 3.0).
The displays are both 1080p PLS models, so unlike last year’s Samsung notebook, there is no longer a 16:10 offering here. That’s too bad, but the new models do feature thin bezels, reducing the overall footprint of the entire notebook. Samsung claims the 15.6-inch model fits in the same footprint as a traditional 14-inch device.
Samsung claims “all-day battery life” but the battery is the one area where the march to thin and light has been impacted. The 13.3-inch model has just a 30 Wh battery, and the larger version only goes up to 39 Wh. Compared to something like the XPS 13, with a 56 Wh battery, you can see that battery life is going to be impacted.
However, if a light notebook with a full Core U series processor is what you are after, the Samsung Notebook 9 series is likely one to check out. We’ll try to get some hands-on time with the new devices at CES.
As CES gets underway, SanDisk is announcing the X400 SSD as the successor to the X300 and X300s and as the higher-performance counterpart to the Z400s. The new X400 will be the flagship of SanDisk's line of SATA and M.2 SATA SSDs for OEMs, though by the standards of consumer SSDs sold at retail it wouldn't quite be a high-end SATA drive.
The X300s was the Self-Encrypting Drive variant of the X300, but for the X400 SanDisk is unifying the two by making encryption a standard feature, pending a firmware update due in April to provide full TCG Opal support. The X400 improves performance in most areas, though not by any huge margins. They're dropping the smallest capacities, leaving 128GB as the starting point, and mSATA is no longer an option. Both changes reflect a lack demand for outdated drive configurations in new product designs. Like the X300, the X400 uses TLC NAND flash and relies on SLC-mode write caching to provide competitive write speeds.
SanDisk OEM Client SSD Comparison
128GB, 256GB, 512GB, 1TB
32GB, 64GB, 128GB, 256GB
64GB, 128GB, 256GB, 512GB, 1TB (2.5" only)
Random Read IOPS
Random Write IOPS
2.5", M.2 2280
2.5", mSATA, M.2 2242, M.2 2280
2.5", mSATA, M.2 2280
The X400 adds a 1TB M.2 option that SanDisk claims is the first single-sided 1TB M.2 drive. The X400 also adds LDPC ECC to the mix, which probably helped SanDisk increase the warranty period to 5 years.
The SanDisk X400 was sampling to OEMs as of late last year and is now available to OEMs and system integrators in volume.
During Samsung's CES press conference the company announced a brand new 2-in-1 tablet. While it was initially thought to be an Android tablet to take on the likes of the Pixel C and the iPad Pro, it turns out that the TabPro S is really a full blown Windows 10 convertible tablet. Below are its specs.
Galaxy TabPro S
Intel Core m3
12" 2160×1440 AMOLED
290.3mm x 198.8mm x 6.3mm; 693g
5MP Rear Facing
5MP Front Facing
5200 mAh (39.5Wh)
Windows 10 Home/Pro
802.11 a/b/g/n/ac, Bluetooth 4.1, GPS/GNSS, Micro USB 2.0
2G / 3G / 4G LTE Category 6
Since the TabPro S is larger than the average tablet and runs a full version of Windows, we're looking at different specifications than one would typically find in an Android device. On top of that, Samsung is able to source components from their different child companies, allowing for features that don't exist on many other tablets.
Internally, the TabPro S is powered by Intel's Core m3 CPU, which is a Skylake-Y part. That CPU is paired with 4GB of RAM, and a 128GB or 256GB SSD. Samsung actually advertises it as an SSD, and given its size it's probably safe to assume that we're looking at an actual SSD rather than an eMMC solution.
The TabPro S uses a 12" 2160×1440 AMOLED display. The prospect of a Samsung tablet with an AMOLED display running Windows interests me greatly, because it opens up the possibility of manual calibration and different gamma targets like BT. 1886 which would greatly improve the movie watching experience.
Like many of the productivity focused tablets that have launched recently, the TabPro S includes support for a keyboard and a digital pen. The keyboard connects to the tablet directly using pogo pins, while the pen works over Bluetooth. In addition to those accessories, there will also be an adapter that allows for the connection of USB Type A, Type C, and HDMI devices.
The Galaxy TabPro S will be launching this February in both white and blue. The keyboard cover and Bluetooth pen will be available separately. Pricing for the TabPro S and accessories is currently unknown.
Today at CES Huawei made a number of announcements. One of them is a new tablet called the Huawei MediaPad M2 10. It's a new tablet coming to the United States, with specs that sit somewhere in the mid range part of the tablet market. You can check out all of its specs in the chart below.
Huawei MediaPad M2 10
HiSilicon Kirin 9302GHz 4x Cortex A531.5GHz 4x Cortex A53Mali-T628
Silver: 2GB LPDDR3Gold: 3GB LPDDR3
Silver: 16GB + MicroSDGold: 64GB + MicroSD
10" 1920×1200 IPS
239.8mm x 172.75mm x 7.35mm; 500g
13MP Rear Facing
5MP Front Facing
Android 5.1 + EMUI 3.1
Active stylus for gold model
802.11 a/b/g/n/ac, Bluetooth 4.0, GPS/GNSS, Micro USB 2.0
The MediaPad M2 10 is actually one of the first Huawei tablets that I've seen coming to the North American market. On paper, it appears to be a tablet targeting the mid range segment of the market. Starting with the SoC, you get HiSilicon's Kirin 930, which consists of two quad core Cortex A53 clusters with peak frequencies of 2GHz and 1.5GHz respectively. It's paired with an ARM Mali-T628 GPU, and either 2GB or 3GB of LPDDR3 memory depending on whether you buy the silver or gold model.
Moving on to the display, the 1920×1200 IPS panel definitely isn't as high resolution as the panels shipping on high end tablets, but it's a lot better than the 1280×800 panels that used to ship on all the mid range tablets out there. Huawei has been a bit inconsistent with their calibration across their product lines, so I'm interested to see how the panel compares to the competition in that regard. Beyond the display, you get either 16GB or 64GB of storage, and a pair of 13MP and 5MP cameras.
As for the design of the MediaPad M2, it doesn't end up cutting any corners. It ships with a full aluminum unibody, and the industrial design is very similar to that of the Mate S. It isn't the thinnest or lightest tablet out there, with a thickness of 7.35mm or 500g, but for a mid range tablet the fact that it's made of aluminum already gives it an edge over other tablets.
The Huawei MediaPad M2 10 will be available in silver and gold. The color choices also serve as a way to segment the devices, as the silver model comes with 2GB of RAM and 16GB of NAND, while the gold model comes with 3GB of RAM and 64GB of NAND. Both models will be available in the United States in the first quarter of this year, starting at $349 for the 2GB + 16GB WiFi model, and $419 for the 3GB + 64GB model which also includes the active stylus. Both models can have LTE support added on for $50.
Seagate has announced four new DAS (direct attached storage) products at CES 2016. Three of them target the premium / luxury market under the LaCie brand name.
Seagate Backup Plus Ultra Slim USB 3.0 bus-powered external hard drive
LaCie Porsche Design USB 3.0 Type-C bus-powered external hard drive (mobile model)
LaCie Porsche Design USB 3.0 Type-C external hard drive (desktop model) with power delivery
LaCie Chrome USB 3.1 Type-C external SSD
The LaCie Chrome USB 3.1 Type-C external SSD is easily the most impressive announcement of the four.Obviously, one of the key points of the LaCie products is the striking industrial design, and the Chrome is no exception.
The product contains two 512GB M.2 SATA SSDs in RAID-0 (effective user capacity is 1TB). It can support data rates of up to 940 MBps, thanks to the integrated ASMedia ASM1352R dual SATA to USB 3.1 Gen 2 bridge chip.
Seagate touts the aluminium enclosure, efficient triple cooling system, magnetized cable management (it is similar to the 2big Thunderbolt 2 product in this respect) and a removable magnetized display stand as unique features for this product.
Gallery: LaCie Chrome USB 3.1 External SSD
It must be noted that the Chrome does need an external power connector (understandable due to the need to power two M.2 SSDs). The above gallery shows us the various external aspects of the Chrome unit.The unit will retail for $1100 and be available later this quarter.
The LaCie Porsche Design USB 3.0 Type-C external hard drives have a new industrial design for the aluminium enclosure and come with a Type-C connector. Other than that, there is nothing too striking about them. The desktop model needs external power, but, it also does power delivery over its Type-C port (making it ideal for devices like the MacBook). Both the Mobile and Desktop versions come with an USB Type-A to USB Type-C cable also (in addition to the Type-C to Type-C cable). This enables compatibility with a wider variety of systems.
The Mobile version comes in 1TB, 2TB and 4TB capacities, starting at $110. The Desktop Drive comes in 4TB, 5TB and 8TB capacities, starting at $210.
Rounding up the product launches is the Seagate Backup Plus Ultra Slim. It is a 2.5" hard drive, and the firmware features are similar to the Seagate Backup Plus we reviewed last August. This implies the integration of a Seagate Dashboard for providing more features compared to a standard external hard drive. The device also comes with 200GB of OneDrive cloud storage valid for two years. It is also compatible with the Lyve photo management software.
The technically interesting aspects include the 9.6mm thickness (Seagate indicated that it is the thinnest external hard drive in its capacity class in the market right now). It comes in 1TB and 2TB capacities with a two-platter design. Cross-platform compatibility is enabled by a free Paragon driver download (enabling Macs to read drives formatted in NTFS and Windows PCs to read drives formatted in HFS+).
The Seagate Backup Plus Ultra Slim comes in 1TB and 2TB capacities. We don't have pricing details yet, but, availability is slated for later this quarter.
Today HTC has taken the wraps off of the second generation version of the HTC Vive. As you probably know, the HTC Vive is a virtual reality head-mounted display designed and made jointly by HTC and Valve. The consumer launch date for the Vive Pre has been pushed back a couple times now, but certain developers have had access to developer versions of the headset for some time now in order to develop new titles for it or work on adapting existing ones. The new Vive Pre is the second version of the Vive developer kit, and it comes with a number of improvements that bring the Vive closer toward its eventual commercial launch which will be occurring this year.
The Vive Pre makes some notable additions to the earlier version. First and foremost are the improvements to ergonomics. According to HTC, the headset has basically been redesigned from the ground up to be more compact and fit more comfortably onto your head while also being more stable. The displays have been made brighter and refinements to the entire display and lens stack have improved clarity over the existing model. Finally, there has been a front camera added to the headset. This may seem strange at first, but what the camera allows for is augmented reality experiences where a feed of the real world can be shown to the user and illusions can be projected onto that space by the headset.
As for the controllers, the design has been overhauled to make them more ergonomic. The buttons have been textured to make them easier to find, and the trigger has been changed to a dual stage switch which allows for interactions with multiple states, such as holding or squeezing something. There's also haptic feedback to go along with interactions, and this is something that can really help the experience when implemented in a proper and subtle manner. Finally, the tracking stations for the controllers have been made smaller and more precise.
I had a chance to try the new Vive Pre earlier, and it marked my first experience with a virtual reality headset, with the exception of the Nintendo Virtual Boy. While I can't make any statements that compare the new Vive to the old dev kit or to other VR headsets like the Oculus Rift, I can say that the experience with the headset and the controllers was unlike anything I've experienced before. The demo consisted of a virtual environment that simulated some of the challenges one would encounter when climbing Mount Everest. It included very theatrical sweeping shots where you looked over the mountains as though you were flying in the air or riding on a helicopter, as well as interactive segments that simulated crossing over a large pit, and climbing up a ladder.
What amazed me was how quickly I forgot about the fact that I was just in a hotel room wearing a rather large helmet and holding some controllers, and I found myself too frightened to look right over the edge of a cliff, and felt a strange feeling when I climbed the ladder as though I was nervous with my increasing height, even though I knew very well that I was standing on the floor the entire time. Head tracking latency was also very low, and to be honest the only thing that ever took me out of the experience was the limited resolution of the displays. That's a technology issue that will be improved with time, but even with that barrier to total immersion the experience is still extremely compelling and unlike anything else.
As of right now, the HTC Vive is scheduled to launch commercially in April of this year. Whether or not that date will be pushed back again is unknown, but what I can say is that I think the Vive and other VR headsets will have been worth the wait.
While NVIDIA has been rather quiet about the SoC portion of the DRIVE PX 2, it’s unmistakable that a new iteration of the Tegra SoC is present.
The GPUs and SoCs of the DRIVE PX 2 are fabricated on TSMC’s 16nm FinFET processes, which is something that we haven’t seen yet from NVIDIA. The other obvious difference is the CPU configuration. While Tegra X1 had four Cortex A57s and four Cortex A53s, this new SoC (Tegra P1?) has four Cortex A57s and two Denver CPUs. As of now it isn’t clear whether this is the same iteration of the Denver architecture that we saw in the Tegra K1. However, regardless of what architecture it is we’re still looking at a CPU architecture that is at least partially an ARM in-order core with a wide, out of order VLIW core that relies on dynamic code optimization to translate ARM instructions into the VLIW core ISA.
Based on the description of the SoC, while NVIDIA is not formally announcing this new SoC or giving it a name at this time, the feature set lines up fairly well with the original plans for the SoC known as Parker. Before it was bumped to make room for Tegra X1, it had been revealed that Parker would be NVIDIA's first 16nm FinFET SoC, and would contain Denver CPU cores, just like this new SoC.
NVIDIA's Original 2013 Tegra Roadmap, The Last Sighting of Parker
Of course Parker was also said to include a Maxwell GPU, whereas NVIDIA has confirmed that this new Tegra is Pascal based. Though with Parker's apparent delay, an upgrade to Pascal makes some sense here. Otherwise we have limited information on the GPU at present besides its Pascal heritage; NVIDIA is not disclosing anything about the number of CUDA cores or other features.
NVIDIA Tegra Specification Comparison
4x ARM Cortex A57 +4x ARM Cortex A53
2x NVIDIA Denver +4x ARM Cortex A57
Memory Bus Width
TSMC 20nm SoC
TSMC 16nm FinFET
But for now the bigger story is the new Tegra's CPU configuration. Needless to say, this is at least somewhat of an oddball architecture. As Denver is a custom CPU core, we’re looking at a custom interconnect by NVIDIA to make the Cortex A57 and Denver cores work together. The question then is why would NVIDIA want to pair up Denver CPU cores with also relatively high performng Cortex A57 cores?
At least part of the answer is going to rely on whether NVIDIA’s software stack either uses the two clusters in a cluster migration scheme or some kind of HMP scheme. Comments made by NVIDIA during their press conference indicate that they believe the Denver cores on the new Tegra will offer better single-threaded performance than the A57s. Without knowing more about the version of Denver in the new Tegra, this is somewhat surprising as it’s pretty much public that Denver has had issues when dealing with code that doesn’t resemble a non-branching loop, and more troublesome yet code generation for Denver can take up a pretty significant amount of time. As we saw with the Denver TK1, Cortex A57s can actually be faster clock for clock if the code is particularly unfavorable to Denver.
Consequently, if NVIDIA is using a traditional cluster migration or HMP scheme where Denver is treated as a consistently faster core in all scenarios, I would be at least slightly concerned if NVIDIA decided to ship this configuration with the same iteration of Denver as in the Tegra K1. Though equally likely, NVIDIA has had over a year to refine Denver and may be rolling out an updated (and presumably faster) version for the new Tegra. Otherwise it also wouldn’t surprise me if the vast majority of CPU work for PX 2 is run on the A57 cluster while the Denver cluster is treated as a co-processor of sorts, in which only specific cases can even access the Denver CPUs.
Install Shotcut video editing software on Ubuntu Linux. How to install Shotcut 16.01 on Ubuntu 15.10, Ubuntu 14.04, Ubuntu 15.04, Ubuntu 14.10 and Derivatives, via PPA. Shotcut is a popular, open-source video editing software for Linux Ubuntu Systems. Shotcut 16.01 is the latest version of the Shotcut video editing software. Shotcut 16.01 has been released
How to download images from websites (Internet) on Ubuntu Linux. How to install Image Downloader 0.0.1 on Ubuntu 15.10, Ubuntu 14.04, Ubuntu 15.04, Ubuntu 14.10 and Derivatives, via PPA. Image Downloader is a powerful software that can easily download image/photos from various websites. Image Downloader 0.0.1 comes with a powerful graphical user interface that has
Calibre E-book Reader for Linux Ubuntu (Calibre E-book Management Software). Install Calibre 2.48 on Ubuntu 15.10, Ubuntu 14.04, Ubuntu 15.04 and Ubuntu 14.10 Systems. Calibre is a popular ebook reader which is also a powerful e-book management software. Calibre has many powerful features such as e-book viewer, reader and conversion tool. Calibre ebook reader software