Hacker Newsnew | past | comments | ask | show | jobs | submit | Noctix's commentslogin

Fedora, Debian (if you still want to use apt) or OpenSUSE.


I don't see the purpose of this. Current Windows on ARM has pretty non existent support by developers.


There's a couple different things at play this time. Windows RT only allowed Store apps and Microsoft's own ARM desktop apps. Windows 10 on ARM will run UWP Store apps and x86 apps (desktop & hybrid "Centennial" apps). At at Build, they announced Edge support for Progressive Web Apps and they'll be adding PWAs to the Store automatically if they meet all the right criteria for features & quality.

So not only will this platform support almost all the existing Windows software out there, it'll treat PWAs as first-class citizens.



Have there been any public previews of this? I'm still trying to figure out how they can emulate x86 without significant performance problems due to the different memory concurrency models


https://www.betaarchive.com/forum/viewtopic.php?f=62&t=37172... ARM64 Windows leaked - and can run in Qemu.


I have heard the concern re memory concurrency models, can somebody explain the problem a bit more?


This is basically an emulator though; it will be quite slow.


They showed games (WoT I think) running using this emulation, so maybe not.


"World of Tanks Blitz" - a Phone game


We don't know yet. Overhead could be something like 5% or 25%. We'll see when the reviews come. For what it's worth, it seems to run x86 apps just fine in Microsoft's own demo (keep in mind it's still a mobile ARM processor, so performance will be similar to Atom/Celeron/Pentium, rather than Core i5, etc).

https://www.youtube.com/watch?v=A_GlGglbu1U


I don't trust product demos, movie trailers nor press releases, so I'll reserve my enthusiasm until someone on the "real world" has had the chance to get their hands on this.

What I know is this: ARM emulation via QEMU on my core i7 was pretty slow last I tried it. Not unusably slow, but still slow. And I have tons of ram and a fat CPU.

These laptops, especially if they are made on the cheap side, won't have tons of RAM and a fat CPU. Unless Microsoft pulled some magic tricks, I don't see this emulation being usable.

But I could be wrong.


"faster than QEMU" isn't a particularly high bar to clear, though. In particular if you're only emulating a single process rather than a full guest OS then you can reduce the overhead quite a bit. Plus if you design something from the start for the single-process emulation case, and only need to worry about one guest and one host architecture, and focus on emulation performance as a key goal, you can likely do better than QEMU without too much trouble.


https://www.betaarchive.com/forum/viewtopic.php?f=62&t=37172... ARM64 emulation in Qemu is very slow anyway...


Emulation is slower but most people will be using browser/office/vscode etc so its just for those extra few apps that might stop you buying the product if emulation didn't exist.


that might not matter


x86 emulation will make life easier initially. But if these devices become popular, so will UWP apps.


TBH, there is not much you can do on a phone of that size. There aren't many windows exclusive applications that you would want to use on a phone.

Also, it simply doesn't make sense when seen through a buisnessman's perspective.


The x86 desktop application support is for the desktop use case. For a phone you'd plug it into a dock and use it as a desktop or into a laptop shell and use it like a laptop. Like this for example: https://www.youtube.com/watch?v=KNzatcI9Fqw


Can this be stated as an effect of Ryzen launch?


Probably, but more likely because of AMD's HEDT (high end dedktop) platform called Threadripper which will have up to 16 cores (32 Threads).

Before AMD announced Threadripper Intel had only a 12 core chip on the roadmap for the x299 platform, and charged around 1700$ for their 10 core chip. Now they will be charging 2000$ for 18 cores.

Competition is such a nice thing. Glad that AMD is back in the CPU game. Can only be good for us customers.


I'm personally looking forward to the times that this competition drastically lowers the price of mid-range CPUs for the benefit of the normal people who don't buy CPUs for 2000$.

(Yes, I know that "normal people" don't even buy laptops anymore, let alone desktops. Please excuse my fantasy-world in which people buy desktop computers and even upgrade the amplifiers of their at least 7-piece stereo sound system)


The AMD Ryzen R7 1700 is a $320 8 core/16 thread processor. Intel's cheapest 8 core/16 thread processor is their i7-6900K which sells for $1049. Even their 6/12 i7-6850K is over $600.

IMO the Ryzen R7's have been a huge "mid-end" win for anyone doing any sort of multicore/CPU intensive work. Without competition, Intel's been gouging the market for the past few years.


>drastically lowers the price of mid-range CPUs

Well, I think the current mid-range Ryzen's offer significant value and I imagine OEMs will start including them into their popular models sooner than later.

6 cores at 3.6ghz with a 4.0ghz boost for $239.

https://www.amazon.com/AMD-Ryzen-1600X-Processor-YD160XBCAEW...

More than likely we won't see any kind of price war. Instead we'll see minor price fighting on a per category basis. Intel's mid vs AMD's mid, etc. There's no drive down to the bottom with a duopoly.


Competition breeds excellence. :-)

EDIT: I was getting excited by the i7-7820X until I saw it has only 28 pcie lanes. Talk about being pushed towards the way more expensive 7900x! My relatively cheaper 6850k has 40 lanes, wonder what the thinking behind this is?


Someone could explain why is important the number of PCIe lanes ?


It limits what peripherials can be attached. For instance, single graphic card uses 16 PCIe lanes, so having only 28 lines supported by processor makes it impossible to have two GPUs in SLI mode (ok, not impossible, because motherboard chipset can add some lanes, but performance will suffer)

NVMe SSD drives also use some lanes (typicaly 4), and you might want two of them in RAID.


Numerous tests show that performance is not meaningfully impacted by running two GPUs on 8x PCIe lanes vs 16x lanes, especially when using PCIe 3.0.


It's a huge difference for minimum frame times at 4K if you intend on getting a smooth 120fps.


with current-gen GPUs

That could also change over the next few years.


What are the chances you will still be running the CPU and motherboard you buy today in a few years?


So, if you not have plans of putting a NVM SSD plus two GPUs on SLI configuration, with 16 PCIe lanes will be enough.


Is there any way to see it as something else?


Apparently, Intel has years of runway vs AMD. The chips are literally designed and sitting there, only to be released if AMD comes up with something. Intel isn't scrambling in any sense, just announcing stuff that is already designed and built. All AMD was force Intel to release it earlier than their monopoly position was allowing.


This comment went as high as +4 before being downvoted to -1.


The down vote behavior is interesting. I simply mention that Intel has an early release of a previously constructed part. Monopolies behave in predictable ways.


IIRC, the Thinkpad line was not affected, and wasn't it a external partner that planted the rootkit?


> the Thinkpad line was not affected

Correct.

> wasn't it a external partner that planted the rootkit?

Nope. The first time it was 3rd party software, Superfish, but Lenovo installed it and installed a self-signing root certificate authority to MITM the traffic[0]. The second time it was Lenovo's own software, Lenovo Service Engine, which they also installed themselves[1].

[0]: http://www.zdnet.com/article/superfish-stop-using-your-lenov...

[1]: http://www.zdnet.com/article/lenovo-rootkit-ensured-its-soft...


Even if so, it reflects poorly on Lenovo.


Because they are companies, and need to make money?


It is Apple, what did you expect?


Open sourced Swift—is that Apple or not, in your opinion?


I think open sourcing a language you built is rather different from acquiring companies and locking out competition.

Apple has more incentive to open source Swift than they do to let anyone use content from a company they acquire (remember FoundationDB's fate?). I think Authentec went the same route but I am uncertain.


What is the innovation here?


The OP sounds like sarcasm to me :)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: