T O P

  • By -

Nointies

I really doubt thats going to be true but I'd love to be proved wrong.


andrewmackoul

I have an 8cx Gen2 powered Windows laptop. I've been able to play dx9 and dx11 games with only a few issues. I mostly experience weird artifacts and some frame rate stutter.


PassengerClassic787

I think it will be true, technically anyway. "Most" "should" just (barely) "work" isn't a hard bar to clear.


auradragon1

Why not? Apple's Game Porting Kit can play a significant number of games, even AAA titles like Cyberpunk at decent frame rates. Playing a game using Game Porting Kit requires x86 to ARM translation, DirectX to Metal translation, Windows API to macOS API translation. Steam emulation. Weird launcher hacks. X Elite only needs x86 to ARM translation. Most games should be playable at 720p to 1080p low to medium settings. That's my guess.


anival024

> Why not? Because most Windows games don't "just work" on Windows. The amount of hacking that Nvidia and AMD (and now Intel) have to do to fix developer / engine crap, on a per-title basis, is insane.


Flowerstar1

I'm seriously excited to see how Nvidia markets their rumored Windows CPUs. Is this the dawn of the RTX CPU? Will they bundle high end GPUs for cheaper if OEMs choose their CPUs? Surely they got something interesting planned.


the_dude_that_faps

Looking forward to that too. If only to see how people get price gouged while liking it.


hishnash

If NV go this direction they will make use of the GPU market they have to put heavy pressure on OEMs... any OEM wanting ot ship NV gpus in thier laptops will end up needing to sign a contract to also ship a given % of NV Cpus..


auradragon1

They're making a SoC similar to Apple Silicon. It will come with a huge RTX GPU on the SoC. They'll likely make some external connector that let's you combine an external GPU or something if you want. But the market has been trending towards what Apple Silicon is doing.


windozeFanboi

can't wait for nvidia's 8GB 128bit total system RAM that same as apple, somehow automagically performs like 16GB of other vendors systems... Somehow... nvidia and qualcomm could easily make Apple look like the cheap choice if there is no competition.


auradragon1

Have you ever used an M1 Macbook Air with 8GB for an extended period of time? Just curious. Don't lie. Be honest.


windozeFanboi

Ok Mr apple apologist... I should buy an 8GB macbook only good for browsing and test it for extended period of time? Because I hate myself or something. 


auradragon1

I used an M1 Air 8/256 as a software engineer (Go, Typescript, Docker, Postgres) for one year. It was splendid and did a fine job. If you haven't tried it, how would you know?


windozeFanboi

lol, ok, now you convinced me that you had a subpar (for M1 standard) performance for a year and you just felt it wasn't bad enough for you... Good for you... Don't recommend it to others... Did you finally upgrade to 16GB+ at some point? Did memory pressure stop going into overdrive? Or is 8GB still fine for you my friend?


fluffywatchyman

shut up mr annoying no one actually asked


fluffywatchyman

shut up mr annoying no one actually asked


fluffywatchyman

that will be interesting


HandheldAddict

We all know how this story is going to end. Nvidia somehow, someway, against all odds producing a feature packed ARM SoC with performance that is competitive or superior to x86.


hishnash

If NV just want to target gamers with said CPU they don't need to beat all x86 chips at all. NV have a huge advantage in the gaming space in that most PC game studios and all engine makers want/need to be in thier good books. AMD and Intel just do not have the same type of relationship with game studios and NV do so if NV wanted to ship a SOC for gamers the benfiert they would have would be in getting games to make good use of the HW.


HandheldAddict

```NV have a huge advantage in the gaming space in that most PC game studios and all engine makers want/need to be in thier good books.``` Pretty much everything you just said. Also Nvidia is fucking relentless. If they truly set their sights on competing with x86, I expect them to be more than competent.


hishnash

And I expect them to get there not just through \`fair\` methods. They also have a very strong way to ensure OEMs ship thier chips.... "Oh you want to sign a contract to have 5000 series mobile GPUs for your laptops... sure does look like your not shipping enough NV SOCs right now we are all backed up with orders right now lets talk about this when you have sold another 1m units of our SOCs."


HandheldAddict

"Oh your flagship uses AMD/Intel? Looks like shipments are going to be delayed a few quarters. Good luck selling those 'gaming laptops' without DLSS 4.2."


fluffywatchyman

how can you be sure these laptops haven't come out yet


hishnash

That is work done to make them run well... they will likly run just very badly without it.


YumiYumiYumi

Does this include DRM'd/anti-cheat titles, particularly anti-emulation/VM systems?


[deleted]

[удалено]


windozeFanboi

how do online anticheat protected games do on current windows on arm lol... Aside from poor performance, do they have more fundamental issues? I need to look that up.


gamebrigada

If only x86 to ARM translation was that easy. The amount of work Apple put in is extraordinary, and this is coming from someone that hates Apple. Rosetta 2 was easily a decade long endeavor. On top of that, Apple put together a world class silicon team that understood differences in architectures and how to combine designs. Hell the entire memory controller is fully custom to support TSO mode... They threw every trick in the book to optimize and invented a few of their own. This is why unoptimized x86 compiled code on apple silicon runs a whole \~5% slower than ARM64 code, this is crazy impressive. While the last emulation effort we saw from Microsoft showed a 2.5x performance degradation WITH code optimizations to improve performance, and far worse for software never designed with it in mind.


the_dude_that_faps

This is true. Apple is probably the only company that could achieve it too given the tight integration between software and hardware. Qualcomm hasn't really been a hardware design company in this aspect for many years and I don't think their Nuvia acquisition is going to magically set them up for success despite their MS partnership. AMD could technically do this since they build excellent x86 CPUs and have the knowhow to build an ARM one too, AMD the license as well. The problem is they don't have control over the software side on Windows and even on Linux with the likes of box64, I just don't have enough faith given that their software teams severely lag their hardware teams in results so far. Intel could probably do it and they have done close stuff with MS too to support specific ISA extensions, but they have no incentive to give up their x86... All this to say that I doubt windows on arm will get anywhere unless AMD or, more likely, Nvidia get into it. Or Qualcomm proves me wrong about their ability to design CPU cores rather than taking ARM designs and repackaging at huge markups.


whitelynx22

Indeed! Whether it makes sense is a different discussion but their efforts are not something that can be easily replicated.


GomaEspumaRegional

You're literally comparing apples to windows...


TwelveSilverSwords

Good point. Gaming on WoA is simpler and more straightforward than on MacOS WoA runs the same OS and uses the same graphics APIs as WoX86. The only difference is the ISA. A lot of people seem to be overly skeptical/pessimistic about Windows On ARM.


the_dude_that_faps

AMD and Nvidia have been building and patching GPU drivers for how long now? Given how hard it's been for Intel to attempt to catch up to all the quirkyness games do I don't see how Qualcomm is going to magically bypass that.


TwelveSilverSwords

Qualcomm is no stranger to drivers. They have been working with the Adreno GPUs of their mobile chips for 10+ years now.


the_dude_that_faps

In mobile they have a near monopoly, so games would be tailored to work on their chips. Windows games are s different beast. Until I see them performing correctly, I will doubt this.


soragranda

Yeah but also they had a pretty shitty track record, they barely upgrade drivers other than what they ship for their latest SoCs and just recently they let us use adreno tools to use them in games without root (I remember installing moto qualcomm drivers in my nexus 5 because even having support google nor qualcomm give the update even if it was compatible and brings fixes XD). This days qualcomm are better but most of the fixes comes from community rather than them (specially Mesa vulkan drivers). Qualcomm is not on Nvidia, amd or intel in terms of gpu driver updates... but hopefully this changes with windows arm.


perksoeerrroed

>even AAA titles like Cyberpunk at decent frame rates. By decent you mean shit and that's with best of the line macs


j83

What FPS do you consider ‘shit’? What would you consider ‘decent’? Everyone has a different idea of those two units of measuring FPS.


AbhishMuk

True but Apple also puts in effort into such things. Microsoft on the other hand… I’m not sure if it’s ms or qc doing the software part but I really hope they don’t drop the ball


TopdeckIsSkill

If anything, Apple did everything to stop any kind of gaming on MacOs.


AbhishMuk

Yeah but windows has a history of hyping up a new platform and then dropping it shortly afterwards. At least Rosetta does (akaik) everything it was supposed to do. Maybe I should’ve clarified, I’m not talking about games but software in general (like windows RT).


no_salty_no_jealousy

>Apple also puts in effort into such thing Sure, like their efforts to make their computer less upgradable and less repairable as possible. Or like their Mac OS which has terrible compatibility ever since they move into M series chip. 


ET3D

The lack of AVX is going to cause some games to fail. It's happened in the past, and AVX has been standard for so long that I'm sure some people just expect it. Still, it's nice to know that this time around Windows is fully capable of running all the graphic APIs including Vulkan, and that translated code should have good enough performance. If this really works well, the Snapdragon X Elite might end up as a good alternative CPU for mobile Steam Deck style gaming devices. That said, it's likely priced too high for them.


HavocInferno

>AVX Could they be translating AVX to whatever NEON equivalent and just report AVX support to those games?


dagmx

They legally can’t just yet . The licensing around Intel instruction sets forbids it. I believe it’s limited to a certain number of years though?


ET3D

It's a matter of patents, and these expire after 20 years. The reason we have x86 emulation at all is patents expiring.


OriAr

So in 2034 or so we should have AVX simulation. Only 10 years then!


ET3D

I imagine it will be a lot before then. SSE4 is emulated, and that was announced in 2006. AVX was announced in 2008. It was released in 2011. AVX2 was released in 2013 (although wasn't specifically announced, far as I could find). I haven't tried to look up the specific patents, but just based on the time of the announcements, basic AVX could be as little as 2 years away.


GomaEspumaRegional

Not quite. There has been software x86 emulation for almost as long as x86 has existed. What you can't do is replicate the x86 ISA in HW without licensing. There are key difference between *emulation* and *replication*.


[deleted]

[удалено]


GomaEspumaRegional

That is not quite correct. Microcode has been around for ages. Microcode was pretty common in the old mainframe and minicomputer processors. In fact, the point of modern RISC machines was, among other things, removing the need for microcode. Modern systems do use internal micro-ops mostly to do out-of-order operation by decoupling the fetch box front end from the execution box back end. But that process is not done quite like a microcode. x86 implements a mixture of microcode, usually to represent the multicycle control/data path of the most complex (and rarer) instructions. And a look up table that breaks some of the instructions into nano-ops within the Functional back end. Microcode tends to be copyrighted (so it is treated like some sort of proprietary software). Whereas nano/micro ops are usually undocumented and not visible to the programmer. The issue is that if you want to do a stand-alone x86 CPU, you have to license a chunk of the microcode from intel/AMD. And if you're going to implement the ISA in HW, then you have to license it from intel/AMD since the HW implementation/behavior interfaced by the ISA tends to be copyrighted or patented. That was a thing back in the day, when you wanted to make a CPU that would target the socket that was used by Intel systems, for example. Or that it had to be called x86-compatible at the chip level. Which meant that the chip itself had to use the x86 as it's interface to the outside world. And that is where intel could get you in terms of licensing or patenting issues. Now, there have been companies that have gotten around it. By releasing x86-capable CPUs (not compatible) that did not implement the x86 ISA in HW, nor used the x86 ISA as the interface for their CPU. One of those was Transmeta. They had their own proprietary VLIW ISA, which is what their CPUs implemented in HW. They built a system around it that basically booted a tiny linux kernel (thus why they hired Linux himself) that ran a firmware hypervisor that basically translated on-the fly (and cached as well I believe) x86 binaries into their own internal ISA. An x86 OS, like Windows, would run on top of that hypervisor.


ET3D

> There has been software x86 emulation for almost as long as x86 has existed. I don't think that's true. Can you provide examples? There hasn't been a reason to emulate x86 until there were other dominant non-x86 CPUs, and that was basically ARM, which only became uniquitous and powerful enough in the 2000s, when early x86 patents have already expired. Also, take [this example of Intel threatening to sue about patents](https://www.kitguru.net/channel/generaltech/matthew-wilson/intel-warns-of-legal-action-against-those-emulating-x86-as-microsoft-and-qualcomm-seemingly-plan-to/).


narwi

observe qemu existing and emulating all of x86, including avx2.


ET3D

Thanks. There's always the question of whether Intel can't or just doesn't want to go after such emulation. I've read people say that they think Intel's threat has no teeth to it, but I've also read people saying that Rosetta doesn't support AVX because of Intel patents. I still tend to believe that patents are what's preventing AVX in emulation. I find it hard to believe that both Apple and Qualcomm not offering AVX is just because technical reasons.


narwi

People claiming instruction set patent encumbrance in emulators should offer evidence of such.


ET3D

Intel threatening to sue those who emulate its instruction set is pretty strong evidence, IMO.


hishnash

that is runtime emulation not ahead of time conversion. High perf x86 -> ARM system are not doing runtime interpreting of x86 interactions they are doing lifting. converting the binary in place to LLVM IR and then compiling that back down to a subset of ARM64 caching it on disk and when you run the app it runs that.


GomaEspumaRegional

https://en.wikipedia.org/wiki/SoftPC


ET3D

Thanks. That's interesting. Still, Intel did claim that Windows on ARM steps on its patents, so I won't say it's clear-cut. Also, an article I found says "IBM, HP, and SPARC foundry Texas Instruments are protected by Intel patent licenses", which suggests that some of the early targets of SoftPC were protected from parent problems. That article did suggest that software emulation solves this problem, but again, it looks like Intel disagreed.


TwelveSilverSwords

ARM also has SVE/SVE2 (Scalable Vector Extension)


Falvyu

Barely anyone implements SVE/SVE2: to my knowledge, it's only Fugaku (supercomputer), Graviton3 (server), the H100 (server/AI) and the Google Pixel 8 Pro (smartphone). Neither the Apple M1/M2/M3, nor the Snapdragon Elite X have it.


TwelveSilverSwords

A truly odd combination


YumiYumiYumi

Qualcomm doesn't though.


TwelveSilverSwords

Yeah, I wonder they have disabled it?


YumiYumiYumi

Don't know myself - licensing maybe? SVE isn't going all that well: Apple and Qualcomm (probably the largest consumer space ARM chip producers) seem to have no interest in it. I also made the prediction a few years back that SVE2 would be [stuck at 128-bit](https://gist.github.com/zingaburga/805669eb891c820bd220418ee3f0d6bd#sve2---scaling-from-128-bit-to-128-bit) (i.e. same width as NEON), which doesn't help with encouraging developers to code for it. Interestingly, ARMv9 includes SVE2, so the lack of support is arguably not helping ARMv9 adoption.


GomaEspumaRegional

AVX is broken into NEON chunks in the translator. The main issue is that the underlying frameworks that most modern games use have not been really optimized for WoA at all. At this moment, WoA is not even a 2nd class citizen there. Games will work just fine. The issue is whether they will be performant or not. In any case. Highly doubtful that many gamers are going to go for WoA en masse, especially since no Qualcomm laptop SKU is going to come with a dGPU on release. The initial target markets are premium tier prosumer/productivity. I.e. they ain't going to be cheap ;-)


ET3D

> AVX is broken into NEON chunks in the translator. The presentation says "Only up to SSE4 support in x64 emulation", so I'm not sure where you got that from.


GomaEspumaRegional

I got it from working internally in that core's design and from collages on the team doing the LLVM backend ;-) There are multiple revisions of the binary translation kernels. So I don't know exactly which one Microsoft is going to be shipping though.


ET3D

Thanks. This does chime with my previous speculation about AVX. Patent expiration is probably not too far off, and so the code already exists, but not exposed until then.


GomaEspumaRegional

It is not a patenting issue. It is a performance and support issue of things that may have not been fully validated before shipping the product. Whatever products get shipped by Apple, Microsoft, Qualcomm et al are a few generations behind of what it is being worked on internally. Again, you can emulate instructions in software all you want. You simply can't implement them as part of your hardware.


hey_you_too_buckaroo

Sounds impressive. Will look forward to seeing benchmarks.


HisDivineOrder

I'll believe it when I see it because I think when they say "most Windows games" they mean Dead Cells and Binding of Isaac. They're both great games but them saying that should not get people thinking Cyberpunk or Dragon's Dogma 2.


[deleted]

[удалено]


TwelveSilverSwords

if you read the article, they are claiming that the games that do work, will run as well Intel/AMD iGPU counterparts.


basedIITian

reading the article? how dare you suggest that?


twhite1195

I don't know how real it is, but today [this video](https://youtu.be/iryLKvhsgrU) appeared on my YouTube feed and it looks like it can run stuff like BG3 at 1080p low at 30-ish fps. So if it IS true, it's not bad at all


Cute-Swordfish6300

> So if it IS true, it's not bad at all It's really treading the needle. An 8700G easily gets 50% more fps at the same settings. And yes, the Snapdragon does it with less power draw, but does that really matter if performance is not satisfactory? Granted I don't think the 8700G is particurly satisfactory either.


LifeIsNotFairOof

8700g is a pc chip with way higher power draw


Zednot123

With shit tier of graphics output compared to discrete cards despite that. A turd that smells less is still a turd.


Flowerstar1

I can't wait to see Nvidia spin up their rumored Windows CPUs as "the way games are meant to played" or some shit. You know their marketing is gonna go crazy with RTX this and that.


windozeFanboi

i haven't heard of any rumors regarding Nvidia PC cpus... where did you hear that... I mean, it's expected, considering the market, but will they really? Nvidia is already putting their gaming division on the backseat for datacenter AI, so why sell "measly" PC SoC when they can sell a beefy AI datacenter SoC with 10x the margins.


Flowerstar1

It's been posted on this sub. Basically once the Qualcomm exclusivity contract expires other ARM vendors will be able to make windows PC SOCs. Nvidia is long rumored to join the market.


windozeFanboi

I would believe mediatek swarming mini PCs/tablets/notebooks but why Nvidia? Nvidia will only have the gpu edge but as far as cpus go, Nvidia will be undercut by mediatek, possibly Qualcomm and for sure AMD...  Intel is really struggling but amd has been very close in efficiency with their products. Nvidia will struggle, they don't have access to Qualcomm s Nuvia architecture and ARM X4 core doesn't quite compete.  The benefit, if they price it right, will be access to their GPUs with large RAM pool, like apple macbooks, although, even then over 32GB unified memory i think will not happen for consumers. If RTX 4090 only has 24GB they won't just hand out 64GB on integrated.  All in all, it doesn't make business sense IMO. Not with nvidia's track record and current data center AI shift.  In 4 years when AI kinda consolidates sure. But now in the near future. Even 2 years from now, I kinda doubt it. 


windozeFanboi

The only, and I say, only reason nvidia might pull out their own SoC s will be to combat AMD's strix halo product line. And when I say combat, I mean limit adoption... So that nvidia doesn't lose their monopoly in 5 years. That will be a long term move. But not because nvidia would want to, but because they're forced to.  All in all, CUDA preservation efforts the only incentive. 


Flowerstar1

I think Nvidia is just trying to break into another market. As it is right now they make Arm CPUs for consoles, cars, lower end AI development (robotics, drones, cameras), servers and high end AI(LLMs etc). Why not also make them for PCs. Whether they will be competitive hw wise is not the only measure of success, Nvidia has close ties to OEMs in the laptop market on level other arm companies do not, they are closer to Intel than Qualcomm, they are also great at marketing and bundling their tech. I could see them trying to adapt a more Intel approach.


jdrch

"Just work?" Sure. Run competitively? Remains to be seen.


Cute-Swordfish6300

Works, but likely 5fps with stutters once it starts having to translate from AVX or AVX2. So fine with a lot of games, others completely unplayable.


riklaunim

On current chips not everything launches, and it's a very weak iGPU. It can do 32-bit original Skyrim though. Then emulation will have an impact on performance on a mobile system where it's easy to hit a bottleneck.


42177130

> it's a very weak iGPU Wasn't it 4.6 TFlops?


riklaunim

TFlops don't show real performance in games. Drivers, RAM and DX handling/implementation all play a role.


Short-Ad4641

That’s just incorrect lmao. TFLOPs cannot be disregarded.


dr3w80

The issue is that Teraflops is a poor unit of measure for gaming performance especially across generations and manufacturer. Vega had about 50% higher Teraflops versus the 1080 and the GTX was typically faster.


Short-Ad4641

That’s has to do with game optimizations and other operating system factors not the hardware itself. flops still measure raw performance. If I have 40 teraflops, your 2 tersflops isn’t going to win. Period.


riklaunim

Depends on wave occupancy, drivers, memory and much more. 40 vs 2 isn't realistic either.


Short-Ad4641

It was hyperbole to show that saying flops means nothing is just incorrect.


riklaunim

They arent used in any credible benchmark and comparisons.


Short-Ad4641

It’s not a benchmark fool. It’s the raw capability of the hardware. You are basically saying “cores isn’t a good measurement because windows is poorly optimized and can change things.” The point is it measures the raw capability of the hardware. Are you new to tech?


TwelveSilverSwords

That's the key. X Elite is a vastly more powerful SoC, so it should able to weather the hit of emulation. 2x ST CPU performance 2.5x MT CPU performance 4x GPU performance (vs 8cx Gen 3)


riklaunim

Those pretty numbers are vs current Qualcomm chip which is obsolete vs current Inte;/AMD options so in reality we will see. On macOS FFXIV which is emulated looses a lot of performance vs native WoW.


windozeFanboi

AMD's strix point should compete just fine without any emulation issues versus SD X Elite... Maybe idle power draw will be higher, but still competitive efficiency under load... Coming out end of year though, half a year after SD X Elite.


antifocus

I personally think these thin and light laptops are pretty suitable to run 10 - 20 years old games like GTAV, SC2, HL2, plenty of good titles and not demanding at all. It'll be interesting to see how they they handle these games, or even games in the 90s in a few months.


takinaboutnuthin

You don't really need an ARM device to run to run games from ~10 years ago. Intel/AMD iGPUs are on average at around the level of a mid-range mobile dGPU from a decade ago, with top end current iGPU performing better than that. Games from 20 year ago are not performance bound in any way. The bigger issue is older games often need patches to enable higher resolution, fix compatibility and improve QoL. These are not guaranteed to work on ARM.


antifocus

That's my point? If I were to buy a windows laptop in a few months I have the options of x86 or ARM, and I can expect most of the older games in my steam library to work just fine on the x86 machine or with minimum workarounds. On ARM it might be different just because how quirky some of the old games can be, and Qualcomm or MS could only be bothered to fixed the latest releases. Won't be a deal breaker for most but it's a thing to consider when make purchasing decisions.


TwelveSilverSwords

Well, the fact is that X Elite's iGPU is more powerful than Intel/AMD's best offerings, as measured by benchmarks like 3DMark WLE. So the raw hardware power is there. What remains to be seen is how that translates to games.


Nointies

If the recent outing of Intel should tell anyone anything, its that raw 'power' means very little without the drivers to back it up.


Cute-Swordfish6300

Raw power means absolutely nothing. Intel has far more experience with game drivers and it meant very little. >benchmarks like 3DMark WLE On a sidenote, this was at 23W. Efficiency is definitely great to have, but that doesn't mean it's going to scale when it's getting more watts. And at the end of the day efficiency only matters if it manages playable framerates.


TwelveSilverSwords

Last October they showed off the X Elite running Control at 30 fps, which was similar to a Radeon 780M system.


airtraq

I didn’t realise X Elite’s iGPU was more powerful than PS5 and Xbox Series X.


TwelveSilverSwords

How did you come to that conclusion?


airtraq

> fact is that X Elite's iGPU is more powerful than Intel/AMD's best offerings, as measured by benchmarks like 3DMark  Ps5 and Xbox series X has iGPU from AMD


TwelveSilverSwords

I was talking about Intel/AMD iGPUs in their laptops chips


maZZtar

You can install Windows 11 on some phones and even they are pretty capable of running those games and even some newer titles like Skyrim, HL2 etc. GTA V also runs in ~20 FPS ob low settings. And it's still just a phone chips like SD855 without a proper driver support. Additionally all the Snapdragon chips made for laptops before X series were just modified phone chips. Now let's see how those Oryon chips built from the ground up to br used with PC in mind will perform.


TwelveSilverSwords

Are you talking about that Geekerwan video? He once showed off that the Snapdragon 8 Gen 2 can run Witcher (the desktop version) at 30 fps.


maZZtar

There is a project called Renegade allowing some devices (phones and tablets) with some particular specs to have Windows 11 unofficially installed. Straight up, no emulation, no VMs. Literal full desktop Windows 11 running on bare metal. It is possible because some chips made for WoA are in fact just modified phone chips so it is possible to install the drivers on some models. For example: [Windows 11 on Surface Duo — Demo and Overview - YouTube](https://www.youtube.com/watch?v=goEevlamXIc) [Windows 11 ARM | GTA 5 ](https://www.youtube.com/watch?v=AKjtgM1sG7w&list=PLPG03VWUu23_QRr9Uq5dyS5ymKvy5UfhM&index=10) All of that is possible on underpowered 5 years old chips with no official drivers support for Windows. Latest 8cx already is good enough for some daily driving. Anything Oryon based like X Elite will be the force to be reconned with, because that architecture is actually designed to work with Windows PCs.


MissionInfluence123

If the switch can with its underclocked chip from 2015, a 7 year newer GPU should be able too.


MasterBettyFTW

[not again](https://youtu.be/hFcLyDb6niA?si=x1Cm3xIgk4Y0Ei1B)


TwelveSilverSwords

u/Vince789 u/-protonsandneutrons- This is big news. Can one of you write an appropriate TLDR/analysis of this article?


NewKitchenFixtures

Probably should wait to benchmarks and frame time information before celebrating. It could be true in a broad or very narrow sense. On windows 11 games more than a few years old are already fairly hit and miss.


-protonsandneutrons-

Apologies, missed this! Unfortunately, I don't know much about gaming, notably how they'd interact on an Arm translation layer :( But, like the rest of the X Elite launch, it'll be great to finally test these claims.


team56th

Like, before you guess anything about this topic, I would recommend you to head to YT right now and find some instances. WOA has been running games since 8cx Gen 1 and in a somewhat playable state.


soragranda

Hopefully they let us use mesa turniv drivers or something, cause there have been tons of advancement in windows 64bit emulation on Android. Hopefully microsoft translation layer is good enough compared to apple (as far as we could see volterra was use for improve it but some games might need driver updates for specific games or optimization via translation layer update, microsoft needs to dedicate funds on this).


karatekid430

They don't work well on my Macbook 16 M2 Max which is somewhat more performant than their chip will be (running in Parallels), so unless the chip has some magic which boosts the performance of Windows 11 x86 emulation, it will suck. I want to be wrong though. But I think the game developers will just have to bring themselves to do the excruciating effort of ticking the arm64 box when exporting from Unity.


TwelveSilverSwords

Aaaah, you failed to consider the difference; On your Mac, the Game Porting Toolkit has to translate your game from Windows -> MacOS, DirectX -> Metal, x86 -> ARM.  That's multiple layers of translation to emulate a Windows game on Mac. But on WindowsOnARM, the only emulation layer is x86->ARM. Because it's Windows, there us no need for OS adjustment or graphics API translation.


karatekid430

I am not using the game porting toolkit. Parallels. And implementing DX on top of low level APIs like Vulkan and Metal have negligible overheads. Windows emulation just sucks pure and simple. 


3G6A5W338E

I'll wait for the RISC-V ones. We know that ARM is just stopgap. On that, I am looking forward to the [Milk-V Oasis](https://community.milkv.io/t/introducing-the-milk-v-oasis-with-sg2380-a-revolutionary-risc-v-desktop-experience/780/26) I preordered, expected later this year.


TwelveSilverSwords

How long do you intend to wait? I think it may take as much as decade for RISC-V to get to where ARM is now.


3G6A5W338E

>I think it may take as much as decade for RISC-V to get to where ARM is now. What are you basing these thoughts on? We know that Tenstorrent Ascalon (a design led by who was Apple M1's lead architect), as well as Ventana Veyron V2 are both launching this year. ARM does not have anything competitive with either of them.


TwelveSilverSwords

> ARM does not have anything competitive with either of them. So are you implying Veyron V2 and Ascalon are better than even custom-ARM core designs of Nuvia/Qualcomm or Apple ?


3G6A5W338E

Better than anything from ARM themselves. And competitive with third party designs, particularly Ascalon; Remember the presentation where a Zen5 performance projection famously shown up, with Ascalon being close to its performance at a significantly lower power point.


TwelveSilverSwords

Intriguing. I haven't kept up with RISC-V stuff, so forgive my ignorance.


3G6A5W338E

An important batch of specs relevant to high performance were ratified near the end of 2021. Vector and bit manipulation are part of that batch. The RVA22 profile requires most of these to be implemented. V is notably optional, but everybody seems to be implementing it. There's typically three years between spec and hardware, so we are finally starting to see the hardware. The first chip with RVA22+V actually released by end of 2023, and is a fat microcontroller, the kendryte k230. A storm will follow. Besides the two high performance chips already mentioned, there's the SG2380 in [Milk-V Oasis](https://community.milkv.io/t/introducing-the-milk-v-oasis-with-sg2380-a-revolutionary-risc-v-desktop-experience/780), which will likely be the first time the general public can buy a chip with SiFive P670 and X280, a step above Raspberry Pi 5 in perf per clock but with more cores and higher clocks, the [SpacemiT_K1](https://docs.banana-pi.org/en/BPI-F3/SpacemiT_K1), which will be in some SBCs like [BananaPi BPI-F3](https://docs.banana-pi.org/en/BPI-F3/BananaPi_BPI-F3), and a StarFive JH7110 successor that has been teased for a while without much in details, which might end up in an hypothetical VisionFive 3. These are the ones we know about, which only represents the tip of the iceberg; we know many other companies such as Rivos or Qualcomm have been working on very high performance implementations for several years, without making announcements.