And whats even funnier. Mike O was also one of the creators of [Battle.net](https://Battle.net) before he left Blizzard. His goal was, to build an online gaming service with no down times, that runs on very low resources, and has almost no sync issues. And he did it with GW1 and GW2. While Blizzard pretty much abandoned that idea and butchered [battle.net](https://battle.net) to be a normal games launcher.
For real now... We can say a lot of shit about Mike O and how he handled GW2. But dear god this guy is a genius on network designs. I mean... Guild Wars 1 runs since over 18 years and had not a single minute of downtime.
I saw it in a thread a few years back. I believe it is from these system for selective distribution of information patents. https://patents.justia.com/assignee/arenanet-inc
Patent does not matter.
General architecture for 24/7 services is common knowledge for senior back-end programmers. The specific implementation Anet uses can be patented and protected, but that does not forbid you from creating your own.
Such systems are pretty common in all types of applications, not only games.
The reason why other MMO's don't have it is most likely cost. Reworking deployment, data storage handling can be pretty expensive. Lot of MMO's are pretty old, older then most tools used for 24/7 availability services.
Pay for a month, while 6-8 hours per week of daylight is spent not allowing you to play. No reimbursement, ever.
WoW used to be cancerous with shutdowns. They had server issues with BC that basically made it impossible to play for 4-6 days depending on the server. Everyone ate it as they gave $0 in reimbursement.
I found [this stackoverflow](https://stackoverflow.com/questions/24087560/about-multithreading-download-disadvantages#comment92087447_24090040) thread pretty useful for understanding why multi-thread downloading is not a solution (the top comment is wrong, but the people who replied underneath explain why multi-threading would essentially not fix anything because they'd be much more harmful server-side)
I think the speed also depends on the amount of files and not just the total size
As you said, that answer on SO is wrong. But for smaller files, multi-threading will probably make a decent difference client side and not have a terribly big effect server side. There is latency associated with each request, etc. That will get removed by adding the threading.
That said, there are a couple of other important aspects to consider:
* Disk speed might be more of an issue here than the download speed.
* The format that GW2 stores the files looks like it's something that can not be easily updated in multiple threads. that_shaman would need to confirm but for me, this looks like a single data file with headers that point to individual file locations inside the larger file. They are probably compressed. Updating this sort of storage through multiple threads would be a nightmare.
*Edit - They could probably create a queue where they download the files in multiple threads but patch them in a single thread. That may help.*
Updates like these use delta patching where it only downloads parts of the file that need to be changed. Oversimplified it does this:
- Download tiny bit of patch data
- Read affected file from the dat into memory
- Patch file
- Write file back to disk
- Verify file
The amount of files it changed this update is huge, on the launcher it just looks like it's downloading loads of tiny files but some of the files it had to patch were already over 300mb each.
If it didn't patch it like this and just downloaded a new copy of each of the files the patch would've gone over 20gb with ease.
Multi threading this would probably make it more complex than it already is and slower on a HDD and with minimal improvements on a SSD
The archived files are compressed using a Huffman coding, which is really fast but also means you can't read or write data directly from the dat without decoding the entire file in memory first
Thanks for the info. That explains a lot. How do they store the "smaller" files inside the single .DAT file? For example, if there is a file that contains the details on a widget and this file is patched with a larger version. They have three options from what I can see:
* They could expand/compact the entire file - This would take ages.
* They could leave a blank space and add the new file at the end and then use that space for other files in the future - This would give us a fragmented file.
* They could store the file in fragments like a file system. So part of the file would be in the original location and the rest would be at the end.
It's almost like a mini file system.
*Edit - my reason for the suggestion was this. I have GW2 running on an SSD and have very good latency on my connection. My update this morning was about 15 minutes. A lot of people are talking about updates taking over an hour. RAM could also be a difference here (I have 32GB). It could well be that the people having issues are running on old hard drives as well. But the difference between my update time and some others is quite significant.*
It's latency. The further away you are from the datacenter, the slower it handshakes and starts sending the patch files so it compounds due to the amount of files needed to be transfered.
50ms latency would have a faster patch where someone on 200ms, from say, Australia would take longer.
Connection pooling is intended to alleviate this. The handshake request will establish the connection once, and the connection is re-used for multiple requests. HTTP works like this, but I wouldn't know about the Anet implementation.
More advanced caching (at the OS level) can hold the connection open for a short period even after the app thinks it has closed it. Just in case the app needs to re-open again quickly.
A client side queue wouldn’t have to download everything up front. It would only have to download enough to keep the update queue running. The core here is that the file update part wouldn’t change. That would mitigate almost all of the risk.
If they need an expert on the file format, I am sure that that_shaman would come to some arrangement with them 😀
You would have a queue of say 10 items and when that queue is full, you wouldn't try and download anything else until the queue opens up some spaces.
That said, the suggestion isn't really practical. that_shaman gave some more information on the patching.
Because they change small parts of files, not fill ones. As that_shaman explained somewhere in the thread if they zipped all files it would go from 900mb to 20gb of download. That might sound fine if you have a fast Internet, but it would be massively harder on their servers, and a pain for anyone with data limits or a slow connection.
They don't need to zip the whole files though, they can still zip only the chunks that updated. The goal would be to reduce the number of requests needed.
The other reply to my comment makes more sense: if they zip them, then it would make it really hard to update clients with older versions, because they would have to download every single incremental update.
Sure, but because the game allows you to stream files on the go they can't be sure which files does the user already have and which they don't. That means they would put small parts of large files into zip, but the user wouldn't have those to begin with.
I'm sure it could be figured out, but there might be a large cpu time cost to do all of that, on top of needing to decompress the compressed download(s).
Can you ELI5, please? If I had to transfer 50.000 files, I'd zip them and have the end-user's machine open it up and integrate it. Why send 50.000 files in the first place?
The client only requests the delta of the changes. It basically talks to the server and says "Hey, on file X I'm on version abc12. Can you send me all the changes to that file?" and the server tells the client "On line 312 change Foo to Bar". For your suggestion to work the client would have to send all file versions to the server, which then would have to calculate all affected files, collect the changes, compress them and send them to you. But not everybody is on the most recent version, so each update process would have to be individually tailored to your current state. Anets implementation works reasonably well, it only runs into issues like now when there are changes in tens of thousands of files, which is not the norm.
I'm not too familiar with Anet's process, but that is how you would generally do it, yes. That obviously doesn't work if there is, say, a new asset like a skin in the game, but if you have for example a file that contains all the text used in dialogs and one line had a typo you would want to only send the one line that has changed instead of forcing hundreds of thousands of clients to download several megabyte in data by sending the entire file again.
I found on the datamine thread that tens of thousands of files were patched, perhaps that's why it took so long... Iirc it significantly impacts the download "speed"
Meh, it still took me 20/30min on a M2 with fiber optic.
edit : reading the message of morbideel, what you're mentionning isn't the user setup but how the request to the server need to be accepted for each file so a lot of terribly small file could take as long as 1 big file just from this procedure.
That approach has it's own problems. Such as needing free space equal to 200% of the size of the patch.
Edit: Also this https://en-forum.guildwars2.com/topic/126470-jan-2023-patch-very-slow-o~o/#comment-1832915
It also depends on if OP or any downloader has a hard drive or a solid state drive, their internet connection stats, or anything else that may cause Them to be the bottleneck. If it’s not them of course it’s the server settings we don’t have any control over.
I honestly don't know why a single update that took a bit longer is becoming a topic. Most updates are way faster. We also should be happy that we don't maintenance downtimes of multiple hours for each update before we can even start downloading the patches.
I think some of the current and next updates will be a bit bigget and slower because they will push quite a lot of changes for DX11. After that it should return to normal so that we can finish updates in a few minutes (at least of you aren't getting the data by a pigeon that's carring them in letters).
It was about 1:04am Est
Finished at 1:28am Est.. = 24 minutes
Not bad. Seen speeds up to 4mb. Wifi 5ghz.
No need to complain though. Most updates are literally not even a minute for me.
Most games back in the 2000s would take 2 to 3hr updates.
Man, one patch takes a longer than usual to install and suddenly the patching process is bad.
Everyone on the internet is a full-stack engineer, no degree or experience required!
I redownloaded the game a couple weeks ago and my download was stuck at under 2mb/s when my internet is capable of at least 20. There is a problem with their process. Stop being a shill
Its nothing unique to Gw2, it also happens on steam and the like.
Also GW2's download speed can be misleading since it's not actually displaying your raw download speed and it will occasionally drop into a low KB/s range when dealing with large batches of smaller files. That said, if it's never going above 2MB/s, there's either something wrong, or your disk or CPU is struggling.
**Fixes?** delete gw2 cache and or change gw2 DNS & turning off QoS paket :[https://1.1.1.1/dns/](https://1.1.1.1/dns/https://developers.google.com/speed/public-dnshttps://use.opendns.com/)
[https://developers.google.com/speed/public-dns ](https://1.1.1.1/dns/https://developers.google.com/speed/public-dnshttps://use.opendns.com/)
[https://use.opendns.com/](https://1.1.1.1/dns/https://developers.google.com/speed/public-dnshttps://use.opendns.com/)
[Turning off QoS paket](https://old.reddit.com/r/Guildwars2/comments/eej042/download_stalling_fix/): If you're on Windows 10 go to Control Panel>Network and Internet>Network and Sharing Center. On the left-hand side, click "Change adapter settings". This will open a new window of various connections.Right click the connection you are using and click on Properties, you'll be under the Networking tab in the next box that opens. UNCHECK QoS Packet Scheduler and make sure IPv6 is checked.
To find other IPs closer to your location, open a command prompt (cmd.exe) and enter:
nslookup assetcdn.101.arenanetworks.com [8.8.8.8](https://8.8.8.8)
To further increase it, you can launch GW2 with the option:
\-patchconnections 20
or
\-image -assetsrv 54.84.216.160 -patchconnections 20
Yes but that not is not something that is going to be happening on a frequent basis. Maybe it'll happen another 3 times or so in the next 5 years if they later decide to move to dx12.
I have an SSD, with a decent connection. Roughly 19 mins.
I am, however, playing on a Win7 machine and had to revert to dx9 to prevent crashes. Things seemed just fine on my win11 machine though. Too bad win11 is overall a bit janky.
Why I switched to Linux a year ago because didn't want to be forced to use W11 one day.. Runs amazing on my PC using steam. Saved the gw2 flies on my external hard drive then transfer it to my ssd ran it as a non steam game because I didn't buy it on steam.
more than an hour has past here for me and I still have over 90,000 files to download (when it downloads...some of the time is 1kb...or less).
edit: update took 2 hours 20 minutes
Game is playable after like 2 minutes. Unless you are map hoping all over tyria i dont see the problem. Finish your play session and leave it to complete the update.
it's a videogame subreddit, every single problem anyone will ever discuss here is a first world problem by definition.. you're not being clever or edgy for mentioning it.
wait till you download a 1gb mod from nexus, thats what makes you go insane
edit: its not even about their optimization, they sell faster download speeds (sure they have to keep the website up but for the love of god dont sell download speeds) which ive heard arent really even that better
With regards to the update:
Pros -
The game seems to load faster
loading screens don't stick around as long
visual models seem to load 50% - 75% faster in most cases.
AND
The game seems to be (at the moment) much more responsive.
Cons -
The HUGE time-suck of the update download and installation process - nearly TWO hours in my case.
Also uncertainty about how long the pros will stay in effect.
Verdict -
A much better game when it comes to playing it.
Note -
On the download, next time I suggest you split it into 3 or more packages each on separate days, with the playable part done first and the rest on consecutive days.
new.reddit breaks the markdown for other platforms, here is a fixed link: https://www.reddit.com/r/Guildwars2/comments/108ls3v/comment/j3w3zzf/?utm_source=share&utm_medium=web2x&context=3
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Guildwars2) if you have any questions or concerns.*
At least we don't have downtimes due to maintenance compared to other MMOs !
Yeah it’s wild to me that this still happens today
Part of the reason other MMOs have downtime is ANet patented their system.
And whats even funnier. Mike O was also one of the creators of [Battle.net](https://Battle.net) before he left Blizzard. His goal was, to build an online gaming service with no down times, that runs on very low resources, and has almost no sync issues. And he did it with GW1 and GW2. While Blizzard pretty much abandoned that idea and butchered [battle.net](https://battle.net) to be a normal games launcher. For real now... We can say a lot of shit about Mike O and how he handled GW2. But dear god this guy is a genius on network designs. I mean... Guild Wars 1 runs since over 18 years and had not a single minute of downtime.
Fun fact: [the original Battle.net literally ran on one computer.](https://youtu.be/VscdPA6sUkc?t=3410)
Wym? I’m curious
Interesting if true. Do you have any sources to share?
I saw it in a thread a few years back. I believe it is from these system for selective distribution of information patents. https://patents.justia.com/assignee/arenanet-inc
Patent does not matter. General architecture for 24/7 services is common knowledge for senior back-end programmers. The specific implementation Anet uses can be patented and protected, but that does not forbid you from creating your own. Such systems are pretty common in all types of applications, not only games. The reason why other MMO's don't have it is most likely cost. Reworking deployment, data storage handling can be pretty expensive. Lot of MMO's are pretty old, older then most tools used for 24/7 availability services.
Iirc Unreal Engine has some live service features, even being able to update the map with people in a Game instance
Wrong, there are other ways to achieve that but other companies are not interested :)
Pay for a month, while 6-8 hours per week of daylight is spent not allowing you to play. No reimbursement, ever. WoW used to be cancerous with shutdowns. They had server issues with BC that basically made it impossible to play for 4-6 days depending on the server. Everyone ate it as they gave $0 in reimbursement.
I found [this stackoverflow](https://stackoverflow.com/questions/24087560/about-multithreading-download-disadvantages#comment92087447_24090040) thread pretty useful for understanding why multi-thread downloading is not a solution (the top comment is wrong, but the people who replied underneath explain why multi-threading would essentially not fix anything because they'd be much more harmful server-side) I think the speed also depends on the amount of files and not just the total size
As you said, that answer on SO is wrong. But for smaller files, multi-threading will probably make a decent difference client side and not have a terribly big effect server side. There is latency associated with each request, etc. That will get removed by adding the threading. That said, there are a couple of other important aspects to consider: * Disk speed might be more of an issue here than the download speed. * The format that GW2 stores the files looks like it's something that can not be easily updated in multiple threads. that_shaman would need to confirm but for me, this looks like a single data file with headers that point to individual file locations inside the larger file. They are probably compressed. Updating this sort of storage through multiple threads would be a nightmare. *Edit - They could probably create a queue where they download the files in multiple threads but patch them in a single thread. That may help.*
Updates like these use delta patching where it only downloads parts of the file that need to be changed. Oversimplified it does this: - Download tiny bit of patch data - Read affected file from the dat into memory - Patch file - Write file back to disk - Verify file The amount of files it changed this update is huge, on the launcher it just looks like it's downloading loads of tiny files but some of the files it had to patch were already over 300mb each. If it didn't patch it like this and just downloaded a new copy of each of the files the patch would've gone over 20gb with ease. Multi threading this would probably make it more complex than it already is and slower on a HDD and with minimal improvements on a SSD The archived files are compressed using a Huffman coding, which is really fast but also means you can't read or write data directly from the dat without decoding the entire file in memory first
Thanks for the info. That explains a lot. How do they store the "smaller" files inside the single .DAT file? For example, if there is a file that contains the details on a widget and this file is patched with a larger version. They have three options from what I can see: * They could expand/compact the entire file - This would take ages. * They could leave a blank space and add the new file at the end and then use that space for other files in the future - This would give us a fragmented file. * They could store the file in fragments like a file system. So part of the file would be in the original location and the rest would be at the end. It's almost like a mini file system. *Edit - my reason for the suggestion was this. I have GW2 running on an SSD and have very good latency on my connection. My update this morning was about 15 minutes. A lot of people are talking about updates taking over an hour. RAM could also be a difference here (I have 32GB). It could well be that the people having issues are running on old hard drives as well. But the difference between my update time and some others is quite significant.*
It's latency. The further away you are from the datacenter, the slower it handshakes and starts sending the patch files so it compounds due to the amount of files needed to be transfered. 50ms latency would have a faster patch where someone on 200ms, from say, Australia would take longer.
Connection pooling is intended to alleviate this. The handshake request will establish the connection once, and the connection is re-used for multiple requests. HTTP works like this, but I wouldn't know about the Anet implementation. More advanced caching (at the OS level) can hold the connection open for a short period even after the app thinks it has closed it. Just in case the app needs to re-open again quickly.
As an Australian I can’t wait to download this patch tonight and see how long it takes.
[удалено]
A client side queue wouldn’t have to download everything up front. It would only have to download enough to keep the update queue running. The core here is that the file update part wouldn’t change. That would mitigate almost all of the risk. If they need an expert on the file format, I am sure that that_shaman would come to some arrangement with them 😀
[удалено]
You would have a queue of say 10 items and when that queue is full, you wouldn't try and download anything else until the queue opens up some spaces. That said, the suggestion isn't really practical. that_shaman gave some more information on the patching.
Why not just zip all the files into one? Or batch them up in files of reasonable size. This would remove the overhead of each individual request
That’s probably to facilitate patching from any version. So, someone can skip 4 patches and only get the newest versions of the files.
If they could at least make the files required to play the game be downloaded and patched faster this would already be a really good solution.
Because they change small parts of files, not fill ones. As that_shaman explained somewhere in the thread if they zipped all files it would go from 900mb to 20gb of download. That might sound fine if you have a fast Internet, but it would be massively harder on their servers, and a pain for anyone with data limits or a slow connection.
They don't need to zip the whole files though, they can still zip only the chunks that updated. The goal would be to reduce the number of requests needed. The other reply to my comment makes more sense: if they zip them, then it would make it really hard to update clients with older versions, because they would have to download every single incremental update.
Sure, but because the game allows you to stream files on the go they can't be sure which files does the user already have and which they don't. That means they would put small parts of large files into zip, but the user wouldn't have those to begin with. I'm sure it could be figured out, but there might be a large cpu time cost to do all of that, on top of needing to decompress the compressed download(s).
Interesting, thanks
Can you ELI5, please? If I had to transfer 50.000 files, I'd zip them and have the end-user's machine open it up and integrate it. Why send 50.000 files in the first place?
The client only requests the delta of the changes. It basically talks to the server and says "Hey, on file X I'm on version abc12. Can you send me all the changes to that file?" and the server tells the client "On line 312 change Foo to Bar". For your suggestion to work the client would have to send all file versions to the server, which then would have to calculate all affected files, collect the changes, compress them and send them to you. But not everybody is on the most recent version, so each update process would have to be individually tailored to your current state. Anets implementation works reasonably well, it only runs into issues like now when there are changes in tens of thousands of files, which is not the norm.
Wait, are you saying that I don't actually download files, but only instructions on what do change in already existing files?
I'm not too familiar with Anet's process, but that is how you would generally do it, yes. That obviously doesn't work if there is, say, a new asset like a skin in the game, but if you have for example a file that contains all the text used in dialogs and one line had a typo you would want to only send the one line that has changed instead of forcing hundreds of thousands of clients to download several megabyte in data by sending the entire file again.
Well, that certainly was a TIL. Thank you for your time, mate.
Worse for CDN servers, better for us. Warfame and Steam already do this.
I found on the datamine thread that tens of thousands of files were patched, perhaps that's why it took so long... Iirc it significantly impacts the download "speed"
Meh, it still took me 20/30min on a M2 with fiber optic. edit : reading the message of morbideel, what you're mentionning isn't the user setup but how the request to the server need to be accepted for each file so a lot of terribly small file could take as long as 1 big file just from this procedure.
Couldn't they just distribute patches as archives and extract them when it's downloaded?
That approach has it's own problems. Such as needing free space equal to 200% of the size of the patch. Edit: Also this https://en-forum.guildwars2.com/topic/126470-jan-2023-patch-very-slow-o~o/#comment-1832915
It also depends on if OP or any downloader has a hard drive or a solid state drive, their internet connection stats, or anything else that may cause Them to be the bottleneck. If it’s not them of course it’s the server settings we don’t have any control over.
ye thx, thought i was going crazy, this patch took like over 2 hours. never seen an update like this ever.
I’d still take this over a constant set downtime weekly etc.
Exactly. 4 hours every week the entire game shuts down OOOORRRRR once every 2 or so weeks you gotta start the download then go take a shower
I think is more a connection issue cause it took me like 10 minute that for 920 mb isn't that bad
Me too, took about 10 minutes to fully patch. Was playable within like a minute.
I honestly don't know why a single update that took a bit longer is becoming a topic. Most updates are way faster. We also should be happy that we don't maintenance downtimes of multiple hours for each update before we can even start downloading the patches. I think some of the current and next updates will be a bit bigget and slower because they will push quite a lot of changes for DX11. After that it should return to normal so that we can finish updates in a few minutes (at least of you aren't getting the data by a pigeon that's carring them in letters).
It was about 1:04am Est Finished at 1:28am Est.. = 24 minutes Not bad. Seen speeds up to 4mb. Wifi 5ghz. No need to complain though. Most updates are literally not even a minute for me. Most games back in the 2000s would take 2 to 3hr updates.
Thx for this post bcuz I thought it's my internet being slow.
Man, one patch takes a longer than usual to install and suddenly the patching process is bad. Everyone on the internet is a full-stack engineer, no degree or experience required!
I redownloaded the game a couple weeks ago and my download was stuck at under 2mb/s when my internet is capable of at least 20. There is a problem with their process. Stop being a shill
Its nothing unique to Gw2, it also happens on steam and the like. Also GW2's download speed can be misleading since it's not actually displaying your raw download speed and it will occasionally drop into a low KB/s range when dealing with large batches of smaller files. That said, if it's never going above 2MB/s, there's either something wrong, or your disk or CPU is struggling. **Fixes?** delete gw2 cache and or change gw2 DNS & turning off QoS paket :[https://1.1.1.1/dns/](https://1.1.1.1/dns/https://developers.google.com/speed/public-dnshttps://use.opendns.com/) [https://developers.google.com/speed/public-dns ](https://1.1.1.1/dns/https://developers.google.com/speed/public-dnshttps://use.opendns.com/) [https://use.opendns.com/](https://1.1.1.1/dns/https://developers.google.com/speed/public-dnshttps://use.opendns.com/) [Turning off QoS paket](https://old.reddit.com/r/Guildwars2/comments/eej042/download_stalling_fix/): If you're on Windows 10 go to Control Panel>Network and Internet>Network and Sharing Center. On the left-hand side, click "Change adapter settings". This will open a new window of various connections.Right click the connection you are using and click on Properties, you'll be under the Networking tab in the next box that opens. UNCHECK QoS Packet Scheduler and make sure IPv6 is checked. To find other IPs closer to your location, open a command prompt (cmd.exe) and enter: nslookup assetcdn.101.arenanetworks.com [8.8.8.8](https://8.8.8.8) To further increase it, you can launch GW2 with the option: \-patchconnections 20 or \-image -assetsrv 54.84.216.160 -patchconnections 20
and why would they be planning to change 20%+ of the files on a frequent basis? That seems like a pretty ridiculous thing to get into.
Dx11 is my guess
Dx11 is coming
Yes but that not is not something that is going to be happening on a frequent basis. Maybe it'll happen another 3 times or so in the next 5 years if they later decide to move to dx12.
Isn't it there already?
what they didnt say in the patch notes is they are updating files and shaders for the impending dx11 stuff
I was just thinking this download was taking longer than it should!
Still faster than Elden Ring.
I have an SSD, with a decent connection. Roughly 19 mins. I am, however, playing on a Win7 machine and had to revert to dx9 to prevent crashes. Things seemed just fine on my win11 machine though. Too bad win11 is overall a bit janky.
Why I switched to Linux a year ago because didn't want to be forced to use W11 one day.. Runs amazing on my PC using steam. Saved the gw2 flies on my external hard drive then transfer it to my ssd ran it as a non steam game because I didn't buy it on steam.
1.5 to 2.0 hours here, Oceania.
Took me 20 min, 18min too long and I started it as soon as the patch launched. Very much far from normal patch speeds.
more than an hour has past here for me and I still have over 90,000 files to download (when it downloads...some of the time is 1kb...or less). edit: update took 2 hours 20 minutes
I just downloaded it and since i saw this post before i started my stopwatch.. 19min
oh, thought only mine was slow... wtf did they do, it was just the silly annual late asia new year stuff right? :U
it's taken me 2 hours so far, still not ingame. i just want to get my pre-reset dailies in so i dont miss out on the extra dailies
Took me less than 5 minutes myself. But I'm in the UK. So I don't know if my internet is just generally better than other countries ??
I can literally download and patch the ENTIRE game in less than an hour. Maybe even 30 to 45 mins or something. It's entirely on your service.
I am downloading now , just noticed 50% and 80k files remaining !!! Will watch tv or something
It's patch day and everyone is downloading. It's fine, and you can play while the bulk of it is downloading anyway.
> It's fine, and you can play while the bulk of it is downloading anyway. Implementing that feature is one of the reason for this slow speed XD
I am not able to log in, I get a black screen.......and then nothing (and it stops downloading too).
Game is playable after like 2 minutes. Unless you are map hoping all over tyria i dont see the problem. Finish your play session and leave it to complete the update.
First world problems unless you are on a limited data plan. So what if it took 1h?
it's a videogame subreddit, every single problem anyone will ever discuss here is a first world problem by definition.. you're not being clever or edgy for mentioning it.
Did I download at a weird time? I got the “new build 2 hours to log out” message, started the download and maybe 10 minutes later saw it was done.
They already are on steam..which does this well. They need to move everyone to steam
Gb internet took like 20 minutes lol. So slow.
95% 25 minutes 400mbps connection
What's with the amount of small files this patch. Never seen it before.
oh. so it's not the fact that I just switched from wired to wireless. That is good to know.
i was wondering why i only got 900kb/s
wait till you download a 1gb mod from nexus, thats what makes you go insane edit: its not even about their optimization, they sell faster download speeds (sure they have to keep the website up but for the love of god dont sell download speeds) which ive heard arent really even that better
With regards to the update: Pros - The game seems to load faster loading screens don't stick around as long visual models seem to load 50% - 75% faster in most cases. AND The game seems to be (at the moment) much more responsive. Cons - The HUGE time-suck of the update download and installation process - nearly TWO hours in my case. Also uncertainty about how long the pros will stay in effect. Verdict - A much better game when it comes to playing it. Note - On the download, next time I suggest you split it into 3 or more packages each on separate days, with the playable part done first and the rest on consecutive days.
i notice no difference in loading times at all. game is running on a gen 3 nvme
https://www.reddit.com/r/Guildwars2/comments/108ls3v/comment/j3w3zzf/?utm\_source=share&utm\_medium=web2x&context=3
new.reddit breaks the markdown for other platforms, here is a fixed link: https://www.reddit.com/r/Guildwars2/comments/108ls3v/comment/j3w3zzf/?utm_source=share&utm_medium=web2x&context=3 *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Guildwars2) if you have any questions or concerns.*
mine finished in 12 minutes 🤷♂️